
Ask HN: How to Self-Study Integrated Circuit Design? - hsikka
Hey HN,<p>I am a graduate student doing ML research, and lately I&#x27;ve been thinking a lot about designing learning systems from the hardware through the software layers.<p>I have no experience with what is going at the processor levels, and I was wondering what prerequisite subjects or general curricula I should follow to learn and reason at these lower levels of abstraction.<p>To be clear, I&#x27;m doing this to build intuitions about new computational systems and how different chips, from ASIC to neuromorphic, may be designed.<p>Any resources or advice telling me I&#x27;m a fool is welcome!
======
mud_dauber
If you're looking at this from an architectural perspective, consider grabbing
a copy of Hennessey & Patterson's "Computer Architecture - a Quantitative
Approach". It covers topics like branch prediction, instruction set design,
memory/caching, and so on. A classic.

If you want to get really deep into the physics of IC Design, one of my
favorites is "CMOS Design, Layout & Simulation (Baker). It covers spice
modeling, physical transistor construction, and a variety of
digital/analog/memory circuit concepts.

Finally: the link to this article was literally right underneath the link to
yours when I opened HN news this morning: [https://medium.com/@veedrac/to-
reinvent-the-processor-671139...](https://medium.com/@veedrac/to-reinvent-the-
processor-671139a4a034)

~~~
bsder
Edit: Apparently this has replaced Weste and Eshraghian: "CMOS VLSI Design: A
Circuits and Systems Perspective (4th Edition)" by Weste and Harris
[https://www.amazon.com/CMOS-VLSI-Design-Circuits-
Perspective...](https://www.amazon.com/CMOS-VLSI-Design-Circuits-
Perspective/dp/0321547748/ref=sr_1_1?keywords=cmos+vlsi+design&qid=1557732562&s=gateway&sr=8-1)

Weste and Eshraghian was (and may still be) the Bible for a very long time:
[https://www.amazon.com/Principles-CMOS-VLSI-Design-
Perspecti...](https://www.amazon.com/Principles-CMOS-VLSI-Design-
Perspective/dp/0201733897/ref=sr_1_2?keywords=Principles+of+CMOS+VLSI+Design&qid=1557731943&s=books&sr=1-2)

It's a lot newer (and nicer) than most other references.

I would avoid Mead and Conway because it's _really_ dated. It's not wrong, but
a beginner won't know which parts to skip.

If you're looking for something about VLSI _layout_ design, "The Art of Analog
Layout (2nd Edition)" by Hastings would be the choice:
[https://www.amazon.com/Art-Analog-
Layout-2nd/dp/0131464108/r...](https://www.amazon.com/Art-Analog-
Layout-2nd/dp/0131464108/ref=sr_1_3?keywords=art+of+analog&qid=1557732184&s=books&sr=1-3)

~~~
lomereiter
The complete beginner should start with Harris & Harris "Digital Design and
Computer Architecture" (David Harris is one of "CMOS VLSI Design" authors), it
provides a gentle introduction into the underlying physics but focuses more on
the logic of the whole endeavor, i.e. how to get from transistors to CPUs.

~~~
dormando
Looks like there's a 2nd ed (mips based?) and an "arm edition" (more recent?)
of the same title by the same authors. Online reviews are sparse, wonder which
is better now?

------
orbifold
Unfortunately most of the tools used in the EDA industry are proprietary and
stuck in the 90s. So it is both expensive and pretty painful to use them,
although after a while a kind of Stockholm syndrome sets in. In other words
you really need Cadence and access to a Designkit from a foundry in a
relatively recent process (65nm TSMC, 22nm SOI Global Foundries, etc.) to
implement something serious. Similar things apply to digital design, although
with tools like Chisel and Verilator (system verilog to C++ compiler) you can
do more things with freely available tools. Once you want to tape out your
design a lot of very expensive proprietary software is still involved. It is
also pretty unrealistic to do everything yourself. It is a multi year single
person project to setup an understand a new analog design kit itself.

So my best advice is to find a group that has already setup most of that
support infrastructure and obtained licenses for all the tools and design kits
(We've got one person alone that handles licenses, software installation, and
the ASIC cluster).

Ideally the group has permanent staff with plenty of hardware experience and a
professor with a proven track record of successful chip tapeouts. The number
of groups that are capable of this in the field you are interested in can be
counted on two hands at best.

I can't talk much about analog design, because I've only witnessed people
doing it and haven't done any design with them myself, but you need to be a
very particular kind of person to enjoy it, especially if you have no-one to
do the layout for you.

Oh and you should think long and hard if you really want to go into this
direction, it involves a lot of really hard and tedious work and is not
glamorous at all. A tapeout might fail because someone accidentally checked a
box in the last step before sending the design to the manufacturer, removing
all blackbox pins on the edge of the design. Or someone might have though that
it is a good idea to put a latch in the PLL reset path, which causes the clock
to never turn on, which you only discover in the back annotated simulation 5
days before tapeout.

------
guidoism
Oh, oh, oh. I can answer this — I’m a software engineer on sabbatical and I’m
doing the same.

First off, everyone says Hennessey & Patterson and Patterson & Hennessey and
while I think those books are worth reading at some point, for me they
concentrate on things that I don’t care about (like what the cutting edge is)
and gloss over the finer details of actually how the processor works.

I’ve found that Mano’s Computer System Architecture is great for the higher
level stuff. I have the 2nd edition from 1982.

I like Hauck and DeHon’s Reconfigurable Computing from 2007 for FPGAs.

I like Hill and Peterson’s Digital Systems: Hardware Organization and Design
from 1973 for describing the design programmatically using an RTL, in this
case they use an APL derivative rather than the more recent (and horrible)
VHDL or Verilog. This is a dense text and I’m still working my way through but
it’s super awesome in my opinion.

I’ve been working through the Visual ARM1 animated transistor-level simulator
and have made my way to Bryant’s 1984 paper A Switch-Level Model and Simulator
for MOS Digital Systems and recursively reading what I need to understand it
thoroughly. I have Mead & Conway on the way.

I also have a copy of Sedra & Smith (2nd Edition) for understanding the actual
electrical circuits but I rarely reference it.

~~~
guidoism
Also these series of blog posts describe what's going on in the Visual ARM1
simulator and they are both really really well written:

\- [http://www.righto.com/2015/12/reverse-engineering-
arm1-ances...](http://www.righto.com/2015/12/reverse-engineering-
arm1-ancestor-of.html) \- [http://daveshacks.blogspot.com/2015/12/inside-
armv1-register...](http://daveshacks.blogspot.com/2015/12/inside-
armv1-register-bank.html)

------
CyberFonic
Have a look at the Wikipedia pages for Carver Mead
([https://en.wikipedia.org/wiki/Carver_Mead](https://en.wikipedia.org/wiki/Carver_Mead))
and Lyn Conway
([https://en.wikipedia.org/wiki/Lynn_Conway](https://en.wikipedia.org/wiki/Lynn_Conway)).
Their book "Introduction to VLSI Systems" and other links on those two
articles should be useful jumping off points for your research.

Most university libraries should have copies of "Introduction to VLSI Systems"
or similar on their shelves.

~~~
contingencies
Thanks for sharing. I'd never head of Lynn Conway, one would have assumed a
very zeitgeist character with reference to Chelsea Manning's recent media
presence.

~~~
p1esk
Another notable trans neuromorphic engineer is Jennifer (Paul) Hasler:
[http://hasler.ece.gatech.edu](http://hasler.ece.gatech.edu)

~~~
voltagex_
Hey, I realise you probably didn't mean to do this, but using a previous name
of someone who's transitioned is not a good thing to do.

~~~
metaphor
Why? An overwhelming majority of this academic's papers [1] are published as
Paul Hasler.

[1]
[http://hasler.ece.gatech.edu/Published_papers/index.html](http://hasler.ece.gatech.edu/Published_papers/index.html)

~~~
ben_w
I know a professional accountant who kept using her maiden name in business
contexts after marriage because otherwise people had a hard time finding her
skillset and work history etc., but for all other purposes she’s Mrs
$NEW_NAME.

As regards trans names, I’m told quite a lot of people (not 100%) find their
old names an uncomfortable reminder of an uncomfortable past.

Plus it’s a sign of respect to use someone’s chosen name, and a sign of
disrespect to use any other name — for non-trans examples, what does it say
about someone’s respect for the other party if they call David Tennant “David
McDonald” or Boris Johnson “Alexander Boris de Pfeffel Johnson”? (At least,
without the brackets that the previous poster used; with is _useful_ for the
same reason as my friend using her maiden name, but it’s still often frowned
on).

------
joshvm
If you have no experience in digital electronics at all, I would actually
start with a book like _Code_ by Charles Petzold. It's a pop-science book
about electronics and goes from light bulbs and switches to simple logic
blocks like adders. It covers the basic building blocks of virtually every cpu
in existence - how to turn transistors into calculators.

It's well written and provides some context to otherwise pretty dry textbooks.

There is quite a bit of difference between how an FPGA works and is programmed
versus a totally custom chip. You could also take a look at reverse
engineering walkthroughs eg Ken Shirrif has a few on his blog.

Nand2tetris is another good thing to look at.

Finally, reverse engineering the MOS6502:
[https://m.youtube.com/watch?v=fWqBmmPQP40](https://m.youtube.com/watch?v=fWqBmmPQP40)

------
anfilt
Not sure how experienced you are with digital logic, but I would start there.
Start with combinational logic then sequential logic. Then move on to timing
analysis and basic pipelining. Then get a FPGA dev board and design some basic
things like a simple processor. You could also use the FPGA as a more concrete
way to test what you have learned in the aforementioned topics.

Then once you get that down/understood. I would recommend looking at the
papers and literature regarding the topics of interest and common
architectures and structures. See what optimizations have been tried. The
optimization targets are generally performance, power and space.

Also with ASICs the standard cell libraries make things much easier since
generally they are characterized pretty well. If you have the funds to make an
ASIC that is. Still because that you spend most your time just verifying and
testing your designs.

-EDIT-

If your thinking custom analog circuits your going to have to learn a lot
more. Starting with circuit fundamentals, but your also going to need solid
understanding of semiconductor physics. There is also the problem that outside
simulation testing ideas/designs is quite expensive. Unless what you have is
something that could be constructed out of discrete components. So unless you
got a company or university backing you it's unlikely you would be able to
make any physical prototypes. Although prices have been getting cheaper for
older processes, and there are services like MOSIS.

------
Junk_Collector
For a solid foundation of microelectronics, Sedra and Smith's "Microelectronic
Circuits" is the classic go to. Back when I worked at TI, they actually gave
every new engineer a copy.

This is a very entry level book that covers a huge number of topics well and
will serve as a launching point to more specific topics. It starts with basic
transistors and works up through Op-amp design and digital VLSI touching on
filter theory and clocks along the way.

------
imtringued
I don't know what your current level is but for complete beginners I recommend
this game: [http://nandgame.com/](http://nandgame.com/) It will walk you
though the basic components of a very simplified CPU.

Alternatively if you are interested in designing custom circuits which are
merely used as accelerators then you could start off with an FPGA and a RISC V
core to which you add your own designs.

------
laydn
Integrated circuit design is a vast field with many different areas of
expertise.

Given your goal, your best bet is to buy an FPGA development board and learn
an HDL (Verilog or VHDL), and start experimenting.

When you to deeper in the field and more over to ASICs, then you would have to
learn the details of synthesis, placement, routing, DFT (design for test) and
a whole bunch of other details, which usually exceeds the willpower of a
hobbyist :-)

And then if you want to really go all the way into the latest and fastest
transistor nodes, then you go into custom circuit design and custom gate
layout, which is not fun at all.

------
planteen
Learn Verilog and practice implementing logic on a FPGA. A MicroZed board
(Xilinx Zynq) is a great choice. You can even implement your own CPU.

~~~
arcticbull
Or an Arty! [1] It's got an Arduino shield connector and super affordable.

[1] [https://www.xilinx.com/products/boards-and-
kits/arty.html](https://www.xilinx.com/products/boards-and-kits/arty.html)

~~~
amelius
Which FPGA vendor is most "Linux-friendly"?

~~~
dormando
TinyFPGA BX is pretty good (all open source tooling), but generally everyone
has linux tools. edit: unless you meant which one would boot linux best :P
Answer is "the expensive ones"

~~~
amelius
Ha, yes I meant tooling. But what if you want to scale your design to a
different platform, would the open source tools still work?

~~~
q3k
For simulation and formal verification, yes, absolutely. Either Icarus
Verilog, Verilator or SymbiYosys have been used for large commercial designs.

For synthesis, the only FPGAs families currently supported by an open source
flow are the Lattice iCE40 and ECP5 [0]. The latter is something you can be
decently productive with and can fit quite a bit of logic (think: Amiga
reimplementation, PCIe interfacing, etc.).

If you'd like to port synthesizeable code from the open source world to the
commercial world, this _should_ just work as long as you're willing to rewrite
any physical interfacing code (since those depend on hardware blocks available
in a particular family) and stick to high-quality Verilog. But that's the same
as porting across any other FPGA families.

Disclaimer: I work with SymbioticEDA, who develop and provide commercial
support for some open source digital logic tooling, like Yosys and Nextpnr.

[0] - [https://github.com/YosysHQ/nextpnr](https://github.com/YosysHQ/nextpnr)

------
rramadass
Here is an online course based on the book; "Digital Design and Computer
Architecture, 2nd Edition by Harris & Harris" which teaches you processor
design using FPGAs.

[https://blog.digilentinc.com/teaching-computer-
architecture-...](https://blog.digilentinc.com/teaching-computer-architecture-
with-fpga-boards-harris-
harris/?utm_source=mautic&utm_medium=email&utm_campaign=digilent_newsletter&utm_content=may_2019)

The boards are from Digilent, whose "Analog Discovery 2" and "Digital
Discovery" multi-tools are extremely useful for learning Embedded Systems
Programming.

------
drallison
Silicon Catalyst
([https://siliconcatalyst.com/](https://siliconcatalyst.com/)) operates an
incubator that supports IC design and development with access to commercial
tools. Book learning only goes so far, then apprenticeship and experience
becomes necessary. To really understand how to make integrated circuits you
have to make a few.

------
blr246
Pick up some data sheets for IC parts that might be useful in a system you'd
like to design. These are published by the manufacturer, and they contain a
lot of design requirements about how to layout properly a PCB and some about
the theory of operation of the parts. You can piece together a lot of
practical information this way.

------
amelius
I see a lot of digital design resources mentioned, but if the application is
ML, then perhaps analog computing is an interesting alternative approach
because here the intermediate results need not be exact.

------
rramadass
The great Nicklaus Wirth wrote a book;

"Digital Circuit Design for Computer Science Students: An Introductory
Textbook"

It is different from other standard texts and well worth studying.

------
p1esk
What kind of ML research are you doing?

------
peter_d_sherman
Start here:

Boolean Logic & Logic Gates: Crash Course Computer Science #3

[https://www.youtube.com/watch?v=gI-
qXk7XojA](https://www.youtube.com/watch?v=gI-qXk7XojA)

It's a nice, simple, easily watchable and highly visual overview of what
you're going to start learning.

From there, it's virtually guaranteed that YouTube will suggest more relevant
videos, and you'll be on your way.

Then later, you might want some books on "Digital Logic" (that's the keyword
to search for).

To go up from that level of abstraction, I recommend "The Personal Computer
from the Inside Out: The Programmer's Guide to Low-Level PC Hardware and
Software (3rd Edition)" By Murray Sargent III and Richard L. Shoemaker:
[https://www.amazon.com/gp/product/0201626462](https://www.amazon.com/gp/product/0201626462)

To learn Assembly Language: Randall Hyde's Art Of Assembly Language
[http://www.plantation-
productions.com/Webster/www.artofasm.c...](http://www.plantation-
productions.com/Webster/www.artofasm.com/index.html)

To learn Compilers: Compiler Construction, by Niklaus Wirth
[http://www.ethoberon.ethz.ch/WirthPubl/CBEAll.pdf](http://www.ethoberon.ethz.ch/WirthPubl/CBEAll.pdf)

Let's Build A Compiler, Jack W. Crenshaw & Marco van de Voort:
[https://www.stack.nl/~marcov/compiler.pdf](https://www.stack.nl/~marcov/compiler.pdf)

A Small C Compiler, by James E. Hendrix (sorry, this one is behind a
pay/registration wall) [https://www.scribd.com/document/289762150/Small-C-
compiler-v...](https://www.scribd.com/document/289762150/Small-C-
compiler-v1-7)
[https://en.wikipedia.org/wiki/Small-C](https://en.wikipedia.org/wiki/Small-C)

Tiny C Compiler [https://bellard.org/tcc/](https://bellard.org/tcc/)

Wikipedia articles related to compilers: Compilers, General:
[https://en.wikipedia.org/wiki/Compiler](https://en.wikipedia.org/wiki/Compiler)
BNF:
[https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_form](https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_form)
Shunting Yard Algorithm: [https://en.wikipedia.org/wiki/Shunting-
yard_algorithm](https://en.wikipedia.org/wiki/Shunting-yard_algorithm) Parse
Trees:
[https://en.wikipedia.org/wiki/Parse_tree](https://en.wikipedia.org/wiki/Parse_tree)

Here's a great way to visually explore how various compilers create assembly
language instructions: [https://godbolt.org/](https://godbolt.org/)

Up the abstraction level from all of that is LISP and LISPlike languages,
Tensorflow, ML, and related high-level abstractions -- but you probably are
more aware of those than the lower levels...

Anyway, good luck! It's a wonderful journey to undertake!

------
dbuder
search youtube for Onur Multu, he shares his lectures.

~~~
deepakkarki
For quick reference :
[https://www.youtube.com/channel/UCIwQ8uOeRFgOEvBLYc3kc3g](https://www.youtube.com/channel/UCIwQ8uOeRFgOEvBLYc3kc3g)

and
[https://www.youtube.com/user/cmu18447](https://www.youtube.com/user/cmu18447)

