
Building a Modern Computer from First Principles - xal
http://www.nand2tetris.org/
======
pajju
You start with a simple Nand Gate, then build the hardware, ALU, RAM,
assembler, compiler, OS and finally build the Tetris Game. Remember all this
runs on your own self-designed computer — hardware + software, all your work!
Awesome once in a lifetime experience project.

Here's how it goes — You are just given a Nand Gate, you then construct other
gates and complex logic from that Nand Gate. Then you build computer's basic
processing and storage devices (ALU and RAM, respectively). Then next Stage
you build an assembler and compiler for your own defined language. ;) Finally
a High level language(jack) is implemented to run on your machine
architecture. Then you build an OS for your machine. Jack OS.

And In the last step you build your first application, Tetris Game. ;)
Remember it is running on your own self-built computer. ;)

This is one of the most well-thought self-learning Projects out there to build
a computer from first principles. Kudos for the creators. Pure Bliss.

My experience — You get the feel and appreciate the project as you move up and
also later in your life. Its a life-long experience.

If you are a college student, having a mentor helps a lot in understanding and
appreciating concepts faster. Worth mentioning, this is one of the best gifts
you can give for a curious soul who has just stepped into computers.

I rate this project very high, and the best self-learning project of all time.

~~~
emacsitor
Seriously, this book should be in all computer engineering curriculums. Hands
down, there's nothing like it.

------
emacsitor
This book is amazing, simply for the fact that it really allows you to
understand the major layers of a computer system in a way that can fit in your
mind all at once. This is not an easy task, with each layer having the
capability of being a highly specialized area of study. You can spend your
life working in operating systems without ever really understanding what a
compiler does, or work in compilers without ever actually understanding the
digital logic that underlies computer architecture.

For students and hobbyists alike, the task of understanding what a computer
fundamentally does can seem like a truly uphill battle. Approaching this
battle from the top down can seem never ending. The number and complexity of
layers between application code and executable binaries is daunting to the
newcomer to say the least. Approaching it from the bottom up is still
difficult, but it allows you to see the need for each abstraction layer as the
shortcomings of a lower layer present themselves.

This book takes this bottom up approach to literally take you from digital
logic to high level software, literally from nand to tetris. And while each
layer in between is highly simplified, it allows you to understand a system as
a whole rather than concentrating on the specific layers. Really, a great
read. And the projects are priceless. If you make it through this book, you
will understand how computers _fundamentally_ work.

------
tptacek
I've been sick in bed for the last couple days and yesterday read the first
half of this book (_Elements_). It is I think the best, tightest description
of how one gets from primitive gates to Adders to an ALU to memory. Extremely
well written.

------
akiselev
An excellent book, one that I can't recommend highly enough.

A friend at Caltech took this a step further and came up with his own crude
SoC that took input from basic switches, did calculations based on code taken
from a small off the shelf EEPROM, and displayed the output to segment LEDs.
Took him like three years but he was eventually able to make a chip with a 20
micron process using a microscope and a UV DMD development board [1]. He did
have access to wire bonders, IC debugging equipment, professors, etc though.

[1]
[http://ajp.aapt.org/resource/1/ajpias/v73/i10/p980_s1](http://ajp.aapt.org/resource/1/ajpias/v73/i10/p980_s1)

~~~
aswanson
This is awesome. Been thinking of doing the same with an electron gun
(hijacked from an old crt) and vacuum chamber.

~~~
akiselev
See if you can get a plasma window for that vacuum chamber, which will allow
you to do e-beam etching down to about 100nm with the wafer outside of the
chamber. Combine it with a DIY 80/20 class 100 cleanroom (if you can make it
small enough to be able to manipulate it with isolated gloves then it will be
relatively cheap), some precise X-Y linear stages (you can probably get away
with +-10nm precision at 100nm), supersonic bath with etching fluid, and a
high quality blender for applying the resist and VOILA you have a simple
little fab. You can probably adopt some stuff from an SEM to scan the electron
beam across the resist but you're probably better off trying to come up with a
way of etching through a die long term (much more difficult and slow).

------
eterps
Wonderful book.

For additional opinions also see:

[http://www.mail-archive.com/fonc@vpri.org/msg01614.html](http://www.mail-
archive.com/fonc@vpri.org/msg01614.html)

~~~
glurgh
Some of the (somewhat critical) opinions in that thread are Alan Kay's - worth
a read.

------
anoncow
[http://www.idc.ac.il/tecs](http://www.idc.ac.il/tecs) is down?

Edit: But slides are available on
[http://www.nand2tetris.org/course.php](http://www.nand2tetris.org/course.php)

A video lecture based course would have been great, but this is splendid too!

------
soferio
If you don't need to go as deep or as far, you might want to consider 'Code'
by Charles Petzold:
[http://www.charlespetzold.com/code/](http://www.charlespetzold.com/code/)

I found it great read and it covers some of the ground of this course.

~~~
angersock
I second this recommendation. _Code_ is a really great tour through the
creation of a digital computer, and helps build a fertile ground for later
questioning and research into the field.

------
jared314
I actually started writing my own hardware test runner[0] to more imitate a
spec/unit test runner, and because I really wanted just a folder, a NAND gate,
and a DFF. I found the built-in components confusing, because I would not
implement parts, and yet the tests would still pass.

Also, the source (thousands of lines of decent Java) is available at the
bottom of the download page[1].

[0]
[https://github.com/Jared314/n2trunner](https://github.com/Jared314/n2trunner)

[1]
[http://www.nand2tetris.org/software.php](http://www.nand2tetris.org/software.php)

------
abecedarius
Re the complaints about how idealized the hardware chapters are:
[http://www.amazon.com/Computation-Structures-Electrical-
Engi...](http://www.amazon.com/Computation-Structures-Electrical-Engineering-
Computer/dp/0262231395) also covers the full stack from transistors to
operating systems, but with much more depth on the lower levels. It's pretty
old, though.

------
pjv
MIT offers a similar undergraduate class called 6.004 which most
undergraduates take. The course materials are available on OCW:

[http://ocw.mit.edu/courses/electrical-engineering-and-
comput...](http://ocw.mit.edu/courses/electrical-engineering-and-computer-
science/6-004-computation-structures-spring-2009/index.htm)

------
bachback
Yes, good approach. It would be very cool to build this from real hardware
instead of an VM, now that we have Arduinos and RPI's.

~~~
alanctgardner2
If you built it from real hardware, you'd be going to the store to buy a
couple thousand NAND chips, like this: [http://www.digikey.com/product-
detail/en/SN7400N/296-14641-5...](http://www.digikey.com/product-
detail/en/SN7400N/296-14641-5-ND/555975)

And a giant honking breadboard.

~~~
austinz
It might be beyond the scope of a normal hobbyist project, but writing Verilog
or VHDL to drive an FPGA might bring interested people half of the way there.
You mentioned Altera design software in another post; I don't know if Xilinx's
is any better (I'm guessing not really), but Digilent sells FPGA boards
intended for the educational market at a pretty reasonable cost. Powerful
enough, I'd reckon, to allow for something of this scope to be built.

~~~
alanctgardner2
I actually considered something like this; there are good dev boards for $150
that can drive a VGA monitor and use USB peripherals. The manufacturer
provides stock blocks for you to integrate, so you can inspect them but you
don't have to build them from scratch (which is super hard).

Doing it on an FPGA, without extensive handholding and in real-world
languages, would be about a year of work in my estimate. This is assuming you
build your own CPU (in procedural VHDL, not at a gate level) to implement an
existing instruction set, and use the manufacturer's provided memory blocks,
video blocks, etc. For reference, an experienced FPGA programmer would take
about 2-4 months full-time to emulate something like an NES.

It would be a really good experience, and it's the kind of thing a comp. eng.
degree prepares you for (we do a capstone project at my school which is like
this). As a bonus, you'll also cover analog electronics (which are
infuriating) and as much comp sci. and math as you're willing to take on.

------
helloTree
It's an amazing book and doing all the projects is really fun. However keep in
mind that the computer you build is really simple and in my opinion maybe a
little bit too simple. In particular there is no decent IO model and reading
from the keyboard is simply done by busy waiting and writing to the screen is
done by writing to a special region in memory. This is ok, as the book covers
many topics, but I would enjoy finding a book that covers this topic in
detail. E.g. the famous patterson-hennessy book about MIPS covers the
implementation of a RISC processor in great detail but does not go into detail
about IO stuff which is in my opinion the really hard part.

~~~
kens
Historically, writing to the screen was done by writing either ASCII or pixel
values to a special region in memory (e.g. Apple II, IBM PC). I'm not sure
what you're suggesting the book should do instead? Use a serial port and
terminal? Implement a GPU? (I'm not being sarcastic, just looking for
clarification.)

~~~
helloTree
My point is that after reading the book I kind of knew how I could implement a
simple CPU in real HW (with ALU, memory, etc.) but it was not clear to me how
the IO part (kbd, screen) would work. Is the kbd connected directly to a
certain place of memory? How is this implemented? Would there be some
screen/gpu HW that is directly connected to the memory region? How is the CPU
clock involved?

E.g. if you press a key, hold it and then release it, it is possible that this
event gets missed as it was too short and the key press method did not check
the memory region at the right point of time. For me their implementation
enforces this bogus "I am in total control of everything that happens feeling"
which often leads to bad design as it ignores the messy real world. I would
have appreciated something more sophisticated and generic there which would
work for different kinds of HW. Maybe I am missing some simple bus-system one
might say ...

Nevertheless it is a wonderful book and I had lots of fun with it! :)

~~~
angersock
If I'm not mistaken, the keyboard should raise high an interrupt pin on the
CPU, which should cause an interrupt service routine to be called.

That routine should then mask lower-priority interrupts, poll the appropriate
region of memory (assuming memory-mapped IO) for the byte or bytes held down,
push those onto the buffer for key inputs or into the STDIN equivalent, unmask
lower-priorty interrupts, and return.

It is then up to the user program to read in from the buffer and do the
needful.

~~~
kragen
The Hack CPU doesn't have interrupts, IIRC.

~~~
angersock
I was actually working on implementing a simple little VM library in C as a
fun exercise, and deciding how to handle interrupts was where I got caught up.

~~~
kragen
Most VMs don't have interrupts. As a less-complex alternative, you might
consider implementing an I/O thread, something rarely done in hardware because
an extra CPU is a lot more hardware than an interrupt mechanism.

One interesting exception is the Unix VM, whose interrupts are called
"signals".

------
femto
Another book in the same space is "Understanding Digital Computers, by Forrest
Mims, originally sold by Radio Shack. It's dated, written in 1978, but the
basics are still relevant, taking the reader from the definition of a bit,
though binary numbers, Boolean logic and eventually to a complete
microcomputer and its programming.

I read this book as a teenager and I remember it giving me my first "aha!"
moment of understanding how computers really work. In my experience, the
prerequisites for understanding the book are pretty low, but the knowledge
within is sophisticated.

------
stiff
There is a somewhat similar book by nobody else but Niklaus Wirth:

[http://www.amazon.com/Digital-Circuit-Computer-Science-
Stude...](http://www.amazon.com/Digital-Circuit-Computer-Science-
Students/dp/354058577X/ref=cm_cr_dp_asin_lnk)

------
MartinMcGirk
Question to those that already have this book - is the book full of diagrams?
I ask because I could get this on kindle instantly (but diagrams suffer) or in
paperback in a week or so. Is it worth getting the physical book over the
e-book for this?

~~~
cranefly
Several of the chapters are available as pdfs from the site

~~~
MartinMcGirk
Thanks, I see that now. Paperback version it is.

------
Siecje
So I just buy the book and use software from the site? That's all that you
need?

~~~
eterps
Yes

------
visarga
Well, that's what I call good education. Start from the scratch. Understand
how it's made. Then your intuition will have something to grab on when you do
more complex stuff. It doesn't feel like voodoo any more.

------
bane
I don't really have the time or inclination to do all the hardware stuff with
real hardware, can anybody suggest software I might use to virtually build the
hardware and get the general gist?

~~~
leoedin
Read the link again! The computer is implemented in an HDL and simulated. All
the software needed is supplied.

------
kriro
I own it, I love it. It's great and should be read and worked through by
everyone who is interested in computers.

------
MWil
Total Cost?

~~~
pizza
$0, the hardware/each chip is implemented in software. The book is about $30,
but most chapters are free on the website.

------
alanctgardner2
I going to come across as defensive here, but I'm actually in a Computer
Engineering program (not Computer Science). This book purports to cover as
much material as 8 undergrad courses, I feel like it must skimp on depth to
(for example) condense all of 'compilers' into two weeks. Compilers are a very
large topic, a single undergrad course isn't even sufficient to really
understand a real world project like GCC. Likewise, 'computer architecture'
makes up three classes in my curriculum: one all about building RISC
processors, another about CISC, then a third about modern architectures. Only
in the last one do you approach an understanding of a contemporary CPU
architecture.

My question for people who have done the course is: does it cover even simple
design theory like K-maps? Does it make you account for propagation delay?
Does it explain caching schemes and TLBs? I feel like it probably has to gloss
over a lot of the 'hard stuff' to remain so dense.

Likewise, it sounds like it's all done in custom languages. Half of my first
year was spent struggling with industry standard, terrible software like
Altera which is super powerful but terribly designed. The other half was spent
actually breadboarding circuits and having them fail because of problems you
never see in simulations (or which they solve for you).

I'm not saying it's not an interesting project, but it really is a nice,
abstract diversion for people who work on software all day. People calling for
it to be included in comp. eng. programs probably don't realize the depth of
what actually gets covered in comp eng.

Edit: to sound a bit less whiny, if anyone is doing this course and they want
to dig deeper into a particular area, I'd be happy to point them to the
books/course materials we used.

~~~
ef4
There's nothing wrong with taking a course that covers a very wide area, even
if you're intending to specialize more deeply in all the topics later. It's
actually a very effective learning strategy, because it motivates all the
subsequent deeper dives.

~~~
alanctgardner2
I guess personally I would feel like I wasted a semester when I had no problem
being motivated to deep dive into the other topics already. I suppose if
someone was undirected and needed to pick a specialization this might help.

~~~
ef4
Ah, but if you're actually motivated there's nothing stopping you from going
as deep as you want into any of the topics. There's never an excuse for
"wasting" a semester.

Coursework is the minimum, not the maximum.

~~~
alanctgardner2
What I meant was that I would want to deep-dive into each topic, but we'd be
busy moving on, and I'd cover all that material again next year anyways. I
don't think survey courses fit my way of looking at topics, I'm very single-
minded. That doesn't mean they aren't valuable or they can't work for other
people. Just that I wouldn't want them to be mandatory.

------
alanctgardner2
I pointed this out above, but this is really a course for CS people that
glosses over a lot of computer engineering topics that you need to, say,
design your own processor. It's definitely a tremendous accomplishment, but
let's keep our pants on here.

~~~
angersock
Such as? Does it skip over K-maps or something?

~~~
alanctgardner2
If you control-f, there's squabbling at length below about the fact that they
don't cover k-maps, k-maps aren't important, etc.

Or you're being sarcastic. I'd hate to assume that, but someone did go through
and downvote all of my posts.

~~~
angersock
I was looking for more of a list of topics it omits that are absolutely
required to implement a functioning processor. A sufficiently simple little
register-based RISC CPU with memory-mapped IO, no interrupts, no caches or
TLBs, and so on is a functioning (if gimped) computer.

~~~
alanctgardner2
Sorry about that, you can see I'm kind of getting clobbered below.

I don't think the problem is that the end result isn't a computer (it
certainly sounds like it is), but that the computer only runs in the provided
simulator, and is written in a custom HDL designed to make this project
relatively simple. The simulator itself ignores a bunch of complexities around
timing that a commercial one (like ModelSim) would consider.

Personally I haven't done this class, but I'd be curious to know whether the
students design the control unit and data path themselves. I know that was a
giant pain in the ass when I did it for a gimped RISC processor (as you
described).

~~~
gruseom
Probably you're getting clobbered because you made lots of negative comments
about something that's really cool, and because your criticism (that it falls
short of a complete degree program in computer engineering) misses the point
of the thing.

------
akent
The lecture slides are in Comic Sans, ugh.

~~~
spacemanaki
I used to feel like this, until I saw Simon Peyton Jones use Comic Sans for a
slide deck on Haskell and decided this was a such a petty, knee-jerk reaction
to something as superficial as font-choice. At this point, I'd almost choose
Comic Sans on purpose for my own presentations, just to weed out and troll the
people who aren't paying attention to what matters.

[http://research.microsoft.com/en-
us/um/people/simonpj/papers...](http://research.microsoft.com/en-
us/um/people/simonpj/papers/haskell-retrospective/HaskellRetrospective.pdf)

 _edited to add:_ Dug up an actual comment from SPJ on this, I hadn't realized
he uses Comic Sans for all his talks:

[http://www.reddit.com/r/haskell/comments/1bd1ia/spj_and_comi...](http://www.reddit.com/r/haskell/comments/1bd1ia/spj_and_comic_sans/)

~~~
snogglethorpe
I've never understood the enormous enthusiasm people seem to to have for
badmouthing Comic Sans. It's not particularly elegant, but it's quite readable
and not particularly ugly. It more or less looks like hand-lettered dialogue
in comics, and that seems quite accepted, and even respected.

I suppose type designers or whoever might be particularly sensitive to
whatever transgressions it commits (I dunno), but almost everybody I've seen
indulge in a bit of C.S.-bashing seems to otherwise not care very much about
typography at all.

As far as I can figure, it's just because people love a bandwagon, especially
one that's really easy to hop onto and entails few risks....

