It is incomplete, but I thought I would share my implementation of the software stack in F#. Currently, only the assembler is implemented, but in my personal opinion, I think it showcases the beauty of F# for domain modeling. When I return to the project, I hope to restart the VM implementation and continue adding to the FPGA implementation as well. My eventual goal is to have the entire software stack built using F# that can than be run on an FPGA implementation of the CPU.
Types to model the instructions and source file expressions: https://github.com/bmitc/nand2tetris/blob/main/dotnet/Nand2T...
The full assembler, mainly consisting of the parser and translator: https://github.com/bmitc/nand2tetris/blob/main/dotnet/Nand2T...
I read the first edition in high school, and if I had to choose either this book or my entire undergrad CS education, I'd pick this book.
There can be a lot of overlap between the two, particularly on the software side of things, but CS curricula generally completely omit the electrical engineering portions of a CE curriculum, and that's where ECS puts its focus.
In the first part of the course they focus on computer "hardware" but only on the logical aspects of it (i.e. logic gates etc.). So it probably is considered part computer engineering (though the second part does focus on software) but I wouldn't say it really overlaps with electrical engineering.
> but CS curricula generally completely omit the electrical engineering portions
Where in this book does it talk about electrical engineering?
From page 6 (1st ed.): Of course the layers of abstraction don't stop here. Elementary logic gates are built from transistors, using technologies based on solid-state physics and ultimately quantum mechanics. Indeed, this is where the abstractions of the natural world,as studied and formulated by physicists, become the building blocks of the abstractions of the synthetic worlds built and studied by computer scientists.
1 Boolean Logic 9
2 Boolean Arithmetic 31
3 Memory 45
4 Machine Language 61
5 Computer Architecture 83
6 Assembler 103
These topics are fundamental to computer science. Boolean algebra is fundamental to computer science. Just because E.E. or C.E. degree courses mention a topic, doesn't mean that topic all of a sudden becomes exclusive to them.
Also, I notice you're using the first edition. May I suggest you look at the second edition, as stated in the title?
What country did/do you study in?
In the US, a research university intro course offering that's asymptotic to this book will almost certainly be promulgated by the EE side of the house for reasons like satisfying ABET accreditation requirements or EE programs generally being better postured to support lecture/recitation supplemented by a significant hardware lab component. At Stanford, see EE 108; CMU, 18-240; UF, EEL 3701; and so on.
At my undergrad alma mater, it was the only upper division EE course that didn't have a prerequisite. The senior lecturers who alternately steered the course were notorious for baiting would-be freshmen into taking it early as an effective means to cull the herd. What's funny is CS undergrad advisors publish a suggested sequence with a footnote calling this major course out by name with an explicit recommendation that it "be taken either by itself during the summer or with no more than 13 hours/credits during a Fall/Spring semester."
The part that was hard for me was that the book's language often muddled concept and implementation in the description of the project. This wasn't too much of a problem for myself as I went through this book several years after school. I had been working as a professional programmer for some time then and was used to disambiguating concepts and detail.
I know a lot of people work through this book as an undegrad but I must admit I doubt I would have enjoyed as much had I less experience. I can't speak to how much they've improved this aspect obviously and otherwise I found the book's language unusually clear for a textbook.
 or perhaps it would be better to say "has no choice but to omit" or it would be ten times thicker than it is.
From your first projects in CS you're sitting atop a huge stack of tech. My own education was fairly low level compared to most CS programs. Several assembly languages, C, debugging crash dumps, register watches, etc were all part of my curriculum. Even still there is so much down below where I work it is hard not to think magically. Just being given a toy model of how all this doesn't but might work was extremely helpful. This may be why you might find CS majors more fond of this book than CE or EE.
These days I work far far away from that level and much of knowledge has atrophied but I often appreciate still understanding the concepts.
I don't know specifically of another school that focuses on this. However, one thing I found over my college career and you are probably already aware of is that often, if you make a connection with department faculty and prove yourself ambitions they are accommodating in how they will account credits.
You might consider finding a school with a friendly and flexible faculty and then see if they will allow you to pursue a CS degree replacing some of the CS courses with CE courses. Give your existing degree you would might find a fair amount of latitude since they won't feel they need to babysit your trajectory.
The biggest part they omitted was how flip flops work. They talk about it briefly in a end of unit video, however. They don't really talk about clock cycles in any depth, and their interactions with flip flops, and why they're important.
The course, I think, it better to have left it out. It keeps the first part focused on the combination of elements culminating in a CPU.
And, most of all, it inspires you. After doing the course, I was more than motivated to understand this myself. And it remains the most educational computing course I've ever done.
I'm looking forward to diving into the second edition and hoping the second half is better than the first edition.
Even some people actually working on compilers still talk about pipeline stalls in a ye olde pentium/RISC sense rather than the out of order monsters we have today.
We didn't go quite as far as this book seems to but definitly got a decent grounding in how computers actually do their thing before it was back to theory.
For example, the portions on virtual machines do not explicitly show you how the JVM works but they give you an idea for the conerns involved such as managing object lifetimes and whatnot.
Often material is either too abstract or far too detailed, ECS managed to find the perfect balance where someone with a CS background can drill down to transistors and come back upwards again and really understand where they are going on that journey.
The only thing that rivaled that lightbulb was aspects of Theory of Computation with undecidability, turing machine vs stack machine vs state machine powers that theoretically limit Von Neumann architecture.
I'll have to pick this up to see what is new, what would really be nice is if they can get a bit into SSDs and modern superspeed networking, I/O, and multicore that wasn't as prevalent back in the day.
Regardless, you'd need a different book or course to get what you're asking for in your last paragraph as that changes the scope and target of the book radically.
Might you or someone else have a title that you would recommend for this that is the equivalent of a "Elements of Computing Systems" style book?
The one we used was Introduction to the Theory of Computation by Michael Sipser (not hard to find a pdf online).
I'll try to find my theory of computation text so I can see who wrote that, but again that was a great prof that walked through it really well.
Alas my bias against lisp may solely be traced to the Programming Languages prof that loved Scheme but couldn't actually communicate with humans. He could recite pi to 100 decimal places and was esteemed as brilliant but literally couldn't form sentences when talking to students. I suspect the lambda calculus would have been a good insight as well, but oh well.
Lambda calculus' anonymous functions are fairly simple but massively powerful - try this page? <https://www.inf.fu-berlin.de/lehre/WS03/alpi/lambda.pdf>
Its motives is the same as NandToTetris: to make any programming-literate person comfortable with the rough outlines of the theory of computation and its main lore and results. The author uses Ruby as an interactive "Illustration language", a notation to describe concepts that happens to be executable. It's the same way I noticed the authors of "Structure and Ineterpetation of Computer Programs" use scheme or the way Niklaus Wirth sometimes uses Pascal in his educational writings. There is a small tutorial chapter in the beginning that crash-courses you through all what you need to understand in Ruby to read the book.
The structure is a bit different, the book isn't project-based like ECS, and there is no natural hierarchy to the theory of computation (except maybe the famous state_machines ==( push_down_automatons ==( turing_machines, and the book does introduce those topics in the natural order) that would make the book feel more bottom up. There is no exercises or nudges towards exploration and tinkering so you need to come equipped with your own. I stopped trying to digest it in one reading and designated it as one of those books you come back over and over again to fully absorb its nourishment.
But the main motivation of the book is strikingly similar to that of ECS: to take a complex and jargon-heavy several-years study topic and distill the most essential lines and edges so that a minimally-educated person motivated enough to understand could mostly understand. Some parts felt rushed and weren't meant to be digested in full details (when he was e.g. discussing a "zoo" of other computational systems that feels superficially different or less powerful than turing machines but are equivalent nonetheless, he was fairly hand-waivy), but the book is so great you will feel guilt dwelling on its shortcomings.
The course is at the above website, in case you want to start working through it before this edition is released in July.
Personal opinion: I have found "Computer Systems: A Programmer's Perspective" to be a better introduction to the same set of topics.
The neat thing about Elements is that, despite being shallow in parts, it provides the full map of the subject with projects building on each other. Someone who's gone through it can more easily approach deeper treatments to fill in the details as they know where it's going, which also aids in motivation (removes the "what's the point" question when faced with seemingly esoteric topics).
Part 1 - https://www.coursera.org/learn/build-a-computer (hardware projects/chapters 1-6)
Part 2 - https://www.coursera.org/learn/nand2tetris2 (software projects/chapters 7-12)
From the FAQ:
> > Which programming language do I have to use in order to complete the assignments in this course?
> We expect learners to submit assignments in any version of Java, or Python. We will assume that you have basic programming ability in these languages, including a basic ability to understand and write simple object-based programs.
So is their FAQ out of date I guess?
Thanks for letting me know about the extended grader support.
I’d prefer they tested emitted code, that way you can use anything you want.
The original architecture had some caveats - e.g. no external interrupt capability, so IO was implemented using busy polling and inspection specific IO-mapped memory regions (I don't blame them, as it made the book more concise and you can not cover everything) and there has been discussion about a HACK-2 from the community, so it would be interesting if somebody knows more.
I thoroughly recommend this book.
Of course, some AD&D fans would say everything past 2nd edition might be an example of it.
Case in point: the vomit-inducing colorful print of the 3rd edition of Axler's Linear Algebra Done Right.
Computing literature published.
A generation renewed.
It is quite fascinating how computers are really just bits of 0's and 1's. What a magical machine built upon a layer of boolean logic. What an incredible feat.
Also loved the nand2tetris course.
I've been wanting to learn more about the fundamentals of how computers work. I'm also interested in exploring FPGAs.
It's also free. Does not cover discrete math and assembly afaik.