Hacker News new | past | comments | ask | show | jobs | submit login
The Elements of Computing Systems, Second Edition (mitpress.mit.edu)
574 points by VitalyAnkh on Feb 5, 2021 | hide | past | favorite | 97 comments



I love this book (the first edition), project, and course.

It is incomplete, but I thought I would share my implementation of the software stack in F#. Currently, only the assembler is implemented, but in my personal opinion, I think it showcases the beauty of F# for domain modeling. When I return to the project, I hope to restart the VM implementation and continue adding to the FPGA implementation as well. My eventual goal is to have the entire software stack built using F# that can than be run on an FPGA implementation of the CPU.

https://github.com/bmitc/nand2tetris

Types to model the instructions and source file expressions: https://github.com/bmitc/nand2tetris/blob/main/dotnet/Nand2T...

The full assembler, mainly consisting of the parser and translator: https://github.com/bmitc/nand2tetris/blob/main/dotnet/Nand2T...


Buy this book if you're a programmer who wants to know how computers work.

I read the first edition in high school, and if I had to choose either this book or my entire undergrad CS education, I'd pick this book.


This book follows more along the lines of the coursework for a computer engineering degree than a computer science degree-- especially at institutions where CS lives under the math or IT department as opposed to the engineering department.

There can be a lot of overlap between the two, particularly on the software side of things, but CS curricula generally completely omit the electrical engineering portions of a CE curriculum, and that's where ECS puts its focus.


Their video course [1] is supposedly based off the book. In the course they state that they are not focused on (nor do they have expertise) in the "electrical engineering" aspect of computer systems.

In the first part of the course they focus on computer "hardware" but only on the logical aspects of it (i.e. logic gates etc.). So it probably is considered part computer engineering (though the second part does focus on software) but I wouldn't say it really overlaps with electrical engineering.

[1] https://www.nand2tetris.org/


>This book follows more along the lines of the coursework for a computer engineering degree than a computer science degree

> but CS curricula generally completely omit the electrical engineering portions

Where in this book does it talk about electrical engineering?


The first half of the book is about digital design, which is part of electrical engineering. They call it "Hardware Land" in the book.

From page 6 (1st ed.): Of course the layers of abstraction don't stop here. Elementary logic gates are built from transistors, using technologies based on solid-state physics and ultimately quantum mechanics. Indeed, this is where the abstractions of the natural world,as studied and formulated by physicists, become the building blocks of the abstractions of the synthetic worlds built and studied by computer scientists.

8


They actually call it "Hardware." If you look on the left of the thread linked website, you'll see a very nice "Table of contents." In this table of contents, it lists all the topics that the books covers.

HARDWARE

1 Boolean Logic 9

2 Boolean Arithmetic 31

3 Memory 45

4 Machine Language 61

5 Computer Architecture 83

6 Assembler 103

These topics are fundamental to computer science. Boolean algebra is fundamental to computer science. Just because E.E. or C.E. degree courses mention a topic, doesn't mean that topic all of a sudden becomes exclusive to them.

Also, I notice you're using the first edition. May I suggest you look at the second edition, as stated in the title?


> Just because E.E. or C.E. degree courses mention a topic, doesn't mean that topic all of a sudden becomes exclusive to them.

What country did/do you study in?

In the US, a research university intro course offering that's asymptotic to this book will almost certainly be promulgated by the EE side of the house for reasons like satisfying ABET accreditation requirements or EE programs generally being better postured to support lecture/recitation supplemented by a significant hardware lab component. At Stanford, see EE 108; CMU, 18-240; UF, EEL 3701; and so on.

At my undergrad alma mater, it was the only upper division EE course that didn't have a prerequisite. The senior lecturers who alternately steered the course were notorious for baiting would-be freshmen into taking it early as an effective means to cull the herd. What's funny is CS undergrad advisors publish a suggested sequence with a footnote calling this major course out by name with an explicit recommendation that it "be taken either by itself during the summer or with no more than 13 hours/credits during a Fall/Spring semester."


"This second edition has been extensively revised. It has been restructured into two distinct parts—part I, Hardware, and part II, Software—with six projects in each part. All chapters and projects have been rewritten, with an emphasis on separating abstraction from implementation, and many new sections, figures, and examples have been added. Substantial new appendixes offer focused presentation on technical and theoretical topics." From the blurb


Wasn't the first edition also in two parts (hardware then software) and with an emphasis on separating abstraction from implementation? That's how I remember it.


Yes on the first account. Kind of on the second. Each project had you finish in a way that would allow you to put it in a box and take it as an abstraction into the next one. That part worked well.

The part that was hard for me was that the book's language often muddled concept and implementation in the description of the project. This wasn't too much of a problem for myself as I went through this book several years after school. I had been working as a professional programmer for some time then and was used to disambiguating concepts and detail.

I know a lot of people work through this book as an undegrad but I must admit I doubt I would have enjoyed as much had I less experience. I can't speak to how much they've improved this aspect obviously and otherwise I found the book's language unusually clear for a textbook.


I have both a computer engineering education and have read the book. My recollection is that it's good for what it is but omits[0] a heck of a lot of necessary background knowledge to apply the material outside of the limited context of the book. I'm not saying that it's not good, just be aware that it's not the whole picture, so to speak.

[0] or perhaps it would be better to say "has no choice but to omit" or it would be ten times thicker than it is.


Coming from the pure CS side I definitely got this feeling. But the real value it provided me was to demystify a lot of what I was working on top of as a programmer. Not that I believe that working through this book has given me a complete understanding of the stack below where I work. I think a superscalar processor with a fat OS like Linux or Windows on top and several layers of drivers and firmware between is far far away from what this book teaches. But it did help me to stop thinking magically about a lot of what is going on. It's embarrassing to admit I once thought a lot of tools like debuggers and whatnot were doing some hocus pocus I couldn't hope to understand. This book really showed me that everything going on in a computer is capital D designed.

From your first projects in CS you're sitting atop a huge stack of tech. My own education was fairly low level compared to most CS programs. Several assembly languages, C, debugging crash dumps, register watches, etc were all part of my curriculum. Even still there is so much down below where I work it is hard not to think magically. Just being given a toy model of how all this doesn't but might work was extremely helpful. This may be why you might find CS majors more fond of this book than CE or EE.


Just curious what curriculum gave a low level CS education? Did you deliberately choose courses such as Compiler thoery and Operating System? In my school(I'm in first semester but join as a returner as I got a Stat master yrs ago) they are both optional and we can skip many low level courses.


Honestly, it was my school. I went to Digipen Institute of Technology. There are a lot of teachers there who worked for Nintendo which has a very old school dev culture. For example, the Wii U was literally their first console with an actual operating system. There are also a lot of Microsoft folks there who worked in jobs like Xbox dev support which gives one a pretty intense education in tackling low level problems.

These days I work far far away from that level and much of knowledge has atrophied but I often appreciate still understanding the concepts.


Thanks. This does sound very interesting. I wonder if there is any other place that offers a combination of many low level courses as well as game programming ones...Those Nintendo guys must be very cool. Did any of them work in the nes/snes period?


A couple. There was a range of experiences. The majority had been handheld developers building hardware and games for the GBC, GBA and DS.

I don't know specifically of another school that focuses on this. However, one thing I found over my college career and you are probably already aware of is that often, if you make a connection with department faculty and prove yourself ambitions they are accommodating in how they will account credits.

You might consider finding a school with a friendly and flexible faculty and then see if they will allow you to pursue a CS degree replacing some of the CS courses with CE courses. Give your existing degree you would might find a fair amount of latitude since they won't feel they need to babysit your trajectory.


I did part one of two of nand2tetris online.

The biggest part they omitted was how flip flops work. They talk about it briefly in a end of unit video, however. They don't really talk about clock cycles in any depth, and their interactions with flip flops, and why they're important.

The course, I think, it better to have left it out. It keeps the first part focused on the combination of elements culminating in a CPU.

And, most of all, it inspires you. After doing the course, I was more than motivated to understand this myself. And it remains the most educational computing course I've ever done.


Seriously. After reading this book I realized that I didn't understand shit about computers before. Wondering what the second edition adds.


Logged in to say exactly this. I thought I understand computation after my degree, and this book, while far from perfect, did an incredible job of bringing fundamental computation to life in the readers mind.

I'm looking forward to diving into the second edition and hoping the second half is better than the first edition.


I feel like computer science degrees rarely get into much of the hardware beyond some basic assembly. You might have benefitted from taking some computer engineering courses which cover the topics in this book in much greater detail.


> basic assembly

Even some people actually working on compilers still talk about pipeline stalls in a ye olde pentium/RISC sense rather than the out of order monsters we have today.


Interesting my CS degree had a required introductory Computer Engineering course which itself had a pre-req first year course where we learned digital logic (among other stuff).

We didn't go quite as far as this book seems to but definitly got a decent grounding in how computers actually do their thing before it was back to theory.


When you've done that read Ulrich Drepper's piece on memory - it's tough but this is what sets you apart from a cowboy (not so much the section on DRAM pinouts, but the whole thing)


As someone who programs without a CS undergrad background and always feels insecure about it, this sounds both very interesting and a very good deal.


If you are interested you should dig in. The very best thing about this book or at least its previous edition is that it is really really fun. The only other textbook I've enjoyed as much is PG's ANSI Common Lisp.


I had sort of the opposite experience. The book was really fun, but it felt a lot more like I learned how to play some abstracted low-level game. It just didn't do much to actually inform me about how a modern CPU actually works.


I think that is nearly exactly the purpose of this book though. It is not to teach you how modern computers work. That may even be an impossible task (as in I'm not sure any one person could hold all of that in their head at once). But by giving you a version of a computer you can completely understand, you can begin to relate to how various pieces of the technology you work with on a daily basis might work.

For example, the portions on virtual machines do not explicitly show you how the JVM works but they give you an idea for the conerns involved such as managing object lifetimes and whatnot.


That's good to hear. I completed all but the final chapter of this book. I don't have a CS degree. A colleague told me I seemed to know a lot more about computing than his computer science class mates. I do wish I'd at least done data structures and algorithms in uni. Catching up now on coursera. next I want to through the book computer systems a programmers perspective


As others have commented, the original book was one of those little gems that once you read, you realise how blind you were before, and it is extremely accessible.

Often material is either too abstract or far too detailed, ECS managed to find the perfect balance where someone with a CS background can drill down to transistors and come back upwards again and really understand where they are going on that journey.


I didn't use this one, but my Computer Organization class which did gates -> circuits -> cpu -> microcode -> assembly was the lightbulb that allowed me to attempt to deconstruct everything to at least the gate level if I had to and made me much more comfortable with computer science and software in general.

The only thing that rivaled that lightbulb was aspects of Theory of Computation with undecidability, turing machine vs stack machine vs state machine powers that theoretically limit Von Neumann architecture.

I'll have to pick this up to see what is new, what would really be nice is if they can get a bit into SSDs and modern superspeed networking, I/O, and multicore that wasn't as prevalent back in the day.


This particular book/course is aimed at being accessible to even high school students. The projects stop short of the depth you'd see in a computer engineering degree because it ends up being a survey of CMPE. It's a good starting point for people who either are considering that discipline of formal study, or hobbyists/professionals with different backgrounds that want to understand hardware better.

Regardless, you'd need a different book or course to get what you're asking for in your last paragraph as that changes the scope and target of the book radically.


>"The only thing that rivaled that lightbulb was aspects of Theory of Computation with undecidability, turing machine vs stack machine vs state machine powers that theoretically limit Von Neumann architecture."

Might you or someone else have a title that you would recommend for this that is the equivalent of a "Elements of Computing Systems" style book?


I TAed the class for a while in undergrad and it was one of the few CS classes with a textbook that really actually helped and augmented the lectures/homework.

The one we used was Introduction to the Theory of Computation by Michael Sipser (not hard to find a pdf online).



It was 1995... so the Comp Org was Tanenbaum's Structured Computer Organization. I also used his Operating Systems book in the OS class. I thought they were excellent but I also had great profs.

I'll try to find my theory of computation text so I can see who wrote that, but again that was a great prof that walked through it really well.

Alas my bias against lisp may solely be traced to the Programming Languages prof that loved Scheme but couldn't actually communicate with humans. He could recite pi to 100 decimal places and was esteemed as brilliant but literally couldn't form sentences when talking to students. I suspect the lambda calculus would have been a good insight as well, but oh well.


Sounds like you had an excellent romp = was scheme taught with SICP?

Lambda calculus' anonymous functions are fairly simple but massively powerful - try this page? <https://www.inf.fu-berlin.de/lehre/WS03/alpi/lambda.pdf>


The book "Understanding Computation: From Simple Machines to Impossible Programs" by Tom Stuart was/is my Theory Of Computation Nirvana experience.

Its motives is the same as NandToTetris: to make any programming-literate person comfortable with the rough outlines of the theory of computation and its main lore and results. The author uses Ruby as an interactive "Illustration language", a notation to describe concepts that happens to be executable. It's the same way I noticed the authors of "Structure and Ineterpetation of Computer Programs" use scheme or the way Niklaus Wirth sometimes uses Pascal in his educational writings. There is a small tutorial chapter in the beginning that crash-courses you through all what you need to understand in Ruby to read the book.

The structure is a bit different, the book isn't project-based like ECS, and there is no natural hierarchy to the theory of computation (except maybe the famous state_machines ==( push_down_automatons ==( turing_machines, and the book does introduce those topics in the natural order) that would make the book feel more bottom up. There is no exercises or nudges towards exploration and tinkering so you need to come equipped with your own. I stopped trying to digest it in one reading and designated it as one of those books you come back over and over again to fully absorb its nourishment.

But the main motivation of the book is strikingly similar to that of ECS: to take a complex and jargon-heavy several-years study topic and distill the most essential lines and edges so that a minimally-educated person motivated enough to understand could mostly understand. Some parts felt rushed and weren't meant to be digested in full details (when he was e.g. discussing a "zoo" of other computational systems that feels superficially different or less powerful than turing machines but are equivalent nonetheless, he was fairly hand-waivy), but the book is so great you will feel guilt dwelling on its shortcomings.


That’s a very generous comparison, thank you!


This is fantastic. I will definitely search this title out. Thanks for the wonderfully detailed response. This sounds exactly what I was looking for. Cheers.


I would add that I like Michael Sipser's Introduction to the Theory of Computation. It provides an introduction that explains a lot of the mathematical concepts and notation, which is helpful for self-study. Covers all the material mentioned, as well as time- and space-complexity and some other topics. However, it is basically a math textbook, so it doesn't have any actual code that you can run (you might code up the theoretical machines as an exercise). Rather, it mainly consists of theorems and their proofs.


How similar is this book to CODE by Charles Petzold? I loved that book because it progressively builds up an imaginary computer using very easy to understand concepts. Even non-technical people can read it to understand how computers work internally.


The Elements of Computing Systems is a project-based course, starting with transistors going through building a computer and then an OS and software on top of it. Code goes deeper into the history leading to modern computers, but doesn't offer this project-based experience from the transistor level up. IMO, the two books offer a nice complement to each other.


As a reader of both books, I couldn't agree with you more.


I read CODE before reading The Elements of Computing Systems. I felt like the former provides details that you actually implement in the latter. Highly recommend both and in that order.


"Elements of Computing Systems" is more detailed and involves you actually creating (virtually) a working computer starting from NAND gates. CODE is much more introductory.


Petzold's Code is a popularization aimed at non-technical readers, as you said. ECS is a textbook aimed at college-level CS and Comp. Eng. students and goes into sufficient technical depth for the reader to build their own digital computing systems.


https://www.nand2tetris.org/

The course is at the above website, in case you want to start working through it before this edition is released in July.


My girls, then aged 10 and 12, made it through the Coursera ECS first course. They are not geniuses, but the material is stellar. I helped them in a couple of assignments. They enjoyed it, even tough not one of them went the STEM route. They became the smart artsy young women that make art and music with computers.


The first edition is a great book, I completely recommend the first ~4 chapters, after that the ROI is not as good.

Personal opinion: I have found "Computer Systems: A Programmer's Perspective" to be a better introduction to the same set of topics.


Different books, different aims. Elements is suitable for both self-study and a 2-6 week (depending on time and level of students) survey course for high schoolers or college students in non-EE/CMPE degree programs. The material is freely available (for the 1st edition, I assume the 2nd will be as well). The print version is 1/4 the cost of Computer Systems, which seems to be aimed at college students (to be covered over an entire semester) or more experienced programmers for self-study.

The neat thing about Elements is that, despite being shallow in parts, it provides the full map of the subject with projects building on each other. Someone who's gone through it can more easily approach deeper treatments to fill in the details as they know where it's going, which also aids in motivation (removes the "what's the point" question when faced with seemingly esoteric topics).


Course is available for free on Coursera:

Part 1 - https://www.coursera.org/learn/build-a-computer (hardware projects/chapters 1-6)

Part 2 - https://www.coursera.org/learn/nand2tetris2 (software projects/chapters 7-12)


The downside is that part two is the software stack and the software grader only supports Python and Java.


Not when I took it! I did most of those assignments in Haskell (a choice I came to regret). Double-checking and it seems the grader supports the following languages: C, C++, C#, Elixir, Erlang, Go, Haskell, Java, Lua, Node.js, Perl, PHP, Python 2.7, Python 3, Ruby, Rust, Scala, Swift.


Oh really? I was going off their own FAQ since the graded items requiring paying, and I didn’t want to do it if I couldn’t use the languages I wanted, namely F# and Racket.

From the FAQ:

> > Which programming language do I have to use in order to complete the assignments in this course?

> We expect learners to submit assignments in any version of Java, or Python. We will assume that you have basic programming ability in these languages, including a basic ability to understand and write simple object-based programs.

So is their FAQ out of date I guess?


Yeah, I guess the FAQ is out of date. Think you should be to sign up for free if you want to check for yourself. I highly recommend it! And F# would be a pretty great language for this course, I think. Looks like no Racket (or any Lisp) support, though.


Yea, I have already done the first course part and then the assembler, which is in F# and linked in one of my other comments.

Thanks for letting me know about the extended grader support.


It's reasonable to support one great programming language along with Java, which is kind of a market standard.


I didn’t say it wasn’t reasonable, but it is a downside for those wanting to do something different.

I’d prefer they tested emitted code, that way you can use anything you want.


They'd need to maintain trusted sandboxed environments for every supported compiler/interpreter. That's a lot of work.


Do the lectures offer anything on top of just reading the book?


The first edition is probably the book with the highest mind-expanding content to page ratio I have ever read.


Does anybody know something about the main differences to the first edition? Is it the same HACK computer (and stack, with assembler, high-level language, etc.), or did they change that?

The original architecture had some caveats - e.g. no external interrupt capability, so IO was implemented using busy polling and inspection specific IO-mapped memory regions (I don't blame them, as it made the book more concise and you can not cover everything) and there has been discussion about a HACK-2 from the community, so it would be interesting if somebody knows more.


I emailed the authors a while back when I found out about the second edition, asking about the differences. They stated that the hardware platform and software stack specifications are unchanged.


This is one of the best books you can read as a software engineer. Lessons can be done over weekends and they teach you so, so much. I recommend this to everyone when discussing CS books. Absolutely incredible work - and concise too.


I have the first edition. It was such a great read. I practiced the concepts by building a 4-bit ALU in Minecraft back in the day (~2013).

I thoroughly recommend this book.


Is there any place to purchase a digital copy of this? I want to read it, but don't like paper copies. Besides the kindle edition, I'm looking for an ePUB or PDF mostly.


The current edition can be read for free, check the "Projects" section, at https://www.nand2tetris.org/course. There is also the Kindle version of the current edition.


This is awesome. Thanks for sharing!


I had the pleasure to be taught this course by Prof. Schocken and also implement https://www.nand2tetris.org/ It's very important to know that Prof. Schocken is a true hacker - he speaks a lot with his students about hacking and curiosity, which is far more important than academic grades.


I just want to add to what everyone else here is saying. I recommend this book to everyone. Very excited to see how they've updated it.


I love this book. At the same time, isn't the whole point of abstraction to make this knowledge irrelevant for people who build applications on top of it? Like, the knowledge should only have utility insofar as the abstraction designers and implementers at the levels below yours did a bad job, unless you have a use case they didn't design for


Gah! So mad I just bought the first edition!


Don't be mad. I've done that before with some books I really liked. Work through the one you've got, then optionally pick up the second edition when it's released. I usually throw the "oops, they released an update a month after my purchase" editions onto my wish list and get it within a year as a birthday or Christmas gift.


I have seen books suffer from the "second-system syndrome."

http://www.neilgunton.com/doc/?o=1mr&doc_id=8583


I wonder what might be some good examples of that. IME, it's mostly been college textbooks which suffer from being refreshed for the sake of encouraging new purchases (killing the used market), often by someone other than the original authors and editors.

Of course, some AD&D fans would say everything past 2nd edition might be an example of it.


> past 2nd edition

Case in point: the vomit-inducing colorful print of the 3rd edition of Axler's Linear Algebra Done Right.


Sorry that you do not like the color in the third edition of Linear Algebra Done Right. Most students find it useful, for example, to have definition boxes be a different color than theorem boxes. The color did not increase the price of Linear Algebra Done Right. New hardbound copies of the book usually sell for about $45 on Amazon, much less than most competing books. Furthermore, Linear Algebra Done Right almost always has the best Amazon sales rank of books for a second course in linear algebra. --Sheldon Axler


I have nothing against using color in books. I think that, for example, Topology Illustrated by Saveliev got this right. The use of color is especially helpful in presenting objects (not text) when the density of information is high, e.g. in schematic images found in books on cell or molecular biology. Turning an otherwise excellent text into a shiny toy, on the other hand, only serves to damage its reputation in the eyes of a serious reader (but, sure, not necessarily a younger student who may be used to or even appreciative of shiny things).


Hah. I actually had two linear algebra texts in college, both ostensibly by the same author (I can't recall now who it was so can't verify the later edition was actually by the same man and didn't reuse his name). One was my mother's in college some time in 1976-1980, the other was mine from some time in 2000-2004. If it weren't for the exercises, I'd have only used hers because it was a much clearer text, it was also less than half the size (overall smaller in every dimension: shorter, narrower, and thinner).


On a black and white Kindle, I rather like Linear Algebra Abridged. That might be different on paper and in color (always overrated).


Dawkins believes Darwin's "Origin of Species" suffered from that as later editions bowed somewhat to religious pressure.


I did the same thing. This book has been on my radar and I bought it about 3 months ago.


This is an amazing book, the engineering, bottom-up counterpart to the equally marvelous, but top-down, mathematical SICP. Between these two a serious student can get a wonderful foundation in CS and a deep and hopefully enduring connection to the beauty in it.


In July.


Sunshine and Friendship.

Computing literature published.

A generation renewed.


Just pre-ordered the Kindle version


As a side note:

It is quite fascinating how computers are really just bits of 0's and 1's. What a magical machine built upon a layer of boolean logic. What an incredible feat.


How does this compare to the NAND to tetris course?


It's the book to this course!


I wonder what is different in the Second Edition. I own the first and absolutely love it.

Also loved the nand2tetris course.


Is there any plan to offer the work in progress as an eBook prior to release?


How does this compare to "Digital Design and Computer Architecture" by David and Sarah Harris? Can anyone who has read both comment?

I've been wanting to learn more about the fundamentals of how computers work. I'm also interested in exploring FPGAs.


Better alternative: https://pages.cs.wisc.edu/~remzi/OSTEP/

It's also free. Does not cover discrete math and assembly afaik.


do you need any external resources to go through this book?


The webpage https://www.nand2tetris.org/software has a link to a file, nand2tetris.zip, which contains projects and tools which are all one needs to go through the book (explained on this page). This open source software can be run on Windows, Unix, and MacOS. I've worked through most of the book on my Mac.


thank you!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: