I just completed part 1 building a computer from gates and writing an assembler and am in the middle of part 2 where you develop compilers and a stack based virtual machine.
For someone like me who did not study computer science but has been a self taught programmer in the Analytics spacefor over a decade, this has been a tremendously useful and accessible introduction to basic CS concepts and very eye opening in terms of what's actually happening under the hood. This course is highly recommended if you've not studied CS though be aware the second part of the course where you develop compilers is quite intense though being an online course, you can pick your pace.
I wonder if there are similar courses on CS algorithms for people who didn't study CS in college.
Algorithmic Design and Techniques
Data Structures Fundamentals
Graph Algorithms
NP-Complete Problems
String Processing and Pattern Matching Algorithms
Dynamic Programming: Applications In Machine Learning and Genomics
Graph Algorithms in Genome Sequencing
Algorithms and Data Structures Capstone
Very little of which is communicated in any way to help someone learn how compilers work.
In the actual history of computing very few good compilers were ever written by algorithm and data structures professors; most were written by people who had a pragmatic drive to get something done. Theory is good but without practice it is little more than navel gazing. Theory and practice need each other.
The reason that nand2tetris works so well as a course is that it gives you a concrete example of how all these algorithms and data structures are combined in order to turn an abstract description (the source code) into a running program that also includes how that computing is achieved through compiler, assembler, linker, run time, operating system, isa, and finally in transistors.
The true issue it gets to is that most programmers have very little understanding of how computers actually work, and when things break they just do high order guessing to get things working again rather than truly understanding what went wrong.
As an EE undergrad, I had to design a TTL based processor, program the microcode, etc. I have no idea how a compiler works, or anything above assembly.
> I wonder if there are similar courses on CS algorithms for people who didn't study CS in college.
This might be a bit more challenging and less hand holding than the nand2tetris course but if you are up for it, why not go with the best? Sedgewick Algorithm book has been one of the standard algorithm books in universities for a long time.
My most striking instance yet of the Baader-Meinhof phenomenon: yesterday, I ran into a colleague I hadn't seen in months, and he recommended I check out this incredibly cool website where you build a basic computer starting with logic gates called nandgame.com. Just now, I was about to check it out, but thought I'd skim Hacker News first... To stumble upon this post, and a comment from the nandgame creator himself. Thank you in advance for building such an awesome educational resource!
Just played through it and really liked it! I think this is just the right level of abstraction for a beginning CS student. Makes it easy to grasp the compositional nature of computers without having to deal with all the details that make circuit design hard, like the non-instantaneous response of real components.
This is really awesome.. I too love the book Code by Petzold. I was wondering if I could help translate this into German? I have some work colleagues that could really benefit, but I think it would be easier if the explanatory text was German.
Have spent some time playing it - something that would be interesting is a view switch which expands your design again into NAND gates such that you can really appreciate the complexity of what's going on.
If I make a XOR circuit with 4 NAND gates it says I can do better. My 3 gate solution is to use an AND, OR, and NAND gate which would have been 6 NAND gates. However it gives me full marks for my 3 gate solution.
Yeah it currently only compares the solutions towards the minimum number of components of any type. I also intend to compare against the minimum number of nand gates overall, but this is not yet supported.
I really cannot recommend this course enough, I find myself referring back to it in so many different fields (security, debugging, programming language design)
When I finished school, I was fairly knowledgeable about physics, including logic gates and electronics, and I was a fairly good programmer, but I had no idea how that virtual world arose from the physical.
When I type on a keyboard, what is physically happening in the universe?
Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines.
... cleverly illustrated and eminently comprehensible story ... No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.
I think you and the GP are using "lower level" in different ways. GP means with a lower barrier to entry, you mean "closer to the electrons" if I understand both of you correctly.
Code goes into how logic gates like the nand gate are built, so it does start at a lower level than Nand to Tetris which start with nand gates as primitives.
I wish that existed as a last year course in college. It has a lot of recap on the parts that creates a modern day computer and likely gives you understanding where imperative languages come from.
Once you’ve learned programming, this book is maybe the perfect survey course to understand systems. It gave me a framework for understanding all the layers from CPU to ReactJS that 4+ years of college never could so concisely
The title of the course is misleading, as it suggests that NAND gates are the only required primitive, whereas the course later introduces DFF gates, without which you cannot escape the parallelism of chips.
Thanks, that page is nice, but it too emphasizes that you also need a clock. That's essential and it was the point I was trying to make. For example, a thing that is usually glossed over in OS courses (in favor of, let's say, scheduling algorithms and such) is that you simply can't make preemptive multitasking out of pure code - and later when you ask yourself how is it that such a phenomenon is possible in the first place you learn that chips are inherently parallel, that sequential code is an abstraction over that so that one can have some sort of law and order, and only then interrupts enter the scene so that one may break that order when needed, e.g. for preemption.
You can make a clock easily enough: two NAND gates, two resistors, two capacitors. If you use parasitic components you will get a pretty wild resonance so better to nail it down by using something a bit more stable. But in principle it will work just fine.
Agreed on the 'preemtive multitasking' bit, but that has little to do with this subject. And it is kind of embedded in the name, you need something to do the pre-empting which by definition has to come from outside.
But building a functional computer with just NAND gates is absolutely possible, even if there are better ways, just like you could use Brainfuck or combinators to do meaningful computation. The whole idea of the course is not to give some kind of purist model for computation but to show that the essence of a computer can be boiled down to some very simple building blocks. So the title of the course is misleading only if you want to pick nits. Course titles are made to attract students and to give broad cover to what the course is all about, not to satisfy language purists.
Couldn't you in theory build a clock purely from nand gates? You create a cycle of nand gates where the output of one is tied to both inputs of the next. If the cycle have an uneven number of nand gates, each nand will oscillate with a period depending on the length of the cycle.
You can make a DFF from NAND gates, though it's pretty obvious the simulator they wrote for the class can't perform that emulation. I was a little put off from the class when they tried to explain why they didn't want to cover them "because they were too complicated". It's pretty obvious to anyone who understands them that they're not significantly more complicated than the rest of the material. I gave up on the class right there, thinking the instructors and the rest of the class would be a lot of the same BS. I'm glad I got over it and completed the rest of the class as it was pretty fun.
And one thing that is said even less is why NAND? You can do it all with NOR gates as well if you want. The reason for NAND has to do with the switching speed, pMOS switches quicker in parallel (either transistor can get the current flowing) and nMOS switches quicker in series (either transistor can stop the current flowing); this is how a NAND gate is wired. NOR gates on the other hand have the worst of both.
The fact that nMOS switches off quicker is why the most common form of dynamic logic (as opposed to static logic) implements, in effect, the bottom half of the CMOS in nMOS and the upper half using a capacitor; which holds the charge and hence pull up for the length of the clock cycle (which is why dynamic logic CPUs can't be single stepped in hardware, the clock speed is too low to hold a charge).
During my intro to computer engineering course, lectures touched on why NAND gates, and the conclusion was that you can build anything you need in a basic computer using NAND gates. It boils down to expense. A BOM (bill of materials) with fewer line items tends to be cheaper.
This doesn't really hold in today's world of silicon prints, but at the start I think it was price that drove the decision.
Actually if you take an XOR gate with a naive translation into NAND gates you would use 20 transistors, but if you don't need a large fan out from the output you can actually do it with 6 transistors using a pass gate layout (which is neither NAND nor NOR). So typically you don't use NAND vs NOR because of transistor count.
Don’t NAND take up less room too? It took a CMOS course several years ago, and laid out a 16x16 shift register. The NAND was simple, orthogonal strips of gate metal and channel. The NOR required a parallel structure with more area.
For someone like me who did not study computer science but has been a self taught programmer in the Analytics spacefor over a decade, this has been a tremendously useful and accessible introduction to basic CS concepts and very eye opening in terms of what's actually happening under the hood. This course is highly recommended if you've not studied CS though be aware the second part of the course where you develop compilers is quite intense though being an online course, you can pick your pace.
I wonder if there are similar courses on CS algorithms for people who didn't study CS in college.