I've taught this stuff before and I find it best to start with boolean logic and combinatorial circuits, then sequential ones, but don't just jump into building a CPU right away; instead, after discussing clocked/sequential logic and registers/memory ("storage loops"), I discuss a counter, then connect the output of the counter to the address inputs of a memory to create a machine that just reads memory. Adding an "accumulator" and adder turns that into a rudimentary adding machine, and I note that such a machine was already able to save --- or eliminate, depending on your viewpoint (although I rarely will go on an ethical tangent) --- countless hours of human labour at the beginning of the last century. I then slowly evolve the adding machine into something closer to a CPU by making it capable of other operations than adding (and this is also a good time to show the elegance of two's complement), gradually turning the numbers stored in memory into not just data to be added, but operation codes, and eventually, with the addition of control flow and indirection instructions, memory addresses too.
This "bottom-up" approach tends to bore some students, but in my experience also leads to far deeper understanding than more superficial ones that skip over a lot of the intermediate steps. It helps to keep the interest by showing these machines actually working, either with actual logic chips and a breadboard, or a simulator; and of course give the students hands-on experience if possible.
As a student, I found the bottom-up approach quite interesting in a lab format. (Though it was an FPGA lab - learning about CPU architecture was a side benefit. And it was at M.Sc. level, so we were somewhat familiar with the basics.)
We built some fundamental blocks in VHDL (e.g. buffers, mux, adder, memory, decoder, ALU), combined them into a basic CPU with fetch/decode/execute stages, added a load/store unit and control flow instructions to make it turing complete, then added optimizations like pipelining. We wrote some code to run on it. Then we switched to a LEON3 off the shelve softcore and implemented our own AMBA peripherals to interface with some real sensors/actors.
The ability to view the NAND gates your code synthesizes was great. Tracking signal propagation delays motivated certain basic design decisions very well. Writing code that runs on a CPU that you also wrote is so incredibly rewarding. (And it's interesting when a bug can be either in your program or your CPU.) And finally integrating a real-world softcore really showed how practical this can actually be even outside of Intel / AMD.
I just wanted to say thank you for this. I've been trying to teach myself various things about hardware over the years. I've been working on a virtual machine for a few years now in my spare time and tryinf to make it as close to what i understand about how cpus work to try and teach myself more. I have no idea why, but the way you've broken each step and building block here gives me some really clear ideas of things to start researching more about.
I'd like to get into electronics and microprocessor stuff more as a hobby, but a lot of it kind of mystifies me these days. I played with electronics a bit as a kid in high school hut never really got into it much after. I do a lot of work repairing maintaining CNC machines now and every time an electronics issue comes up it makes me want to get more serious about it.
Ben Eater has an excellent step by step tutorial on creating a complete (simple) computer on a bunch of breadboards. It's fascinating to watch, and he does a great job explaining all the details. Highly recommended.
I have to mention "But How Do It Know?" here as well. I came across it from a previous HN discussion and picked up a copy. It's really, really good. And it's a really easy read. http://www.buthowdoitknow.com/
This body of knowledge taught to us from early 1960's onwards is remarkable, they accomplished so much in the world. Clever 'graphical' algorithms like Karnaugh Maps that subsume more theoretical structures such as Boolean Lattices are a delight
This "bottom-up" approach tends to bore some students, but in my experience also leads to far deeper understanding than more superficial ones that skip over a lot of the intermediate steps. It helps to keep the interest by showing these machines actually working, either with actual logic chips and a breadboard, or a simulator; and of course give the students hands-on experience if possible.