Edit: At the start the original title is visible it's "Richard Feynman: The Computers From the Inside Out" and the lecture is from 1985.
The major quote:
"Computers are not primarily computing machines, they are actually filing systems."
"You can scroll through memory as well" --> Would be nice to have a scroll-bar because I missed this at first and wondered why there were only 6 bytes of memory. (Would also be nice to see a bit more of the memory at once, although then the instructions might be easy to miss. Maybe a "show more memory" check-box?)
Would also definitely be nice to reset / modify the instruction pointer.
And of course the ability to save/load the CPU state (eg. simple copy/paste to the clipboard.)
Next steps would be to teach about basic I/O, since that's a fundamental part of any computer. (Start with some simulated LEDs/toggle switches that map directly to a special address space?)
Easily accessible, fun, and if you find it too simple, you can always introduce others to it.
One request: don't introduce the concept of 'nibble' on the first page. It's jargon that no one needs in this general introduction. And it just sounds silly.
And it sounds silly and adds nothing. That too.
(edit: my education on this stuff was in the 00's. Sounds like it was more common in the 80s. I'd definitely support obsoleting the term entirely now, though! It adds unnecessary complexity.)
And replace it with what? Bistable multivibrator?
That was last year, I also got the impression that he liked to fill likely gaps that he could do with small detours.
I don't think I read or heard about it again for over a decade. It's never sounded silly to me, or at least no more silly than "byte", and it's at least slightly useful to explain converting hex.
Its not a synonym for byte. In modern systems, a word is typically 4 or 8 bytes.
I started it at the request of some HN users, and I hope it continues to hopefully enlighten some software folks who wonder how the CPU they rely on actually works.
So, I plead: if you have cause to write a digital design book, don't write it as a "Verilog for Computer Scientists"! Verilog is a big language, and not all of it is good for writing hardware. What you really want is "Digital Design for Computer Scientists".
- Being able to think parallel. Software is sequential by default, HDL is parallel by default.
- Understanding what you want to build. If you can't draw what you want, you probably can't write good HDL. There's a reason it's called Hardware Description Language, and not "JS2GATES". In a sense, HDL is really just text-based schematics.
Case in point:
When you think "HDL" you probably think of Verilog or VHDL, but SPICE netlists is also an HDL.
Totally. When working through the nand2tetris stuff, several of my designs fell on their face because I started thinking of multistep circuit logic as sequential, forgetting that junk results aren't discarded by the "flow" but that their values are always "evaluated" and fed into downstream logic.
Thinking of a transistor as a voltage controlled switch (obviously simplified to the point of being wrong but works for understanding digital) and seeing how you can use those to build logic gates, and then use gates to build muxes and flip-flops and so on up was essential for my understanding HDLs.
(Incidentally, all that has also made how computers and machine code work much clearer to me).
I don't think I could've really done it without that foundation. And I certainly couldn't have done it by just thinking of Verilog with my programmer's idea of concurrency.
It's obviously very low level and you'd have to gloss over a bunch of things, but I'd definitely start with transistors if I had to explain programmable logic to somebody.
00000001 <-- this line is both an instruction and data.
00000001 <-- this line is both data and an address.
Looking at this, I expected that it would read line one, then move the value addressed by line two to the value addressed by line three. i.e. set the instruction pointer to 00000001. This didn't work. What did was:
This program advances to line 4 then remains there. Could someone explain why program two is an infinite loop, but program one isn't? Thanks!
If you modify the last line to "11111101" then when the instruction pointer is increased by 3 it will overflow back to 00000001 and you will get your infinite loop.
The problem is that it moves that value into the instruction pointer, then the instruction pointer increments itself by 3.
the instruction pointer always increments itself by 3 after every instruction.
Also - it would be very neat to have a "reset" on the Instruction Pointer, so that you don't have to refresh the current page if you want to start your program over...
Knowing most if not all of this already I cannot say how well it teaches, but it does describe what I spent much longer on learning.
Two suggestions to the author: first, identify yourself on the site! Second, interpret "0" as "STOP" or "RESET" to prevent the simulation from running to infinity. Alternatively, adjust the UI to indicate that the PC is off-screen.
It was quite enlightening to see over the course of three weeks or so how a (simple) CPU could be built from ever-larger building blocks, but starting at the very beginning with transistors.
The CPU jumps to the instruction at #4, which tells it to add the value of #2 to #3. The instructions at #7-#10 reset the instruction pointer to 00000001, so that it'll increment to #4 again. The result is that #3 is continuously incremented up by one.
I've usually preferred to stick to really high level stuff (Like JS/Node.js), but trying this out makes me want to go learn assembly or something!
But. (Is there always a but?) it's very unapproachable. It has no welcome mat.
"Let's dive in" is normally an appreciated attitude, but here it makes it sound like we are coming in in the middle, and have missed the introduction. So, what is this thing? The page doesn't really say. There's an about page, but most people are too lazy to click. They want to see the value explained immediately, clearly, succinctly, on the landing page... or they're gone.
If they do click the About page, the writing is stilted, formal. But they don't, in most cases, I suspect.
When there are clickable images, it's not clear they are clickable. The text indicates you should click, but there are these huge arrows right there, and the obvious interpretation of the text is that you're supposed to click the arrows to see the image do its animation. Nope.
CPU is very jargonish as well. A better title would be something like "how computers work."
Still, nice design, beautifully made, good chunking of stuff into bite-sized pieces.