
I don't know how CPUs work so I simulated one in code - boyter
https://djhworld.github.io/post/2019/05/21/i-dont-know-how-cpus-work-so-i-simulated-one-in-code/
======
balls187
The final for Computer Architeture had us building an 8-bit CPU. It was a
multiweek project, starting with designing the instruction set leading up to
building the CPU in software implementing bubble sort in assembly.

The first and only time I had to do an all nighter (2 actually) in college was
due to that project. Two days before the final presentation, the CPU didn't
work. After a few clock cycles the memory would contain garbage. I ended up
rebuilding it from scratch debugging every step of the way only to find out
the 1-bit mux (a primitive supplied with the software) was wired backwards.

0 corresponded to the B input, and 1 selected the A input.

Once I correct that, the CPU worked like a charm, we nailed the final preso,
and I slept for 16 hours.

~~~
spchampion2
I had a very similar experience, including a very late night trying to fix a
weird integration problem along with my project team. We had spent hours
trying to find the issue, but nobody was thinking effectively and we were
making stupid decisions. Everyone wanted to keep going, but I insisted that we
were not thinking clearly and that sleep would help us better solve the
problem. I even promised I would come back first thing in the morning to start
debugging again on my own. The team begrudgingly agreed and we walked away for
the night.

I slept, came back in the morning, and had the problem fixed in like 15
minutes. I learned so much about CPU design from that class, but I also
learned how important sleep is to thinking clearly.

~~~
sixothree
I woke up one day recently and "knew" what a co-worker had done wrong. He
pasted steps of his issue with screenshots. Having slept on it I realized he
had misspelled a server name. I have no idea how I discovered that or realized
that while I was sleeping. But there it was.

~~~
jussij
The mind is an amazing bit of kit.

On several occasion during my IT career I have faced situations where the
problem at hand has taken days/weeks to solve.

What is most amazing is that on some occasions the solution literally just
pops into my head.

It's almost like there is some _background thread of the mind_ working
subconsciously on the problem, only to rise to a conscious level when a
solution is found.

~~~
ishiz
There certainly seems to be something to it. This isn't a new phenomenon:
there is a story that Archimedes around 250 BC couldn't solve a problem until
he stopped working on it and took a bath. While this story may not be true, it
does show that these "eureka moments" were common not just today but also
thousands of years ago.

I'm not a psychologist but from what I understand, "dual process theories"
state that humans have two distinct thought processes. System 1 is fast,
instinctive, and unconscious; it answers questions like "2+2=?" System 2 is
conscious, deep thought; it answers questions like "if 2x+3=17, x=?". System 1
is the default thought process, but it can be suppressed by system 2. I would
love if anyone who knows more knows how this might be connected.

~~~
Angostura
To be fair, the problem Archimedes was stuck on was how to measure the volume
of an irregular body. Immersing his irregular body in the bath kinda gave him
a clue.

------
xigency
> So I’m trying to get a better understanding of this stuff because I don’t
> know what L1/L2 caches are, I don’t know what pipelining means, I’m not
> entirely sure I understand the Meltdown and Spectre vulnerability papers.

I wouldn't discourage anyone from learning about hardware from implementing
gates or even looking at simple 8-bit CPUs, but if you are interested in
learning how modern caches and pipelines work, there is a free Udacity course
that goes into excellent detail [0]. You can also find the lecture videos on
YouTube.

This is originally an online course at Georgia Tech and the professor does an
excellent job teaching these concepts.

[0] [https://www.udacity.com/course/high-performance-computer-
arc...](https://www.udacity.com/course/high-performance-computer-architecture
--ud007)

~~~
dr_zoidberg
There's a very good article [0] that makes rounds here very now and then. It
gives a very good explanation of many of those concepts, while remaining
understandable for those that haven't taken CPU design courses.

[0]
[http://www.lighterra.com/papers/modernmicroprocessors/](http://www.lighterra.com/papers/modernmicroprocessors/)

Edit: It'd be great a 2019 refresh (last update is from 2016) talking about
Ryzen and the newer Intel designs.

~~~
djhworld
Bookmarked, thanks for the link, I think it will help me with the next step

------
blago
I took an architecture class back in college and the first time we met, the
professor gave us a couple of programs described in plain English. One of them
was basically sorting, and the other one something else.

Each of us spent the rest of the semester picking an instruction set,
designing a system, writing an emulator, and writing the code that would
perform the tasks described on the first day.

I went a little overboard and created a C backend for the emulator along with
an in-browser JS client that was pretty much a full-blown machine language IDE
and debugger.

Needless to say, this feature creep didn't end well. I barely made it work
well enough to get an ok grade but I learned that overconfidence can be more
dangerous than the lack of.

Here's what I was able to salvage from the front end on short notice:
[http://blago.dachev.com/~blago/CS-535/stage_2/src/web/](http://blago.dachev.com/~blago/CS-535/stage_2/src/web/)
It's using ext.js which was pretty cool at the time.

~~~
Hendrikto
> I barely made it work well enough to get an ok grade but I learned that
> overconfidence can be more dangerous than the lack of.

That is a very important lesson.

~~~
w0utert
Yeah, reminds me of the time I had to write a BSP tree builder as a course
assignment, and ending up spending 2 months writing an OpenGL renderer + an
importer for the Apple QuickDraw 3D API instead (hint: don't try to do that,
it's almost a full object-oriented programming language in itself if you want
to support even a moderately complex model).

All just because I found some nice architecture models that were only
available in QD3D format, and I was convinced it would only take me a few
hours to read them... I only got the brilliant idea of just converting them to
something simpler like .3DS after I already finished the assignment >_<

------
flanbiscuit
Very cool!

I had the same thoughts as you and bought "From Nand to Tetris"[1] a while ago
but I did not get as far as you. You've inspired me to pick it back up and
finish the book.

Curious if you decided to use "But How Do It Know?" over "Nand to Tetris" for
any specific reason or were you just not aware of the latter.

1\. [https://www.nand2tetris.org/](https://www.nand2tetris.org/)

~~~
djhworld
Hi author here!

I actually didn't set out on building anything, it all happened organically! I
honestly cannot remember why I chose the book, I think I just saw the blurb
and figured it was good enough to read.

It was only when I was a couple of chapters in that I figured I could probably
whip something up in code :)

~~~
Hnrobert42
Hello author. I just wanted to heartily thank you for trying this and writing
a post about it. I love seeing people pursuing technology for the sheer joy of
it.

Staring at the blank whiteness of an unwritten post, one wonders whether
anyone else will notice or care what words will come tumbling out. But through
your post, your work is multiplied. It is inspiring.

So, again, thank you.

~~~
djhworld
No worries, thanks for the kind words.

I got some great feedback from friends and colleagues before publishing it so
thanks to them too :)

------
js2
> However, after making my way through But How Do It Know? by J. Clark Scott,
> a book which describes the bits of a simple 8-bit computer from the NAND
> gates, through to the registers, RAM, bits of the CPU, ALU and I/O, I got a
> hankering to implement it in code.

Speaking of code, another excellent book along these lines is _Code: The
Hidden Language of Computer Hardware and Software_ by Charles Petzold.

[http://www.charlespetzold.com/code/](http://www.charlespetzold.com/code/)

~~~
heinrichhartman
yes, that's notably the only publication by Microsoft Press, I can recommend
;)

~~~
segfaultbuserr
> _the only publication by Microsoft Press_

Really?

[https://www.microsoftpressstore.com/store/browse/programming](https://www.microsoftpressstore.com/store/browse/programming)

~~~
C4stor
Odds are the last comma is a typo ^^

~~~
segfaultbuserr
Aha, I see what you meant. Quite interesting to see how ambiguity can be
created by a mere comma in real life.

------
eatonphil
Going through a similar phase, though this is way further along. :) I had a
great course in college on computer architecture that culminated in a
processor in Logisim that could run fairly complex programs. (I recommend the
Harris and Harris textbook for a surprisingly light/easy introduction [0].)
But that was a while ago and I've never done anything for x86/amd64.

I started working on an emulator [1] a few weeks ago but it interprets Intel
x86 assembly rather than ELF files. I found this a great way to get started
since parsing text is easier and the instructions you need to get a basic C
program (compiled to assembly) running take an hour or two: call, push, pop,
add. You can shim _start in without having to implement syscalls.

Conditional jumping and syscalls took another weekend or two and now it can
run some basic fibonacci C programs. I also had to build a graphical debugger
for it to see what was going on... I will probably move to reading ELF files
soon.

I'll be writing up the process in a series on x86/amd64 emulator basics.

[0]
[https://www.goodreads.com/book/show/2558730.Digital_Design_a...](https://www.goodreads.com/book/show/2558730.Digital_Design_and_Computer_Architecture)

[1] [https://github.com/eatonphil/x86e](https://github.com/eatonphil/x86e)

~~~
bogomipz
Your emulator project is really neat. I would love to see more documentation
on the design and process of it. The notes.md file was pretty spartan. Thanks
for sharing. Cheers.

~~~
eatonphil
Here's the first post:

[http://notes.eatonphil.com/emulator-basics-a-stack-and-
regis...](http://notes.eatonphil.com/emulator-basics-a-stack-and-register-
machine.html)

------
gilbetron
(Way) back in college, we had been learning high-level programming concepts
(UI, OS, Compilers, Algorithms, etc), underlying math behind computing, and
physics up through circuits. My favorite course was Computer Architecture
which was set up to pull everything together, mostly through a series of
simulated computer projects, until, at the end, you realized you knew how a
computer worked from math, through physics, and up to what the person
interacted with. One of my favorite educational experiences ever. The
Professor (Yale Patt) was a really interesting guy, great storyteller, and
still actively worked with Intel. I also thought he was really friendly, turns
out he just liked talking my girlfriend at the time and I was a tag-along ;)

~~~
sanderjd
This was my favorite part of college too. I think it was near the end of a
really good networking class I took maybe my junior year when I distinctly
remember chatting on AIM and thinking about how I could visualize
approximately how my key presses were going from the hardware through the OS
into memory into instructions on the CPU executing the application back out
onto my screen and across the network to a server and on to my friend's
machine for the reverse trip. It was a powerful moment. Of course as I've gone
further through my career I've realized how far off my approximation was
because everything is even more complex in the details, but I still think it
was a fundamentally valid and valuable moment.

This is one reason I'm ambivalent about the skepticism around computer science
/ engineering programs as a prerequisite for a career in software development.
It bums me out to think of people toiling at this work without experiencing
that kind of bottom-up knowledge of computing. But I think this is largely a
projection of my own personality on others; that would be a bummer for me, but
I think many people don't care about any of that and just want to do valuable
work for good pay.

------
ibeckermayer
I'm working on a similar project, implementing the architecture described in
this book ([https://www.nand2tetris.org](https://www.nand2tetris.org)).

Ultimately I hope to implement it on an FPGA with attached keyboard and screen
and actually use it to compute stuff:
[https://github.com/ibeckermayer/Nand2TetrisFPGA](https://github.com/ibeckermayer/Nand2TetrisFPGA)

~~~
mikevin
Nice project. I've finished most of the course (up until the compiler because
that's where it became too familiar) and I'm now redoing it on an actual fpga.
It's a great learning experience and I can fully recommend keeping at it, it's
one of the most rewarding projects I've done. Actually bought a Pynq Z2 last
week to have some peripherals to connect and to drive it easily. Might put it
up github too.

~~~
ibeckermayer
Nice. I looked around online and it seems like others have attempted something
like this but I didn't find any project that looked complete. I have a Basys 3
board which has VGA and USB I/O, so I hope to actually get from Nand to
actually playing Tetris on my homespun platform.

------
hawkjo
Quick shout out to NAND Game, a version of this laid out as a computer game.
Delightful, free, web-based.

[http://nandgame.com/](http://nandgame.com/)

~~~
hathawsh
That is awesome. I plan to challenge my kids to finish that game this summer.

------
thrower123
I still think the single best course I took in college was my computer
architecture course. We built an entire 8-bit CPU up from individual gates
over the course of the semester in Logisim[1], to the point where we could
compile a limited version of C down to the machine code for our simulated
processor and load it into the simulated main memory and run it.

I'm incredibly glad there was that hands-on component to the course, rather
than just the theoretical textbook and lectures learning; it was hard as hell,
and I actually ended up dropping it the first time and taking it again later,
but at the end I actually felt like I kinda knew what was going on. Pointers
were never mysterious again, at least.

[1] [http://www.cburch.com/logisim/](http://www.cburch.com/logisim/)

------
nstart
As a layman on this topic, I'm curious about the difficulty around this
project:

It seems like a lot of people have gone through similar experiences of
building an 8 bit CPU simulator/emulator. Curious why it's not 64 bits. I can
guess the answer is "it's hard". I'm just wondering where the difficulty lies.
Is it in the standards? The mechanical feasibility of connections? Actual
limitations of building a 64 bit emulator on a machine powered by a 64 bit
processor?

Apologies if the question seems obvious. I come into this knowing next to
nothing and I could probably find the answer with some research of my own.
Just wanted to ask the community for thoughts here ️

~~~
djhworld
For this project? It wouldn't have been difficult to upgrade the machine to
64-bit, it's really just a matter of increasing the bus width and register
size.

But it would slow it down massively due to me writing it in a general purpose
programming language and passing around booleans everywhere, there's a lot of
for loops that copy things in and out of buses and I would have to increase
the size of the ALU to accomodate the bigger numbers.

------
pjc50
> [https://github.com/djhworld/simple-
> computer/blob/master/cpu/...](https://github.com/djhworld/simple-
> computer/blob/master/cpu/cpu.go#L763)

The author themselves points out that a "big pile of gates" is not the best
approach here, but I'm kind of surprised that at no point did they attempt to
build another representation of the circuit and either execute it in a table-
driven way, or write a short script that spits out go for compilation.

~~~
gnode
> but I'm kind of surprised that at no point did they attempt to build another

Perhaps because the existing implementation had already served its purpose -
an educational exercise to understand CPU design, and not an instruction-level
emulator or transcompiler.

------
sehugg
Shameless plug -- hardware engineering students are using the online
8bitworkshop IDE to study and design custom CPUs in Verilog:
[https://8bitworkshop.com/redir.html?platform=verilog](https://8bitworkshop.com/redir.html?platform=verilog)

------
mar77i
I got this link yesterday or so. I breezed through it until the memory latch:
[http://nandgame.com/](http://nandgame.com/)

~~~
rdc12
Did you manage to make the latch in the end? I would imagine feedback would be
a tricky subject to try learn without help.

------
anderspitman
I went back to school for CS after 6 years of professional programming. Just
for being guided through logic gates, assembly, and implementing a simple CPU
in an FPGA I consider it well worth the time and money.

The point when I realized I finally understood (on a surface level) everything
from the electrical signals to my JavaScript reminds me of when I finished the
last video of Khan Academy's series on Euler's formula and "understood" it, if
only for a moment.

------
cptnapalm
You can also do this with UNIX pipes and a few small C programs, apparently.
[https://www.linusakesson.net/programming/pipelogic/index.php](https://www.linusakesson.net/programming/pipelogic/index.php)

------
dustfinger
People who are interested in building their own computer might enjoy the
Bitwise [1] project over at the Handmade Network.

[1][https://bitwise.handmade.network/](https://bitwise.handmade.network/)

------
baybal2
Especially for people in line of web development.

Take a look to see how simple is working of a CPU, and by how many
_MAGNITUDES_ does the code size grow when any modern "programming paradigm" is
involved.

Programs that are brought down to the absolute minimum of arithmetic and
logical operations in assembler can often run thousand times faster than when
written in higher level languages.

I remember I was shown how a classical computer science problem called
"Sisyphus dilemma" can be done in a single logic instruction, instead of a
kilobyte long program in Java that makes the smallest solution possible when
no binary operations are allowed.

~~~
pjc50
> classical computer science problem called "Sisyphus dilemma" can be done in
> a single logic instruction

This sounds interesting and I'd like to read about it, but Google isn't
helping.

~~~
baybal2
[https://en.wikipedia.org/wiki/Josephus_problem](https://en.wikipedia.org/wiki/Josephus_problem)

Mistook the name.

Yep, just shift the binary representation of the total number of members by 1
bit

    
    
        /**
      * 
      * @param n (41) the number of people standing in the circle
      * @return the safe position who will survive the execution 
      * ~Integer.highestOneBit(n*2)
      * Multiply n by 2, get the first set bit and take its complement
      * ((n<<1) | 1)
      * Left Shift n and flipping the last bit
      * ~Integer.highestOneBit(n*2) & ((n<<1) | 1) 
      * Bitwise And to copy bits exists in both operands.
      */
     public int getSafePosition(int n) {
      return ~Integer.highestOneBit(n*2) & ((n<<1) | 1);
     }

------
Zrdr
Congrats !

This reminds me of this amazing computer done in the cellular automaton
Wireworld. [https://www.quinapalus.com/wi-
index.html](https://www.quinapalus.com/wi-index.html)

------
Jipazgqmnm
[https://www.nand2tetris.org/](https://www.nand2tetris.org/)

------
lqet
Nice work. Somewhat related to the Megaprocessor by James Newman [0]

[0]
[http://www.megaprocessor.com/progress.html](http://www.megaprocessor.com/progress.html)

~~~
djhworld
I watched the Computerphile episode about this on Youtube when I was in the
midst of the project, apparently it's at a Computer History museum in
Cambridge now, I really should find the time to go!

------
naringas
there's also the series of youtube videos by Ben Eater in which he builds a
breadboard computer.

link:
[https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2...](https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU)

------
russellbeattie
Ben Eater has a series of videos where he builds an 8 bit computer using
actual chips. He starts from first principals and it's pretty amazing to see
it come together. Definitely worth watching.

Building an 8-bit breadboard computer!:
[https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2...](https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU)

------
ixtli
I took a computer architecture course in undergrad and they had us make an ARM
emulator similar to this in c++. It was extremely illuminating.

~~~
djhworld
Cool.

I always find that "learn by doing" works best in these sorts of things :)

------
peter303
Thats how Gates and Allen wrote their BASIC interpreter for the Altair
microcomputer and started MicroSoft. They did not have access to a working
8080 microcomputer so they simulated one on a Harvard mainframe from a machine
language specification. Bill took the interpreter by punch tape to MITS in
Albuquerque. And legend has it their BASIC worked only after an hour of
fiddling.

~~~
djhworld
That's such a cool story haha.

Not sure if my thing is remotely close to that, but nice to know I'm standing
on the shoulder of giants :)

------
levelist_com
Check out the following:

The Elements of Computing Systems: Building a Modern Computer from First
Principles

[https://www.amazon.com/Elements-Computing-Systems-
Building-P...](https://www.amazon.com/Elements-Computing-Systems-Building-
Principles-dp-0262640686/dp/0262640686/ref=mt_paperback)

------
blattimwind
> I don't know how CPUs work

Guess introductory courses in universities aren't useless after all.

------
madrox
Reminds me of the early days of unmodded Minecraft when someone made a CPU in
game with lots and lots of redstone. It really makes otherwise arcane concepts
accessible. Super cool!

------
afpx
This is very cool. It would be so fun to run qemu on it.

[https://www.qemu.org/](https://www.qemu.org/).

~~~
djhworld
I think the machine would need to be much _much_ faster, with the stack
pointer register + a few extra instructions to even attempt a project like
that hahahahaha

------
arcticbull
I did this too, when I was in college. I implemented a gameboy CPU emulator.
Then I wanted to know how FPGAs work so I implemented a Chip-8 CPU on an FPGA.
Once I clean it up, I'll throw it up on my GitHub and maybe write up a quick
medium post. This is exactly how I learn: I don't know how something works, so
I make one.

~~~
djhworld
> This is exactly how I learn: I don't know how something works, so I make
> one.

Uh huh, the FPGA Chip-8 thing sounds awesome! Definitely post about it!

------
floor_
Making a chip-8 emulator is a reasonable project for those who are interested.

~~~
merlincorey
I'm in the final stretches of mine right now, in fact.

The article describes an 8 bit CPU with 17 instructions.

CHIP-8 is an 8 bit CPU with 31 instructions, but crucially, it has a large
amount of documentation[0] and a nice corpus of ROMS available[1] making it a
bit easier as you don't have necessarily write your own programs for a CPU you
don't fully understand yet.

[0] See
[http://devernay.free.fr/hacks/chip8/C8TECH10.HTM](http://devernay.free.fr/hacks/chip8/C8TECH10.HTM)
among many other resources [1] For example this archive of the now defunct
chip8 archive site
[https://github.com/dmatlack/chip8/tree/master/roms](https://github.com/dmatlack/chip8/tree/master/roms)

~~~
djhworld
Just looking at the docs, those 31 instructions would definitely help a lot,
along with the stack pointer register.

I'd definitely recommend doing something like that (not at the logic gate
level though - too slow) but an emulator is a fun project to tackle. I wrote
my Gameboy emulator first which is a bit more complicated than chip-8 and at
the time the documentation was a lot more varied.

------
nlowell
Implementing a CPU simulation is a classic college exercise, as you can see in
these comments. Super valuable based on how memorable it was for everyone! It
really helps de-mystify computers.

------
elamje
Any Yale Patt alums here?

~~~
kickopotomus
EE360N Labs all over again. Tangentially, does anyone know what Patt has had
to say about Meltdown/Spectre? Speculative execution was/is his bread-and-
butter if I remember correctly.

------
guhcampos
It's an impressive, yet symptomatic feat.

With more and more people getting into coding and languages of higher and
higher levels, less of us are turning to the lower levels of computer science.
Lots of developers did not study CS at all and the ones who did mostly
neglected the computer architecture classes - myself included.

Sometimes I stop and wonder that most of my low level compute knowledge comes
as mere luck because I ended up working for the EDA industry for a few years.
If I had not, I'd be a computer scientist with close to no understanding of
how a computer actually works. We should all know our HDL's.

~~~
cheschire
Don't let yourself get too cynical! Development is becoming more accessible,
and this is skewing the ratio.

I bet there are far more low level developers now than 30 years ago, however.

~~~
baybal2
> I bet there are far more low level developers now than 30 years ago,
> however.

30 years ago every developer at least knew how bits and bytes with logic
operations work.

Now, just any minimally proficient C/Cpp devs are genuinely hard to find. I
may well say that there are less of them in total now.

C development community shrank a lot over the years.

Things went so bad that now some people suggest running whole web servers on
_MICROCONTROLLERS_ to just blink some LEDs!

~~~
Klathmon
> Things went so bad that now some people suggest running whole web servers on
> MICROCONTROLLERS to just blink some LEDs!

Why is that such a bad thing? Even microcontrollers are magnitudes more
powerful than they were years ago, so why stick with other "simpler" methods
when you can go with something more secure, easier to write and understand,
and more "standard"?

I'm one of those people running webservers on microcontrollers (you'll hate
this, but I program my esp8266 controllers in JavaScript!), and it seems silly
to lament that. They are cheap (about $3 a piece to my door), power efficient
(battery life is measured in months for the battery powered devices), and all
of my code is about a dozen lines of simple code that allows me to integrate
with the rest of my system.

~~~
keepmesmall
Why save cycles today if you can borrow tomorrow?

------
DonHopkins
I love how you can actually see the different parts of Guy Steel's Lisp
Microprocessor, which he designed in Lynn Conway's legendary 1978 VLSI System
Design Course at MIT.

[http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/MIT78.html](http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/MIT78.html)

The Great Quux's Lisp Microprocessor is the big one on the left of the second
image, and you can see his name "(C) 1978 GUY L STEELE JR" if you zoom in:

"Guy Steele: LISP microprocessor (LISP expression evaluator and associated
memory manager; operates directly on LISP expressions stored in memory)."

[http://ai.eecs.umich.edu/people/conway/VLSI/InstGuide/MIT78c...](http://ai.eecs.umich.edu/people/conway/VLSI/InstGuide/MIT78chip%20photo-2%20L.jpg)

And here is a map of the different parts of the Lisp microprocessor:

[https://imgur.com/zwaJMQC](https://imgur.com/zwaJMQC)

Here is the chalk board where they kept track of all the student's projects
and where they would go on the chip:

[http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Status%20E...](http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Status%20Em.jpg)

And the layout of the chip with everyone's project (with the big Lisp
microprocessor standing out at the lower left):

"The final sanity check before maskmaking: A wall-sized overall check plot
made at Xerox PARC from Arpanet-transmitted design files, showing the student
design projects merged into multiproject chip set."

[http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Checkplot%...](http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Checkplot%20s.jpg)

This is a photo of one of the wafers they made, with lots of chips:

"One of the wafers just off the HP fab line containing the MIT'78 VLSI design
projects: Wafers were then diced into chips, and the chips packaged and wire
bonded to specific projects, which were then tested back at M.I.T."

[http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Wafer%20s....](http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Wafer%20s.jpg)

And here the classic paper about it, "Design of a LISP-based microprocessor"
by Guy Lewis Steele, Jr. and Gerald Jay Sussman:

[https://dl.acm.org/citation.cfm?id=359031](https://dl.acm.org/citation.cfm?id=359031)

------
vectorEQ
cool project and nice writing style on the blog. awsome!

------
ZedIsNotDead
Modern problems require modern solutions.

------
ameyv
This is really good resource.. Thank you

------
gvand
Nice project!

------
Circuits
Modeling it in code is cool and all but you skipped out on all the fun and
enjoyment (pain and suffering) of troubleshooting your own shitty wiring job
by not doing this with discrete hardware on a proto board.

~~~
kabdib
I started out doing hardware, as you say. Lots of bad soldering (initially).
Expensive smoke and shitty chips from Radio Shack that appeared to have had
their magic smoke removed prior to being shipped to stores in packaging that
definitely was not anti-static. Capacitors that blew up into confetti, diodes
that became LEDs for a single, glorious instant. Poking around in the back of
old teevee sets and somehow avoiding being electrocuted or thrown against the
wall by 30KV waiting patiently in a circuit where the bleed resistor had
cracked. Realizing that you're going to need another paper route to afford the
buffer chips and board work. Learning that manufacturer data sheets are
sometimes full of lies. Redesigning the support circuitry for the CPU to
remove just one more chip. Failing a bunch of high school courses because you
were writing a BASIC interpreter (in anticipation of working hardware,
someday) instead of doing homework.

Oh, I built a working computer, too. That was fun. But when at the end of that
project you've got a processor and some RAM and a display sitting there in
front of you, well, you realize that you don't really know what to _do_ with
the rig. _Now what?_

Then I realized that I could do far more damage in software (at scale) than
with hardware (just one-off workbench class disasters). And so . . .

~~~
djhworld
> Then I realized that I could do far more damage in software (at scale) than
> with hardware

Haha, this was pretty much it for me too. I'd imagine going down the hardware
route is a lot of fun with a lot of lessons learnt, but an expensive lesson.

I might play around with some hardware stuff next though.

