Hacker News new | past | comments | ask | show | jobs | submit login
Megaprocessor – A micro-processor built large (megaprocessor.com)
731 points by diymaker on Aug 19, 2016 | hide | past | web | favorite | 88 comments

There's also the MOnSter 6502, an exact copy of the 6502 microprocessor built with discrete transistors on a 12"x15" board. It's operational and Apple Integer Basic runs successfully on it, slightly slower than the real chip. It's quite impressive, since unlike the real processor, the MOnSter 6502 has a bunch of LEDs indicating the internal state.

See http://monster6502.com/

The best part about the MOnSter 6502 is you can control the frequency through a potentiometer :)

A really excellent way to switch between actually seeing the individual instructions being decoded and executed and getting something fast enough to be (barely) usable!

What's really attractive to me is that it's also an existing, "real" architecture instead of a custom one-off (which may have been designed specifically to ease a particular implementation.) It would be fun to see a discrete 8080, Z80, or even 8086, for much the same reason.

The first generation DEC PDP-8 minicomputer was a discrete-component machine the size of a small refrigerator.[1] Probably the smallest pre-IC computer manufactured.

[1] https://en.wikipedia.org/wiki/PDP-8

My very first job after graduating college was working with a PDP-8. At one point we had a hardware failure. We had a maintenance contract, so DEC sent out an engineer. After confirming the fault from the diagnostics I had run, he pulled a circuit board out, unsoldered a transistor, and soldered in a new one.

We also upgraded the hardware while I worked there. We doubled the memory from 4K words (6K bytes) to 8K words. The upgrade cost five thousand dollars.

The smallest pre-IC computer is probably an aerospace computer. These computers are very interesting, but almost totally neglected. For instance, in 1960 Atlas ICBMs (nuclear missiles) were guided by a computer that was a two foot cube and weighed 240 pounds. Side note: PROM memories were invented for this computer, so missiles could be aimed at different targets in the field.

In 1962, Arma created the first microcomputer, or at least the first computer with that name. The Arma Micro Computer was a general-purpose aerospace computer built from transistors and transfluxors (two-hole core memory with a cool name). The computer was a tiny 0.4 cubic feet and 20 pounds. It was a 22-bit machine; while we now think word sizes must be a power of 2, back then people used whatever word size gave them the accuracy they needed.

The original Atlas guidance computer was ground-based.[1] It was the first transistorized computer, but it was a sizable mainframe and not on the missile. Guidance was by radio control.

The ARMA Micro D is really obscure. It shows up in some lists of early computers. Apparently it was inside some versions of the LTN-51 inertial navigation system.[2][3] But this seems to have been around 1969-1970. The Concorde used that navigation system.

[1] http://afspacemuseum.org/displays/BurroughsComputer/Burrough... [2] https://www.flightglobal.com/FlightPDFArchive/1970/1970%20-%... [3] http://www.seaboardairlines.org/aircraft/ins-2.htm

8080 would probably be doable, but the Z80 has over double the transistor count, so it would be (even more) impractical to construct. The 8086 has an order of magnitude more transistors.

Would definitely be very cool to see though.

> which may have been designed specifically to ease a particular implementation

To be fair, so are all the "real" architectures. 8-bit chips especially are nothing but trade-offs to keep transistor count down.

Potentiometer is cool, but not as cool as a crank wheel operated clock https://www.youtube.com/watch?v=0GmY_UrbXnA

I'd forgotten with current 1bn transistor counts that it ran with just 4k!

See http://www.visual6502.org for a great visual sim of a chip operating.

I think it's actually much slower than the original.

>The maximum reliable clock rate is not yet determined, but we expect it to be in the tens to (low) hundreds of kHz.

Those LEDs! Those brightly lighty LEDs!

Never mind the gates. Relaxen und watschen der blinkenlichten.

Indeed the LEDs are the most important part since this project seems to be all about exposition. For example:

> So how big is it ? Well an 8-bit adder is about a foot long (I use five of these)

If he were optimising for size rather than clarity, it would be much smaller. Instead he has nice diagramtic outlines for the gates, and generous spaces between them. And those LEDs! Those brighty lighty LEDs.

When I got started in electronics, I imagine exactly a little LED lighting up when a bit is flip in a memory chip. The on in the post is a great illustration! well done!

Dekatrons work like this :) The actual logical state of the thing is stored using a glowing plasma.

https://en.wikipedia.org/wiki/Dekatron https://www.youtube.com/watch?v=iF8CnG7ee-A

The clock speed ramp-up in the video was the coolest part :-)

I've followed this project and really enjoyed it. I've built small versions of various circuits, a core memory bit, a 4 bit adder, etc, but never even considered something of this scale as practical. I really hope the London Science Museum buys it and puts it along the wall next to the Difference Engine[1] (sigh, they closed their exhibit). This definitely belongs somewhere that the public could experience it though!

[1] ARGH! They closed the exhibit -- http://www.sciencemuseum.org.uk/visitmuseum/plan_your_visit/...

Yes, they closed it, because that was the deal: http://www.electronicsweekly.com/news/business/information-t...

and http://www.computerhistory.org/babbage/modernsequel/

"The complete working Babbage engine is on public display at the Science Museum in London. A duplicate engine and printer, a 'second original', the Babbage Difference Engine No.2 was completed for a private benefactor of the project, Nathan Myhrvold, formerly chief technology officer and Group VP at Microsoft. The Babbage Difference Engine No 2. was on displayed and demonstrated from May 2008 to January 2016."

Here's the thing, as I understood it the Science Museum in London got to keep theirs, the second one, would go through the Computer History Museum for a year, and then on to Nathan.

Except if you read my link, the London Science Museum has taken theirs (which is not Nathan's) off display. That was what I was saying "Argh!" about.

Apparently the Computing gallery is closed so it can be replaced with a new Mathematics gallery. Hopefully that will manage to incorporate the Difference Engine somewhere.

I went recently and it was still on display, although it was not in the computing exhibit as before, but sort of on it's own.

Reminds me of this DIY 4 bit processor: https://hackaday.io/project/665-4-bit-computer-built-from-di...

The paranoid in me wonders if in the future we'll have to resort to projects like this for sensitive tasks to know our hardware hasn't been backdoored...

There's some serious head scratching going on when you watch the video. If you can, wait a day before watching the explanation. I'm not even sure what's going on after watching that!

That is really impressive.

We should soon have a viable lowRISC implementation for those highly sensitive tasks. Honestly though, the Intel ME thing is quite worrisome.

Definitely a step forward, but couldn't either the fab or the FPGA put in a back door? http://www.darkreading.com/threat-intelligence/researchers-d...

Speculating: Yes, but it's probably a bit harder to make a backdoor in an FPGA that can anticipate and manipulate a custom design. The backdoor would have to be smart enough to know it's running a CPU, although if it could just detect that it's running multiple Ethernet ports, it could still act as a silent forwarder for evil packets or a data mangler. It could conceivably also try to send the FPGA design to the backdoor's owner for them to tell the backdoor what to do next.

Yes, the fab could put in a backdoor. That's why you need to build your own. But, what if the machinery you purchase has a backdoor? Now you need to build your own fab machinery! It's backdoors all the way down. There always has to be some level of trust.

I'm not sure how you put a backdoor in an atom.

Easy: You put the backdoor into the atoms it's entangled with.

Doesn't really matter, though. If you have to build your tools up from that level, it's going to cost so much that only governments and huge corporations can do it--and those are exactly the people installing the backdoors in the first place, so we're back to square one.

> The paranoid in me wonders if in the future we'll have to resort to projects like this

This has got me picturing my store-bought laptop with monstrously large, homemade processors bursting from the case that I can use for sensitive tasks. I wonder how close this is to the realm of possibility.

Well. Having seen photos of the original Amiga prototype, as demoed at 1984 CES (hidden under a table). Each breadboarded custom chip was about twice the size of the machine case. Used about 7200 logic chips, so maybe not enough paranoia proofing? The 68000 was, of course, standard.

I'm sure a Radeon and i5 would only be a bit larger. The internet may be obsolete by the time you're done wiring. ;)

Photo here http://arstechnica.com/gadgets/2007/08/a-history-of-the-amig...

Hadn't see the pictures of the Lorraine prototype before. The article you linked refers to breadboards but that's not what is shown on the picture and it's unlikely that much more than very basic concepts were built using breadboard.

Instead people would use wire a much more reliable technique for building prototypes and very low volume productions.

Wire wrapping involves boards with pre-drilled holes in which you insert IC sockets with elongated pins. You then wrap a few turns of wire around the pin and route your wire to another one.

That's what we see in the picture and others of the same prototype.


The IBM 1401 was built using "SMS cards"[0] connected using wire wrap

[0]: https://en.wikipedia.org/wiki/IBM_Standard_Modular_System

The earlier DEC machines had a wire-wrap backplane. The wire-wrap assembly was automated - which was just as well:


I think it's hard to be amazed enough at projects like the Amiga. Custom chips prototyped with discrete logic and a complete multitasking OS - all built from scratch.

Why have I been thinking the same thing?!?

Except, you would only have to use it to verify the security of the on-board processor in my dream . . .

The project is fantastic, and I'd love to see it in person. However, that might be the worst game of Tetris ever memorialized in a YouTube video. I guess he was too busy building this awesome machine.

The amazing thing about this is that you can't critique. I mean, you can say anything you want, but you can't stop it or change this guy's mind. :)

It's pretty neat seeing him change the clock speed: https://youtu.be/z71h9XZbAWY?t=3m38s

We see these types of postings here from time to time and it always brings to mind the famous quote from Edmund Hillary when asked why he climbed Mt. Everest. If one wants to go to 39,000 feet you can get on a commercial airliner and do so or pretty close. Similarly, I could buy a microprocessor board (e.g. RPI for $25 or so) but this guy climbed the Mt. Everest of computer, "because he wanted to".

It was George Mallory who is famously quoted as having replied to the question "Why did you want to climb Mount Everest?" with the retort "Because it's there".


I preferred his words on achieving it: "we've knocked the bastard off".

Those words could have bee justly spake on getting this thing working. Incredible.

29,029' or 8848m.

I guess it's a lot quieter than using relays...


I attended a talk by the guy who built this thing: http://web.cecs.pdx.edu/~harry/Relay/

He included a live demo. It made a very satisfying "kachunk-kachunk-kachunk" sound as it multiplied two numbers. It also made me realize that the threshold of complexity necessary to construct a practical turing-complee computing device is quite low.

Indeed, it doubled as a beatbox machine, with 99% perfect rhythm! https://www.youtube.com/watch?v=n3wPBcmSb2U

(Unfortunately the small demo isn't doesn't run to completion, perhaps because the source material didn't include it, or because uploading it all would have been nontrivial)

Doesn't the mega and micro cancel and we just have a processor?


[Edit Addition: This isn't just a math joke, his mega-processor is exactly the size a processor was before micro-technology made them a million times smaller... so making something at 1/million scale scaled up million/1 makes it the same size as the original.]

I'm watching the video thinking, wow, this guy has an amazing shed. Then at the end you realise it's in his living room.

Reminds me of the MOnSter 6502 project that did the same thing with a 6502 CPU.

It was actually shown at the Bay Area Maker's Fair:

MOnSter 6502 monster6502.com

Question: is there a market for servers twice the size they are. But a fraction of the cost.

I for example have really consolidated servers in recent years. VMs are fantastic. But the result is that I have lots of Data Center space spare.

I would love to buy servers of similar power consumption but larger foot print, for less dollars than small ones. Assuming that a lot of limitations and cost go into shrinking current systems.

Or are there other issues like distance between components that then come into play?

Due to how IC fabrication works, and the laws of physics, that is not really possible. Larger ICs will definitely cost more simply because less fit on a single wafer --- and defects will be more common, use more power because to use the water analogy there is more 'inertia' to move around the circuits, and also be slower as a result.

You can do this by buying hardware that is a year or two old.

Maybe see if you can turn up old 4U or 8U chassis, and reuse them with old motherboards you have or find lying around. As another commentator said, airflow will be excellent, and you'd have lots of space for local disks too.

If your situation allows it, you could rent out the spare space you have, either by providing standard colocation or by racking old but usable stuff you have spare and making it available (either the real hardware or VMs). Depending on your location and what your pricing could be, this could be a nice side project. One or two cabinets could net a couple hundred (maybe thousand) per month easily, I think.

Another thing: maybe look into engineering-sample (ES) chips (like Xeons) on eBay. For some reason the market is flooded with them, so snap them up while they're around, if you can. You could build some nice kit with them :P

[PS. If the idea of that side project sounds interesting - I wouldn't mind low-scale sysadmin experience, if it would be helpful. My email's in my profile.]

You could put in place servers with larger than normal chassis (4U?) and slow, low speed fans, 55W TDP passively cooled CPUs with big heatsinks on them. But I am curious what your power budget is, what kind of AC circuits are we talking about for each cabinet?

I would guess that, all other things being equal, you'd need more power and generate more heat, because there's more resistive wire to push electrons through. I could be completely wrong, though.

Reminds me of a machine I saw on Tokio's Miraikan: it demonstrates the Internet (TCP/IP) with colored balls, rails, tubes and whatnot.

Wonderful post!

I saw that! Except when I saw it, it was mostly broken, perhaps under the DDoS of children putting too many bits into it too fast.

When I saw it, you could send a message from one designated terminal to another, but that's just rolling balls on tracks. In the intended design you could choose which terminal it went to using an 8-bit IP address.

Lots of previous submissions, but the major thread (with lots of comments from the author!—maybe he'll show up again) was https://news.ycombinator.com/item?id=9755742. Since that was over a year ago, we won't count this one as a dupe.

Also because it was finished since then.

this would be fantastic learning tool. i could definitely imagine it being used to great effect in a computer architecture course.

Does he mention the clock speed of this thing? Wonder what the size comparison would be if it was built on say a 20nm scale?

In the video he has it running just over 1Hz so the flashing of the LEDs can be seen with the naked eye. As he brings it up to about 8KHz everything blurs and goes solid (as you'd expect).

Oh found it on the Reg article - 20khz (0.02Mips)

0.02 MIPS for real? I'd expect about 0.005 MIPS from a design like that.

Many early CPUs took multiple clock cycles per instruction, anywhere between 2 and 10+.

So all instructions are executed in a single clock cycle?

To add, here's a nice table of Meaningless Instructions Per Second for different CPUs:


It's pretty representative of 8 bit CPU true performance.

I think he mentioned somewhere that he had it running at 50 khz now. http://megaprocessor.com/GBU_speed.html - speed page!

What a madman! I mean this in a good way. This is really awesome. The link for "Stepping Stones" and worth a watch. I hope he finishes and does 8 through 10 on the video todo list.

I hope this finds a permanent public home at some point. This would be great addition to any technical museum or exhibition.


Well, that's one way to make completely open hardware :-) I wonder if it can run GPG.

This reminds me of some other fascinating projects: The Clock[1] and The Tower[2]. Several years ago I thought about building a dead-bug style Christmas tree for a decoration competition. Projects like these continually remind me that when we're not optimizing for space, you really can make something functional _and_ beautiful.

[1]: http://techno-logic-art.com/clock.htm [2]: http://techno-logic-art.com/tower.htm

I agree with him that the ~$25 Lattice Mach FPGA breakout boards are outstanding (but these days MachXO3 versions are available). They have an on-board FTDI FT2232 chip which provides JTAG programming of the FPGA, plus very key, an extra UART. With just one cable you power the FPGA, configure it and then talk to your design through the UART.

They are the FPGA version of ST's Nucleo boards (which have built in "ST-LINK" for programming and debugging plus extra UART).

Ha this is timely for me ... last night I laid out an ALU in eagle using 7400 logic and was going to document the build of a 7400-based programmable processor.

You may find this interesting: http://www.homebrewcpu.com/

That would likely be interesting to quite a few people here, definitely post it if (hopefully when) you get it going.

It belongs in a museum!

Seriously, that's quite an amazing project.

Now you can see what a computer is doing.

Fantastic for understanding!

Fantastic. That's all what I have dreamed when I studied Computer Architecture.

I'd like to see a story of somebody building their own IC fab in their garage.

This channel might be of interest to you: https://youtu.be/-Qph8BNrnLY

One of my favorite channels as she's done many projects I've had interest in.

I've really loved this project from the first time I saw mention of it. I've always dreamed of doing something like this, but have never been able to get the full technical abilities and resources to do it myself.

Like a computer from a 1960s movie - computers as art meant them!

An extreme case is LEGO logic gates:


The last picture in the article implies to me that it's running windows, is that the case, or did I misunderstand something?

I would say that more likely that is just a separate computer running windows for research/programming/design/cat videos/etc

Watching the guy play tetris made me cringe

seems an implementation of the book - Code https://www.amazon.com/dp/B00JDMPOK2/ref=dp-kindle-redirect?...

Hahaha this is so complicated xD

Interesting stuff

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact