Hacker News new | past | comments | ask | show | jobs | submit login
Megaprocessor – A micro-processor built large (2016) (megaprocessor.com)
266 points by amadeuspzs on Nov 18, 2021 | hide | past | favorite | 92 comments



The mega processor is one of my all-time favorite computers, along with the Magic-1 https://homebrewcpu.com/

The megaprocessor is just absolutely wonderful in how it bridges from 'here is a transistor, it lights an LED' to 'here is a computer, it plays tetris'. I always struggled to unwind the layers of abstraction in a modern computer from atoms in the CPU to running python, but being able to just look at a bunch of literal transistors (with LEDs on each gate!) wired up playing tetris shows how a computer really works in such a profound and awe inspiring fashion.

Magic-1 is sort of the next level higher complexity, where it is made out of very simple TTL (most complicated chip function is the ALU--a circuit I had to build as an EE undergrad out of or- and and- gates) and it hosts a webpage. It currently seems to be down, but you can see it on the wayback machine https://web.archive.org/web/20210815180101/http://www.magic-...

I will never forget when I came across that site and realized that I was interacting with a wirewrapped pile of ram and nor gates over the internet. There was even a time when you could telnet in and play some retro text-based adventure games, To this day, the only time I have played Adventure was on Magic-1.


I'm partial to the Gigatron[0] myself. Built entirely using a mere 34 TTL ICs available in the 70s (930 logic gates) and it's capable of driving a VGA monitor and 4-bit sound while running at 6.25Mhz. In my opinion, it is beautifully simple and elegant.

[0] https://gigatron.io


Charles Petzold's book Code: The Hidden Language of Computer Hardware and Software explains a computer from the ground up.

I don't know if the ideas still apply to modern computers, but it's pretty cool understanding how things like addresses are decoded and instructions are constructed and executed at the gate level in a very basic microprocessor.


Love this kind of build, so I'm working on a little breadboard one myself https://www.youtube.com/IanWard1 similar to the Ben Eater 8-bit CPU videos https://www.youtube.com/watch?v=HyznrdDSSGM&list=PLowKtXNTBy... (which are amazing and everyone should watch)


I often wondered i could build some sort of general computing machine if we were pushed back to the dark ages or something. I guess you have to define exactly at what level of technological achievements we were pushed back to. But with the knowledge we have today, and without ICs (or advanced manufacturing facilities) and only "simple electronics" (whatever that would be) if this would be possible. Fun stuff to think about!


First gen transistor computers often used standard functional units - gates, flip flops, and such - packaged into small modules with edge connectors and wired together with wire wrap on a backplane. Like this DEC PDP-8.

http://www.oldcomputers.arcula.co.uk/files/images/pdp8104.jp...

It's fairly easy to design a computer like this.

Later TTL/CMOS designs replaced the packaged modules with much smaller 74xx/40xx ICs.

You can make basic logic gates with just diodes and resistors, but you need transistors for inversion, buffering, and a usable flip flop.

That's probably the minimum level for useful computing/calculating. If civilisation has ended and you have no transistors you probably don't have the resources to make glass valves either, so that's going to be a problem.

Of course there's always clockwork...


A famous example of a modular design is IBM's Solid Logic Technology that was used in their System/360:

https://en.wikipedia.org/wiki/IBM_Solid_Logic_Technology


These modules seem to be the primary influence on sci-fi movie computer design, starting with HAL in "2001".

When sci-fi writers need to create some plot tension around getting a computer either up and running or down and disarmed, the characters will inevitably be plugging/unplugging colorful modules at some point.


We could use vacuum tubes instead of transistors.

(I googled this to make sure I wasn't misremembering what I read 40 years ago in an already outdated book at the library and I was suddenly filled with a sense memory of the smell of the interiors of old electric appliances loaded with tubes and dust.)


Yes...there were a couple of generations of what we would recognize as vaguely 'modern' computers (say...roughly ENIAC to the IBM 704/709) built completely out of stuff that looked like this:

http://www.righto.com/2018/01/examining-1954-ibm-mainframes-...


Yes, that was the first all-electronic generation after the very earliest relay designs.

They were shockingly unreliable and incredibly expensive. Tubes have a very low mean time between failure, so any design that uses tubes exclusively can't work for more than short periods without breaking down - possibly minutes, maybe hours, probably not days, and absolutely not months or years.

And each failure means a cycle of fault finding, which can take hours or days in turn.

As a technology it sort of works in a prototype way - you can get some work done until you can't. But the unreliability means it's qualitatively different to a modern laptop or server farm.

The wonderful thing about integration on silicon is that it's the opposite - it's incredibly reliable, as long as you keep the thermals reasonable.


Well...certainly true of the original tube computers (ENIAC was famously temperamental), but that module comes from an IBM 700-series, which was a production product. Tube machines from IBM, Burroughs, Univac, Bendix, Ferranti and many others were in no way mere prototypes with hundreds built. The tube based AN/FSQ-7 was for years the basis for the USAF SAGE air defense network.

Tube reliability improved radically over the 15-20 years tube computers were a thing; it had too. And just like you point out about silicon, reasonable thermal management became recognized as important to tube reliability and designs changed accordingly. MTBF was lower than a modern computer, but they certainly ran for days or weeks and more. And debugging was usually fairly quick as you ran some diags that pinpointed the module (not single tube) that failed and replaced the whole thing.

I have an acquaintance with a Bendix G15 that still runs. Admittedly, the G15 is much simpler than an IBM 700, but it's a nearly 65 year old tube machine.


We could use telegraph relays instead of vacuum tubes - might be better reliability and repairability.


Except that mechanical contacts are the bane of all things electrical. Vacuum tubes are lightyears ahead of relays in this regard.


Repairability, yes. Reliability, no.


I forgot which book it was (maybe "the three body problem"?) but there was a science fiction story where a Chinese king makes his soldier act as a logical gate and his army becomes a computer. I was like, wow, I didn't think about that, but it totally makes sense!!


That is the three body problem, and while avoiding spoilers, not exactly a Chinese king.


There's an XKCD for everything :-) https://xkcd.com/505/


In that case, if you want a somewhat entertaining very-high-level overview of what would need to be done, then there's a manga that showed this off a few chapters ago, it's called Dr. Stone. What stuck with me the most was that the purity needed for the silicon used in processors was absurdly high, so much so that they couldn't quite do that just yet, so they made a processor out of parametrons and used magnetic core memory. I knew semiconductors had to be very pure, but it was a bit discouraging to realize just how much effort it would take if you started from zero.


Dr. Stone is great but I also found it to be a bit too hand-wavy. In real life you can't just build steam engines with a small village worth of labor + a "master craftsman". Mining, transporting, and refining iron ore alone is a huge task that could easily consume every drop of the village's labor resources and still not produce much iron. Fuel is also a huge task. Unless you have a high quality coal mine nearby, you have to create charcoal which is also very labor intensive (see: https://www.youtube.com/watch?v=GzLvqCTvOQY). I just can't fathom how Senku realistically makes processors unless he has a nation state worth of labor at his disposal.

But yeah, it is a fun "what if".

"What if a super genius with the entirety of wikipedia in his brain were sent back to the stone age? Could he rebuild modern society?"


IIRC a key obstacle why steam engines were not used earlier despite the concept being known for at least a millenium was the requirement for quite advanced metallurgy - you can make a nifty proof of concept from copper or iron, but a useful steam engine needs to be (a) relatively high pressure and (b) large, so you can do it only if you can reliably and cheaply make large quantities of decent steel. If you can't make large quantities of steel, your steam engine doesn't work; if your steel-making process has unpredictable results, then your boiler blows up at a weak spot, and if that steel is expensive, you're better off having the same people work a literal treadmill instead of making a steam machine.


At least with iron, you'd have the benefit of the existing refined ore lying all around you in a post-apocalyptic setting. There's little need for actually mining iron ore anymore if your population has been reduced by 99% or more. You can walk down any abandoned street and find sources of iron and other metals. Now, there's still the refining process (but it would be shorter from something already processed) and fuel to contend with.


Also, making glass is not just combining sand and seashells and fire.

I don't doubt that they could have made glass plates or something, but they start turning out vacuum tubes and borosilicate beakers next to each other like it was all a matter of knowing the recipe.


>the purity needed for the silicon used in processors was absurdly high

Yes. Silicon wafers are cut from a monocrystalline boule, a single flawless silicon crystal with no defects or inclusions. A big chunk of silicon atoms, nothing else at all. (Doping happens later) To the extent any physical object can be called "perfect", a semiconductor wafer is perfect.

(Of course after manufacturing it will start picking up embedded hydrogen and helium atoms from cosmic rays and alpha particle background radiation.)


Wow, no idea Dr. Stone was that hardcore. That sounds watchable!

edit: didn’t pay attention that you were talking about the manga. That makes more sense. Sounds highly readable!


The first computing machines used relays which are electromechanical mechanical switches. Current would flow into an electric magnetic and it would magnetize a switch and close a loop thereby switching something "on." By placing these switches together into different configurations you could form equivalent logic gates.

Sometimes insects or moths would get stuck in the relays which would screw up the system. This is the origin of the word "bug."

Prior to incorporating logic into electronics, computing machines were hand cranked or motor cranked gear machines. See: https://www.youtube.com/watch?v=fhUfRIeRSZE. The YouTube video literally is a hand cranked portable calculator.

The world you envision has already existed.


The use of the term "bug" in engineering predates automatic computers by nearly a century; the Wiki article [1] on the topic gives a pretty good summary of its history.

[1] https://en.wikipedia.org/wiki/Bug_(engineering)#History


> The YouTube video literally is a hand cranked portable calculator.

It can even do square roots?! That's amazing. And it fits the palm of your hand!

Now we're down to specks of sand calculating so fast they melt without cooling. Seriously wtf.


Then the next question is; what would you do with it?

You need a source of problems to solve, and until you've bootstrapped the rest of society at least to the point where something like high-resolution trigonometric tables, desktop publishing, high-speed accounting, (for example) are needed, the effort isn't going to keep you fed...


Calculate ballistics, like some of the original computers were created for? Never too early post-apocolypse to start the thinking about the next war.


I agree it wouldn't be high on the list, but I also imagine there would be practical needs. Like command/control. So, voice only radios first, but some sort of messaging that doesn't need a live listener on the radio would then be a nice next step. And that could be done with a simple computer.


No need in electronics whatsoever - mechanical computing is a sufficiently advanced engineering discipline, as, incidentally, is fluidics!

https://en.wikipedia.org/wiki/Fluidics


Looking historically, you have a bunch of options for a pre-IC computer; there were lots of pre-IC computers. Transistors, of course, or vacuum tubes give you a useful computer. You can build a computer from relays, but the performance is pretty bad. Memory is also very important. Magnetic core memory is the way to go if you don't have ICs. None of this is going to help you if you went to the dark ages.

As far as mechanical devices, mechanical calculating machines didn't arise until the late 1600s and weren't reliable for many years. It's unlikely that you'd be capable of building a mechanical computer until the industrial revolution. Note that Babbage was unsuccessful in building his machines even in the late 1800s.

If your goal is to build a Turing-complete machine of some sort, even if totally impractical, you could push the date back a lot. But that would be more of a curiosity than a useful computer.


For arithmetic, pinwheel calculator (aka "Odhner's arithmometer") [0] is a pretty decent and reliable mechanical device. You can even give it an electric motor for doing the rotations for you and a numerical keyboard.

[0] https://en.wikipedia.org/wiki/Pinwheel_calculator


You can even build a binary machine without electronics, have a look here: https://en.wikipedia.org/wiki/Z1_(computer)


CollapseOS is a z80-based forth that is targeted at bootstrapping computing from scavengable components in old electronics.

https://collapseos.org/


Relay computers are relatively simple to make, and require just electromechanical relays.

Some semi-random examples

https://web.cecs.pdx.edu/~harry/Relay/

https://relaycomputer.co.uk/

Main issue is memory. Takes a lot of space to make any usable about of memory out of relays.


On that note, I was wondering on several occasions whether it would have been technologically possible to build neon lamp logic circuits in Babbage's time. Aside from the problem of building an air liquifier a few decades early, I don't see any really major technological hurdles there. That would have nicely solved his problems with mechanical manufacturing...


Good point!

I used to play that same thought experiment with more basic utilities like my toaster with it's various settings and electronic controllers. Then I was given a Dualit. No more philosophical dilemmas!

Kidding aside, it's always staggering how far removed we really are from operating on (humanly) first principles. Humbling.


There are many people on the internet researching how basic things can be made in a low-tech fashion. I particularly enjoy https://simplifier.neocities.org/ for example.

But if you read those blogs you still notice the mind boggling height of the giants whose shoulders the bloggers stand on. Having access to simple chemicals like acids or various salts for example is huge. I wouldn't even know where to start if I had to bootstrap a highschool chemistry kit starting with nothing but my hands and my knowledge.


The Dr. Stone manga provides an interesting perspective on how you'd bootstrap that chemistry kit.


219 chapters and ongoing. Whew. This confirms my suspicion that bootstrapping is really hard.


Whoa this is great! Thanks for the link!


I mean, you can apparently trick swarms of crabs into being logic gates. https://phys.org/news/2012-04-scientists-crab-powered.html


Babbage's analytical engine comes close, and doesn't even use electricity.


In the dark ages you can build gears. Gears can do arithmetic and calculus.


The catch is that there are tolerance issues. Doron Swade's account of building the two existing Difference Engine #2 models (http://www.amazon.com/exec/obidos/ASIN/0670910201/donhosek) is a good example of where the challenges lie. It was just barely possible to do with 19th century technology. Physical mechanisms deviate from theory by quite a bit.


Antikythera mechanism was built before the Dark Ages, he just wanted to go back to the Dark Ages so he can do precision work by that time, he can probably build a battleship fire control system at that time.

edit: https://www.youtube.com/watch?v=gwf5mAlI7Ug


Now he just needs to build a battleship to go with it!


In the same space of using discrete components instead of ICs , the Monster6502: https://monster6502.com/

Note: Well, there are some quad transistor array chips, but that seems still in the same spirit.


Missed opportunity to call it a "macroprocessor".


I think the minicomputers of the 70s well-represent the halfway point between there and what we have today.

At Basic Four Corporation I worked on systems built from 8"x11" circuit boards. A CPU might consist of two such cards joined on the front by a couple flat 50-pin cables and to the other components by a backplane.

Disk Controller: 1 board Terminal controller: 1 board etc

https://www.ricomputermuseum.org/collections-gallery/equipme...

Would be interesting to see some enterprising soul recreate a modern computer in such a form factor.


I am unaware of many Hackernews who had even heard of Basic Four, let alone worked there! Did you know Chuck Milden?


If I did I don't remember him. :) How are you familiar with Basic Four?


Chuck was president of ICS, which was acquired by Basic Four in like the mid-70s. I only met him long after, but he told me stories. Including one about how he wheeled an Apple II into the Basic Four boardroom and demonstrated it, saying in effect "this is the future, and if you're not on board with the microcomputer revolution you'll be left behind". They decided to pass, and continue figuring out ways to sell $50,000 hard disks to existing customers. And that's why most of Hackernews hasn't heard of Basic Four :)

I do know that MAI ended up selling microcomputer based products eventually, but by that time they were well into day-late-dollar-short territory and would continue to lose ground along with all the other minicomputer vendors like PRIME that hardly anyone these days has heard of.


Basic Four was about 300 employees when I landed there and having spent most of my time in manufacturing I didn't rub elbows with upper management. Although they did rub elbows with me once when they thought I was stealing their operating system. But that's a whole nother story. :)


That is fantastically impressive and reminds me of something Sam Altman said once

"Alan Kay gave me an Alto. That’s not the very last computer that I think is within my capability to understand everything that’s happening in there, but it’s getting near the end." https://mastersofscale.com/sam-altman-why-customer-love-is-a...

This is a visual representation of about what I understand about a processor and still outside of what I could actually make without a lot of reference material.


This is currently sitting in the computing history museum in Cambridge. 10 pounds entry. They have loads of old computers and consoles


Massachusetts or UK?


UK. Here's the Megaprocessor on the museum's website: https://www.computinghistory.org.uk/det/43063/The-Megaproces...


Would a museum in the United States have an admission fee denominated in pounds?


No. It’s a computing museum so the admission fee would be denominated in “pound coin”.


MOnSter 6502 for the curious.

https://monster6502.com/


Beautiful.

Inspired by this great submission, I was instead at http://visual6502.org/JSSim/index.html - 6502 simulator in HTML5 with visual changes on the virtual circuitry.


There is now https://github.com/floooh/v6502r as in remixed.

Purposedly better, faster, and has Z80, too.


Tom Scott has done videos where this has been either used directly as part of the video or has been in the background.

For example - https://www.youtube.com/watch?v=Z5JC9Ve1sfI - It certainly makes for a cool background.


The quality of not only the product, but the accompanying explanations is outstanding. I think it's a work of art, because not only is it visually impactful (especially at 1Hz, as in the demu), it also uses the medium to convey an idea that would be difficult to convey in any other way.

I'm interested in making (stochastic) algorithms fast, which always seems to eventually lead back to looking at code in compiler explorer. The extent of my knowledge there is basically "short assembly good, long assembly bad". But I've always lacked some "tactile" feeling (for lack of a better phrase) for what a register like "eax" or "rax" is. I hope that learning more about the megaprocessor might help get a glimpse of this.


Previous submission, 88 comments

https://news.ycombinator.com/item?id=12317217


This is AWESOME


It is very, very cool :)


If the ISA is sufficiently efficient, 8kHz is fast enough to run interpreters. An 8kHz can be useful as a calculator, running thing similar to FORTRAN and, if is has suitable I/O, maybe run a BASIC or CHIP-8 interpreter.


Alternatively it works if you are sufficiently patient.


I'm curious what is the significance of 8kHz for running an interpreter? Am I missing something obvious?


This is so good. I've just watched his 8 videos explaining from transistors to logic and memory. Wonder why he unfortunately stopped at SS8 : Time and Memory now...


That giant slider pot to adjust clock speed is awesome.


Can it run Doom? Or NetBSD? or whatever people would normally run on a PC?


It seems to have an address width of 16 bits, there's no mention of a memory management unit, and the website lists its RAM size as 256 bytes, so I'm going to say no, regardless of speed.

The RAM size is of course the most limiting out of those three, but even if it were larger (and one could somehow build it without resorting to integrated circuits), you'd probably run into problems with the other fundamental limits if you wanted fancy things like memory in the order of megabytes.

Of course that's not the point, though, because building an entire general-purpose CPU from scratch at such a human-visible scale and from basic components is a feat in itself.


Wait - this thing runs Windows?! I figured it ran some kind of bytecode or toy OS. That's very neat that it runs a commercial operating system.

I'd seen this post before but I'd never noticed the monitor with the Windows login screen.


Hah, of course not. The PC acts as a terminal/controller for this machine. Running Windows 7 on a 8 KHz CPU is impossible, even on a x86-compatible one. WinXP has been shown to run on an extremely underclocked 8 MHz Pentium CPU, booting in half an hour: https://winhistory.de/more/386/xpmini_en.htm


If you have enough RAM and patience, you can emulate anything even on small CPU. eg. https://dmitry.gr/?r=05.Projects&proj=07.%20Linux%20on%208bi...


30,000 minutes are more than 20 days. That's a lot of patience to boot Windows.


It has only 256 bytes of RAM


It definitely does not run Windows. It has 32 kB of RAM, 16-bit registers and a custom instruction set. That will never run Windows 1, let alone Windows 7 :)


That's still more memory than my old Vic20 had. :)


I doubt it. There seem to be several SBCs in the picture with the Windows terminal.

The Windows terminal is probably used to communicate with the megaprocessor.


Curious what factors limit clock to 8kHz.



Worst case propagation delay.


This does not make any sense. I can get way better delay / frequency wise on long USB cables. Something here does not compute.


Then you don't understand digital logic or how processors are designed, single cycle or pipelined.


I'm astonished it's running Windows !





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: