Hacker News new | past | comments | ask | show | jobs | submit login
How the World's First Computer Was Rescued from the Scrap Heap (2014) (wired.com)
59 points by 8bitsrule on Feb 16, 2021 | hide | past | favorite | 53 comments



This is quite sad. It seems the person who was tasked with doing something with the panels they had quickly gave up on doing anything with them other than making them into a visual prop.

It's not ENIAC any more, it's a bunch of DMX blinkenlights mounted on an ENIAC frame. If you're going to do that, what's the point of using the original hardware? That's just destroying it further.

Making the whole machine work like it originally did is obviously not going to happen without, well, the rest of the machine. But we can fit a whole ENIAC in a fingertip these days. What I would have done is attempted to reverse engineer the logic of the available pieces, collected any existing documentation, and then made it do something. Real calculations. Wire it up to a virtual reproduction of the rest of the machine.

Sure, maybe there isn't enough existing documentation to do it faithfully. Maybe it would all be a giant pile of guesses. But at least it would be computing something, even if it's just our best guess of how it was supposed to work, instead of just flashing lights randomly.

It just feels really sad knowing that not a single one of those vacuum tubes is actually performing any logic functions, the way they wired it up.


I think it's really amazing to see old computers work. The Computer History Museum has managed to do that with some 50's and 60's era computers-- you can actually see them in action, teletype and everything.

But it's really just a curiosity to have the physical machine literally running. It's merely a nostaghic artifact. We have photos, stories, documentation, and historians writing books and articles. That's what matters-- not the blinky lights even if they're running 70 year old "code" on an emulator or miraculously preserved vintage electronics.

I think programming up an raspberry pi (or whatever) to run a mainframe emulator and hooking it up to old iron panels is something akin to building animatronic dinosaurs. It's great to do for some representative artifacts, but it can be overdone.

Keeping the IBM 1401 running at the computer history museum is fine too. The room even smells great, it's a spectacle for geeks. It's feasible for as long as some old-timers can keep replacing the components that age-out. But at some point it's going to be a better idea to unplug the thing, remove the leaky caps, and keep the dust off it.

It will be interesting what happens to VM's and Containers in the future. Are these things going to be preserved and displayed like a "brain-in-a-jar"? Will people in the future marvel at our convoluted code from 2020, our turgid (but quaint) frameworks. Will our API's be comparable to crazy victorian contraptions?


Is there even a source for replacement vacuum tubes? I thought they burned out on a consistent basis.


Most vacuum tube computers used extremely common pentode and dual triode tube types as their primary logic elements; ENIAC made extensive use of the 6SN7 for example, which is still manufactured today for guitar amplifiers. One caveat is they were a subtype specifically burned in and selected for computer reliability, which is not done anymore obviously.

For the types where that's not an option, surprisingly (at least to me), most tubes commonly used historically still have ample new old stock tucked away in warehouses across the US and Europe. There was no major problem finding all the tubes required to recreate EDSAC in the UK: https://www.tnmoc.org/notes-from-the-museum/2013/1/9/product... for example.



In addition to ENIAC being built after the German Z1 (1938) and Z3 (1941), as others here have stated, it was also built after the UK's Colossus[0] (1943).

If we're talking "first real computer", though, and depending on your definition of "real", the Manchester Baby[1] (1948) was the first stored-program computer. The Colossus and ENIAC machines all needed to be re-wired to change what computations they could make and the Z3's instructions came only from the input tape.

[0] https://en.wikipedia.org/wiki/Colossus_computer

[1] https://en.wikipedia.org/wiki/Manchester_Baby


The ENIAC was originally not Turing complete, and as you say, had to be rewired for each new problem. In 1948 it was rewired to a stored program computer, which made it Turing complete in the sense that it was essentially programmed to emulate a von Neuman machine. This made it considerably slower, but was much more practical, as it was now possible to change programs frequently. Previously it took weeks to do so. This configuration was used until it was retired.

Betty Jean Jennings Bartik, one of the original ENIAC programmers was leading this work, which was one of the main reasons she got the computer pioneer award for in 2008. Better late than never I guess.

https://www.computer.org/profiles/betty-jean-bartik

The Manchester Baby was the first computer that was built to be Turing complete from the start.


> The Z3 [...] needed to be re-wired to change what computations they could make.

Sure? Wikipedia says:

> Program code was stored on punched film.


Oops, yeah, you're right. I've updated my comment, thanks for pointing that out.


<quote> In addition to ENIAC being built after the German Z1 (1938) and Z3 (1941), as others here have stated, it was also built after the UK's Colossus </quote>

Don't forget the ABC computer (1937-1942) which also pre-dates ENIAC

https://en.wikipedia.org/wiki/Atanasoff%E2%80%93Berry_comput...


So in this, does the definition of "World" include both West coast and East coast? What about Hawaii and Alaska?


Colossus was (while extremely important and criminally underutilized after the war) not turing complete by itself.


Right. Colossus wasn't a general purpose computer at all. It was a key-tester, like a Bitcoin miner.

The big problem in the early days was memory. ENIAC had nothing writable that could hold a stored program. IBM had electronic arithmetic in R&D before WWII, and that came out as a product after the war, the IBM 603 (1946) and 604 (1948). Those were plugboard programmed machines with electronic arithmetic. Workarounds for memory cost, basically. Everybody involved recognized that with enough memory, they could get rid of the plugboards, but the hardware wasn't ready.

Many memory devices were tried. Delay lines. (Slow, not random access) Williams tube electrostatic storage. (Big and expensive.) High speed drums. (Slow, not random access.) Finally core memory was developed, the first thing that went fast and had random access.


Yes, the difficulty of implementing memory back then often isn't appreciated. Core memory was a really important technology for making the modern computer possible. (Everyone I've talked to who used Williams tubes mentions how unreliable they were.) As for ENIAC, it was upgraded to core memory (built by Burroughs) in 1953.


A few thoughts: The historian view is that ENIAC is the "first electronic, general purpose, large scale, digital computer" (with all those adjectives required). The paradoxical thing is that ENIAC is clearly the first in an important way, but it's really hard to nail down how. You really can't overstate the importance of ENIAC: it's the machine that triggered the construction of numerous other computers and started the computer revolution, while earlier systems had very little impact.

Another paradox is that although ENIAC directly caused the modern computer, ENIAC was completely different architecturally from modern computers and missing most of the things you'd think of as necessary. It wasn't stored-program or really Turing-complete (although I think those properties are overrated). It didn't have RAM or even use binary or have instructions or programs in the normal sense. Even its inventors recognized that ENIAC's implementation was the wrong approach. But despite all this, modern computers are offspring of ENIAC.

My view is that ENIAC's important "first" is that it was the first to show the world that computers were both practical and revolutionary, so much faster that it could solve entirely new classes of problems.

As far as the Zuse Z3, the historian view is that the proof that it was Turing-complete "was an impressive party trick, but diverged entirely from the way the machine was designed, how it was actually used, or indeed from anything that would have made sense in the 1940s."

If you're interested in ENIAC, the book to read is "ENIAC in Action: Making and Remaking the Modern Computer".


My impression is that having ENIAC running made this event the place to be for anyone interested in computers:

https://en.wikipedia.org/wiki/Moore_School_Lectures

This had a major impact in how computers evolved in the next few years.


Parts of the ENIAC are in at least six different museums around the world. For ENIAC’s 75th birthday (yesterday) I collected a short video from the curators of each. https://youtube.com/playlist?list=PL0IDvwajM_78t0VKt3tatNejX... The multiplier had a wild ride getting to the University of Michigan.


The ENIAC was not the world's first computer.

Konrad Zuse's Z1 was first: https://en.wikipedia.org/wiki/Z1_(computer)

Even if you want it to be electrical, not mechanical, then Zuse's Z3 was first: https://en.wikipedia.org/wiki/Z3_(computer)


AIUI, the Z3 has no conditional branches. Without those, it's not much like a computer as we understand it. Rather, i think the Z3 is an "automatic calculator" - it can carry out a fixed sequence of operations, but it can't do arbitrary computation. In that respect, it is like the Atanasoff–Berry computer and the Harvard Mark I.

The prehistory of computing has a bunch of really interesting machines like this (and Colossus, and early ENIAC really) which are technical marvels, but miss at least one of the key things which make a computer really a computer.

People often focus on the electrical vs mechanical distinction, but that's a red herring. If you made a 386 out of Lego, it would still be a computer. Similarly, binary vs decimal is irrelevant.

To me, there are three key characteristics:

1. There is a program comprising a sequence of symbols which are interpreted by the machine (so not Colossus or early ENIAC, which were controlled by plugboards)

2. There are symbols for conditional branches, or some equivalent construct, and the language is Turing-complete, or some practical projection of it (so not the Z3, ABC, or ASCC)

3. The program is stored in a memory which can be modified by the program

Once you have those things, you can do anything. The first two points are the most important, but i think the third is also really significant: it is what allows things like operating systems and linkers to arise. If the first two are the definition of life, the third is the definition of multicellular life.


3 is not necessary. OSes and linkers are not necessary. We did without for many years.

The program should be in some kind of more-or-less addressable memory, to support conditional branching. That's all.


It's not necessary in a strict theoretical sense to be Turing complete, yes. But in practice? Most of the magic of modern computing is in programs which can themselves manipulate, translate, and even write new programs. Nearly all we've built up depends on that. You can only get so far in practice by flipping switches to encode binary numbers.


No you didn't. You can't even write a bootloader without 3. All practical computers have always had 3.


You reveal that you have never programmed a microcontroller. Most do not have an OS, and cannot run the compiler used to program them. Yet, they are practical: the world would grind to a stop if they vanished.

Most today are far more powerful than any among the whole first decade of practical computers. Those ran compilers in as little as 4k words of memory. The output of the compiler was not written to memory; it was punched directly to tape or cards. (On some, it physically could not be written to program memory, even if there were room: no bus connected to there.) The tape was later read into program memory--often onto a spinning magnetic drum--and executed directly, no OS in sight.

If you imagine these were not practical computers, consider that people paid $millions for them; they earned their keep. Have you ever operated a computer doing work worth that much?


it can carry out a fixed sequence of operations, but it can't do arbitrary computation

This is incorrect. The Z3 is Turing-complete, as has been shown by Raúl Rojas in 1998 [1]. It requires some hacking though.

[1] https://ieeexplore.ieee.org/document/707574


That is a cool paper (and there is an unpaywalled PDF [1]), but what that it shows is not that the Z3 is Turing complete, but that a larger machine comprising a Z3 and a piece of sticky tape is Turing complete. The Z3 as built and used was not Turing complete.

Moreover, the technique in the paper is purely a stunt. You couldn't do practical branching computation that way. It is definitely cool, but it doesn't change the fact that the Z3 was a calculator, not a computer.

[1] http://www.inf.fu-berlin.de/inst/ag-ki/rojas_home/documents/...


Doesn't "Turing Complete" require 1 and 3 already?


Wikipedia says that the Z3 was "electromechanical", though, while the ENIAC is described as "electronic".

You can always play the "who was first" game if you just add enough distinguishing categories.


Without electronic speeds, it's kind of like inventing a printing press that's slower than writing by hand.

The invention isn't much of an invention until the whole thing is electronic. Reading off a tape doesn't cut it. That's why, you'll notice, the entire computer industry bloomed out of the work done on the ENIAC. In other words, if your client had invented facebook, they would have invented facebook.


Yes, the Z3 used relays and the Eniac used tubes. Relays were more reliable but slower.


The Manchester Baby was the first stored program computer. IMHO that makes it the first computer and not just a giant digital calculator.


ENIAC was the first general-purpose electronic computer. You can argue about “first computer” for years, as people have, but the term computer is not well enough defined to have one answer. (And as if battered monster reached out from the grave, the ENIAC just slapped down the Manchester Baby. Recent research by Haigh, Crispen and Rope has revealed that the ENIAC, not the Baby, was the first computer to run a stored program.)


Glad it was saved. Other early contenders for the 'first' title include the ones from Manchester: https://en.wikipedia.org/wiki/Manchester_Mark_1


The Smithsonian had a few panels. They had a working counter. At one time, they had a visitor control panel where you could push buttons and watch it count. But in time they had to power it off.

It's well documented. I once saw schematics. The counters really are decimal, with 10 tubes per digit, set up so that only one at a time is conducting. It's not even binary-coded decimal.


Discussed at the time (of the article, that is): https://news.ycombinator.com/item?id=8657651

Also 2015 (1 comment) https://news.ycombinator.com/item?id=8855782


If the internal workings will be simulated, why not just build fake panels that look like the original machine. The ENIAC can be emulated. A few years ago, they did a VLSI version of it. Why did they have to hack up the original hardware?


Obligatory comment that the Z3 was the world's first digital computer. Not sure what the "real" refers to in the headline, but it turns out that the Z3 was indeed Turing complete.

https://en.wikipedia.org/wiki/Z3_(computer)


> Not sure what the "real" refers to in the headline

We all know what they mean - they mean it was American.


Well, Britain decided to hide everything behind the Official Secrets Act, which workers proudly maintained.

And the US didn't, and here we are today on HN from California.


If you didn't know, you're been hell-banned for months. That's why so few people reply to your comments.

But anyway, keeping something secret doesn't mean it didn't happen. And it's now public knowledge for anyone who cares to look outside the history of the US.


Didn't know HN has shadowbanning.


You have to have a certain level of karma to be able to see shadow-banned people. Otherwise... yeah not knowing about it is kind of the point, isn't it?


No, you just need to enable showdead in profile settings. Certain karma amount is required for vouching permissions.


> If you didn't know, you're been hell-banned for months.

Of course I'm banned on HN - I'm not a leftist, right dang?

I stand by what I say, and won't appease moderators.

> But anyway, keeping something secret doesn't mean it didn't happen.

My point was that Britain missed out on most of the IT industry growth because of the OSA. Instead that success went first to Boston, then SV.

I do understand that all Britain has left is its past glory, which its citizens continually remind the US about, while we're busy doing.


Of course I'm banned on HN - I'm not a leftist, right dang?

Very unfair. As somebody who has regularly commented here against the excesses of the left, while sometimes the silent downvoting has often been telling as to the sympathies of some, the moderation has always been scrupulously fair.


> I do understand that all Britain has left is its past glory, which its citizens continually remind the US about, while we're busy doing.

Not sure why you needed to turn snide there after I took the time to vouch for your comment and reply to you in good faith.

You can be happy about making a success of current technology, without pretending what other people did before you never existed.


> Well, Britain decided to hide everything behind the Official Secrets Act, which workers proudly maintained.

Fun fact: GCHQ still refuses to talk about some of the earliest computing work on Colossus and enigma decryption. I was quite surprised to hear this when I was on a tour of The National Museum of Computing at Bletchley Park[0].

I hope that if/when they do decide to declassify that information that they'll hand it over to TNMOC.

[0] https://www.tnmoc.org/ - It's well worth a visit so when they're open again post-COVID see if you can get down there.


GCHQ are embarrassed because they lost / destroyed it. "Refusing" is less embarrassing than "ooopsie!".


do you have a source for this? I imagine you are right, but is there an interesting story about it you've seen?


I recommend "Between Silk and Cyanide" by Leo Marks.


It's... complicated. The title of first digital computer can be argued to be ABC, Z3, Colossus, or ENIAC, and which one you give it to is going to be decided on the basis of how precisely you define "first digital computer." Given that there is also an element of nationalism to the decision (is the first computer German, British, or American?), it's not entirely unreasonable to suppose that the criteria is partially determined by the desire of whom to crown.

Personally, I don't think the fact that the Z3 and the Colossus were discovered to be Turing-complete long after they were last used should really qualify them for a title based on "first Turing-complete machine." The long time it took to establish Turing-completeness indicates that they weren't designed to be Turing-complete, and that they are is more a reflection of just how low a bar it is to be Turing-complete than the capabilities of the machines themselves. In other words, I would submit that the phrase "first Turing-complete machine" should really be understood as "first intentionally Turing-complete machine."

Another framework that makes sense to understand is the role that the computers had on later development of the field. ENIAC clearly has a massive influence, since it's the one that spawned more recognizable computers as its progeny. The influence of ABC on ENIAC only came out much later (and is still somewhat debated). Z3 had little impact on the field later because it was on the wrong side of WW2. Colossus I believe did influence the Manchester machines, but this link was not known at the time because of the secrecy around Colossus.


Who knows, perhaps if the minds behind Paperclip had been a little more data-minded:

* today, Z3 might be considered "first real"

and

* today, we might be stuck writing code in a fossilized language lovingly called Plankalkül 58 by its inventors that had so much inertia back then that it stifled all further development in the 1960ies, with the result that all of our 2021 computing would be about as fresh as the 2021 B52.


> we might be stuck writing code in a fossilized language lovingly called Plankalkül 58 by its inventors

Substitute ALGOL 58 and that's pretty much what happened. You can't trust those Germans with language design!


Haha, got me, I was just picking any "year used to name a language version" suffix at random, without realizing what corner of computing history this one is associated with. Is there anbody from that era who didn't have their own ALGOL?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: