There's a very lively startup scene in Berlin, but no-one in that scene seems to know that the world's first digital computer was built in Berlin.
There's an obscure and easy to miss sign (https://en.wikipedia.org/wiki/Konrad_Zuse#/media/File:2007-0...) on the house he lived in while working on it, and another sign at the technical university, but that's about it. No big festivities in his name, no commemorations, nothing.
Just imagine if the first digital computer was built somewhere in the US or in the UK. I'm sure we'd hear about until the end of times. Strange how that works.
The Deutsches Technikmuseum commemorates his contributions too. It was one of the things I marveled at in that geek paradise of a museum. Because otherwise the only time I ever heard of Zuse is in a few paragraphs in the cursory and customary "History of Programming Languages" that opened my PL class at uni.
(Claims below are completely opinions, very unsubstantiated, yours might vary, doesn't matter, just my musings. Take it as a light-hearted commentary, not some hill I'd be willing to die on.)
So what even amuses me further is how I find the question of the earliest computer as largely political for lack of a better term. Most of the world would say it's either Alan Turing and his work at Enigma, some would claim Charles Babbage. Growing up in SE Asia, some of my elementary books would even claim the abacus as the earliest "computing device" (I disagree on principle). And of course, Berlin, Germany would say it is Konrad Zuse.
It seems to me this is because there is a very loose definition of "achieving computing". I won't get into Turing vs Zuse, but consider the arguments I remember for the abacus:
- it aids computation, was used as, basically, a calculator
- yes you can still make mistakes, but hey garbage-in garbage out!
- interpreting the configurations of the device to mean something is not so different from interpreting a computer's state through the configuration of lights on your monitor
Of course, if you qualify the debate as achieving electronic computing, then we have a bit more to chew on. But I imagine you could hammer arguments to go either way even with the "electronic" qualifier.
I suspect counting on your fingers predates the abacus. Granted, the abacus is much more powerful, and it is crafted instead of naturally occurring, but the principle of operation seems very similar.
Just because Germans are taught not to be proud of their history using the above method (perfectly sound argument), doesn't mean they aren't. It's just more quiet and subtle.
Apologies for the confusing video, but you're watching a master of dark humor.
Many people i know in Berlin know that one of the first mechanical calculators is in the technical museum Berlin, but those people don’t know about concepts like turing-equivalent, so they call it a Computer.
Many great inventions have a nationalist folklore surrounding it, often ascribing a development to some national hero, while ignoring the process of development as a whole or parallel developments in other countries (e.g. the car, the airplane, the telephone).
On the other hand, Zuse is indeed obscure in Germany overall. Maybe it had to do with the fact that he wasn’t really commercially successful, he did not build some computing Empire out of his inventions. He wasn’t a Silicon Valley style entrepreneur. Something to consider for the engineers among us: it’s not enough to build/invent a thing.
I knew that there was a proof, but never read the paper before. Now that I did, I think that proof is garbage tbh.
For Turing completeness, a fixed, constant-sized program needs to be able to handle arbitrary inputs and use arbitrary amounts of memory. It's important to distinguish the model from the implementation here. Theoretically, a language like Python is Turing complete. (And BTW you don't need infinite precision integers or anything fancy: you can simply simulate the tape with an object graph.) Practically, every Python program running on a real computer will run out of memory eventually.
Anyway, the translated Z3 programs in this paper contain one case statement for every possible memory location. This means that for a given program size, you're limited to run in O(1) space. That's only as powerful as a finite state machine, not a Turing machine.
I see nothing in the paper that convincingly addresses this limitation. There are some arguments to the effect that practical CPUs have limited address space as well. Which is true, but that argument only shows that those aren't Turing complete either, not that the Z3 magically is.
Maybe there's a subtle point that I overlooked, but at the moment, I'm simply not convinced.
Ultimately there is no real computer that is Turing complete, because all real computers are finite. Ultimately, you can represent any real computer with a (very large) DFA.
The two things missing from what would normally be considered a universal computer are conditionals and indirect addressing. The Z3 is also limited in that it executes a specific finite set of instructions in linear order, with no branching.
In the proof, they get around the finite length of a program by literally creating a loop -- they glue one end of the paper tape of the program to the other so that the computer can keep executing the same instruction stream forever. Then they get around the lack of indirect addressing by accessing every memory location in every loop. I agree that it's very much a stretch to call the Z3 Turing complete.
Wouldn't a DFA for a computer have to have 2^n states for n bits of memory, so it's size would be exponential to memory size of the computer? So even for a pedestrian 640K of memory, that's like 10^10^7 states. Sounds a lot like trying to simulate a non-deterministic Turing machine on a deterministic machine; the state space explodes exponentially making it intractable.
I clicked through the first link because I wanted to see what was "funny" about it. Turns out, the way you simulate a Turing machine on the Z3 is very similar to the execution model of AWS step functions.
Historians say: the "argument that Konrad Zuse's 1943 Z3 computer was universal was an impressive party trick, but diverged entirely from the way the machine was designed, how it was actually used, or indeed from anything that would have made sense in the 1940s... To us, the real lesson of his analysis is that the Z3 could have been Turing complete with only minor design changes, but that it wasn't because the concept and its benefits were not yet widely understood."
I agree with that analysis and think Turing-completeness is overrated.
Its an interesting what if to consider what would have happened if Zuze had had the support from the allies post war in the same way that Volkswagon did.
Surprised in away that Zuze wasn't snapped up by operation paperclip.
I think that's because he didn't have a lasting impact for Berlin (or Germany) beside being the first.
Zuse built the first computers in Berlin, but they were only used in the war-time government ministries afaik. After the war, he sold his first computer, which I would consider commercial available, to ETH Zurich in Switzerland. He founded 2 companies, but both were not successful compared to e.g. Intel. He would be really famous and well-known if he had managed to create a computing giant. He never had a university employment, so his papers were not well known compared to Turing and there weren't a lot of them. He even didn't publish his PhD thesis where he invented the first programming language and created an elaborate demo in which he programmed the first chess engine (also afaik). I think in science being the first is overrated and not that important. You have to publish and popularise your idea to really have an impact (schmidhuber would disagree). So his intellectual legacy was merely that he was the first, but not really influential. And his commercial success was also limited.
I think it's tragic, it was an enormous wasted potential for germany. It might be that he never was the great entrepreneur with the skills to build a tech-giant, but he was undoubtedly a genius who invented the future. He could have an enormous impact if he were equipped with a big lab and funding, if he would have taken the academic route. But Germany wasn't (and is still not) able to support geniuses that don't complete the usual strict path to professorship (study -> phd -> habilitation/post-doc).
Zuse is one of the more underappreciated figures in the foundational periods of computer science due to this fact.
I find it quite interesting that the fact that he was on the losing side of that war meant that he had to make due with limited resources which positively influenced his design choices and as such shaped his perspectives of what computers can and should be.
Neil Gershenfeld at the MIT Center for Bits and Atoms has often said that 'computer science is one of the worst things to happen to either computing or science. The canon of computer science that's currently taught prematurely froze a model of computation based on 1950s technology'
I think you can see a sort of stunted alternate path of computation and the connection it has to the physical world in the kinds of projects that Zuse participated in post WWII:
For various reasons Turing captures the contemporary pop culture imagination but I have a feeling that historians in 50-100 years will look back at Zuse as being equally if not more so influential.
I bet that Von Neumann would get a chuckle at how influential his throw away architecture ended up being to our society and would agree with Gershenfeld about how unfortunate it is that its success prematurely froze the model of computation.
I think that's also because Zuse was always relatively obscure during the war, he never was a "nazi celebrity" like Wernher von Braun. Zuse's computer experiments were not considered "vital to the war effort", funding got denied so it always remained a "basement project". And after the war computers were hardly the most important thing to work on in Germany either. Also AFAIK Zuse himself didn't consider his inventions all that important and earth-shattering, at least before and during the war, he "just" needed a tool to help with computations, so he built one.
I'm not sure that the then German government was fully supportive of Zuse's work, from the wikipedia article:
A request by his co-worker Helmut Schreyer—who had helped Zuse build the Z3 prototype in 1938[21]—for government funding for an electronic successor to the Z3 was denied as "strategically unimportant".
Well, there's the Zuse Institute Berlin (ZIB) in Berlin, named after him, and quite prominent on the campus of FU Berlin.
Further, there was a conference/show to celebrate Zuse's 100th birthday, organized by the Tagesspiegel newspaper. This event was repeated for the last two years, I think.
As far as I know it's just an abbreviation that, by accident, is close to "Zuse", but has nothing to do with it (someone might want to correct me here if I'm wrong)
It was developed by Germans, so SuSE means: Software und System-Entwicklung (or english: Software and systems development)
I'm German, and it never occurred to me as a pun. SuSE started out with a hippy-ish corporate style in the 90s so I guess they didn't intend to associate with old school computers but I may be wrong. I remember Siemens/Nixdorf financing/advertising the Konrad-Zuse-Museum, though.
Very unlikely. “Suse” is the short form of “Susanne” in German, similar to “Suzie” for “Suzanne” in English. For that reason Germans wouldn’t perceive it as a pun on “Zuse”, and also because in German pronunciation the soft “s” in “Suse” is phonetically very different from the hard “ts” in “Zuse”.
Germans don't care much about those things, because there is so much history everywhere. A small invention like a computer has not much relevance in that environment.
Additionally, Zuse was colaborating and supported from the Nazi-Regime, even though he himself was not a Nazi, nor did he directly supported their crimes. But this stain dimished education about his work for a long time.
It's fascinating that beside instructions on the conceptual level of how to operate the machine from a programming perspective, it also contains arts&crafts instructions on how to manually glue pieces of the command tapes together: https://www.e-manuscripta.ch/zut/content/pageview/2856533
Deutsches Museum has a great collection of computers, including a Cray 1 and an original Apple 1 kit donated by Steve Jobs. However, most of the exhibitions in Deutsches Musem - including the computer science one - seem to be stuck in the 90's.
The Deutsches Museum is currently in a renovating and reconfiguration phase, has been for four years, and will be for at least another year. Things are about to change.
I can but hope it won't change too much. I love the exhibit on nuclear power (with a life-sized cutaway of a containment vessel), the air&space exhibit (which includes a V2, and a piece of the moon) and the Newtonian physics exhibit.
I love their underground rebuild of a coal mine and their dated Faraday cage show, with humans exposed to high voltage and engineers in white knits turning keys.
The Z3 in the Deutsches Museum is only a replica, but one built by Konrad Zuse and his company in the 1960s. The original machine was destroyed in WW2. The Z4, however, is the one and only original machine.
I think they mean "stuck in the 90s" in the way of presentation and exhibition design, which I'd tend to agree with. They have a lot of cool artifacts, and I can totally see why older relatives raved about it, but to me it felt stuffy and drab compared to other museums when I was there a few years ago.
Exactly. Stuffy and drab is a great way to describe it. I was reminded of my school days a lot when I was there. There is a interactive game in the astronomy section which belongs in a museum for other reasons then it is.
I just recently learned just how underappreciated Zuse is. Besides the Z3 and Z4 being hugely modern machines while _also_ being the first programmable computers (load/store architecture, fixed length instructions, multiple stage execution with overlapping stages, 32-bit float [which he independently invented, but was previously discovered, but not implemented, once in 1914], branch delay slots), he suggested the structure for an optimizing compiler for numerical code as well as trying to patent super-scalar execution -- in the 1930s; the patent was eventually rejected as being "obvious" - in the 60s, years before the invention of Tomasulo's algorithm.
„ Zuse is the nerd's nerd. Not only did he build the first programmable computer, he also was the first dude to grok digital physics and the idea that the universe may be a computer.
Zuse's book 'Calculating Space' was written in the 60s, years before Fredkin et al. People thought he was insane. Consequently, he thought his intellectual life was over and actually went into startups later in life.
Today, his ideas about the universe being computation are very trendy. Before his death, he was invited out to MIT and shown some love for his pioneering work.“ [0]
The Deutsches Museum in Munich is the most amazing engineering museum I've seen. I highly recommend it. I had all but forgotten about Zuse when I saw the computer mentioned in the article.
Thanks! That submission didn't get as high on the front page as this one already has, and didn't get as much discussion as maybe it could, so I think we'll leave the current one up rather than marking it a dupe. I've de-baited the title though.
The Deutsches Technik Museum in Berlin is pretty incredible. They’ve got a reproduction of the entirely mechanical Z1-computer. (The original was destroyed in the war if I’m not mistaken.) If you’re ever in Berlin, be sure to check it out. Take the U1/U2/U3 to Gleisdreieck. It’s within walking distance from there.
Just to clarify the accusation, contrary to what the article says the computer hasn’t been invented by the Nazis. Being a German does not make you a Nazi. I hope it’s just an editorial mistake from the Verge.
Note the Vice article doesn't link to the manual directly, and the site it DOES link to has an incorrectly formatted link to the manual. Pretty sad show all around.
it's ironic that we have such a hard time preserving information of a technology that runs the world today, a technology that is called "information technology"
There's an obscure and easy to miss sign (https://en.wikipedia.org/wiki/Konrad_Zuse#/media/File:2007-0...) on the house he lived in while working on it, and another sign at the technical university, but that's about it. No big festivities in his name, no commemorations, nothing.
Just imagine if the first digital computer was built somewhere in the US or in the UK. I'm sure we'd hear about until the end of times. Strange how that works.