Hacker News new | past | comments | ask | show | jobs | submit login
Konrad Zuse: Nearly the German Turing (linuxvoice.com)
133 points by nkurz on March 13, 2015 | hide | past | favorite | 33 comments



Zuse is the nerd's nerd. Not only did he build the first programmable computer, he also was the first dude to grok digital physics and the idea that the universe may be a computer.

Zuse's book 'Calculating Space' was written in the 60s, years before Fredkin et al. People thought he was insane. Consequently, he thought his intellectual life was over and actually went into startups later in life.

Today, his ideas about the universe being computation are very trendy. Before his death, he was invited out to MIT and shown some love for his pioneering work.


Zuse was way ahead. He had working programmability by 1938. Atanasoff in the US got close, but didn't have general programmability. None of the WWII cryptanalysis machines were general purpose computers. Their modern counterpart is a Bitcoin ASIC.

None of those machines put the program in main memory. Memory was a huge problem in the early days. Paper tape and punched cards were slow. Everything else was expensive per bit and hard to address. Atanasoff had a drum of capacitors, but it required a discrete capacitor for each bit and rotating electrical contacts. Tauschek in Austria invented the magnetic drum in 1932, but it wasn't used for computing until after WWII. The big bottleneck in computing for a long time was simply that there was nothing like RAM - there were no random access memory devices.

There were lots of special purpose machines before computers. Much early work went into electronic multiplication, which was really slow mechanically. IBM introduced the IBM 603 Electronic Multiplier in 1946, which was the first electronic digital computing device product. I/O was punched cards, but the multiplier was tubes. There's a long history of mechanical multipliers, going back to Leibniz and ending with the IBM 602A. My favorite is the McClure Multiplying Punch, which had a table-driven mechanical multiplier for pounds/shillings/pence amounts. The ENIAC was a lot of multipliers and adders programmed with huge plugboards, but didn't store programs in memory.

Von Neumann put all the pieces together in 1945 in the "Report on the EDVAC" - general purpose, branching, program and data in main memory, binary, external I/O devices.[1]. He saw that memory was the limitation on general purpose computing, as did others. He wanted 250K bits of memory in the EDVAC. He only got 44K; the mercury tank delay line approach didn't offer much capacity. Von Neumann proposed a CRT memory, the first true random access memory device, which was made to work at the University of Manchester in 1946 as the Williams tube. That technology powered several early machines, including the IBM 701 and the UNIVAC 1103, and competed with delay lines until magnetic core memory started working. Both Williams tubes and mercury tanks were dead-end memory technologies, but they worked well enough to get something done.

Turing was involved with the ACE effort in the UK, but it was under-funded. Pilot ACE only had 4K bits of memory and wasn't running until 1950.

[1] https://archive.org/stream/firstdraftofrepo00vonn


We've come to talk about the Von Neumann architecture, but the information contained in his report on the EDVAC was largely lifted from the work of John Mauchly and Presper Eckert. Von Neumann's contribution was the relatively pedestrian task of translating their findings into mathematical language. He inappropriately put it into the public domain without attribution simultaneously diminishing their place in history while unjustly cementing his. Not to mention, he destroyed their chances of earning a patent. It's a real shame that we tend to gloss over this nowadays for expedience's sake.


For more on this, I recommend "Eniac: The Triumphs and Tragedies of the World's First Computer" by Scott McCartney. I read it a few years ago and found it fascinating. I don't remember how much it gets into Zuse, but it has a lot about the Eniac vs Atanasoff battle, and von Neumann and the Preliminary Report on EDVAC. These things are never black and white. All of these people did valuable work.


Well, Eckert and Mauchley went on to form the Eckert-Mauchley Computer Corporation and produce the UNIVAC I. That company became UNIVAC, later Unisys, and is still around.

Mauchley was one of the founders of the ACM.


I remember going to the Deutsches Technikmuseum in Berlin. They had an amazing exhibit on the development of mechanical computers from automated looms and the origin of punch cards which were used in automated stitching machines. They had a model of the Z1. I have never encounter another presentation of the development of technology that made progress seem so orderly and inevitable (obviously massive hindsight bias).

The Z1 was absolutely incredible and it really drove home the point that switches are switches, regardless of whether they are implemented as a transistor or a metal lever.


I visited that exhibit in Berlin too. There was two or three Zuse computers on display as well as a bunch of other early European computers. It was a very nice exhibit.

It was particularly enlightening to see the various technologies being used, today all computers are made of silicon chips, but the exhibit had electro-mechanical, electro magnetic (memory), vacuum tube and solid state computers on display.

I don't know if it is a touring exhibit or is it still in Berlin, but I recommend anyone to visit that.


It was in Berlin at the new year, and didn't look like it was going anywhere. :-) Konrad Zuse's son is a retired computer science professor who built one of the replicas in the exhibit, and still sometimes lectures on it and demonstrates it.

http://www.horst-zuse.homepage.t-online.de/


Here is a photo of the thing in the living room of his parents: http://www.computerhope.com/jargon/z/z1.jpg


I took German in College but I'm far from fluent. Is this exhibit accessible to an English speaker who knows ein bisschen Deutsch?


a good online resource is the Konrad Zuse Internet Archive, which is a cooperation with the DT museum

http://zuse.zib.de/

z1: http://zuse-z1.zib.de/


I saw Konrad Zuse’s son, Horst Zuse, give a presentation on the history of these computers at the Deutsches Technikmuseum in Berlin. It was fascinating (this article covers the same information). He's currently building a Z3 out of modern parts, out on the museum floor. I took some pictures and video of the Z1 replica and the in-progress Z3:

http://imgur.com/a/effAn

https://vid.me/FUQL


The article has only an image of the Z4 in Munich. I find the reconstruction of the Z1, which is in museum in Berlin, much more interesting.

Excerpt from the Wikipedia article[1]:

    Construction of the Z1 was privately financed. Zuse got money from his 
    parents, his sister Lieselotte, some students of the fraternity AV Motiv 
    (cf. Helmut Schreyer) and Kurt Pannke (a calculating machines manufacturer 
    in Berlin) to do so.


    Zuse constructed the Z1 in his parents' apartment; in fact, he was allowed 
    to use the bathroom for his construction. In 1936, Zuse quit his job in 
    airplane construction in order to build the Z1.


    Zuse used thin metal sheets to construct his machine. There were no relays 
    in it. The only electrical unit was an electric motor to give the clock 
    frequency of 1 Hz (cycle per second) to the machine.


    The machine was never very reliable in operation due to the precise 
    synchronization required to avoid undue stresses on the mechanical parts.
[1] https://en.wikipedia.org/wiki/Z1_%28computer%29


> The article has only an image of the Z4 in Munich. And it is an image of the Z3 if one looks closer at the description on the left.


> The article has only an image of the Z4 in Munich.

Looks like I have to visit the Deutsches Museum the next time I'm in München.


Some trivia from Wikipedia [0]:

  The movie Tron: Legacy, which revolves around a world inside a computer system, 
  features a character named Zuse [1], presumably in honour of Konrad Zuse.
[0] https://en.wikipedia.org/wiki/Konrad_Zuse#Zuse_Year_2010

[1] http://www.imdb.com/character/ch0209988/?ref_=tt_cl_t7


Some years before his death, Konrad Zuse wrote a book about his long and interesting life called "Der Computer - Mein Lebenswerk", which was later translated to an English version titled as "The Computer - My Life".


Lebenswerk means life's work, not life.


Zuse's independent development of ideas that paralleled other early computer scientists' makes me wonder how much of our modern computer science (Von Neumann, Lambdas, clocks, registers, and all) was inevitable discoveries and how much was shaped by our pioneers.

If we did it all over again, but with different people, would we end up with something similar, or something very different? Would we have digital, basically serial computer cores with layers of memory around them? Would our languages be like assembler and C and Lisp?

Or are there potentially very different but equally practical alternatives out there?


If there ever it a tour of machines that moved worlds, I would recommend the following:

Z1 - DT museum

Difference Engine - British Science museum

Bombe - Bletchly Park

Harrison's H1 - Greenwich Observatory


Anyone visiting Bletchley should try to come on a day that the National Museum of Computing is open. It is housed next door to the Turing museum, and is probably more fun to visit.

It's full of ancient hardware, including a computer lab full of working BBC Micros and an original BBC Domesday Book machine, which is an unholy combination of LaserDisc and BBC Master to produce a sort of 80s StreetView.

It has the oldest working computer (the Harwell WITCH), a rebuilt Colossus, and quite a lot more. They're working on a reconstruction of EDSAC.


There's also a Difference Engine at the Computer History Museum in Mountain View. A block or two down from Google.

The interesting thing about the Difference Engine was that it was Babbage's unrealized design. It wasn't built until the early 2000s, funded by HN favourite Nathan Myhrvold. He funded one for the British Museum, another one for himself. The one at the Computer History Museum is Myhrvold's own, on loan.


Almost. Myhrvold provided funds for the British Museum to complete their copy, and build him a complete machine. From Wikipedia:

"After the Science Museum in London successfully built the computing section of Charles Babbage's Difference Engine #2 in 1991, Myhrvold funded the construction of the output section, which performs both printing and stereotyping of calculated results. He also commissioned the construction of a second complete Difference Engine #2 for himself, which has been on display at the Computer History Museum in Mountain View, California, since May 10, 2008."

I've seen both machines and they are marvels. I was especially excited to finally see the second machine in operation last fall. It is mechanical poetry in motion.


They also have a number of early analog computers which looked to my musician's eye like modular synthesizers. Yes I know it's the other way around.


Plankalkül [0] is cool. I was impressed when I learned it had lambdas.

[0] http://en.wikipedia.org/wiki/Plankalk%C3%BCl


I've visited the Technikmuseum in Berlin too, the Zuse exhibit is fascinating. A history I never knew. Honestly if I didn't see the things on display I'd doubt it a bit. But the mechanical computer parts are there.

The fine article linked here is down at the moment, so here's a few other links of interest.

Berlin museum: http://www.sdtb.de/Mathematics-and-Computer-Science.1256.0.h...

Chess computers: https://chessprogramming.wikispaces.com/Konrad+Zuse

Wikipedia: https://en.wikipedia.org/wiki/Konrad_Zuse


Looks like the site went down. Here's a cached link:

http://webcache.googleusercontent.com/search?q=cache:4wvG8zC...


In episode 60 [1] of the great (and sadly, ending soon) podcast Pragmatic, John Chidgey talks some about the history of programming and more specifically some about Turing, arguing that Turing is overrated. I largely agree, and since I send John feedback after every episode, I sent him a longish rant, reproduced as follows:

When talking about the history of programming, there's always a difficulty because of the difference between computing and programming. It's nearly undeniable, for example, that Vannevar Bush had a greater impact on computing than did John Von Neumann (I'll take the haters on with that). As We May Think is more important than Von Neumann's draft, because what Von Neumann was talking about had already been created and codified, whereas As We May Think was much more of a visionary work. Maybe put another way, I think Von Neumann was a brilliant synthesist, bringing together the ideas of many into the next step in the evolution of whatever he was working on at the time. Bush, on the other hand, was a management/strategic level thinker who saw the revolutionary step ahead of whatever he was working on. But, Bush was not a programmer in any way, and his impact on programming is limited. Similarly, I think that Turing is overly lionized for his programming contributions because of the Turing test and Eliza. He provided a lot of the theoretical underpinnings, but I would argue that C.A.R. Hoare's contributions have a greater reaching practical impact than did Turing's. I suspect that Turing and Von Neumann had political connections in the ACM and IEEE that led to their improved historical standing. But as always, historical/political credit is almost as much a function of who you know/where you are as what you've done. (I'm a huge fan of Shannon, Mauchly, Adm. Hopper, Hoare, and Dijkstra, over others like Turing and Von Neumann).

It's also very difficult, when just surveying computer history, to give proper credit to Zuse, Lebedev, Scherbius, Rejewski, etc, due to the lack of English Language resources on their accomplishments, and lingering bias against the governments some of them worked for. One could make a very compelling argument for Scherbius as having created _the_ most pivotal invention of the twentieth century, because Enigma drove the large investment into cryptanalysis, which led to the large investment in devices that eventually became the general purpose computing machines of today.

There's also a large, long discussion to be had about the impact of figures like Marvin Minsky, Bill Gosper, Richard Greenblatt, Dennis Ritchie, Ken Thompson, Richard Stallman, Linus Torvalds, Brendan Eich, Donald Knuth, Peter Norvig, or Alan Kay had on programming, and also on the impact figures like Jack Goldman, Doug Engelbart, Steve Wozniak, Thomas Watson, Robert Noyce, Gordon Moore, David Packard, Bill Hewlett, Steve Jobs, Nolan Bushnell, Larry Ellison, Bill Gates, Vint Cerf, or Jim Clark had on computing. Few of those guys cross over from programming to computing or vice versa, yet each have critical contributions to be discussed and looked at. I don't know of a lot of good books or resources, though, that really tackle this. Steven Levy's Hackers is the canonical example, but it is heavily biased toward the AI Lab crowds, Lisp hackers, and the early Unix pioneers, without touching on the big industry/engineer types more than tangentially or even scornfully, and almost completely ignored the military/NASA. I also really appreciated Peter Siebel's Coders at Work, which was more inclusive but not really a history, more of a set of conversations. I'm told Petzold's CODE is good, and of course deeper into history there's the Godel, Escher, Bach: the Eternal Golden Braid, which I have on my bookshelf and have to admit not getting to far into because I don't really like math. Either way, as I'm sure you know, there is a wide, wide history of programming and computing that could be explored more, and I find it a shame that no one has done so with the historical rigor that I would like. (Such a book would probably be $100+ because sales would be so small since few people would be interested, and it would discourage others from taking up future projects)

[1] http://techdistortion.com/podcasts/pragmatic/episode-60-or-w...


I agree that in terms of practical impact, Turing has had limited impact. And, probably he had political connections related to the war effort. But I don't feel that he is overrated. He was a first class theoretician. Some non-trivial results that I know of are,

1. A version of the central limit theorem [1]

2. A fixed-point combinator in his proof of equivalence of lambda calculus and computable functions, in his paper. Any one who has tried to derive the Y-combinator (appropriate for this website) knows how tricky it is to get one. Turing seemed to have done this all this in an appendix. [2]

3. LU decomposition. [3]

4. I don't understand his work on morphogenesis, but it seems to be his most cited paper. [4]

I largely agree with the list of pioneers that you have pointed out, without berating Turing.

[1] http://ebooks.cambridge.org/chapter.jsf?bid=CBO9780511614293...

[2] http://en.wikipedia.org/wiki/Fixed-point_combinator#Other_fi...

[3] https://micromath.wordpress.com/2012/02/25/alan-turing-and-l...

[4] http://en.wikipedia.org/wiki/The_Chemical_Basis_of_Morphogen...


There's always a tendency to focus in on a few "pivotal creators" when the reality is much more diffuse. People like to know who the inventor of X was and the reality is that the enshrined inventor, at best, made a particular advance in commercialization or practicality--or simply got the good press.

Hackers is a good read but it arguably sacrifices historical completeness for narrative flow. You mention the focus on the AI lab but then it also makes the argument everything that happened on the east coast was eclipsed. But then, that's what good stories do. I watched The Imitation Game last night and it took enormous historical liberties--probably too many. Breaking the Code is better in that regard. On the other hand, I saw a historical play about the invention of ether last year that I felt was harmed by too literal attention to less important threads of the central story.


What about Rudolf Von Hacklheber?


[deleted]


The best way to complain about an HN title is to suggest a better one. "Better" here means more accurate and neutral. Even when a title is baity, we can't always think of a better one, so community suggestions help a lot here.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: