Hacker News new | past | comments | ask | show | jobs | submit login
Where Code Comes From: Architectures of Automatic Control from Babbage to Algol [pdf] (tomandmaria.com)
36 points by hcs on May 21, 2016 | hide | past | favorite | 4 comments

The thing i find fascinating is that there was no single machine which was definitively the first computer. Every successive machine added some feature we now consider essential, but which was only a small increment over what came before. If we consider the machines which were actually built:

Mark I aka ASCC, ABC, Z3: executing arithmetic operations in a program, but from a separate memory, with no control flow

Colossus: all-electronic, but operations are hardwired, not programmed

ENIAC: operations are configurable, and include flow control, but cannot programmed

EDVAC: operations, including flow control, can be programmed, but from a fixed memory

SSEM aka Baby: operations, including flow control, can be programmed, from the same memory as data

The SSEM was the first machine that is basically the same as a modern computer, with all the limitless potential of a machine which can operate on its own program. But it was only a trivial increment over EDVAC!

There's a similar story with programming languages and their compilers.

Short Code - translation is partly manual, interpreted, kind of a fancy assembler

A-O - fully compiled, but more of a linker or fancy macro assembler

Autocode - fully compiled, but still close to the machine, explicitly naming registers

FORTRAN - fully compiled, abstracted from the machine, but no recursion

Overall a good intro and exploration that I'll let others discuss as I don't study this part of history as much. I will counter and clarify one point:

"These machines [analog computers] were not following instructions as they computed, and so the concept of a program... does not apply to them."

Not true: it just shifts to mean something different. Now, if device was fixed in function, as many were, then the author is right that they're essentially dedicated hardware without a program. Yet, there are both configurable and general-purpose analog systems. The configuration of one or partly input into the other constitute the program in these machines. It's not same as digital computers but non-fixed-function, analog computers are still programmed. Examples below of some.


Note: Program above is the number and placement of wires you plug into its holes. One could represent that on paper as a program. That can even be automated further.


Note: The configuration of that would be the program.


Note: The brain is likely an analog or mixed-signal computer. The program in an analog NN is probably the synapses, weights, and such.

So, general and semi-general computers in analog domain have programs. They just direct behavior of simple, real components operating on real things instead of instructions for abstract machines operating on 1's and 0's.

The part you left out of the ellipses is pretty important to the meaning: "according to which devices carry out a sequence of different operations over time". The article is about transfer of control, which is why it focuses on looping and branches.

edit: Maybe sequencing mechanisms on analog computers could have been examined more closely. It's my understanding that Claude Shannon's master's thesis on Boolean logic in hardware came out of his work with Vannevar Bush's differential analyzer at MIT, so things must have been very complex.

"The article is about transfer of control, which is why it focuses on looping and branches."

Perhaps my comment has less application for that reason. Early analog systems either computed functions or were used in control systems. They have looping mechanisms for sure. They can emulate branches as that's just a series of functions with some decision code choosing which to execute. Given these can be configurable, I could still see programs in computers that used analog model with features you describe.

" It's my understanding that Claude Shannon's master's thesis on Boolean logic in hardware came out of his work with Vannevar Bush's differential analyzer at MIT"

Smart observation. Claude Shannon invented the first model for general-purpose, analog computers modeling the differential analyzer.


To get you up to speed, this is a nice, short write-up on the history of analog computers and some of their advantages. You'll also see the difficulties in here vs throwing some Verilog together on Boolean.


They have a lot of untapped potential. It's why I'm looking into them. So, I try to correct any misinformation about them as I discover the truth through my digging. They certainly have drawbacks and issues but also remarkable traits that make them a competitive advantage to this day. Today, they call them "mixed-signal ASIC's" which use analog in places it's best and digital for rest. Still mostly done by hand, though, despite advances in synthesis.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact