
Aristotle's influence on computational thought - sharmajai
https://www.theatlantic.com/technology/archive/2017/03/aristotle-computer/518697/?single_page=true
======
zeteo
> The history of computers [...] is better understood as a history of ideas,
> mainly ideas that emerged from mathematical logic

I have an advanced degree in CS and have done my fair share of theory, but
ideas from logic are the cherry on the cake and did not create the computer.
The real unsung heroes are those who invented relays, amplifiers and punched
card machines. Babbage had a design in 1840, Boole had a theory in 1850; but
without components neither of them made practical impact, and became nearly
forgotten.

Once relays etc. became widely available in the 1930s it was anyone's game.
Konrad Zuse built his first machine in his parents' apartment [1]. Feynman did
complex computations at Los Alamos with punched card machines. And, as the
article mentions, Shannon created a theory of relay circuits because they were
_already_ building complex circuits.

[1]
[https://en.wikipedia.org/wiki/Konrad_Zuse](https://en.wikipedia.org/wiki/Konrad_Zuse)

~~~
maxander
A large part of this is, though, that the groundbreaking mathematical insights
that paved the way for computers are things that any modern ten-year-old takes
for granted. The idea that we could design a single set of fundamental
computational steps (a CPU instruction set) that could execute any program was
key; that allowed people to abstact over all the different "math engines
people have made out of whatever compontents" and create the unified concept
of "a computer."

More concretely- you can definitely get an advanced CS degree without seeing
much math, since CS is a giant and heterogeneous field. But if you study
something like language design, you quickly run into mathematics that Boole
would have felt very comfortable with.

------
maxander
> Shannon himself encountered Boole’s work in an undergraduate philosophy
> class. “It just happened that no one else was familiar with both fields at
> the same time,” he commented later.

Never underestimate the power of being one of the few people to have a
particular set of individually-common skills. If you know molecular biology
and group theory, or chemical engineering and architecture, or web design and
sign language, or _whatever_ and _whatever_ , there's special opportunities
open to you and almost noone else.

------
chis
I honestly don't think that machine learning deserves to be on this timeline.
Despite the hype, we haven't seen much indication that it's anything other
than a fad, propped up by its ability to do more complex statistical
regressions than ever before thanks to modern hardware.

Maybe ML truly is our best shot right now, but juxtaposing it with the massive
leap that computable logic represented makes it seems pretty inconsequential.

------
woodandsteel
I am glad that someone is explaining how Aristotle is the great-grandfather of
modern computing.

You know, when you look at the histories of the various fields of study in the
modern university, it's quite remarkable how many of them got their start, or
at least a large part of it, with something Aristotle wrote 2,300 years ago.

