
A Man Who Tried to Redeem the World with Logic - andreshb
http://nautil.us/issue/43/heroes/the-man-who-tried-to-redeem-the-world-with-logic-rp
======
smallnamespace
I'm absolutely blown away by the fact that the possibility of recurrent neural
networks encoding memory was discovered and published _before_ electronic
computers existed, and that the construction was entirely theoretical, based
on first principles alone.

~~~
alimw
The article seems to be describing something more like traditional computer
memory (encoded in an electrically stable configuration of switches) than the
memory of a recurrent neural network (encoded in the pattern of underlying
connections). In biological terms that might be comparable to the distinction
between short-term memory and long-term memory.

------
virtualwhys
"Not only did Russell write back [about the proposed corrections to be made to
Principia Mathematica], he was so impressed that he invited Pitts to study
with him as a graduate student at Cambridge University in England. Pitts
couldn’t oblige him, though — he was only 12 years old."

------
damptowel
McCullough interview video:
[https://youtu.be/MTmR6X2w8Tg](https://youtu.be/MTmR6X2w8Tg)

He seems quite the figure.

------
coldcode
Genius is often coupled with depression; sometimes in order to see the world
is different light you need a different brain to see it with. I often wonder
how a genius like Pitts can appear out of common people with no apparent
genius in them - genetics is often a cruel process where amazing results are
often coupled with not so amazing side effects.

------
sundarurfriend
> In a way Pitts was still 12 years old. He was still beaten, still a runaway,
> still hiding from the world in musty libraries. Only now his books took the
> shape of a bottle.

In an otherwise fine article, this was a staggeringly insensitive statement
about the man's struggle with depression. That too at an age where it was even
more poorly understood than today.

------
oriel
> The higher the probability, the higher the entropy and the lower the
> information content.

Is this a known and/or understood statement? It seems to be saying a lot for
so few words. Would love any other sources that could expand on it.

~~~
tromp
It is known if we fix it to read "the lower the entropy".

Information Theory equates entropy with with the (expected) negative logarithm
of probability.

------
auggierose
I read that article at least a year ago before, but "past" doesn't show a
previous mention here on HN. A bug?

~~~
dajohnson89
>This article was originally published in our “Information” issue in February,
2015.

~~~
auggierose
Ah, thanks.

------
JacksonGariety
Didn't this same thing happen between Wittgenstein and Russell two decades
earlier?

------
pvsukale3
Thank you for sharing this article.

------
kemiller
How sad.

------
taosx
I don't know why I don't have nautil.us bookmarked.

