
The Synthesis of Reliable Organisms from Unreliable Parts (1956) [pdf] - rndn
http://www.dna.caltech.edu/courses/cs191/paperscs191/VonNeumann56.pdf
======
techbio
First, I love that this is from over 60 years ago. Here I am in the 21st
century, happily thinking that genetic algorithms are relatively new
techniques, and along come Von Neumann on HN, reminding me that, no, of course
this is foolish, they are if not billions of years old, they are older than
old, newer than new, infinitely permanent.

Second, more specifically, error correction is a natural extension of two
techniques/mandates:

a) The need for energy: A machine that is not provided plentiful energy will
soon cease to operate as a machine unless it can gather it's own, and graduate
to organism status.

b) Reproduction: many copies have the opportunity to survive beyond
catastrophic errors to a single organism (eg. lightning strikes) and reproduce
with mutation to reduce effects of minor but still limiting errors (eg. cold-
bloodedness).

Thanks for the link.

------
Synesthesis
Can anyone comment on the relevance of this in modern CS or related fields? As
a non CS major, whenever I see material from Von Neumann, I can't help but be
amazed at it. Yet I get the impression that while many people appreciate his
work (he presents very compelling concepts in my opinion), it didn't have the
impact on CS or other fields you would expect. Am I right or wrong? Where are
these ideas of Von Neumann applied today?

~~~
nickpsecurity
I'd have to read and understand the paper to be sure. However, high-level and
logic synthesis tools do a job that sounds pretty similar. The Synthagate HLS
tool converts [1] an algorithm into automata that are simplified, combined,
used to generate data paths, used to generate control paths, integrated, and
synthesized into hardware. They have a book [2] on the method that I'm trying
to get HW experts to peer review in interest of imitating (or not) the
techniques. Further, there's plenty of academics working with neural nets,
error correction, and probabilistic systems on FPGA's. So, old as it is,
elements of his writings might aid modern researchers or just be an
interesting look on prior ideas.

[1] [http://synthezza.com/logic-synthesis-in-
synthagate-4/](http://synthezza.com/logic-synthesis-in-synthagate-4/)

[2]
[http://synthezza.com/download/AboutUs/Book2011.zip](http://synthezza.com/download/AboutUs/Book2011.zip)

------
a3n
Readable text pdf:
[http://www.sns.ias.edu/pitp2/2012files/Probabilistic_Logics....](http://www.sns.ias.edu/pitp2/2012files/Probabilistic_Logics.pdf)

And commentary:
[https://duckduckgo.com/?t=lm&q=probabilistic+logics+and+the+...](https://duckduckgo.com/?t=lm&q=probabilistic+logics+and+the+synthesis+of+reliable+organisms+neumann)

------
reasonattlm
[https://en.wikipedia.org/wiki/Reliability_theory](https://en.wikipedia.org/wiki/Reliability_theory)

[https://dx.doi.org/10.1006%2Fjtbi.2001.2430](https://dx.doi.org/10.1006%2Fjtbi.2001.2430)

"Reliability theory is a general theory about systems failure. It allows
researchers to predict the age-related failure kinetics for a system of given
architecture (reliability structure) and given reliability of its components.
Reliability theory predicts that even those systems that are entirely composed
of non-aging elements (with a constant failure rate) will nevertheless
deteriorate (fail more often) with age, if these systems are redundant in
irreplaceable elements. Aging, therefore, is a direct consequence of systems
redundancy. Reliability theory also predicts the late-life mortality
deceleration with subsequent leveling-off, as well as the late-life mortality
plateaus, as an inevitable consequence of redundancy exhaustion at extreme old
ages. The theory explains why mortality rates increase exponentially with age
(the Gompertz law) in many species, by taking into account the initial flaws
(defects) in newly formed systems. It also explains why organisms “prefer” to
die according to the Gompertz law, while technical devices usually fail
according to the Weibull (power) law. Theoretical conditions are specified
when organisms die according to the Weibull law: organisms should be
relatively free of initial flaws and defects. The theory makes it possible to
find a general failure law applicable to all adult and extreme old ages, where
the Gompertz and the Weibull laws are just special cases of this more general
failure law. The theory explains why relative differences in mortality rates
of compared populations (within a given species) vanish with age, and
mortality convergence is observed due to the exhaustion of initial differences
in redundancy levels. Overall, reliability theory has an amazing predictive
and explanatory power with a few, very general and realistic assumptions.
Therefore, reliability theory seems to be a promising approach for developing
a comprehensive theory of aging and longevity integrating mathematical methods
with specific biological knowledge."

