First, I love that this is from over 60 years ago. Here I am in the 21st century, happily thinking that genetic algorithms are relatively new techniques, and along come Von Neumann on HN, reminding me that, no, of course this is foolish, they are if not billions of years old, they are older than old, newer than new, infinitely permanent.
Second, more specifically, error correction is a natural extension of two techniques/mandates:
a) The need for energy: A machine that is not provided plentiful energy will soon cease to operate as a machine unless it can gather it's own, and graduate to organism status.
b) Reproduction: many copies have the opportunity to survive beyond catastrophic errors to a single organism (eg. lightning strikes) and reproduce with mutation to reduce effects of minor but still limiting errors (eg. cold-bloodedness).
Can anyone comment on the relevance of this in modern CS or related fields? As a non CS major, whenever I see material from Von Neumann, I can't help but be amazed at it. Yet I get the impression that while many people appreciate his work (he presents very compelling concepts in my opinion), it didn't have the impact on CS or other fields you would expect. Am I right or wrong? Where are these ideas of Von Neumann applied today?
I'd have to read and understand the paper to be sure. However, high-level and logic synthesis tools do a job that sounds pretty similar. The Synthagate HLS tool converts [1] an algorithm into automata that are simplified, combined, used to generate data paths, used to generate control paths, integrated, and synthesized into hardware. They have a book [2] on the method that I'm trying to get HW experts to peer review in interest of imitating (or not) the techniques. Further, there's plenty of academics working with neural nets, error correction, and probabilistic systems on FPGA's. So, old as it is, elements of his writings might aid modern researchers or just be an interesting look on prior ideas.
Yes, the ideas are applied to CS in very practical way. For example, in order to have a fault-tolerant web service you have to assume every component and server in your stack can fail at any time so you have to design automated failover into each layer ... You assume the parts are unreliable but the whole system is designed to be reliable despite its parts being unreliable.
Well, he did created the Von Newmann Architecture [1] and various other concepts that are still used in modern computers. So, I would say that his work is very influential in CS.
"Reliability theory is a general theory about systems failure. It allows researchers to predict the age-related failure kinetics for a system of given architecture (reliability structure) and given reliability of its components. Reliability theory predicts that even those systems that are entirely composed of non-aging elements (with a constant failure rate) will nevertheless deteriorate (fail more often) with age, if these systems are redundant in irreplaceable elements. Aging, therefore, is a direct consequence of systems redundancy. Reliability theory also predicts the late-life mortality deceleration with subsequent leveling-off, as well as the late-life mortality plateaus, as an inevitable consequence of redundancy exhaustion at extreme old ages. The theory explains why mortality rates increase exponentially with age (the Gompertz law) in many species, by taking into account the initial flaws (defects) in newly formed systems. It also explains why organisms “prefer” to die according to the Gompertz law, while technical devices usually fail according to the Weibull (power) law. Theoretical conditions are specified when organisms die according to the Weibull law: organisms should be relatively free of initial flaws and defects. The theory makes it possible to find a general failure law applicable to all adult and extreme old ages, where the Gompertz and the Weibull laws are just special cases of this more general failure law. The theory explains why relative differences in mortality rates of compared populations (within a given species) vanish with age, and mortality convergence is observed due to the exhaustion of initial differences in redundancy levels. Overall, reliability theory has an amazing predictive and explanatory power with a few, very general and realistic assumptions. Therefore, reliability theory seems to be a promising approach for developing a comprehensive theory of aging and longevity integrating mathematical methods with specific biological knowledge."
Second, more specifically, error correction is a natural extension of two techniques/mandates:
a) The need for energy: A machine that is not provided plentiful energy will soon cease to operate as a machine unless it can gather it's own, and graduate to organism status.
b) Reproduction: many copies have the opportunity to survive beyond catastrophic errors to a single organism (eg. lightning strikes) and reproduce with mutation to reduce effects of minor but still limiting errors (eg. cold-bloodedness).
Thanks for the link.