An interesting question because all of this genetic encoding and expression is probabilistic at its very core (it has to be, otherwise there wouldn't be evolution), whereas it would be very bad if basic operations in a CPU had similar level of error rates.
Interesting to me how the top comment has talked about constructing logic gates out of biological circuits. I wonder if anyone has done the opposite, i.e., write a probabilistic programming language whose operations are under the same amount of noise as a cell?
> probabilistic programming language whose operations are under the same amount of noise as a cell
The reason for the probabilistic nature is that biological "computations" are eletrochemical reactions and feedback loops, which do not map very well to the concept of "executing code" as in programming languages. I think a closer analogy could be a hardware description language that sythesizes analog circuits for computations (cf. analog computers) which are then subject to noise from electromagnetic radiations in the environment.
So in a certain sense, this has already been done in a very rudimentary way during the pre-digital age of computing.
That isn't how evolution works (or at least it is a gross generalization that is borderline wrong).
You own cells mutate over your lifetime and the mutations may trigger genetic features across your entire body. Sometimes, these mutations cause errors that we call cancer, but not always.
Evolution is when a certain genetic feature provides better fitness (which doesn't necessarily have to come from a mutation), and then gets selected through mating. For example, if suddenly people with brown hair were a better mate, it would be much more likely for the next generation to have brown hair, until non-brown hair were "evolved" out of the gene pool completely and it would be impossible to not have brown hair.
This is how we got orange carrots, which are distinct from the previous non-orange carrots, for example.
So evolution is copying the state of a running executable (one that’s poorly memory managed and dependent on external variables) instead of the file itself, and evolution favors favors the copies that survive and maybe even run faster or are otherwise better.
Not exactly, evolution doesn’t happen instantly. A digital comparison to an organism would be something like this:
An organism is a collection a billions of threads all running the same code but starting at seemingly random entry points. Before dying, each thread forks one or more threads (depending on external factors like available memory).
In order to reproduce, the “father” code sends a copy of its current code, but only the bottom word of every byte. The “mother” program combines this with the top word of every byte and then executes it in a chroot jail to make sure it will actually run. Once it is confident it will run, it “births” it onto its own machine.
Every thread in this process is running on non-ECC memory, with cosmic rays bit-flipping things, though there are threads running around making fixes to broken threads (actually, just killing them before they can fork).
The implementation of the threads isn’t relevant here (such as modeling proteins, atp pumps, and such).
In this, evolution fitness would be less cancer (fork bombs), doing usable work, successfully mating, etc. This evolution pressure might look like preventing mating until the program is a certain age or performed certain milestones and a “score” of how well it has done its work (both partners want a good “life score” but not too high — liars could exist! — to proceed with mating).
Evolution would occur naturally over many generations. From one generation to the next, they look nearly identical, but from hundreds of generations they might look identical, or not. The “not” part is evolution.
> it would be very bad if basic operations in a CPU had similar level of error rates.
Given how slow evolution is, and how many times DNA is copied and/or transcribed in any one individual, my intuition is that the error rates for genetic processes are actually incredibly low.
There are numerous correction mechanisms, and other mechanisms that destroy malformed proteins and mRNA chains. Wikipedia claims one out of every 1000 to 100000 amino acids added to a forming protein is wrong. Each ribosome is proceeding to add amino acids to a chain at about 10 per second, and millions (or way more) proteins are being produced all the time.
My intuition says that error rates are extremely low, but that errors are common - due to the incredible number of iterations that occur in a very short time period.
I just mean that cells in an organism experience much higher noise rates than a CPU, but we still consider them as capable of sophisticated computations. It's hard to peg a number at exactly how much more, but at least for reference many bacteria have a gene transcriptional error rate of like 10^-4. Apparently fiber optic engineers target 10^-12 error rate, and I assume the error rate of transferring data between CPU registers is even lower. So it's probably a decent estimate to think that biological "computations" occur under at least a million times more noise than a computer.
Interesting to me how the top comment has talked about constructing logic gates out of biological circuits. I wonder if anyone has done the opposite, i.e., write a probabilistic programming language whose operations are under the same amount of noise as a cell?