Could neural networks in general be compared to the way information is stored in DNA?
The encoder/decoder is in the process and the data loop would be the same as life in this metaphor. The training conditions of the NN are constrained such that it is able to store small variational changes, success accumulates by survival over many trial and error experiments. Like animals in nature DeepMind and OpenAI have shown that NN's can evolve into sophisticated local optimum solutions for specific game environments.
in my experience, it is more useful to view neural nets as geometric transformations - via stateful functions - that map stuff in input space (eg a sentence written in english) to stuff in some other space (eg the same sentence written in french).
by viewing neural nets (and machine learning, in general) from a mathematical perspective, you can readily exploit an entire field of tools and techniques (eg numerical optimization) and clearly define objective functions to train against - benefits that you dont necessarily get by viewing ml from a biological perspective.
Could neural networks in general be compared to the way information is stored in DNA?
The encoder/decoder is in the process and the data loop would be the same as life in this metaphor. The training conditions of the NN are constrained such that it is able to store small variational changes, success accumulates by survival over many trial and error experiments. Like animals in nature DeepMind and OpenAI have shown that NN's can evolve into sophisticated local optimum solutions for specific game environments.