Very cool to see this sort of thing.
Does this work also allow cyclic structures such as recurrent neural networks (RNNs)?
Does this work generalize to graphs in general?
If you have open-sourced this in any sense, would enjoy to check it out and/or contribute.
• We are currently working on a neuroevolution framework
based on Cartesian genetic programming (CGP) and existing
work on CGP-based ANNs.
• The algebraic framework introduced in this work will be the
basis for the genetic representation and operators.
• Evolved networks will be a mix of de novo evolved modules
and existing modules in the form of ANN layers, relational,
and functional programs.
• The representation will be based on a mapping between
algebraic expressions and a recursive, modular adjacency
I think it's odd that programming languages and code have not replaced math. I mean I agree that you need math for ML, but that doesn't mean you can't show what you do with pseudo code.
Math is fine to prove things or simplify a formula, but when it involves computing, why not use code instead? It's okay to use math in physics, but when you deal with computing and data, I think it's a little misplaced.
Not to mention that the language of mathematics is poorly defined. I mean I would love to learn math by learning its syntax.
For example, how do you say 'one layer'? W_i? L_i?
And then there are those who say screw mathematics completely and use the Hadamard or Kronecker product symbol to denote convolution, which has caused me no end of trouble in the past.