
Neural Network Quine - crazyoscarchang
https://arxiv.org/abs/1803.05859
======
OtterCoder
Quine, the descriptor in the actual article, is both more accurate, and more
appropriate for a forum made of programmers.

The fact that they made it into a quine with non-trivial, useful side effects
is actually really interesting, regardless of the language/paradigm.

------
ttul
What’s the significance of this paper?

~~~
TekMol
Sex sells.

~~~
stanfordkid
Would be interesting to use this as a training algorithm -- have multiple
self-replicating networks that mate and copy portions of their weights in
random patterns... eerie.

~~~
Choco31415
We already have the mating part with NeatAI. Link:
[http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf](http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf)

Essentially it's a neural network which is represented by a series of genes.
Genetically, new genes come in by having new links form between neurons,
disabled, or split (with an in between neuron). This, as you can see, allows
for a wide range of network representations. Because these variations are
tagged with id's sequentially, we can mate two neural networks by combining
their genes in order. They never conflict, and will produce something similar
to both parents.

------
_0ffh
Isn't this essentially looking for a fixpoint?

~~~
vanderZwan
> _In mathematics, a fixed point (sometimes shortened to fixpoint, also known
> as an invariant point) of a function is an element of the function 's domain
> that is mapped to itself by the function._

[https://en.wikipedia.org/wiki/Fixed_point_(mathematics)](https://en.wikipedia.org/wiki/Fixed_point_\(mathematics\))

Looks like it. And they need to resort to some mathematical trickery to
achieve this because as the explain in the paper:

> _A neural network is parametrized by a set of parameters Θ, and our goal is
> to build a network that outputs Θ itself. This is difficult to do directly.
> Suppose the last layer of a feed-forward network has A inputs and B outputs.
> Already, the size of the weight matrix in a linear transformation is the
> product AB which is greater than B for any A > 1._

------
popcorncolonel
So genetic programming, but for NN's? Hasn't that been done?

~~~
qiemem
GP for NNs has, but this isn't GP; "self-reproduction" (as in the original
title) here means that the neural network is learning to output a copy of
itself. In other words, the NN is learning to be a quine (as indicated by the
current title). This is a very different than just applying genetic
programming to NNs. In GP, you're updating the NN (or whatever) by copying it
with mutations/recombinations. Here, the NN itself is learning to create a
copy of itself.

The fact that they manage to also train to the network to simultaneously
perform somewhat complicated tasks is super crazy.

------
std_throwaway
This is a tautology.

