
Show HN: Neural Network Evolution Playground with Backprop NEAT - hardmaru
http://blog.otoro.net/2016/05/07/backprop-neat/
======
danjoc
"Forget Torch, Tensorflow, and Theano. I decided to implement Backprop NEAT in
Javascript, because it is considered the best language for Deep Learning
according to the Data Science Dojo."

That's a bold statement with no elaboration. I would expect a link to DSDs
statement at the least. Based on what metrics? How does Javascript go about
accessing GPUs for training?

~~~
lawyao
I think it was meant to be sarcasm.

Kind of like [http://www.deepexcel.net/](http://www.deepexcel.net/)

~~~
danjoc
Okay. My sarcasm detector is broken :D Thanks for the clarification.

------
deepnet
Nice summary of NEAT.

Given Neuro Evolution evolves efficient structure, this suggests training NEAT
on a dataset produced by pre-trained deep networks could distill highly
optimised functions.

Evolving specialised components like LSTM is an intruiging possibility.

Great to see Karpathy's Recurrent.js making prototyping easy and immediately
accessible.

Ken Stanley's innovation markers to allow succesful crossover of augmenting
Neuro Topologies is powerful tool.

------
tansey
Very nice write-up. Several years ago, I took a graduate class that covered
NEAT and ended up doing a little project [0] to see if you could apply similar
ideas (recurrent nets evolved via NEAT). My idea was to use it for multi-agent
control problems, though, where you want agents to try and "teach" other
agents whenever they learn something useful.

[0] [https://github.com/tansey/social-
learning](https://github.com/tansey/social-learning) (note: not as clean of a
structure as OP's code!)

------
yarosal
Back in 2009, I have implmemted NEAT in c#, and applied it to solve Torcs
CIG2009 challenge. It worked well, but took a lot of time to train
(reinforcement learning). IMO neat will not be able to practically compete
with with something like deep Q network on visual signal. I recommend you to
Take a look on cig2016 challenges.

~~~
Houshalter
This is not normal neat. It uses backprop so should train much faster.

