
Neural Complete – A neural network that autocompletes neural network code - kootenpv
https://github.com/kootenpv/neural_complete
======
minimaxir
A note: looking at the code, this isn't a seq2seq Keras model. The core model
code is a fork of the base Keras text generation example
([https://github.com/fchollet/keras/blob/master/examples/lstm_...](https://github.com/fchollet/keras/blob/master/examples/lstm_text_generation.py)),
which works like a char-rnn where the previous 80 characters will predict the
81st character, then feed the generated characters back into the model. In the
server implementation, the server keeps predicting characters until it hits a
break character, which will then serve the generated characters to the user.

In a seq2seq implementation, you need to predict all output characters
_simultaneously_ (i.e. model input is all characters in a line, model output
is all characters in the next line), which in Keras involves using a
TimeDistributed(Dense()) layer. (see the Keras example for seq2seq:
[https://github.com/fchollet/keras/blob/master/examples/addit...](https://github.com/fchollet/keras/blob/master/examples/addition_rnn.py))
This also requires more sequence ETL and a _lot_ more training time.

~~~
kootenpv
You are totally right: I was mixing up with another project I'm working on
where I _am_ using seq2seq (using only tensorflow) :) I will update the text
of the repo. Thank you!

------
asrp
This reminds me a bit of auto-sklearn[1] for automatically select the machine
learning algorithm to use and its parameters (and so isn't quite doing it at
code level like this one).

[1] [https://github.com/automl/auto-sklearn](https://github.com/automl/auto-
sklearn)

~~~
kootenpv
Yea, I also have a similar project in the making:
[https://github.com/kootenpv/xtoy](https://github.com/kootenpv/xtoy) . This
one does optimisation of finding a machine learning model using evolutionary
search, but mainly focuses on just taking any kind of data and coming up with
a prediction (missing data, text data, date/time data etc). It's a lot of fun
:)

------
potomushto
> It would be very fun to experiment with a future model in which it will use
> the python AST and take out variable naming out of the equation.

So what if we use AST as a source for the code structure? Also, there are
other metadata such as filename (e.g. reducer.js), path (./components),
project dependencies (package.json for JavaScript projects), amount of github
stars and forks.

~~~
kootenpv
Yes, these things should ideally be taken into account. I have been contacted
by a company that is actually working on it:
[http://sourced.tech/](http://sourced.tech/)

------
teomoura
and so it begins

------
z3t4
i imagine a future where the programmer is just there in case the AI does a
mistake.

~~~
iloveneptune
> in case the AI does a mistake

such as using tabs for indentation

~~~
jboggan
At some point in the future there will be a silicidal holy war between
different sentient AIs and it will be fought over Tabs versus Spaces.

~~~
kootenpv
Haha! That's gonna be scary.

------
m-j-fox
I will train on Tim Pope's dataset and see if the computer comes up with a
useful plugin.

