
DeepCoder: Learning to Write Programs - lopespm
https://arxiv.org/abs/1611.01989
======
SNBasti
Well, it's obvious that there will be an improvement in specifiying the syntax
for a computer program as it has been for the last decades.

e.g. Assembler -> C -> C++

There recently has been a post @ HN about the missing programming paradigm
([http://wiki.c2.com/?ThereAreExactlyThreeParadigms](http://wiki.c2.com/?ThereAreExactlyThreeParadigms)).
With the emerge of smarter tools, programming will get easier in one way or
the other, releasing the coder from a lot of pain ( as C or C++ did realse us
from tedious, painful assembler ). However, I am quite sure that it won't
replace programmers since our job is actually not to code but more to solve a
given problem with a range of tools. Smarter tools will probably boost
productivity of a single person to handle bigger and more complex
architectures or other kinds of new problem areas will come up. Research will
go faster. Products will get developed faster. Everything will kind of speed
up. Nevertheless, the problems to get solve / implement will remain until
there's some kind of GAI. If there's an GAI smart enough to solve our problems
probably most of the Jobs have been replaced.

~~~
petra
>> our job is actually not to code but more to solve a given problem with a
range of tools.

Actually, often the most valuable thing a programmer can do is build a
tool/dsl that lets the domain experts(or cheaper labor) to solve problems.

So if a programmer just solves problems, he will probably have some serious
competition.

~~~
golergka
Best programmers are domain experts. The line will just continue to blur.

~~~
wiz21c
spot on

------
glangdale
It's intriguing, particularly in cases where the code written could be
verified against a specification or at least comprehensively fuzz-tested
against a 'known good' version of the program. A mildly terrifying thought for
a performance programmer - the idea that, perhaps, someone could just write
some sloppy Python code to solve a program "at some speed" and have a
automated system iterate it into some tuned, SIMD-laden, C/C++ horror-show.

While of course we have optimizing compilers to do this sort of thing now, you
could imagine automated systems that attempt to preserve clarity and
simplicity as they do it - and such a system could work semi-supervised,
iterating with a human in the loop to steer the system towards more
comprehensible solutions.

~~~
petra
Well, for math we're already here:
[http://www.spiralgen.com](http://www.spiralgen.com)

It's optimizes math code, both at the math level and the implementation level,
and Intel have used it to optimize their performance libraries.

~~~
amelius
> Well, for math we're already here

No, _they_ are here:

> SpiralGen holds the exclusive license to the Spiral software generation and
> optimization technology.

But I wonder how that can be true if:

> Spiral was developed under the lead of Carnegie Mellon University

~~~
nickpsecurity
It's called Technology Transfer to private sector. Happens all the time. The
CompCert certifying compiler for C was a landmark achievement of INRIA that
lots of CompSci is building on. A company called AbsInt controls the I.P. now.

~~~
petra
The thing i don't get: this seems like a useful but rarely used tech.

Even if there are patents involved , i'm sure it's not patented everywhere
around the globe, and it's just software, so why isn't there no real
competition ? something like a reasonably prices SAAS service ?

------
wybiral
So this is how the uprising happens.

First they came for the factory workers. Then they came for the accountants.
Then they came for the drivers...

Then they came for me.

~~~
tonmoy
I mean that's how most circuit designers lost their jobs to CAD/EDA tools, why
should software engineers be different?

~~~
kevinnk
Circuit designers didn't lose their jobs to EDA tools any more than software
developers lost their jobs to compilers.

~~~
partycoder
A programmer translates requirements into implementation. Which is translating
natural language into a programming language.

Now, as automation increases, that layer gets thinner each time.

But it will be some time before we see AI capable of doing requirement
analysis, negotiating requirements, etc.

------
argonaut
For actual in-depth technical discussion / criticism of the paper, you can
read its ICLR peer reviews (and other public comments):
[https://openreview.net/forum?id=ByldLrqlx](https://openreview.net/forum?id=ByldLrqlx)

------
ainiriand
Programming is easy if specs are frozen. I want to see that neural network
discuss with my managers about inputs/outputs. Good luck.

~~~
bnegreve
Machine learning algorithms are actually quite good at solving poorly
specified problems. They learn from examples and don't require any definite
problem specification.

For example, recognizing people in a picture is almost impossible to specify
but relatively easy to learn, given enough examples.

Next time your facing a poorly specified problem, don't ask for a clear spec,
instead ask for a million examples and train a deep net to solve it :)

------
EvgeniyZh
That's interesting that the paper was on arxiv since November (
[https://arxiv.org/abs/1611.01989](https://arxiv.org/abs/1611.01989) ) and
people payed attention just now.

~~~
lostmsu
That's because, AFAIK, no any meaningful problems were solved by this
algorithm yet. I'm not even sure it can make a Max(array) on its own.

~~~
zump
Then why's it worth discussion?

Why is there no "general" deep learning algorithm?!

~~~
eli_gottlieb
>Why is there no "general" deep learning algorithm?!

Because "deep learning" is almost always just neural networks with many layers
and some tweaks to the unit types to alleviate vanishing-gradient issues,
trained by stochastic gradient descent. In rather theoretical terms, "deep
learning" involves using gradient descent, and gradient-increasing tweaks, to
search for a specific continuous circuit in a given space of continuous
circuits. It's only as general as the hypothesis class you search through,
which here is, again, continuous "circuits" composed out of "neuron" units of
specific types, numbers, and arrangements.

Now, in computability terms, given a large enough (continuous or discrete)
circuit, possibly a recurrent one, you can represent any computable function.
However, in learning terms, that doesn't make a _useful_ computable function
at all easy to find in a very-high dimensional space.

------
vikiomega9
Would a potential roadblock for such systems be automatic verifiability?
Consider that a system provides a ranked list of possible code snippets
someone would still need to pick from these choices and test it.

~~~
tluyben2
Just like humans program; you write code and have unit tests (if you are
lucky). What is different about this? It has the same inputs / outputs so the
unit tests will be there. Formal verification would be better; aka having a
model for the original and having the computer prove that new version is
mathematically identical to the original. But both the formal verification and
the transformation proof are far off for almost all software projects in
practice.

~~~
dualogy
Who's writing the unit tests, the same bot as writes the code? The customer?

~~~
tluyben2
The customer. The bot cannot do that (and if it could, it would not need to I
guess).

------
faragon
My two cents:

1) Write programs easy to understand at first glance

2) Write code easy to delete

3) Write considering costs: target zero -or as low as possible- maintenance
costs

4) Write code for both fun and profit

------
qeternity
> The components of LIPS are (1) a DSL specification

LIPS with DSLs you say?

------
sgt101
surprised that Muggleton and ILP isn't referenced.

