
Will humans write code in 2040? - dbennett
https://arxiv.org/abs/1712.00676
======
vadimberman
I mostly get downvoted for questioning the AI hype, but I never tried it here
in HN.

The paper is... how I put it mildly... not in the realm of making sense.

They mention simple question answering (which still stumbles at more complex
even if structured questions), then code generators that existed since 1980s
(at least) and then pretend that it's the same.

Today's AI is about approximating human judgment from datasets. The knowledge
representation part, which is essential for tasks like coding, did not advance
much since a decade ago.

This statement, however, takes the cake:

> Some early results from Facebook this year suggest that machines are capable
> of developing their own more efficient methods of communications

It's about the hype created by technically illiterate journos. (My TechCrunch
piece on that: [https://techcrunch.com/2017/09/06/the-secret-language-of-
cha...](https://techcrunch.com/2017/09/06/the-secret-language-of-chatbots/))
Could the authors have done some minimal due diligence before making this kind
of claims?

Re subject, if anything, coding today requires much higher level of abstract
thinking. Of course, there are tools for coding oompah loompahs too, like
WordPress, but there is a lot of human decisions to be made there, too.

~~~
wslh
I think it is not about AI but about improving our work with new tools. There
are zillion developers doing the same thing everyday with the same mistakes
and a lot of this work can be encapsulated at different abstraction levels.

~~~
malux85
I agree, the level of abstractions will just change. We might end up writing
something more like formal logic in a declarative language and the AI will
then implement and optimise it.

Kind of the way a compiler generates optimised machine code from our higher
level languages now.

~~~
npgatech
Also don’t forget - we still have to write certain routines in Assembly even
after so many years. Number of abstraction layers doesn’t mean complete
automation.

~~~
zitterbewegung
Going from high level to low level code will not go away. But, for the most
part you would probably try to get something working and then optimize it.
[http://wiki.c2.com/?PrematureOptimization](http://wiki.c2.com/?PrematureOptimization)

------
maemre
I think they will and the paper is too naive when assuming the techniques can
scale and that computers can do the translation from a vague list of
requirements to a precise specification. This comic[1] explains the issue well
I think. There is a lot of hot new research going on in program synthesis,
most of the stuff I see is about either of:

\- Programming by example where we give some example inputs and outputs and
let the machine synthesize a program generalizing the input-output function.
It works well for small, pure mathematical functions with not so many subtle
edge cases but I don't think we can make it work for the edge cases as easily,
at least without human intervention that will be analogous to programming the
synthesizer.

\- Synthesis from specification, we give a tight spec of the function we want
and all the available functions then ask the computer to synthesize the
program. Some researchers made some very cool examples of this work by e.g.
taking specifications encoded with dependent types and generating conforming
programs. The catch here is that, if the spec is not precise enough, you may
end up with a non-conforming function and who is going to write such a precise
spec? The programmer!

[1]: [http://www.commitstrip.com/en/2016/08/25/a-very-
comprehensiv...](http://www.commitstrip.com/en/2016/08/25/a-very-
comprehensive-and-precise-spec/)?

------
skywhopper
Turns out, machines already do write most or really all of the “code” and we
use higher level more human-like languages to tell magic programs what code to
generate. Those magic programs are called compilers and interpreters, and they
already do a lot to protect humans from the extreme heterogeneity of hardware.

~~~
mr_toad
There has been research going back to the very early stages of ML to apply it
to compiler heuristics. For example, applying decision trees to loop
unrolling.

------
glangdale
The idea seems a bit outlandish, but I could easily see many coding jobs
vanish. We wrote a performance-oriented library (a regex matcher) and invested
huge numbers of person-hours tweaking and tuning and building new, faster
subsystems. It's not a giant leap of faith to imagine a 'Gold' version
(correct but not fast) being written by a human and most of the
tuning/tweaking/test generation/etc being automated.

I doubt that there is a simple AI approach to generate significant programs
out of thin air, but I suspect a lot of the mechanical work of making programs
fast/robust/small/whatever could be automated.

Sounds like a fun startup, honestly.

~~~
simplyluke
Sounds a lot like a compiler.

~~~
glangdale
Hah. It's a fair cop.

That being said, the system I had in my mind was considerably more elaborate
and extensive than you would expect from a compiler, except if your -O flags
were numerically open-ended (-O99, anyone?).

------
filleokus
Fundamentally, I see programming as the task of describing what a computer
should to. The way we create these instructions have definitely changed during
the past 20 years, especially with the large open source movement and stuff
like pip and npm. We have higher levels of abstraction now, the LEGO blocks we
assemble are larger and more reusable.

In my experience it feels like most of the low hanging fruits, when it comes
to abstractions, already have been picked. The databases, ORMs, web servers,
UI-frameworks, parsers/encoders, crypto stuff etc. are all pre-built LEGO
pieces I use and connect with each other. And it feels like the connection
part is my main job.

I would predict that we code mostly in the same way in 2040 as we do now, with
the approximately same level of abstraction, but with much better tooling. I
want an AI that can continuously read my code as I write it and tell me when
I've done a stupid misstake and turned the LEGO block the wrong way.

Also isn't a main problem of professional software development to understand
what another human wants me to instruct the computer to do? If my project
leader / customer can't manage to instruct another human on how the program
should work, can we expect an AI to do better? Of course, the AI can respond
quicker on Slack, and ask questions based on knowledge it has from previous
projects and have sane defaults based on experience, but still. I'm sceptical.

~~~
jsmthrowaway
> The databases, ORMs, web servers, UI-frameworks, parsers/encoders, crypto
> stuff etc. are all pre-built LEGO pieces I use and connect with each other.

Keep in mind there is far more programming than Web programming in the world.
It's important to keep that perspective when ruminating on programming at
large, and based on the pieces you mentioned, it sounds like you're limiting
yourself to that mindset. (That's OK.)

~~~
filleokus
Yeah, of course. It's what I do mostly these days. As mentioned in other
threads, some people still work at much lower abstraction level, and even hand
optimise stuff that is really critical and can't be left to a stupid compiler.

But isn't the trend with larger (/ "more useful") abstractions spreading even
outside the sphere of web stuff? I'm thinking about stuff like game engines,
ML frameworks or all the standard libraries for mobile plattforms.

I would really like to know how the life of a network card firmware developer
is like, or writing drivers for graphics cards, or software in highly reliable
systems like cars or something. I guess their work is quite different from
mine.

------
drawkbox
There are already many layers of hardware/software that we don't have to code
to use even going into the OSI layers stack. However the top most layer will
always require programming and architecture because that is where innovation
that matters to humans resides.

Even machine learning, big data, artificial intelligence, language processing
and more all take some initial design and goals i.e. you have to train machine
learning and architect what it is used for.

AI/automation/generation are currently great at reproduction but not
inception/prototyping/creative aspects of development. There are areas that
constantly move into automated layers though, where machine code is better.
For instance, optimizations like WebAssembly are forming to solidify the
application layers standards/platforms that may not need to be done by humans
much longer, just as assembly wasn't used as much directly by programmers once
C/C++/Objective-C came along in that wave in the 80s. When layers are
standardized and automated, programmers just move up the stack to the new blue
ocean of development built on top of everything else.

Similarly to AI, people have been saying visual coding will take over actual
coding for a long time, same with automation/generation/AI. The automation
still takes architecture, choreography, creativity and innovation to come up
with something. Coding is automation once it is developed and live. Coding is
part of the takeover and there will always be a top most layer that AI can't
do for some time and possibly never on the creativity side for things that
matter to a human or market need.

Programmers are also the first to really take advantage of AI. I am hoping for
a future where small teams or individuals can build AI armies that code like
they want on a massive scale. Here's to coding for your bot army that will do
your bidding in the future and make a small team competitive always through
the help of automation, machine learning and artificial intelligence.

------
flukus
Programmers have been trying to make programmers redundant for as long as our
industry has existed but made very little progress in doing so, none if you
don't count productivity improvements.

I'm yet to see a codeless platform that can handle anything more complicated
than a sick leave form.

------
ggm
Feels like one of those particular/general cases. In particular, few people
routinely code in assembler any more. I imagine you can do degree-level
courses in "computing" which don't even go there. I know you can do courses
with no VLSI or microelectronics, both of which were required subjects on many
courses at one point.

So yes, in particular terms, we don't do things which machines (programming
language compilers, VHDL->VLSI compilers) can do better.

But "general" case, no. we still "do" these things. They're just specialized.

So particular case, will everyone write code in imperative languages? No. But
general-case, yes, people will still write programs.

"Siri, I've lost my wallet: ask Roomba to sweep the floor one time,
exhaustively, and see if its stuck under a piece of furniture" is, at one
remove "code"

~~~
cjalmeida
One thing I can see happening is assisted UI design for websites. A lot of
manpower today is spent going from PSDs or wireframes to HTML/CSS.

It's quite feasible to give a sketch, with some style transfer have an AI
output decent HTML and even some scaffolding for your JS framework of choice.

------
mythrwy
In the future humans won't have write code because machines will do it all.

So all that will be left for humans is set up configs like webpack. This will
take a tremendous amount of time and effort. So there will be a huge demand
for configers. And thus configer bootcamps and Medium articles on how there is
so much sexism in the configing industry and others bemoning the prevelence of
config-bros.

Sarcasm aside, at the end of the day someone has to tell a machine what is
wanted. Unless machines get much smarter and they aren't about to get that
smart any time soon near as I can tell. Those someones are called programmers.

------
userbinator
_that machines, instead of humans, will write most of their own code by 2040_

Regardless of whether it actually happens, a future where humans can no longer
have meaningfully detailed control over what machines do (to them), sounds
extremely dystopian and certainly much sci-fi has been based on that vision,
so I hope it _doesn 't_ happen...

~~~
cjalmeida
This already happens. Compilers and interpreters hide most of the details.

Unless you're writing embedded code you usually don't have a say in a lot of
what's going on. Things like memory allocation, scheduling, interrupts,
addressing.

Things like kubernetes takes this even further.

------
dep_b
That AI would ask the client / designer "what the hell do you mean with this?"
until an implementable design emerges. Client / designer decides it's much
more efficient to throw a half baked idea over the fence to blame a human
programmer afterwards.

------
bfirsh
If you’re on a phone, here’s an HTML version of the article:
[https://www.arxiv-vanity.com/papers/1712.00676/](https://www.arxiv-
vanity.com/papers/1712.00676/)

------
nothis
The more I think of this, the more the only counter-argument seems to be
natural language not being precise enough. But all coding tasks start with a
natural language description at one point. Hmmm.

------
xbmcuser
I think Humans will be writing code 40 years from now but not as they do today
as natural language processing will get far enough that you could tell an AI
what you require and then it will write the code for you. When a tech advances
the number of its users increases but the % that understand its internals
drop. Programming is no special flake to be protected from it the same will
happen to it. We are already getting small hints of this with specialized
algorithms that optimize code or find bugs

~~~
michaelmrose
Citation needed. Decades ago human level AI was coming real soon now and cobol
would let managers write code without needing programmers.

Are you specifically asserting that the need to write code will be obviated by
human or greater AI? This seems to be a safe prediction in the long run
although I cannot imagine how we could accurately establish a time line.

------
trophycase
Yes. Technological progress is almost always slower than people imagine.

------
jonahx
Nice counterexample to Betteridge's law.

------
robertc2017
Yes.

------
shahbaby
it's called prolog

~~~
convolvatron
its closer. in the end I think its too fussy. what do you think of the datalog
inspired languages?

------
juanmirocks
In the end, the only remaining jobs will be those that require interacting
with other humans.

~~~
blank_2
What current jobs don't require interaction with other humans?

~~~
juanmirocks
Well, better phrased: that MOSTLY require the interaction with other humans.

