
The future is fewer people writing code? - pratap103
https://techcrunch.com/2016/07/22/dear-google-the-future-is-fewer-people-writing-code/
======
dasil003
Non-programmers make this mistake all the time: thinking that the syntax is
the hard part of programming.

No, the hard part of programming is understanding in very specific and rigid
details how to accomplish a task. What the author doesn't realize is the
enormous amount of processing power, shared culture and empathy that goes into
human interaction.

Even mighty Google doesn't have the compute power or the architecture to
replicate this. Until computers can _understand_ humans at a human-level this
will not be possible. Several breakthroughs will be necessary, and even still
I expect decades/centuries before it happens if it even happens. Essentially
we're talking about the Singularity.

~~~
bdrool
> Non-programmers make this mistake all the time: thinking that the syntax is
> the hard part of programming.

Programmers make it all the time too, or at least something very similar. See
all those people who are on the endless quest for the "perfect" language,
which usually means one which allows for writing the shortest code, sometimes
at a very high cognitive cost for very little payoff. No, I really don't want
to do a massive amount of mental backflips just to save a few characters,
thanks.

~~~
brianwawok
Do you write in assembly?

If not, you clearly see some gain in more terse syntax.

Seems like a good thing to explore, although perhaps it should not be one's
sole focus in life.

~~~
openasocket
The difference comes down to syntax vs. semantics. The problem with assembly
isn't its syntax (mov eax, ebx is fairly readable), it's that the underlying
semantics are too low level. Exploring new abstractions/semantics to use is
very useful (local variables, first-class functions, algebraic data types,
etc.), but optimizing solely for source program size leads to a language
that's great for code golf and not much else.

~~~
brianwawok
Not sure. I mean yes you can abuse some languages and fit an entire program on
1 line (see any golf project). But I can generally grok a 100 line scala
program in the same time as a 100 line Java program, and the scala program is
usually about twice as dense. (So I am groking scala twice as fast).

~~~
i_are_smart
I generally find this to be true for myself, too. Although I would add the
qualifier that I find Java to be an exceptionally wordy language, for lack of
a better term. It seems to take a lot of talk to do a little in Java even
compared to other object oriented languages.

------
pmlnr
“50 years from now, I can’t imagine people programming as we do today. It just
can’t be."

Dear writer, let me introduce you to FORTRAN, COBOL, LISP, or BASIC. These are
alive languages, all 50+ years old.

Coding didn't change much. The languages, the methodologies, the ideas change,
but the approach is the same, and whoever thinks this will soon (50 years is
not _that_ far) changes, have never had to debug something nasty. Doing that
with voice commands in my opinion is significantly harder compared to what we
have now.

We will have tools, accessible, easy tools; Arduinos and Pis of the future;
sure. But it will not replace, nor eliminate or reduce the amount of code
written.

~~~
nickbauman
There's a serious gap in the writer's mind about computation and programming.
It's like the author is suggesting that "eventually we won't need writing: it
will be replace by writing-thinking or picture-writing". It's completely
absurd. Specific, complex ideas can only be described and communicated in
text. Not pictures. Blueprints, for example, have a pictorial element to them,
but their fundamental value is our ability to use the formal _language_ to
analyze what's on the plan and whether it is correct or not. To the degree
that a picture or a motion graphic can formally accomplish this is to the
degree that it is supported by a specific language under the covers. Not the
other way around.

~~~
imgabe
Blueprints/schematics are far, far superior at conveying the information they
do compared to a written narrative. Given the ease of preparing written text
compared to drawing schematics nobody would go to the trouble of doing so if
that weren't the case.

~~~
jrapdx3
An interesting angle on the topic comes from my father who at one time was a
project manager and designer in the construction industry. In the days before
computers he would painstakingly hand-draw the design that was reproduced as
blueprints, that was the role of the "draftsman".

But the drawing wasn't the source of stress, rather it was the project
"specification" that he sweated. The issue was the spec was a _legal, text-
format document_ detailing the size of beams, type of wire, plumbing,
fixtures, etc. He had to assure that beams were sufficient to support
structure, electrical wiring was safe and up to code, etc. A mistake could
expose the contractor and himself to legal liability if a component failed, so
an accurate spec was a task he took seriously.

Of course the subject of program specifications is commonly discussed, though
often doesn't have the same significance that my father experienced. I guess
in most cases program crashes don't have the same impact that a roof caving in
would entail. In situations where crashing can't be tolerated, the spec will
mean a whole lot more.

~~~
imgabe
I work in the same construction design industry. The drawings themselves are
also contractually binding. Many smaller jobs forgo the written specifications
altogether.

~~~
jrapdx3
My father had mostly worked on larger projects, like tract houses and the
like. Of course, times change, my recollection was of how things were a long
time ago. My comment was just illustrating an instance where relying on a text
description was still important even though there was a graphic format as
well.

Your info was relevant to the idea of that at some level of complexity it
becomes necessary to use text vs. only graphic presentation. Maybe in
construction that occurs when there are more than a few elevations to juggle,
but you probably know much more about it than me.

------
winstonewert
Dear article writer,

Natural language sucks, it is amibigious, difficult to manipulate, verbose,
and have too many non-functional degrees of freedom. After all, that's why
mathematics left natural language and adopted the mathematical syntax we have
today.

Diagrams suck, they are ambigious, difficult to manipulate, verbose, and has
too many non-functional degrees of freedom. That's why cook books don't have
diagrams to describe recipies.

The syntax will never die, it is the only sensible we have to define programs.

~~~
n00b101
> Diagrams suck

And yet anytime two or more programmers get together to talk about what they
are creating, they start drawing diagrams on whiteboards.

~~~
winstonewert
And it is on a whiteboard because it is not useful enough to record in a
longer term medium.

I'm not saying diagrams are useless, they just make a poor substitute for
syntax.

~~~
krapht
I'm going to disagree on that. Every day I wish I could intermix textual and
pictorial representations of logic in the programming I do. In particular, any
series of computations that can be represented as a directed graph, e.g. a
streaming data workflow, or state machine, is much more easily understood
pictorially than textually.

The flowchart and decision tree exist for a reason to describe algorithms.

~~~
pessimizer
> In particular, any series of computations that can be represented as a
> directed graph, e.g. a streaming data workflow, or state machine, is much
> more easily understood pictorially than textually.

As long as it is very simple. Electronics already have a highly developed
visual language for describing their functions - but if what was going on
inside every chip was illustrated just as what was going on between chips, it
would be entirely unintelligible. Instead, any visual representation is at a
particular scale, and well known portions are represented as blocks with
cryptic textual notes next to each interface (ACK, EN, V0+, CLK, PT2, HVSD,
WTFBBQ, etc.), labels to identify company or type, and an expectation that you
know what they do or can find out on your own (and not an expectation that you
understand how they do it.)

Anything simple enough to be completely expressed in human-comprehensible
pictures should be exposed to the user and modifiable (even if not by using
pictures, but forms.) I totally agree, if that's what you and this article are
trying to say. My experiences in trying to encode actual human workflows in
BPMN have taught me that when using pictures it's harder to express things of
any sophistication than in words - because of words like "with" and "each" and
"all", "if" and "when," and because of ways things change over time, and
because of separate but overlapping/interacting flows that languages can
express easily but pictures not so much.

In pictures, that involves looking all over your picture for different things,
trying to figure out how to draw lines to them; if the condition is once or
twice removed from the object of the search, it involves trying to untangle
massive knots with your eyes and memory. Theoretically, that is. What it
involves in practice is scrawling words all over your picture (just like in a
circuit diagram.) Words that express the same types of relationships over time
and type as the picture is trying to express projected onto a plane, words
that could be easily expanded to include those relationships and eliminate the
18 types of lines, the 25 types of shapes, the 12 types of shape borders, the
16 color schemes and the long list of rules for connecting them that had to be
invented to avoid coming up with a textual syntax.

~~~
krapht
Yes, and that's how I would use a language that would allow mixed picture and
text logic flows. At a certain level of abstraction block diagrams greatly
assist understanding program flow, and it is redundant that I have to write
the code and then draw the block diagram later for documentation.

Going back to electronics, I don't think anyone would argue that schematic
block diagrams are inferior to reading the raw netlist. Similarly, I feel
programming could be improved if IDEs for popular langauges would allow
connecting functions together in a streaming manner. Of course, I am aware
this exists, Simulink, LabView, FPGA schematic workflow, but these are niche
languages that I don't work in.

~~~
winstonewert
"I don't think anyone would argue that schematic block diagrams are inferior
to reading the raw netlist."

Well, no, but some may well argue that reading the HDL is better then a
diagram. I have experience working with both the HDL and schematic in the FPGA
world, and in my estimation text-based HDL is way better than working with a
diagram.

Of course, YMMV, my brain may just be more optimized for processing text
instead of images.

~~~
wott
Many times I wished there was a HDL for PCB design input instead of schematic
tools, now that there is often very little discrete/analog parts in a board,
because large chips include almost everything needed and you mainly spend time
connecting them together, possibly with a bit of plumbing but not much, and
the only remaining discrete components are very repetitive: a ton of similar
decoupling capacitors, pull-up/down resistors, termination resistors, a couple
of voltage divider resistors and a few other common functions.

That should be a great fit for a textual HDL instead of labouring through a
schematics mainly linking pins to pins again and again. It would even be much
more expressive, now that we often have chips so big that they cannot be
represented efficiently as a single symbol on a single sheet but are split in
smaller blocks looking like HDL ports without the flexibility; now that µC,
SoC and other kinds of chips have pins that are so much muxed out that they
don't have a clear, expressible function, meaning that grouping them in blocks
is more of random choice than a good solution. And this multiplexing means
that you'll often have to change and change again the connections of your
wires in the schematic, and that would be much easier to do with an HDL.

\-----

That's why my mind was blown when a software job forced me to use a graphical
tool like Scade. It felt like coming 20 years backwards, when in electronics
HDL were not popular yet and we had to design FPGAs and such with schematics.
And that was even worse, because the graphical representation looks parallel,
concurrent, as a electronic schematic does, except that it doesn't match
anything on the software side: first the specification/design document you
have to implement is generally sequential, not concurrent, and then the
generated code and the way the CPU/computer works are sequential as well, not
concurrent. So you have this weird looking graphical part in the middle, which
looks parallel but isn't really, and messes with your brain because you have
to perpetually translate between the sequential specification to it, and from
it to what it really does sequentially.

An appaling moment to do this job and discover that they considered it an
improvement on C/Ada/whatever regular programming. And I didn't mention the
tooling; like when what could have been a simple textual diff turns into an
epic nightmare you are never sure you can trust the result, if you manage to
get a result.

------
buckbova
> To get there, programming tools should first use our language. For instance,
> to turn a button red, we shouldn’t have to write code. We should just be
> able to point to the button and select a shade of red.

We've had that for over twenty years.

~~~
throwaway2016a
Yeah but web developers still tend to do it the "hard way" but editing the CSS
directly. Why do web developers still do it that way even though visual tools
exist?

My guess is because with working with text is more efficient. The cognitive
load of finding a setting in the UI, moving your mouse to it, and selecting
the value is far greater than just typing.

~~~
Pfhreak
Why do people play D&D instead of WoW?

Perhaps, in part, because one allows for a greater expressiveness -- albeit
with a little more planning and cognitive load.

~~~
seangrogg
This does make me wonder if developers tend to be involved in more forms of
creative expression (roleplaying, art, etc.) than in creative consumption
(gaming, movies, etc.), what intersection(s) exist and why...

------
tribune
"...so why are we having a serious conversation about grooming children to
become software developers before they’ve even gone to middle school?"

We're not, really, but given the pervasiveness of computing technology we're
recognizing that it's important for children to have some formal experience
with software design concepts regardless of which career path they choose.

I'm a firm believer that at least some coding ability is beneficial in any
profession. It's like writing, or vocabulary; you don't "need" it for some
professions, per se, but being a good writer enhances both your professional
and personal life in many ways, so it's worthwhile to teach. It's much the
same with coding.

~~~
circlefavshape
This. Whether or not we need more professional devs is beside the point. At
coder dojos, for example, they're not teaching design patterns, they're
teaching the basics - assignment, loops, conditionals - the kinds of things
that allow you to automate computation in other fields of endeavour

~~~
rimantas
There is also one million other skills that would be beneficial.

------
maker1138
Programming is done in code for the same reason mathematics is done in
notation, for specificity.

Doing programming in plain English would be just as cumbersome as doing math
in plain English.

I don't think anyone can quite fully imagine the nightmare of trying to
program in a recursively enumerable language[1].

[1]
[https://en.wikipedia.org/wiki/Chomsky_hierarchy](https://en.wikipedia.org/wiki/Chomsky_hierarchy)

~~~
sdegutis
Hell, even trying to _explain_ how a relatively simple shopping cart web app
works, unambiguously, in plain English, to an executive, is extremely tedious
and verbose, requires defining a lot of specific terms, and at the end of the
day it still confuses the hell out of him.

~~~
beachstartup
you should work for better executives.

------
codingdave
Some of my coworkers have been using similar concepts to teach K-5 coding in a
few school districts for a few years now. And the big, surprising impact has
not been about the code, but about the problem solving... I'm getting the
stories second-hand, but apparently the concept of debugging problems becomes
so ingrained that they "debug" all their efforts. When they make a mistake in
math, they debug their process to fix it. They debug what is wrong with their
handwriting to improve it. They think of everything as a problem to be solved,
work towards solving it, and are doing wonderfully across all subjects.

So the question of whether these kids will be coders as they grow up really
doesn't seem to be that important of a question -- they are being taught how
to succeed at whatever they try. I'm excited to see these types of programs
move forward and become prevalent throughout our educational system.

~~~
marklubi
> the concept of debugging problems

I've been seeing this same thing happen with my 7 year old son (I'm a
programmer), and I always encourage him to find out _why_ something didn't
work out the way he expected it to.

Whether they grow up to become a coder or not, I'll be happy when more people
look for the cause of the problem, rather than just simply treating the
symptoms.

------
mafuyu
I think the author is missing the forest for the trees, here. Schools often
don't have much of a computing curriculum, and these classes are great for
improving general programming literacy. And if one kid finds out he really
enjoys it when he wouldn't normally have through school, that's a win. It's
true that most of them won't become programmers, but we don't teach biology in
high school assuming you won't become a biologist, either.

------
deathcakes
"To add an example for clarity, think of the field of typography — until the
Digital Age, typography was a specialized occupation. But with new programs
like Microsoft Word coming into existence, typography (e.g. formatting a
document, setting the margins, making sure the lettering is appealing, etc.)
became something everyone could do easily without much thinking."

Without much thinking pretty much encapsulates what Word did to presentation
standards, at least in my experience. Let us never forget WordArt.

Part of me always wants to make the argument that these things are difficult,
not just because of an abstract syntax and arcane rules, but because these
things are genuinely difficult to reason about - attempts to make difficult
things easier by papering over the cracks results in a lot of pain for a lot
of people. Bits ping off and people are left unable to even begin to solve the
problem.

However the very, very, obvious flipside of this is that lowering barriers to
entry is pretty much always a good thing. It invites unconventional
perspectives and novel approaches - how could that be a bad thing? Sure, some
people will make crappy things that shouldn't have ever existed but by the
same token some people will make great things that never would have been
without the lowered barriers.

------
lmm
Coding in a good language already consists of writing about the things you
care about, not the things the computer cares about. Text (with a few symbols)
turns out to be the best way to express computations, not to a computer but to
a human reader/maintainer.

The future is most professionals writing code as part of their job, just as
the present is most professionals writing as part of their job.

------
mlashcorp
It saddens me that laypeople equate software development to writing code,
that's like saying an architect just knows how to draw.

~~~
jasonthevillain
Which is an excellent example. Architects still do architecture, the tools
just evolve.

------
dkopi
"The future I imagine is a world in which programming is self-explanatory,
where people talk to computers to build software. To get there, programming
tools should first use our language."

But is: "For every button on the page that is a "warning" button, replace the
background color to red."

Necessarily better than? $("button.warning").css("background-color", "red")

~~~
mrob
Inform 7 is a programming language that looks similar to natural English. As
far as I can tell the only benefit is tricking beginners into thinking it's
easy. By the time they figure out it's a normal programming language only with
extra verbose syntax, it's too late, they're already a programmer.

If we had a real natural language based programming language it would have all
the problems of law. Laws are written in a very formal style that takes a lot
of training to understand, and despite this they contain enough ambiguity to
support a whole industry of lawyers arguing about them. Making programming
similar to law would not make anything easier.

~~~
AnimalMuppet
For a previous attempt at the same idea, see COBOL. The original design goal
was to make professional programming obsolete.

~~~
ktRolster
Applescript is worth looking at for the same reasons.

------
throwaway2016a
I remember going to a mobile conference in the early 2000s and every single
vendor there was saying that developing mobile apps using UML was the future.
No code, just map out everything in a diagram.

Granted a smart phone was unheard of at this point so most mobile apps
wouldn't even be called apps by today's standards.

A decade and a half later and mobile developer is a highly skilled _coding_
position.

~~~
ktRolster
_in the early 2000s and every single vendor there was saying that developing
mobile apps using UML was the future. No code, just map out everything in a
diagram._

UML in later versions got so bad that even the original creators disowned it.
Turned out that putting everything into a diagram was just as hard (and less
convenient) than writing it as code.

------
Kinnard
> For instance, to turn a button red, we shouldn’t have to write code. We
> should just be able to point to the button and select a shade of red.

Someone is going to have to right the code so that the end-user can just click
buttons. So if this future of programmatic interfaces is coming, it's going to
require more people writing code to build it— not fewer!

~~~
xigency
Just point to the button and select a shade of being able to point at the
button and click a color. (Yes, that is confusing.)

I don't think the author or the person interviewed has used Visual Basic, or
understands why a developer might not want that.

~~~
Kinnard
Essentially what he's saying is that in the future people will just read and
not write . . .

------
mattnewton
I don't know if I agree with the author since they didn't seem to provide any
evidence / argument for why the future is codeless. They did warn me though
that I wouldn't be able to understand from my vantage point in Silicon Valley
though so maybe other people see the argument? But from my perspective code is
only growing without showing signs of abating.

~~~
madelinecameron
First off, I agree. I agree that everyone who continually says "programming"
is dying and in 5 - 10 years, "AI" will be writing code (or whatever else they
can dream up) have no idea what they are taking about.

Programming is definitely going to get easier and I don't doubt that it will
become a common skill.

I just doubt this idea that professional programming is a dying art or
something. It is just silly to say that because tools that lower the barrier
of entry are becoming more common, that the entire profession will soon be
dead.

(Also, I am like 75% sure that it was a sponsored article to advertise for the
company named in the article)

~~~
rimantas
Programming won't be any easier. It's like claiming that in the future
thinking will get easier. I'd claim otherwise: keeping in mind recent trends
of "outsourcing" the knowledge many brains just don't have enough building
block in working memory to do complicated thinking.

------
PaulHoule
Progress happens slow in the short term and fast in the long term.

In the 1980s, there was a consensus that "software components" enabled by
object orientation were a pipe dream.

They were so long as you were using C++ which was barely binary compatible and
where you couldn't reuse objects in a .so file without also having an .h file.
It was awful, not at all a minimal viable product.

Then Java came along and a number of other languages that adopted essentially
the same model for OO programming such as Python, PHP, Ruby, C#, etc.

Now you can cut and paste a few lines of XML into Maven and woohoo... You've
incorporated a software component into your system.

People bitch that it has to be XML, but the sheer ease of doing so means it is
not hard at all to get 100+ dependencies in a project and now the problem is
dealing with the problems that come when you have 100+ dependencies.

(And of course the same is true with npm and every other language that has
similar tools.)

Two big themes are: (i) tools that reduce the essential difficulty of software
development and (ii) antiprofessionalism in software engineering.

Compilers like FORTRAN mean you don't need to have the intimacy with the
machine you need to write, say, Macro Assembler. That is mainstream, but other
technologies, such as logic programming and rules engine are still stillborn.
In theory tools like that mean the order of execution does not matter so much
so you don't need the skill to figure out what order to put the instructions
in. Practically they are yet to become vernacular tools that are palatable to
programmers and non-programmers. (Anything programmers can't stand will be 10x
more painful to non-programmers, I can tell you that!)

Anti-professionalism is another big theme. Had computers come around 20 years
earlier we would probably have a programmer's union, licensing and other
things that would make a big difference in our lives. As it is, the beef that
programmers have is not that we don't get paid enough, it is that we are often
forced into malpractice by management.

------
sp527
This is a very amusing article because I went in expecting fairly
sophisticated arguments about Cloud, PaaS, DevOps, layers of abstraction,
higher-level languages/paradigms, etc. Perhaps this was naive, given the
domain of origin.

The irony is that the future probably _will_ enable individual programmers to
have an even more outsized ability to create value, and fewer programmers will
be necessary to accomplish the same set of tasks. Sadly, cogent arguments
about meaningful issues aren't exactly TechCrunch's forte.

------
hackguru
There is an upper bound limit for how abstract a general purpose programming
language can become. Programming languages mainly exist because of their
ability to remove ambiguity. Our natural language on the other hand is very
vague. Many people might read the same exact article and interpret it
differently. This is natural language's great feature. This feature is why a
kid, without fully formed thoughts, can learn and use a natural language.
Hence I don't see a day programming languages will completely fade away.
Programs are result of a careful thought process that cristalizes a concept
into a process and that process is only complete when you can describe it in
an unambiguous language. One may argue that natural languages are capable of
being not ambiguous. A subset of a natural language can be used without
ambiguity but that is just definition of a programming language. Arguing
programming languages will fade away is the same as saying math one day will
not be necessary because we can explain all concepts in physics or other
sciences in natural language.

------
ISL
Are there graphical languages that advocates like?

The only graphical language that I've encountered professionally is LabVIEW,
and I've yet to see an instance/programming style where it has been superior
for anything but quick prototyping.

A language that's editable in both flowchart and traditional formats could be
very useful, if executed in a way that doesn't cripple the traditional side of
things.

~~~
pp19dd
LabVIEW was eerily satisfying to write code in. Or at least some type of code
in. Loops and logic blocks got a tiny bit weird, but overall you could do a
lot with sub-vis.

I find myself craving it even 15 years later.

~~~
maxxxxx
LabVIEW gets really weird once you get to things like loops or, even worse,
threading. You will start craving some good old fashioned code very quickly.

I haven't seen any kind of visual coding environment that didn't fall apart
quickly once you got to more complex scenarios.

------
whack
The writer presents an ideal vision for the future - one where people can
"build software" purely through abstract thought, without needing to know the
semantics of specific tools and programming languages.

If such a future is possible, that would be great. I would be all for it. But
the people who are already in the field, working in the trenches, don't think
it's a realistic vision for the near future. All attempts thus far to produce
"layman friendly programming" have been either failures, or relegated to non-
functional toys. Hence why we don't want to waste _our_ scarce time and
resources on such moonshots.

If the author and his peers disagree, they are free to found/invest in such
ventures. And if they're right, they can make a fortune for themselves in the
process. But just sitting in the sidelines and armchair quarterbacking is a
pointless waste of time for everyone involved.

------
random3
That's a bit Cocky, considering the amount of knowledge and effort google has
behind AI, quantum computing and others.

But I guess the main point of the article was not that, but the Bubble plug.

------
gerbilly
Think of the the other time when people want to be precise: contracts.

Contracts are written down in text, so they can be edited, carefully read and
referred to later.

He might as well also say, that in the future there will be no written
contracts, we'll go back to debating and settling our issues verbally in
public.

~~~
audleman
> that in the future there will be no written contracts, we'll go back to
> debating and settling our issues verbally in public.

This is a great metaphor. Only I would change that he thinks we should "go
back" with he thinks we are going to invent a magical technology that lets
lawyers from two companies drag and drop a few images together and BOOM
there's a legally binding contract.

~~~
fourthark
The lawyers will explain what they want to the computer. They will have no
idea what the computer made of it, but the result will look close enough that
they won't be able to argue they didn't get what they asked for.

------
madelinecameron
Nice sponsored article there.

~~~
jmagoon
I actually clicked the link for whatever visionary platform bubble is--it's a
workflow engine! Real groundbreaking stuff there.

------
Animats
Visual Basic was a success until Microsoft killed it.

Remember when you could write HTML with a WYSIWYG editor?

Most sites can be built with Wordpress.

The hosting side of things is mostly automated and becoming more standardized.

------
nercht12
To each their own. Yes, I'm sure one day more people will use software that
allows them to put together amazing programs using complex GUIs that do
wonderful magic. These may be business programs that do statistics, or games,
and we've already seen both (hello Excel, hello game studios). The problem, as
it always has been, is that point when you want to do more and suddenly find
yourself in asm land or looking to talk to mysterious "drivers". After all,
what's a port anyways? Does that mean my computer has Docker built-in? As I
hint, people will hit that wall of mystery, and lower-level programming become
inevitable. After that happens, they eventually become accustomed to this idea
of telling a computer what to do by strict commands according to a specific
protocol. For the average guy who never gets into it, you could tell him it's
kind of like talking to a dog. "Sit." It sits. "Love." It wags its tail.
"You're so smart, pooch." It gives you a blank stare... and keeps wagging it's
tail.

These days, when I look for a programming language, it's not about the syntax
sugar, it's more about the feature set that comes with the language: things
like the module and build system (e.g. Java is pretty easy), the ease at which
both complex and trivial tasks can be accomplished, and the availability of
support community and libraries.

------
lnanek2
If it was another company doing this, there might be a chance it is
meaningful. Google is not that company, however. They started and killed
visual programming projects before, dumping everyone's data. The last one was
called App Inventor. At least they made the code open source so MIT could
write App Inventor 2 based on it. The best thing they could do to make me
think Project Bloks will be meaningful is to give it to another company in a
similar fashion. Otherwise I expect they would just kill it next year anyway
like other projects of theirs.

~~~
lern_too_spel
The difference is that Bloks isn't a service that can be shut down. It's
essentially a set of open source schematics.

------
dkarapetyan
Programming is really formalization. That is the hard part. The kind of
thinking required to take something and express it as a set of logical and
computable constraints. It doesn't matter how much money you throw at the
problem we are never going to have the entire population being able to
"program".

It is the same reason despite all the training in mathematics only a few
people go on to get a PhD and come up with something novel in mathematics. The
rest of the population gets by with basic algebra, not even calculus is
required.

------
emblem21
Dear writer, the future is fewer people writing sponsored content.

------
spotman
Maybe an unpopular opinion here, but I agree with the article overall.

My view is less that programming is going away, and more that all jobs are.
Not immediately or anything, but I don't think we are going to magically
produce programming jobs for all the masses who are going to need a job.

Having been at this over 15 years I have single handedly automated thousands
of jobs, and of those a healthy handful are making things more efficient that
a project needs less programmers, etc..

So while we will still need programmers probably forever , I'm not sure why
people think that the number of programming jobs will do anything but stay the
same or decrease, while the number of candidates increases.

Tooling has come so far, and it's going to go farther. You don't need to know
a lot to make something meaningful anymore.

How we expect to train all children to be programmers and think that by the
time they are our age it will still be a field that is lucrative is silly I
think. My prediction is with all the new programmers coming into the field
intersecting with the tooling getting better, intersecting with that a lot of
other markets need less people I think we are left with a crowded field of
players where the average skill level is lower because the tasks do not
require it to be all that high any more.

Programming is the new carpentry. Probably jobs for a long time , but training
the children now like it's going to be the most amazing career path is short
sighted I think.

Considering this, I hope my kids don't pick programming as a career. I would
love to be wrong.

~~~
freerobots
Supporting your opinion the BLS puts the job outlook of a programmer at -8%
between 2014 and 2024.

I hold the same opinion as you. I think that new languages and tooling will
empower people to do more with less while being easier to learn. Combine this
with globalization plus stagnating economies and the outlook of programming as
a career seems less lucrative.

------
bbctol
It seems like this is focused a lot on what people want to do, as opposed to
what provides value. Not that that's a bad focus! But if you want to predict
the future, I think market forces would be a better indicator; as much as we
might want to move to a future where programming involves more intuitive
tools, I still think it will be more powerful, and thus valuable, to be able
to muck around in the code.

------
maus42
>But with new programs like Microsoft Word coming into existence, typography
(e.g. formatting a document, setting the margins, making sure the lettering is
appealing, etc.) became something everyone could do easily without much
thinking.

Correction: Most people do it without thinking. This does not mean the
majority of the typography on the internet (and eventually printed) is _good_
or _well done_.

My LaTeX setup produces fairly good typography for math work, without too much
thinking on my part, after setting up the packages, fonts, etc and learning
LaTeX, which did took some time. (And ironically enough, writing LaTeX feels
quite a bit like coding...) But first of all, deciding to use and learn LaTeX
(or some other workable solution to produce good typography, it probably isn't
the only one but the one I'm familiar with) requires you think about
typography and realize that good typography is needed in the first place.

------
50CNT
This article strokes me the wrong way. If any of the things described in there
were possible, why the hell would I be writing any code? I'm a lazy programmer
for gods sake, and code I don't have to write is a win in my book.

Matter of fact, once I got past the point where the novelty of writing lots of
code wore off, I'm spending most of my time trying to write less code.

That side step all these miracle solutions for bringing coding to the masses
in one fell swoop and eliminating its tedium do is that "Hey, technically, if
you draw pictures instead, it doesn't count as writing". Yes, technically
true, totally useless. I personally believe that whoever comes up with this
again and again deserves to be bludgeoned by a copy of "K&R The C Programming
language", turned into a picture book. All 70,000 pages of of it, with the big
glossy full-page, double-page foldout prints.

/rant

------
Forge36
-on mobile so I apologize in advance for bad grammar and typos

I don't think the author understands the purpose of the project. Google wants
more coders. As a company Coders are likely one of Google's largest expense
(At my Job staffing is 25% developers, and staffing is ~80% of our current
budget). Does Google necessarily need more developers writing code? No,
however they could use more people who can code to solve the small problems
they face daily.

I think it's more likely we'll have people writing code informally, and as a
small part of their overall job.

>Writing code will become less and less necessary, making software development
more accessible to everyone.

I agree with that sentiment, however I fail to see the link between more
accessible development and fewer people writing code. This process has been
happening for years.

>The real benefit of something like Project Bloks is that it actually removes
the code.

But is that new? What if something more advanced is needed?

Excel is a good example of writing code for a job. Access is an example of
programming without writing code (it's sql with a GUI). Both tools are popular
however people have a hard time doing advanced things. This is also perhaps
due to the high price of creating the building block interface/software cost.

By thinking logically, people may not write code formally, they may not write
any code: however it will encourage people to create solutions to the problems
most applicable to them. Maybe their solution is 90% the blocks provided by
their program. 10% code they wrote so handle their edge case. Perhaps it's
something they only engage in one day a month.

In the end I think we'll see more code, more people writing code, and programs
with more handling of the common tasks as building blocks but the ability to
write code for the complex parts and plug it in where needed.

------
pessimizer
Thinking that turning buttons red is the major problem of programming is
mistaking the interface for the substance. It's as if you wanted to teach
people how to develop new automobile technology by selecting the shape of the
steering wheel, the fabric in the interior, and the color of the paint job.

The problem with programming is that computers can't understand your
intention. What amateur programmers need is side-effect free functions,
efficient abstracting away of cores and memory-management, and static analysis
that makes functional bugs as obvious as a leak in a plastic bag. Computers
will never understand your intention; programmers barely understand your
intention and they have a lot more in common with you.

------
modeless
He's right but he doesn't understand why. We'll write less code in the future
but it won't have anything to do with visual programming languages or other
fancy tools for "programming without code". Code will always be the best way
to write programs.

The reason we'll write less code in the future is we won't need as many
programs. The future is in machine learning and machine teaching, which enable
a single program to perform a huge variety of different tasks. We'll train the
computers of the future by showing examples and correcting mistakes, as we do
with our fellow humans. Machine teaching is a different thing entirely from
programming.

------
linguistbreaker
Did the author not notice the text on the Project Bloks homepage he linked to
that says

"creating new ways to teach computational thinking to kids."

Pretty disingenuous or arrogant to act like you're setting Google straight xD.

Also, "intersectionality."

------
jfe
Any time a new idea comes along, there's always someone who claims everyone
needs to learn it, and that it needs to become a part of school curriculum.

In my opinion, the ability to think laterally is far more valuable than the
ability to think 'computationally'. The latter is comprised of essentially one
pattern of thinking -- procedural -- while the former opens one to an infinite
set of patterns with which to think.

The computer is a decent vehicle for exploring patterns or modes of thinking
once you've discovered them, but the goal should be to explore the pattern,
not the vehicle.

------
jwatte
"Writing code will become less and less necessary, making software development
more accessible to everyone."

I heard that same argument 30 years ago. "4G" and "expert systems" and
"application generators" and "visual programming" were going to do away with
the "engineering" aspect of software engineering.

However, in reality, we write more complex code for a simple business app now
than ever before.

Once hard AI can extract requirements and transform them to systems, we can
retire from coding, but probably not before then.

------
ankurdhama
Programming is the act of "describing" a computation to a computing machine.
In the past people used gears and levers to do that for mechanical computers.
The way of describing computation can exist in many forms, like, formulas in
excel sheet or using timeline to describe an animation or using C to build a
device driver. All these ways have their own particular context to be useful.
So, it doesn't make sense that in future we will have "one particular way" of
describing computations.

------
justinlardinois
Is "computational thinking" really something that isn't already being taught
in schools? There's a link to an article that spends three pages defining it:
[https://www.cs.cmu.edu/~15110-s13/Wing06-ct.pdf](https://www.cs.cmu.edu/~15110-s13/Wing06-ct.pdf)

It alludes to computer science, but really it just boils down to solving
problems by breaking them into parts. In other words, problem solving skills.
It doesn't sound like anything special to me.

~~~
grossvogel
I don't think it's being taught adequately to lots of people who aren't
destined for careers in tech or academia. Look at the nearly universal disdain
for math and especially "story problems," as well as the mystical aura of
"coding" outside of those tech / academic circles.

In that context, I do think early encounters with programming _might_ be a
great entry point to "computational thinking" for some who aren't well served
by existing curricula. It certainly makes more sense to me than trying to
train every child to write code for its own sake.

------
seangrogg
"[...] so why are we having a serious conversation about grooming children to
become software developers before they’ve even gone to middle school?"

For the same reason we groom children to become mathematicians, scientists,
readers, writers, historians, musicians, actors, etc. before middle school.
Project Bloks grooms your child to be a Software Developer about as much as
their first grade teacher grooms them to be a Quantitative Analyst.

------
ktRolster
Once upon a time, Excel didn't exist, and if you wanted to do that kind of
thing, you needed to program it. So Excel replaced a segment of programmers.

At the same time, many other programming opportunities opened up. We have code
in every single doorknob (at least, a lot of them). Some coding usages will be
replaced, but others will always open up, at least for the forseeable future.

------
wentoodeep
Future isn't about writing less code or using a building blocks, but by using
AI and NPL to produce results from user intents. Perfect example would be
Wolfram|Alpha and Wolfram Language. It's no longer about developing a
software, mini app or microservice but more about getting answers. Either
textual or graphically.

------
meira
Well, I disagree. When I started to program, in 2000, it was só easy that a
child 12 years old could do (HTML,php,ftp). Although anyone can deploy
something online today, I doubt it's só easy for a 12 years old kid to learn
something similar to what I used, like React, node and git.

Editado: conclusion, programming is becoming harder, not easier.

~~~
ktRolster
_programming is becoming harder, not easier._

So true. I'm not sure why, it shouldn't be that way.

------
nijiko
There will exist both. You need one to create the other. Eventually graphical
will become the textual representation to create the higher power.

There is always a lower and higher power, even if they are equal in design.

------
zwieback
The linked article by Jeannette Wing is more interesting than the techcrunch
article. She wrote really interesting stuff on subtyping in the 90s so her
name jumped out right away.

------
mempko
I'm sorry the future is clear. Perl 6 will be the last programming language.
The only way coding will become visual is when we use unicode symbols to do
diagrams in perl 6.

------
WorldMaker
As someone that has had several times now to "productionize" or "modernize" an
Excel spreadsheet or Access database many times, this is one of those
prognostications this is one of those things people think from time to time
and thus far in my experience tend to be incorrect about.

The issue with "computational thinking" in so far as how this article seems to
want to teach it and how most schools often do teach it already in the real
world is the tendency to stop at the basics and Office applications and just
enough VBA/macros to give people a feeling of competency without giving them a
glimpse into the real depths of programming and what software developers
_really_ do.

I keep wanting to make a XKCD-style sketch graph of the idea. But there's a
lot of Dunning-Kruger over-competent business people that thinks all the
software they need to run their business is spreadsheets and spreadsheets
pretending to be databases like Access. To them real software developers seem
over-paid based on their experience of Lovecraftian "systems" they can hack
together given what they think they know.

That's a very real and dangerous place for business people to be, but it is
unsurprisingly common. Those people don't respect programming as a discipline
and a _craft_ , and sometimes those are the people out in the corporate world
controlling software developer salaries or morale...

It's also the same lack of knowledge about software as a _craft_ (as
engineering, in a very classical sense) that leads people time and time again
to the well of "well in the future people won't be coding because [ Excel will
do it all | There will be a visual tool everyone will easily understand | AI
will do all the programming based on natural language queries | Insert some
other magic idea here ]".

There's as much art to software development as there is science, and
forgetting that art will still need artists and will not make itself is a
strange thing that is surprising common.

To be fair, there are a lot of software developers themselves that have played
into this delusion, and it's something of a trap that a software developer can
easily fall into. We're trained to break down systems and try to automate them
to their fullest potential and it's hard sometimes to avoid that meta-leap to
wanting to do it to our own systems. We fall into building "Business Rules
Engines" that we think some business users might be able to understand and
comprehend and might obfuscate away the need for programming. We experiment
with boondoggles like visual programming languages and "auto-coding"
experiences. We get grandiose visions of the machine or software product or
great AI that will make it all more accessible...

The future will probably look like the present in that regard. We'll still
have the Dunning-Kruger folks building mission critical applications out of
complex webs of Excel and Access and other past and future productivity tools
we build in the goal of making programming more accessible. We'll still have
software developers eventually hired to clean up the messes and craft versions
that can sustainably last or reliably operate outside the hacked together
environment from which they were originally built. There will continue to be
software developers continuing to think they can build the environment that
will done rule them all and save everyone time (and meanwhile eat up so much
of software development budgets and time to built it)... And all of these
groups will still have a hard time communicating between each other the real
risks and efforts involved in any of it.

------
pklausler
So long as it's the right people no longer writing code, that future sounds
pretty nice to me.

------
dudul
Is the future of math to get rid of notation, greek letters, etc?

------
hyperbovine
Anyone else stop reading at "intersectionality"?

------
fapjacks
Well, that's what they said forty years ago, so...

------
Falkon1313
It's not about code. When you, as a programmer, consider the purpose of
programming, and how computer-illiterate people view it, things are different.
A programmer's job is not to type code. It's to find out what is holding other
people back from achieving what they want and find a way to remove that
barrier. It's to solve problems and help people make more valuable use of
their time.

It could be the analyst that spends hours of drudgery printing out things from
one system and retyping them into another (then going back to fix the typos)
when what they want to be doing, what would enable them to provide value, is
analyzing the output from the second system. It could be the junior exec that
spends countless hours manually collating data to build spreadsheets and
presentations about their projects, when they want to be doing is trying out
new ideas and refining their projects.

Many people were raised to "write each vocabulary word 30 times" and see
drudgery as a necessary, albeit frustrating, part of their jobs. Programmers
automate that away so those people can do more important and useful things and
produce more value. It's not just drudgery though. People have hard problems,
often vaguely defined, and programmers help them understand, specify, clarify,
and solve those problems.

Younger generations are more computer literate, but still often just use
computers and don't realize how much control they could have over them. They
may use programs that don't do quite what they need, not realizing how easy
that would be to fix. Even if they're not the ones doing the programming, just
recognizing that a programmer could help solve their problems is valuable.

At the same time, many non-programmers don't realize how hard some things are
to program, or how clear, precise, and unambiguous things need to be defined
in order for computers to produce the desired result. They assume something
must be simple when they can't even define what the something is. Or they
assume that because a program exists, any programmer could make an equivalent
but slightly different program quickly and easily.

We don't need to teach everyone to code, we certainly don't need to teach
everyone the syntax of some specific language. But we really do need to teach
them to think in these terms. What problems do you face? Which of those could
be automated or streamlined? How would you specify it clearly and
unambiguously? What edge cases and special conditions do you need to deal
with? etc. Given that line of thinking, those who are interested will learn to
program and those who aren't will at least understand it. Some hypothetical
pictocode/vocalcode/AIcode doesn't really matter. People need to understand
the basic concepts of problem-solving and automation, how they can be useful,
and what makes them relatively easy or difficult.

------
LoSboccacc
same as always right?

------
cnfjdnx
And in this thread you can see programmers getting defensive and flustered
over the notion that they too might be vulnerable to automation.

~~~
walshemj
Or recall all the other times a silver bullet has been hyped - first I recall
was "the Last One" in the late 70's early 80's

