
How will you be programming in a decade? - clessg
http://dave.cheney.net/2015/12/07/how-will-you-be-programming-in-a-decade
======
danieldk
_That’s fine, nobody said you’re wrong, but you are increasingly a minority,
and the economics of scale are not working in your favour._

This is not backed up by any facts. First of all, the graph shows desktop PC
shipments. Laptop shipments haven't drastically changed:

[http://www.statista.com/statistics/272595/global-
shipments-f...](http://www.statista.com/statistics/272595/global-shipments-
forecast-for-tablets-laptops-and-desktop-pcs/)

Secondly, the other logical explanation is that most machines of the last five
years are still fast enough for general purpose operating systems. So, people
are simply upgrading far less often.

 _The other outcome is the developer PC continues to exist, in an increasingly
rarified (and expensive) form as workstations migrate to the economies of
scale that drive server chip sets._

Or the same cheap processors that run embedded devices simply become fast
enough to handle normal desktop workloads. So, why would they be more
expensive? The Raspberry Pi will probably be good enough as a modest
development workstation in a couple of iterations at 30 Euro. Heck, some
people do their work on Chromebooks running Crouton.

I don't think hardware is the problem. Every kid will be able to afford some
device to start tinkering. More worrisome is the move away from general
purpose operating systems to walled gardens, plus reliance on proprietary web
applications. For this reason, we need a flourishing Linux/BSD/FLOSS
ecosystem.

~~~
StavrosK
> worrisome is the move away from general purpose operating systems to walled
> gardens

And, on the other end, open hardware is absolutely booming with _computers
that cost less than shipping_. I hope to see open software and hardware
flourishing even more as this trend continues.

------
pjc50
If I stay at my current employer, in 2025 I'll most likely be maintaining an
MFC application in VS2015, quite possibly on the same workstation I'm using at
the moment.

We don't update very often. Currently I'm using VS2008.

~~~
pluma
Heard of developers today using the 2005 version of their IDE because they
need to support widespread legacy systems that won't be updated to newer
software for the foreseeable future (and modern versions of the IDE have
become incompatible).

Legacy is here to stay. There'll likely still be people hacking on COBOL in
2025.

~~~
dracul104
At my last job, we were still supporting .NET 1.1 apps using Visual Studio
2003.

~~~
nemmons
At my last job, we were still supporting .dlls written in VB6. Last I heard
they had plans to upgrade them to .NET by Q2 2016.

------
pgz
_How will you be programming in a decade?_

With Vim (or Emacs)... that's kind of obvious.

~~~
bechampion
hahaha nice

------
vidarh
What I need to work is a reasonably large screen, a good keyboard, and a
computer powerful enough to run a decent editor, browser and ssh connection
from that machine to one or more machines running the rest of my stuff.

In other words: I need a terminal. There's no need for me to have a beefy
computer on my desk, because I can leave that under my stairs and not have to
worry where I work on it from, and mix and match when I use my home server or
cloud instances.

The laptop I type this on basically functions as a luggable terminal (it's a
17" one; I often pack my 11" Chromebook with it when travelling so I don't
need to pull out the 17" one for minor stuff...), and not much more.

The only thing that will make me upgrade my laptop is improved graphics
performance, better screen or better keyboard - everything else is pointless
for me.

Meanwhile almost every mid range Chinese tablet today (mid-range = ca.
$100-$200 these days) seems to be Atom based and supporting 4K HDMI output and
dual-booting Android and Windows...It's getting _very_ close to the point
where a tablet or phone + bluetooth keyboard + a HDMI connection is sufficient
or me to work. 4K LED TV's are so cheap now that I'm tempted to experiment.

~~~
noir_lord
Agreed.

My main laptop is a 17" Vostro, it's an older i5 (2430 I think) with 8GB RAM,
it's absolutely fine for all the development stuff I do on it, if I could get
something of that speed with an awesome screen and keyboard for a decent price
I'd be so happy.

I keep looking at the Thinkpads coming on the market second hand but while
they hit two out of three (similar spec, good keyboard) they are still all
1600x900 or so.

I want 4K already, drives me crazy that it's easier to get a tablet with an
insane screen than a laptop.

~~~
imtringued
I don't see a future in tablets. They removed the most important component for
a programmer (keyboard). Sure you can get a bluetooth keyboard but then why
didn't you get a laptop in the first place? If I'd remove something from a
laptop it would be the screen and replace it with the hololens or some other
holographic device.

~~~
vidarh
The "future" in tablets is the same as in phones: It's a lighter, smaller
computer.

Keyboards and screens are interchangeable

> Sure you can get a bluetooth keyboard but then why didn't you get a laptop
> in the first place?

Why not both? It is nice to have a lightweight option that can fit in a large
coat pocket, yet be able to "scale up" to a full size laptop or a 42" (or
larger) 4K display without having to move between computing devices. As hard
as e.g. Google and others try to make transitioning between devices seamless,
we're nowhere near the point where that is good enough, but we _are_ very
close to the point were a tablet or phone is good enough to act as a terminal
of the form I described with the help of a keyboard and screen.

------
lmm
Ten or twenty years ago video or audio editing required an expensive
specialized workstation. Today you plug a few specialized peripherals into
your regular computer.

I see it as the same thing. Maybe keyboards and monitors will be rare,
expensive, specialized devices. But you'll buy them and connect them up to the
same kind of computing device that regular people use. There's no reason for
the underlying computer hardware to be anything other than commodity.

------
osullivj
I'll extrapolate from my personal coding history. Late 70s to mid 80s was
Basic, Z80 & Forth on Pets, ZX81 & Lynx. Professional career started in 85,
and that was Fortran & C up to 1992 when I started coding in C++. Added Java
in 97, Python in 2000 and C# in 2002. I've dabbled in R & Cobol at various
times. I'm coding in C++ and C# today. I expect I'll still be coding in C++,
C#, Java & Python in 10 years time. Maybe Rust will be in the mix too.

------
dagw
Given how little has changed in the way I'm programming compared to a decade
ago, I'm going to guess very little.

edit: what has changed is size of data sets I can work with and speed at which
I can manipulate them. 10 years ago 300-500 GB of data was quite a lot of
data, now it's something I trivially work with on a middle of road computer
using normal software without really having to worry too much about
performance. Loading 20-30 GB data sets into memory is something I just do
without really thinking about it, as is visualizing them in 3D.

Basically everything is the same, it's just 10 times bigger and 10 times
faster

------
sitkack
The premise that video editing is done primarily on dedicated hardware is
incorrect. I know two video editors (people not things) that create trailers
for block buster movies and they do it on a mac laptop using final cut.
Software can be written anywhere now, most TVs costing over 800$ are running
some form of android, many with access to the google play store.

In 10 years, I will most likely be using a keyboard connected to a screen. The
environment will be a combination of local and remote, just as it is now. Are
you saying keyboards are going away? Will I need to run fingers 2.0?

~~~
hodwik
> "I know two video editors (people not things) that create trailers for block
> buster movies and they do it on a mac laptop using final cut."

I find that hard to imagine. I'm running a pretty powerful desktop, and I was
having a hard time just editing and grading RAWs in Resolve yesterday.

------
teambob
I am not sure about this article. Mobile and tablet have had massive rates of
adoption but even they are getting saturated now.

There are still large numbers of businesses that use desktops. Everywhere I
look there are desktops. Tablets are displacing some but not all in business.

~~~
pilsetnieks
You seem to forget, as does the author of the article, that there's the
increasingly popular laptops (at the expense of desktops) between desktops and
tablets. If anything, it's the laptops that are usurping the market share that
previously belonged to desktops.

~~~
pluma
Laptops and docking stations. And why shouldn't they replace desktops?
Miniaturization is an almost universal trend in technology. The only place it
doesn't work is screens, which is why we see a convergence in screen sizes
between smartphones and tablets (although some devices are trying to replace
the screen as a visual UI to varying degrees of success).

~~~
VLM
laptops also equal VPN access and work-from-home.

Depending how cheap your employer is you may not get both a desktop and a
laptop. I had no desktop from 00 to 05 so that some beancounter could get a
bonus for not "wasting" a tiny fraction of our total salary on giving my team
both a laptop and a desktop. I don't miss that place, but most of the world
works under conditions like that.

So desktops are for people who absolutely cannot ever be imagined to work from
home, for practical or primate dominance reasons. That's a rapidly shrinking
segment of the employment population.

Ironically by plugging a large display and decent mechanical keyboard into a
"laptop" we're just reimplementing desktops the hard and expensive way.

~~~
pluma
I'd say you need to distinguish between "laptop" and "laptop with docking
station" (or just lots of cables).

I mostly work on my laptop wherever I am -- in office, at a client, at home,
on the commute, at user groups, etc. I'm comfortable with using my laptop's
keyboard although I do enjoy having additional external screens when they are
available.

But I know plenty of people (including non-developers at non-tech companies)
who do most of their work at their desk with the laptop plugged into a docking
station with external peripherals -- keyboard, mouse and screens -- and only
use the laptop on its own for meetings or as part of a BYOD policy.

Desktops are non-portable. Laptops are portable. Tablets are ultra-portable
but 1) not powerful enough on their own (good luck relying on a remote desktop
with a dodgy Internet connection) and 2) not as comfortable to use for many
professions (due to size constraints, even if you have a physical keyboard --
less so if you just need a dumb device to click through PowerPoint
presentations).

Laptops are a compromise between a desktop and a tablet. Laptops with docking
stations can replace most desktops (except for high performance scenarios --
but for reference my 15.4" laptop has 4 cores, 32 gigs of RAM and an nVidia
graphics card; good enough for most things other than bleeding edge AAA
gaming).

The benefit of a laptop with docking station over a desktop is portability.
Desktops are only really required if you need levels of performance you
absolutely can not gain with a laptop or that would make the laptop
insufficiently portable (e.g. due to battery life -- I can generally get away
with a meagre three hours away from wall outlets but many would prefer a
longer battery life over the raw performance).

------
ctstover
How will you read books in 50 years? How will you play violin in 25 years? How
will you paint on canvas in 75 years? How will you steep tea 150 years? This
generation that grew up watching the Jetsons is obsessed with this idea of
"evolving" to the point of not doing anything.

------
onion2k
Most of the grunt work of coding will be a matter writing a set of Cucumber
tests (or whatever language you prefer, but it'll something close to formal
English) and having a solver/fuzzer do the actual coding for you. There'll be
some things that a solver can't do, but for the majority of simple things
(generating models, CRUD operations, database interaction, networking,
testing, maybe some aspects of the UI) I don't imagine we'll need to write any
actual code ourselves. A lot of this stuff already exists; we just don't use
it because it's unreliable. That will change.

~~~
pluma
This would be more convincing if this hadn't been the predicted future for
decades already, repeatedly labelling various "human readable" or AI based
solutions the new messiah only to realize non-programs are not interested in
learning to program and getting rid of the language doesn't change it.

Human language is ambiguous. Ambiguity results in errors. We do not accept
errors in machines because we expect them to outperform humans. When you begin
to formalize human language in such a way that it becomes unambiguous, you
effectively just create a new programming language.

The unreliability isn't a quirk of the system. It's an inherent limitation of
the medium. Even the best AI can't read minds and even if we had the
technology to do it we'd still have to deal with the fact that even mind
reading won't yield the correct solution (it'll just yield the solution you
think is correct at the time).

We won't be able to get rid of programming languages (or programmers) until we
create AIs that can improve themselves autonomously.

What you are right about, though, I think, is that we'll see a shift towards
more declarative programming. While declarative programming has been around
for ages it seems to be making a comeback.

~~~
humanrebar
> When you begin to formalize human language in such a way that it becomes
> unambiguous, you effectively just create a new programming language.

Exactly. This is why professional jargon (a) exists. Pick a noun in computers
and programming. It has another meaning in another context.

This is why contracts are written in "legal English" (b) and not in normal
English.

a)
[https://en.wikipedia.org/wiki/Jargon#Examples](https://en.wikipedia.org/wiki/Jargon#Examples)

b)
[https://en.wikipedia.org/wiki/Legal_English](https://en.wikipedia.org/wiki/Legal_English)

~~~
pluma
Legalese is not unambiguous. It is minimally ambiguous. Ambiguities can still
derive from the way the (ideally completely unambiguous) terminology is used.
In order to have a perfectly unambiguous legal document it would need to
consist of a single well-defined phrase with a single well-defined context.

If legalese were as unambiguous as you seem to think it is, most case law
could be fully automated to the point we'd barely ever need actual lawyers and
courts.

The only perfect language is the language of mathematics (if we ignore
conflicting notations and definitions often being relative to a specific --
though typically well-defined -- problem domain).

As soon as it involves the real world, things get messy and the wider the
margin of error becomes. Physics and Chemistry tend to work relatively well
(unless you try to accurately predict complex systems). Biology a bit less so.
Psychology is where things start to fall apart. Sociology is where it
completely breaks down.

But even legalese isn't as well-defined as mathematics. Simply because it has
to refer to pre-existing concepts from the start or else it wouldn't be of any
actual use (e.g. what's the point of defining "theft" without referring to any
real-world interactions -- but once you involve real-world interactions you've
left the rational objective domain and rely on the subjective experience of
human observers).

------
marcosdumay
Yet another thin client lover. I really expected them to get extinct be early
2000's, but looks like I was falling on the same kind of mental trap they do.

Well, my phone is perfectly capable of compiling my software today, so why
will I offload it to a server 10 years in the future? If your answer has any
mention of "battery", please explain how using a wireless network to transmit
the entire software twice will use less energy than simply compile it at the
device.

Servers, desktops, laptops, tablets and phones are all here to stay. They'll
all get more powerful, cheaper, and more convenient. None will go away.

~~~
imtringued
A phone can do anything your desktop or laptop can do given sufficient memory.
You didn't bring anything new to the table.

I don't even know why you would care about the energy consumption caused by
wireless transfer except if your software doesn't need to be compiled because
it's interpreted or your codebase is tiny. Also since you never talked about
speed, compiling on a server can still be significantly faster than compiling
on a phone.

By the way, why do you pretend to be a caveman? Can't you just think a little
bit ahead? You don't send the entire software twice you only send the changes
and assets don't have to be transferred back to your phone.

------
tokai
I will properly be programming much like now, but with Julia instead of
Python.

------
structAnkit
We already have the "specialized device used in software development": the
laptop (read: MacBook). No longer do we need to be tethered to wires and the
grid in order to get meaningful work done as laptop battery efficiency,
processing power, and mobile internet connectivity is substantially better
than it was ten years, even five years ago. With dedicated cloud servers we do
a lot of the compiling and automated test running and deployment on machines
other than our personal ones while simultaneously writing new fixes and
features ready for the next cycle of server-side code building.

Ten years ago we were in the middle of switching from wired desktops to
laptops, but laptops just weren't portable enough due to weight and longevity
when disconnected from the wall. Today this is still there case as our
processing needs have increased as our laptops have become more formidable. I
expect in ten years we'll still be in the same position.

What has been happening on the phone/tablet/watch/etc side of things however
is that we're able to do all of our non coding tasks on the go in a less
distracting way, such as responding to emails, retriggering failed code
builds, or even reviewing code and merging pull requests. But actual
programming will be crazy slow without a tactile full sized keyboard, things
we've relied on to code with since the dawn of calculators/computers, and that
won't change until we're programming software graphically, i.e. as if we were
using Origami with Quartz Composer or RelativeWave Form, as opposed to
textually.

~~~
davecheney
Do you think Apple will still be making MacBook Pros in 2025, and if they do,
will they still have a power hungry (but fast) Intel processor?

~~~
coldcode
I expect in the near future Apple will build their own family of processors
for all their devices. After you build your own OS(s), invent your own
language, design most of your internal chips, what remains is the CPU/GPU.

------
toddan
I probably wont be programming that much, if i do not work with a legacy
system. The type of programming i do is mostly putting a simple gui on top of
a database, in a decade we will have automated that task and making a database
application will be as simpel as doing a powerpoint presentations. If i do not
move up to management or find a niche i will be out of a job, because i do not
have the capacity to figure out the hard parts of programming.

~~~
hodwik
There will still be jobs for "configurers", who define the inter-field
dependencies and so on, in support of the business logic. As it gets easier to
make these forms the amount of logic in them will increase.

Your job will be to understand a business process, and build the app out so
that the app supports the process.

------
drewm1980
I could see visual programming becoming less of a domain-specific rarity, if
some really good VR goggles hit the market. 2D visual programming hasn't
really gone viral because complex graphs look like spaghetti when you project
them down into a plane, but if they are floating in space I suspect that
becomes less of a problem. Our brains evolved for taking in spatial
information and manipulating 3D objects, after all.

There are domains where visual programming is common (i.e. Control in
Simulink, Data acquisition in Labview, pure data in audio), but they are
domains where you don't really need to scale the complexity of the code, so
the parts of the ecosystem that enable scalability (i.e. automatic and
deterministic formatting of the visual representation of the code, version
control) are underdeveloped.

~~~
VLM
The future being unevenly distributed, you can program today in Minecraft yet
its very unpopular. Or rephrased its very popular to look at what
statistically almost no one produces. Also VLSI hand design is kinda like
programming and is also not popular.

There is a cognitive load issue where most people are not very good at 3d
(source, the intro to drafting class I took). There is a self selection effect
such that all the people "we" know are good at 3d games, but most people in
general are extremely bad at 3d. Then again most people in the general public
are also awful programmers or procedure designers in general. Most people are
not draftsman or programmers, although this is a political opinion in direct
opposition to the stated goals of the "everyone must code" movement.

I'm old enough that I was around in the previous 3-d fad, the VRML era, and
there were language tokenizers that could translate source code into very
abstract 3d structures, although in practice it was completely useless.

Looking with a liberal arts perspective, its like asking why sculpture never
replaced literature, now with computers assistance. Could you look at Dyce's
famous painting and have any idea it is King Lear unless someone told you
ahead of time its Shakespeare's King Lear? A 3-d printed sculpture of Dyce's
painting wouldn't be much of an improvement over looking at the original
painting, well, it might look cool but it wouldn't help comprehension of the
original theater play. Modern tools and energy sources mean marble sculpture
is cheaper and more available to the masses than ever, yet despite the new
easy tools there are more book authors than ever...

I'll just admit defeat WRT some non-fictional topics. As a kickstarter or
movement or experiment someone should make a "thingiverse for geometry" site.
Nothing but endless 3-d printable models to use to physically demonstrate 2D
and 3D geometry proofs. Can all of Euclid's proofs be 3-d printed it an
educational format beyond the ridiculous (ridiculous being 3-d artificial clay
tablets of a translation) In my infinite spare time, etc...

------
w_t_payne
Using model-driven development on a gaming PC or console with a VR headset and
3D input device or gesture recognition to control and navigate a 3D
representation of a data-flow network and speech recognition to input symbolic
logic into the processing nodes.

------
bechampion
i love my macbook , but imho devices like chromebooks make a quite decent
workstation today , in 10 years time who knows . Sometimes i feel like we (in
someway) are going back to the AS400 and dumb terminals days...

------
super_mario
In terminal VIM with local compilers/interpreters.

------
ams6110
Given as I've been using a PC and keyboard and emacs for the past 3 decades, I
expect I'll be using the same a decade hence.

------
hias
I hope not ;-)

