Hacker News new | past | comments | ask | show | jobs | submit login
The Quiet Revolution in Programming (drdobbs.com)
78 points by ternaryoperator on Apr 3, 2013 | hide | past | web | favorite | 60 comments



At the risk of sounding like a nitpicker, as a statistician, I'll have to comment on the methodology. I believe the crucial clause in the article is

when the research and analysis firm Forrester recently surveyed our readers about how much time they spend writing in any given language

When the results change over time as indicated by the two charts, that can mean one of two things: either a lot of people who worked in just one language in 2010 now work in several languages, or a lot of people who work in just one language have stopped reading Dr. Dobbs. In order to support the claim that it's mainly the first and not the second possibility, one would have to submit at least some supporting evidence. (Edit: at first I thought that one would want to contact the same people in both surveys, but no, that's not good. It leaves out the effect of new people entering the arena. This doesn't seem to be easy.)

And also, this perhaps being a matter of taste, I find it questionable to have a caption like "Fraction of programmers..." underneath the charts. As much as I respect and admire Dr. Dobbs (I have published there): not every programmer is a Dr. Dobbs reader.


Agreed, there are many, many publicly available methodologies to rate language popularity, none of which are great, but this one seems especially odd.

It'd be nice to see a time-lapse of the TIOBE index from 2009 to 2013: http://www.tiobe.com/content/paperinfo/tpci/index.html

Or any of these dozen other measures (including mentions on delicious, or IRC): http://www.langpop.com/

Or employer challenges posted on CodeEval: http://visual.ly/most-popular-programming-languages-2013

That said, the underlying thesis may be true that more projects today require more languages, hard to say.


I agree, the thesis sounds plausible. A more thorough investigation would probably reveal good significance in its favor.


I'm the author of the article. Let me try to address your thoughtful observations.

>This doesn't seem to be easy.

Quite agreed. It's not an easy thing to measure accurately. I believe the explanation for the numbers is indeed the first of the two options you present, as I wrote in the original piece. As the Dr. Dobb's readership has grown vs. 2010, both in terms of unique visitors to the website and subscribers, I don't think the second option is likely.

>not every programmer is a Dr. Dobb's reader

Quite true. This is a problem inherent in all surveys. The survey size for these two questions were 1143 in 2010 and 500 in 2012, which statistically speaking would be fairly representative samples. The real rub is that programmers are not a homogeneous group, so the results will change a lot from one type of programming to another. For example, Dr. Dobb's does not cater much to embedded developers, consequently the effect that they would have on the charts is not captured.

If I assess the 2012 numbers based on what I know anecdotally, they seem acurrate insofar as capturing the broad trend towards polyglot programming. What was counterintuitive, at least to me, was how much the trend accelerated in the last two years.


This is a problem inherent in all surveys

Sorry, I'll have to very strongly disagree on that one. If, for example, you conduct a public opinion poll in the United States, then all that matters is the size of the sample and your method of selecting the sample. If those are both adequate, then you can draw conclusions (with margins of error) about the entire population. Furthermore, you can repeat the same poll at different points in time and draw conclusions on the changes that you observe. What's happening here is different. Your population is all programmers. Of that, a subset is taken, namely, the set of all Dr. Dobbs readers. From that subset, you take your sample. I trust that the size of your sample and the method of taking it are fine. But we don't know if the subset from which your sample is taken is sufficiently random, and we don't know if and how it changes over time.

In short, you cannot draw conclusions about the entire population if your sample is taken from a subset that does not qualify as a random sample. That problem is very definitely not "inherent in all surveys," as you claim.


You remind me of this[1] interesting piece of recent research.

It turns out that if you perform social science experiments to determine human behaviour and your sample is almost entirely drawn from a population of white american grad students, your results tell you an awful lot about the psychology of white american grad stuidents, but not so much about human beings in general. Oops. Bang goes almost the entire edifice of modern behavioural science.

[1] http://www2.psych.ubc.ca/~henrich/pdfs/Weird_People_BBS_fina...


>I'll have to strongly disagree with that one.

Then you spend the rest of your comment restating exactly what he said--namely that programming is not a homogeneous community. And, as he points out himself, the Dr. Dobb's community is known to not be representative of the whole and he gives examples.


One more thing that stood out as a sore thumb to me: you do not comment on the fact that the first graph (the chronologically later one) splits up the programming languages in a different way than the second one! It won't weaken your point much to just use the same key in both graphs, but you don't do it, and you don't comment on it, which is a little sloppy.

(C/C++ are two different entries in the first graph, while they are the same (adding up their scores, thus exaggerating the upwards tendency of the curve in the higher-value range of the x-axis) in the second. Something similar happened for VB.NET.)


Do people realize the huge hidden cost of this "polyglot programming" thing? Imho, we have two huge "blackholes" these days that suck resources form everything that matters: the "social everything, center around advertising, give stuff for free" black hole at the business level, and the "uberpolyglot everything", from languages to OSs at the technical level.

> We have had to become polyglots to develop almost any current application.

...in my experience, by working in 2 languages instead of one, the time it takes to deliver something grows 3-4 times (!!), and there's still all the "lost" training time: instead of learning new paradigms, patterns and way to solve problems, people spend inordinate amounts of time learning the details of new languages. I used to love the polyglot way, but now I see language proliferation more as part of the problem than the solution. There was and advantage of all this, but now most paradigms of concurrency and everything are cross-language, and even if some languages like Clojure or Haskell injected new or better refined ideas into the mainstream, these ideas are nor longer tied to a particular language.

I think we should invest as much as we can in unifying languages and technologies, like:

- using one language by making a traditional client side language work server side: the Node way

- cross-compiling: the X to Javascript compilers (like Clojure-script and Python to Js or Ruby to Js techs)

- make languages use libraries from other languages: not to much exciting things happening here yet :( ...but I'd love having a way to, for example, mix C, Python and Ruby libraries inside a Clojure program, regardless of the performance or security issues

...they all seem like "boring plumbing work", I know. They are neither "exciting big problems to solve" nor "amazing business ideas" and you'll most likely neither change the world nor get rich by working on such things, but you can probably make a huge difference by working on such plumbing an let the really smart guys free to invent really new things!


My reply is that you may be able to drive a nail with a brick, but a hammer works much better.

Same for programming. The more you are aware of, the better able you are to chose the better way of proceeding on a project.

I'm probably scarred though, I work with people who believe that c# and sql server can solve all the world's programming problems. (I like c# and sql server, but they aren't always the best solution for a given problem)


>The more you are aware of, the better able you are to chose the better way of proceeding on a project.

This is a part of the problem. "The right tool for the job" has become an end in itself, rather than a means to an end. People spend so much time trying to develop more than a superficial understanding of a vast array of languages and frameworks, simply so they can say they are using the "right" tool for the job (perhaps so their future "show HN" post can be sexy). The right tool should always be understood as successfully executing your project as the goal. Sometimes the right tool is simply the one that you or your team has expertise with, or has a deep body of experience built up on StackOverflow. The "right" tool in absolute terms is completely meaningless. Often times it seems like people forget this.


I didn't mention a "right" tool, I actually made sure to use the term "better", because "right" has those negative connotations.

For our purposes, there are "better" tools available than those that were chosen. A pain that we're feeling now, a couple of years on, as we start to migrate a live application to those other platforms. That's a big loss of productivity for us because choices were made to use tools that were familiar instead of figuring out if they actual made the most sense.


Heh you're right. It's interesting how one's expectations can totally change how you read a statement. I see the "right tool for the job" tripe repeated so much that I start to see it everywhere :)


...but there's essentially no learning curve for both the brick and the hammer! If it takes you one month to learn how to use the hammer, you'd better use the brick. Now, if your "hammer" is actually more of a general purpose tool than the "brick", it's the other way around, but if it's a hammer specially designed to work only at pounding nails in wood...


Most of the time (anecdata) a language is not used to solve programming problems, it's used to solve business problems.


>Note the big spike on the left and the mostly sub-2% numbers for programmers coding more than 50% of the time in one language. I expect, after some reflection, that most readers will find this chart unexceptional.

Am I missing something? I find the first graph in the article quite difficult to comprehend in terms of what it signifies. Although the numbers are "sub 2%" after the 50% line, isn't it an aggregation across 5 data points on the x-axis and 7 languages that will still add up to a substantial number of developers who program mostly in a single language?

I am yet to meet someone who vastly switches between languages on a daily basis in given 12 month period. One occurrence is probably working on a python and javascript for a webapp and writing an occasional chef cookbook in ruby.


In the last 12 months I have written Java, Python, C++ and Go for work. I don't switch daily but it's not unusual for the switching be weekly or semi-monthly. That seems pretty polyglot to me. I'm not unusual at work either. Many of us switch among the big 3 of Java, Python, and C++.

It is possible that Google is unique in this though.


How do you like Go? I've been bouncing between Python, Clojure, and Go lately.


Go is my current favourite. It is opinionated in ways I appreciate and it is very low friction.


fwiw I switch between Javascript, Ruby, SQL, Objective C, and Java pretty frequently (all at one job), and I don't work at Google.


Going back around 15 years, I was mostly in VB5-6, some VBScript, some JavaScript and some SQL. 10 years ago that expanded to include C#, VB.Net, ActionScript/Flash... most of those used at least a few hours a week, and often switching between the two... Today, it's about 1/2 .Net web projects C#/HTML/CSS/JS and the other half NodeJS backend code (JS), with more movement towards full-stack Js (NodeJS, MongoDB, web front end).

I like looking at different languages/tools/platforms, and am a bit into different languages. I've actually liked JS for a long time. The nature of web based apps make it a requirement, so it is nice to see decent frameworks take hold server-side that use it.

Just pointing, that for me personally, if you add all the web stuff, more than half my time is in JS these days, and a decade or five years ago, I could not say the same.


There's no way that "HTML/CSS/Javascript" should be lumped together.


Agreed.. I don't recall seeing SQL variants in there either. I do think that HTML/CSS could be removed, or lumped together. There's a lot of growth/movement in server-side JS between NodeJS and MongoDB alone.


I quite agree. I expect in future surveys JS will be broken out separately. (Note: I'm the author of the original article.)


I've never met a one language programmer. I have, however, met some who were only proficient in one. That's the big difference these days. You now have to be proficient in various languages in order to build stuff.


If I am mainly proficient in just C++ or Java (for examples), it really isn't true that I can't build stuff. I can build stuff. Whether it's the stuff you want to build is none of my concern.


if you don't consider html a programming language (and it's really a markup language anyway..), i know dozens of single language programmers who only know PHP.

In fact, I think the majority of developers I know fall into that category.


As a data point, I'm strongly proficient in one (Objective-C) but use PHP,HTML,CSS and a small amount of JS to supplement the data handling, web services, and create the web site side of my mobile apps. I can get done what I need to in those other 4, but I can't sit down and hack out a whole project like I can Obj-C.

I'm also strongly proficient in Adobe Illustrator, Photoshop, and InDesign though, so I don't know if that offsets my proficiency in only one language.


All of my coding is in C# (for work). I don't work in anything else.

Of course, this is probably a lot rarer outside of business apps.


I bet you have had to write some SQL... (:


Nope. I write XML messages that are then sent to services that handle that kind of thing. (I know SQL, due to a previous job, but I don't write it in my current position )


Many years ago, I was working with a Perl programmer and, making conversation over a lunch or coffee break, I asked him what other languages he knew. He replied, "only Perl." I was shocked and persisted in my questioning, even offering up other common languages in the hopes that he was being merely forgetful, but he was quite clear that he knew only a single programming language. This, I find very scary.

This is probably not far from the norm where a programmer "knows" multiple languages but are only expert in one. This is almost as scary to me.

At least for me, there's a refreshingly synergistic effect from knowing and understanding multiple languages well.

In fact, I would go so far as to claim that a good programmer should know and understand a variety of languages and, ideally, use them frequently.


I would go so far as to claim that a good programmer should know and understand a variety of languages and, ideally, use them frequently.

How frequently? Can you describe the average week of this programmer?


Given that the circumstances differ, I don't think I could provide any specific guidelines.

I am not averse to using multiple languages in a project if each serves a need so, in the past two months, I have used gsl, TCL, Python, C, and Go for various components of an ongoing project. But not necessarily all at the same time. Most days, I focus on one and only deal with two simultaneously during integration or code generation.


I'm not really opposed to using different languages in a project.. but what you just described seems to mean a lot of unnecessary complexity for a given project. Given that you could use a subset of two of those to reasonably accomplish what you need, it seems to me like you are dramatically increasing maintenance costs in favor of toying around.


This line of reasoning is quite standard, and for many purposes, quite valid. However, in my particular case, it is not. Each language served a specific purpose that could have been accomplished by another, but with greater effort. Sometimes, it's better to use multiple languages in a system than to use just one throughout. Unfortunately, I can't go into more detailed reasons, but I assure you, it is not for "toying" around.


I didn't mean to offend... I honestly like toying around with different languages, and have never been afraid to add more. I mean a typical web project has data-persistance, server-side language, server-side platform, client-side language, common markup, and styling. Each of them could be using a different language or set of tools... let alone more backend connections/services.

I've been leaning towards full-stack JS as much as possible. If only that it makes some of the communication channels much easier to work across. There's still templating (html/jade/mustache/markdown) and styling (css/less/sass/stylus) to consider, and not mentioning interaction with other systems/services (mongodb/sql/salesforce). Bringing as much of that under one umbrella as possible is useful.

On the flip side, if you have well documented services, and workers/queues you can break your load up and use a lot of different platforms.


You didn't mention differing circumstances, just that the skills are necessary to be a good programmer.

Also, "platform spread" is a real concern.


Why should it be scary? If you need someone to write some scripts in Perl, it should be just fine to use someone who mainly just knows Perl. Anyway, they can learn other stuff as they need to, once they know one language well, but you haven't made a case that this is actually necessary.


It's scary because, knowing only one language means that he can only think about, and express solutions to, any problems in terms of that language. In this case, he was not expert at Perl either, so the net result was ... not good. I remember seeing Perl programs that would have been better written as very short shell scripts.

Note that "mainly just knows Perl" is quite different from "only knows Perl".

Since then, I have never met another programmer who knows only one language; I think they are pretty rare. As for proving the case, I don't think I can: there may well exist a good monolingual programmer out there. It's just not likely.


> I remember seeing Perl programs that would have been better written as very short shell scripts.

This is really common. Anyone who spends time with code should take the time to learn the *nix toolset. It's a shame to see so many people reinventing tools that have been around since the 70s.


Scary for the person perhaps. Language fads come and go. You don't want to be a person with only one language on your resume when nobody uses that language anymore. Perl is probably not one of those, though.


Which languages makes all the difference. It's more important to know at a language from a broad number of paradigms, rather than knowledge of a bunch of C-syntax-derived object oriented languages (I say this, but I'm just as guilty of it as many others, despite trying to rectify it).

Frequent usage is more difficult, though. I'd like to think I know C++, Java, C and Python all very well, SQL to a reasonable level. However, add the "dabbling" languages: Racket, Haskell, Scala, C#...where exactly would the time for using these frequently come from?

Arguably, languages are only important to an extent - libraries surrounding languages are the truly important part.


I think knowing a given language can help a lot... for example the addition of generics and lambdas to C# has dramatically changed how I may approach a given problem in C#.

I know there is often a cost to using, for example the linq extension methods, it will look cleaner a lot of the time than the loop syntax.

Beyond this, simply using a new framework in an existing language can shape your views... for example, the differences in using JS with say jQueryUI, NodeJS, or AMD modules.

Knowing a language, and some platforms/libraries/approaches can greatly shape how you use that language. I once saw a VB app written by a COBOL convert... well, it ran in VB, but definitely looked like a COBOL application.


Assuming that the article is correct (and perhaps it is not), I think this is bad for software development.

It takes a lot of work to become really proficient in any one language. Multiple languages? For most people, that means only adequate proficiency — or even barely adequate proficiency — in some of them. Don't we already have enough lousy programmers?

I do expect a programmer to be able to handle at least a few languages, let's say C, HTML, CSS, Java, PHP, Emacs LISP. Now throw in SQL, JavaScript, C++, ASP.Net, Python, Ruby, etc... It's not just syntax. How can anyone know most of the ins and outs of all of these, learn best practices, understand the libraries, keep up to date on new versions?


From my (non coder) point of view it's hard to understand what drives the continual expansion in programming languages.

Is there a general principal for creating a whole new language as opposed to adding features.? Could you have a language where you could have a setting for something like static typing vs dynamic typing? The language would get more complex, but seems like it would still be less complex than a whole new syntax?


Is this 'news'? I thought everyone seemed to understand that this has happened for some or is happening for many. I wrote this a year ago and didnt think the topic was groundbreaking. http://jobtipsforgeeks.com/2012/03/16/the-future-polyglot-pr...


Web programmers tend to take this for granted, but there are whole swaths of the software world that aren't web applications. For developers in those areas it may very well be something they haven't considered.


I switch between Javascript, Python and PHP on daily basis. If I have no other choice I use java for the task given or C# ... I think that heterogeneous programming and language interoperability will be the next nuts that we will have to crack - to learn that languages are tools and we should use the proper for the task.


All the currently successful languages became so because of their universal character: you can write server, a GUI app, or code a ML library that implements a neural network in either C or C++ or Java - they all offer the balance of abstraction tools and performance needed to be universal.

The proper tool for the task way is pushing things backwards! By going this way you're running towards a brick wall, the wall of your limited brainpower: at one point you will not have enough brainpower to keep juggling the increasing number of "proper tools" and you'll have to stop growing, stop learning and enclose yourself in warm niche/bubble. Yeah, this is exactly what your manager wants - you'll become a good but replaceable "programmer specializing in X, Y and Z" instead of the "uncathegorizable freak" that embeds a Scheme interpreter only he understands in his C++ code, can do in a day what the rest of the team does in a month, but that may bring the whole company close to bankruptcy if he's hit by a bus because, well ...nobody can replace him :)

"Universal" tools are always better overall even if they are worse for every particular situation you can imagine because the "problem space" in software is expanding. I an "all is open-ended" world universal tools are the only to avoid being trapped in a "bubble" (yeah, you may have warm and cozy in a "web dev with rails" bubble or "frontend dev in javascript" bubble, but you're still in a bubble!).


>By going this way you're running towards a brick wall, the wall of your limited brainpower: at one point you will not have enough brainpower to keep juggling the increasing number of "proper tools" and you'll have to stop growing, stop learning and enclose yourself in warm niche/bubble.

I couldn't disagree more with this statement; it seems to me you have this backwards. The purpose of "proper tools" is exactly to encapsulate the complexity and provide a simpler interface with which we can address the problems. In fact, as the level of complexity rises, you need tools that provide the right level of abstraction. In many cases, those tools are specialized because they address specific aspects of the problem. Each such tool pushes out the boundaries of our limited brain power.

For instance, I use a lot of code generation in my work. A complicating factor in code generation is the fact that you are working at two or more levels simultaneously; the code generator and the target language. However, with the proper tools, the problem becomes very tractable. In this case, I am referring to explicit code generation, but exactly the same issues arise when generating code with Lisp style macros.

As to the universality of languages, I can only say that the practice gives lie to the theory. Once you've used enough languages, their differences (and fitness for particular problems) become clearer.


Let me get this straight - the guy that can write high performance scalable async code in erlang/go for the backend, design snappy and responsive UI in javascript or C# and modeling/analytics/stat tools in R,Haskell,Python all with easily defined and usable and language agnostic interfaces is easily replaceable living in a bubble, but write all of this only in Java and you are a programming god?

The pipeline was what make Unix great - the ability to chain different small stuff together. Huge monolithic is something I dislike. Personal taste.


I don't mean monolithic, I mean homogeneous / not heterogeneous. The Unix way was great while all the tools were written using a small set of languages and techs (C, Bash, maybe C++ here and there...) because a guy with a certain set of skills could just pop the hood up, get the code, and delve into the guts of any program... now you have "shit, I chose Ruby but that cool lib is Python" or "why is that huge Perl script used to configure and start this Erlang program?".

By "Java" (bad example, I agree) I mean any multi-paradigm, multi-platform, general purpose language: it can just as well be Scala on the JVM or Python if its performance is good enough for you and you can put up with the way it annoys you out of your mind if you try to do functional programming in it. All languages claim to be so now but in practice they are not! If you use Haskell for something, Erlang for something else and integrate with some R code by means of some Python or C code, you pay a huge price in increased complexity and you'd better make sure the advantages given by using "the right tool for the job" are worth this price... If you can master the kind of polyglot skills you talk about, great, you're definitely not in a bubble, but this is because you are one of the top 0.1% in terms of "brainpower", not because you used you bp efficiently - imagine what you could do if all the mental energy spend understanding all the little tech details for all those things were focused on solving just one problem and with a smaller number of tools!


Funny conclusion: Java stays roughly the same between the two graphs, if you "join" C and C++ in both graphs, they, too seem rather similar, and JavaScript probably got a bump because the second time, "HTML/CSS" was added as a "programming language". Apart from the other errors mentioned here, this seems very much like "only believe the statistics you have faked yourself"....


If you're looking at how individual languages compare, you're missing the point. If all these languages were folded into a single line, the editorial's main point would be exactly the same.


I wonder how this would correlate to different development stacks. It's possible a "language" used to mean something more like an ecosystem.


seems like a big decline in developers working exclusively in the windows ecosystem; C#, VB.NET.


I'm guessing that reflects a shift in enterprise system front-ends from the rich client craze to the webapp one.


What rich client craze? In 2010?


What was the actual question asked? (And did it change at all between surveys?)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: