
Joe Armstrong on Programmer Productivity - mdevilliers
https://groups.google.com/d/msg/erlang-programming/OiyGQ4UHqxw/HgGma01CGqYJ
======
tom_b
FTA (favorite part for me)

 _Most time isn 't spent programming anyway - programmer time is spent:

    
    
        a) fixing broken stuff that should not be broken
        b) trying to figure out what problem the customer actually wants solving
        c) writing experimental code to test some idea
        d) googling for some obscure fact that is needed to solve a) or b)
        e) writing and testing production code
    

e) is actually pretty easy once a) - d) are fixed. But most measurements of
productivity only measure lines of code in e) and man hours._

For me, b) is a bottleneck. c) is far and away my favorite part. Nothing like
green-fielding . . .

~~~
georgeoliver
At least at the larger software companies do they really only measure LOC and
man hours? It seems like you could develop decent estimates if you tracked
every asset on a job, its performance, and the overall conditions of the job
itself.

~~~
fusiongyro
The problem is that the assets are dynamic. If you give me a single complete
requirement for some little feature I can give you a pretty accurate estimate.
But most of my requirements are things like "Write an app to do X." The only
part of that I can estimate is the part I've done before, but if the app were
just like all the apps that came before there would be no need for a new app.
But the juicy candy center is an abject mystery and there is nothing to
compare it to from which to derive an accurate estimate. Once you've dug
halfway into it, you'll be tossing out new requirements. Maybe you can make
good estimates on those, but your a priori estimate is now probably really
off.

------
lkrubner
This one sentence sums up the sociological and psychological problems that
retard progress in the tech industry:

"Experiments that show that Erlang is N times better than "something else"
won't be believed if N is too high."

That is exactly it. I've had those arguments where I was forced to tell a
manager about another project I did for another company, a similar project
where things went quickly and smoothly, using technology the manager was not
familiar with. And every word I spoke was met with disbelief.

~~~
colanderman
Ya, I had a manager laugh at me when I told him I had learned Erlang over the
weekend (I am the resident "language lawyer", and Erlang's really not that
complex). He didn't seem to be able to grok that a language could be so
simple…

~~~
jefffoster
Languages are simple syntactically, but I'd have a hard time believing that
anyone could pick up the set of idioms necessary to effectively use a new
language in a few days.

Norvig it covers this well in his Teach Yourself Programming in Ten Years
([http://norvig.com/21-days.html](http://norvig.com/21-days.html)) essay.

~~~
colanderman
Without knowing any other language? Sure.

Coming from a strong background of similar languages? Sorry, you're wrong
there. Perhaps you aren't familiar with Erlang; it is the "everyman" of
functional languages. The things I think you'd call "idioms" (not really sure
what you mean as that's a very loose term), let's say recursive programming,
pattern matching, carry over as-is from any number of other functional
languages. Pattern matching? It's a mix of OCaml and Prolog. Terms? Imagine
Scheme had tuples in addition to lists. Etc. The only "new" concept is the
message system, which if you know anything about networking, is pretty
straightforward.

In other words, if you know pretty much any other functional language, it's
almost trivial to translate basic programs to & from into Erlang knowing
little more than its syntax, which as you concede, is simple.

OTOH, if you're coming from, say, PHP, sure, learning Erlang will be difficult
as first you have to understand functional programming. But you're asserting
impossibility, so counter-examples are moot.

Ten Years… I've been coding over twenty years in more languages than I can
count… I have a good idea how long it takes to learn a language well. Maybe
I'm biased because it's the latest language I've studied, but Erlang was the
first non-toy language where I skimmed the reference manual, said "huh, that
was all unsurprising", and started coding effectively.

------
scotch_drinker
The corollary to a) that I deal with all the time is third party integrations
that are poorly documented and that after some time X start magically working
even though the third party changed nothing on their side (according to them).
Management always thinks "We've integrated things with this vendor before, it
will be a snap to do again on a totally different endpoint" and this is never
the case.

~~~
Finster
Then there's Facebook.

"We integrated with Facebook a few months ago, it should be easy to do it
again."

Famous last words. I've lost count of how many times I've become an "expert"
at some aspect of Facebook integration to have it be completely different just
a few months later. Google is also really bad about this, at least in the past
year or so. It almost makes me want to quit webdev and be a dba or something.

~~~
rantanplan
Feel ya. So many API's and yet it seems like there is no _i_ nterface - since
they change all the time. Might as well don't have them at all.

DBA it is then.

------
lifeisstillgood
this is partly why I advocate a _slow code movement_ similar to the slow food
movement. instead of sprinting I would like to walk to the destination, avoid
the pains of broken ankles and repairing shoes whilst running in them.

I would like to explore to find a sane and reasonable approach and not be
driven by artificial deadlines guessed at three months ago but by todays
business need.

i would like to actually be measured on business value generated, not lines of
code written.

I would like to improve and simplify, deliver real value and savour the joy of
actually creating.

this may of course explain why I feel totally unproductive at times

~~~
mbrock
I think that is entirely in line with the true values of agile, which haven't
really gotten through.

Slow food isn't about slowness for the sake of slowness. It's just about being
realistic and interested in the long-term. It's about saving energy and mental
stress.

One beautiful expression of this ideal is in "Domain-Driven Design."

~~~
ams6110
Agile in the sense you are describing really has the wrong name. Agile implies
fast; from Apple's Dictionary:

 _able to move quickly and easily; able to think and understand quickly_

Agile should have been called Adaptable or some other name that has less
implication of speed and more implication of accomodation of unknown or
changing requirements.

No putting that toothpaste back in the tube at this point though.

~~~
tetha
Though this has the same problem as "cheap". People always equate "fast" or
"quick" with "quick right now" and "cheap" as "cheap right now". There are
enough examples in long-term projects that this is not true (though the
converse is true, as well. And those problems which transist from short-term
to long-term are just nasty)

------
metabrew
I rewrote a (~10k lines) C++ app to erlang back when I was learning erlang,
and saw a ~75% reduction in lines of code.

[http://www.metabrew.com/article/rewriting-playdar-c-to-
erlan...](http://www.metabrew.com/article/rewriting-playdar-c-to-erlang-
massive-savings)

I expect the 'N' value varies wildly depending on what you are building, and
whichever language you are comparing against.

~~~
mightybyte
At a previous job I rewrote a (~9k lines) Java app in Java and ended up with a
77% reduction in lines of code! So it would appear that the "smart programmer
effect" is likely larger than the effect of most languages.

~~~
metabrew
In this case, I was re-implementing the C++ app I had written - so it was the
same programmer authoring both codebases.

I expect if I rewrote either codebase again, it might shrink even more.

------
jeffdavis
I've often wondered what the software development world would be like if we
used prototypes. Experimental hacking seems like a cut-down version of this
concept.

But what if we actually built things that we already agree in advance to throw
away? Maybe even start from two or three different plausible designs, and push
on each one for a while until one seems to be winning? Then rewrite it,
learning from the other contenders and from the prototype itself, all before
shipping.

The obvious answer is that it would take too long. But I'm not so sure. They
say designing software takes too long, also, but when I spend a few weeks
designing, the implementation ends up going smoothly and hitting the target.
Usually the features that are designed thoroughly at the start of the release
hit the target, and other "quick" features that are added in later, sans
design, take longer than the designed features and end up pushing the release
out. Any overruns or missteps in the implementation of the well-designed
features seem insignificant in comparison to things that skimped on design
work.

~~~
Kronopath
This is a well-discussed concept already - see the concept of "Spikes" in
Agile development, or to a lesser extent, the idea of "Tracer Bullets" from
_The Pragmatic Programmer_.

Of course, what's done in practice tends to differ, unfortunately.

~~~
jeffdavis
"Of course, what's done in practice tends to differ, unfortunately."

Exactly... I've never really seen this happen beyond what I would consider
"experimental hacking". Maybe it happens somewhere.

~~~
lifeisstillgood
I like Fred George's _developer anarchy_ in this regard - especially the
concept of micro-web-services. they are the equivalent of a unix command - do
one thing (well) and join up with MQ. if you make a service small enough you
can be confident of putting it up as a prototype and rewrite it next weekend
in node.js

------
gruseom
There's a problem with language productivity comparisons that Armstrong
mentions: it's impossible to write the same program in two different
languages. If you have the same team write it, their second try will benefit
from everything they learned the first time, and that is so huge a part of
programming that it is sure to distort the outcome and may even dwarf any
language effects. But if you use different teams instead, you've traded one
confounding variable for another—the effect of switching teams—which is also
hugely influential. Thus it's impossible to do an apples-to-apples comparison,
and most such experiments deserve high skepticism. It's too easy to
consciously or unconsicously engineer the outcome you expect, which is
presumably why we nearly always hear that the experimenter's pet language won
the day. Has the experimenter's pet language ever _not_ won the day?

That makes me think of a more modest way to do these experiments that might
return more reliable results: use the same team twice, but have them solve the
problem in their favorite language first. That is, if A is the pet language
and you want to compare A to B, write the program first in A and then in B.
This biases the test in B's favour, because A will get penalized for all the
time it took to learn about the problem while B will get all that benefit for
free. Since there's already a major bias in favor of A, this levels the
playing field some.

Here's why I think this might be more reliable. If you run the experiment this
way and A comes out much better, you now have an answer to the charge that the
second time was easier: all that benefit went to B and B still lost.
Conversely, if A doesn't come out much better, you now have evidence that the
language effect isn't so great once you account for the learning effect.

~~~
arh68
This approach strikes me as the most realistic. If there is an existing
codebase in the preferred language A, the obvious question, if B wins, is "Do
we rewrite the code in B?"

The only downside, and I think adding the language C tries to defuse it, is
that whatever the team writes first will always stick. For example: if I wrote
a version with dynamic typing (say in Ruby), then redid it with static typing
(Haskell), of course I'm going to try to reuse types. The extreme example
(Greenspun's rule) is if my pet language A is Lisp: regardless of what B is,
the team could try to write a half-baked lisp runtime on top of B. The style
carries over, and sometimes it doesn't translate exactly. I don't know how to
solve this.

~~~
gruseom
Good point—there are more effects than just "learning about the problem" that
carry over to the next time you write the program. Once your brain has
imprinted on a particular design for solving the problem, you'll probably
carry that over to the next implementation. It may not be the design you'd
have come up with if you were thinking in B in the first place and, short of
erasing your memory and starting over, there's no way to test that.

------
beat
I'm not sure the growth in the amount of software in the environment
(contributor to A) is really that much of a problem. After all, he says we
might have "thousands" of times more software, but we certainly aren't
spending thousands of times more fixing it. That's because today's software is
significantly less broken than the software of yore.

We're standing on the shoulders of giants. Underlying most of our environments
is Unix (or Linux, same diff), which is basically the same as it was 20, 30
years ago. We're also running on http, an astoundingly good design. There are
lots of other basics, none of which are all that complex, and all of which are
finely tuned.

More to the point, a) represents a practical limit. If our environments get
too flaky due to poorly understood configuration or bugs, we ultimately can't
get programming done. But not pushing to where it hurts some means not using
the latest, greatest tools that can amplify our power as programmers - the
same tools that let us have thousands of times more software than we used to,
_without_ losing any more time to configuration/bugs than we did decades ago.

And beyond that, a lot of the business opportunity in the industry lies with
running along just behind the bleeding edge, being firstest-with-mostest to
the new and powerful technologies. So unless you're in a safe business
relatively independent of new tech, you're going to bleed a bit.

~~~
pessimizer
>might have "thousands" of times more software

He said thousands of times more lines in each piece of software, not thousands
of times more software.

>That's because today's software is significantly less broken than the
software of yore.

As a per-line measurement, yes. As a per-program measurement, not even close.
Nor as a per-feature measurement, because those 1000x locs are largely going
into abstraction layers at the bottom of the stack and chrome at the top.

------
ricardobeat
> 30 years ago there was far less software, but the software there was usually
> worked without any problems - the code was a lot smaller and consequently
> easier to understand

Pardon the intermission, but this is one of the areas where node.js shines
IMO. I can read the complete source code for a very complex application, or at
least know that each module has a reasonably-sized source and _is_ readable if
the need arises. Very small modules and using composition is encouraged, not
restricted to a fringe community, and you also get to share tooling and
libraries with the browser. All that on top of a friendly, functional
language. It really feels like a step forward.

~~~
GuiA
Never used nodejs before - is node's packaging system different from Python's
or Ruby's?

~~~
grncdr
Yes. The biggest difference is that there is no global namespace for modules.

    
    
        require('my-dependency')
    

returns a value, rather than aliasing identifiers into the current scope. So
you end up with code like:

    
    
        var myDep = require('my-dependency')
        exports.doSomething = function (x) {
          myDep.doSomethingToAnX(x);
          // more code here
        }
    

The Python import (and Go and probably others) works in a similar way, except
that in node, even the module names (the argument to require) are local to
each module. That is: `require('my-dependency')` can return different values
depending on the location of the file that called it. This means your project
can depend on two different libraries that both depend on conflicting versions
of "my-dependency". The mechanism by which this is accomplished is simple and
straightforward. Another nice side-effect (as compared to Ruby) is that you
can easily isolate a copy of any given dependency to monkey-patch or otherwise
modify it, without affecting anybody else. (Not true of built-in globals such
as String or Function, but that's a shortcoming of JavaScript, not Node)

Of course, the event+callback architecture and preference for writing
everything in a manual continuation-passing-style makes working in Node suck
for a host of other reasons, but the module system is really quite nice.

~~~
doublerebel
Modified globals is the fault of bad coding, not node. There are valid reasons
for using globals, but it's well known to be bad practice to modify standard
classes these days.

If someone wants not yet released features, they can compile to JS/node with
source maps (i.e. Traceur, Coffeescript). This also solves the cps/callback
issue (i.e. Iced Coffeescript).

Npm is also _incredibly_ easy to publish to (one line in the CLI). That leads
to a ton of packages released that would otherwise hit friction in release
process in other package managers.

------
zwieback
Nice post although I disagree that 30 years ago stuff mostly worked. That's
not how I remember it but I guess his main point was that there was a lot less
software then.

------
6ren

      The problem is we don't do similar things over and over again. Each new unsolved
      problem is precisely that, a new unsolved problem.
    

And if tasks are similar, and they can be made mechanical (or at least,
commonalities can be factored out), it should be turned over to the machine
itself.

------
perfunctory
> I've been in this game for many years now, and I have the impression that a)
> is taking a larger and larger percentage of my time.

Have the same feeling.

------
jezclaremurugan
And it also depends on the editors, debuggers and other tools they use/know to
use.

~~~
Patrick_Devine
Tools are definitely important, however proficiency with those tools _and_
most importantly, proficiency with the language are the largest determining
factors. It takes months (if not years) to get to the point where you're
really comfortable in a new language and can crank out volumes of code
relatively quickly.

If you made me write with Eclipse, I'd probably be horribly inefficient. It's
not that Eclipse is a bad tool, it's just that it lends itself to languages
that I'm not super fond of writing in (ie. Java, ActionScript, etc.) and it's
much different than the way I'm used to working (with vi, git and a command
line). If you transplant a C# programmer into the vi and Python world, they'll
usually run away screaming. Neither is better, they're just different.

------
brucehauman
Investments in complexity bring fewer and fewer benefits, until maintenance
alone consumes all resources. [http://t.co/CL87QjJNjg](http://t.co/CL87QjJNjg)

------
avty
Nothing kills programmer productivity faster than management.

------
sgustard
How about the perspective of Joe Armstrong (Green Day)?

~~~
richardjs
He's sleeping right now. Try again in twelve days.

