
An empirical study of working speed differences between software engineers [pdf] - mikebike
http://page.mi.fu-berlin.de/prechelt/Biblio/variance_tse2000.pdf
======
jondubois
Initial implementation speed is one of the least important parameters in
software engineering. Senior engineers are generally a bit slower but produce
higher quality code.

It's MUCH better to have an engineer which takes 3 days to implement a feature
in such a way that it doesn't need to be revised/fixed again for at least 1
year than to have an engineer which takes 4 hours to build that same feature
but in such a way that the feature has to be revised 10 times by 5 different
engineers within the course of the year.

The second approach actually consumes much more time in the medium and long
term. By putting too much pressure on engineers to implement features quickly,
you encourage them to create technical debt which another engineer will have
to deal with later.

It's basically a blame-shifting strategy.

~~~
modarts
How do you explain that to companies that follow the Google interviewing
model?

~~~
fredophile
What do you mean by "Google interviewing model"? Are you suggesting that this
is a problem at Google and companies that interview the way it does?

------
naveen99
I would think the speed ratio between the best and worst programmers has to be
infinite. Some programmers simply cannot accomplish a task. Once they die you
may as well divide by infinity.

For every task you need some minimal iq. Some tasks need higher iq than
others. A programmer is someone who can at least do the least iq requiring
task. He will fail on some more difficult tasks.

Iq just a standin for mental compute capacity.

~~~
Dr_tldr
Surely you must realize how ridiculous that sounds. You can build basically
anything (including a kernel) by working from tutorials and starter projects,
then hacking through it. Someone's solution may be extremely sub-optimal, but
I've yet to see a programming task that wouldn't to googling and persistence.

Inventing Calculus was hard, but virtually anyone can learn to pass a Calculus
test given enough time and incentives.

~~~
Hydraulix989
Tutorials don't cover things like corner cases, and they neglect nearly every
aspect of designing and integrating a large system, particularly scaling it.
Some of the worst code I've ever seen was "tutorial code."

I've encountered concurrency challenges in my day-to-day work that have no
cookie cutter solutions, are specific to my application (particularly in
constraints and requirements), and I haven't even been able to fully convince
myself of the correctness of my own solution, let alone contrive a way of
systematically proving or even testing it.

On the other hand, programmers faced with solving technical challenges such as
optimizing and approximating in the face of limited resource, where heuristics
and intuition only discovered and learned through experience and trial-and-
error (pattern building and "finding" at its best). These meta "design
patterns" aren't found in any books yet, especially because they are pretty
hard to articulate using the English language.

There is an inventive element to the sort of problem solving that good
programmers perform that goes well beyond pasting solutions from
StackOverflow. This is why duct tape and bubble gum cobbling of out-of-the-box
solutions rarely works in practice (and why good engineers are paid so well).
Clever hacks are actually pretty damn clever.

It's scary how many "best practices" I've only learned through building my own
production systems and having actual users test what I've coded -- having
already done a fair deal of sitting in classrooms being inundated with CS
theory from the very minds who conceived it -- even years of building pet
projects of my own haven't prepped me for the challenges I am encountering
now.

~~~
Hydraulix989
I might add that the difference between a hacked together kernel and the Linux
kernel is HUGE. "Hacked together" projects are full of weird bugs, incorrect
concurrent code that still works 99+% of the time but then rarely crashes and
burns when that ill-anticipated failure case execution interleaving randomly
strikes -- to say nothing of problematic and unrefined UX. Well-designed
projects are a work of beauty.

~~~
ForHackernews
> full of weird bugs, incorrect concurrent code that still works 99+% of the
> time but then rarely crashes and burns when that ill-anticipated failure
> case execution interleaving randomly strikes -- to say nothing of
> problematic and unrefined UX.

That sounds like an accurate description of 95% of existent software in the
world.

~~~
smaddox
Closer to 99.99999999%, i would estimate.

------
ivan_ah
In summary, the difference between "fast" programmers and "slow" programmers
is not 28 (as per Grant and Sackman folklore), but more in the range 1--7.
Specifically, _The work time variability tends to be larger for task type
“test /debug” (SF50 = 2.4, SF25 = 3.2) and even more for “programming” (SF50 =
2.4, SF25 = 7.1) than it is for “maintain” (SF50 = 1.7, SF25 = 2.4) or for
“understand” (SF50 = 1.8, SF25 = 2.9). Task type “review” is special. It
exhibits both low variability (SF50 = 1.1, SF25 = 1.3) and low skewness._

~~~
jsvaughan
It also say this:

"Thus, if we ignore the most extreme cases, the differences between the
slowest and the fastest individuals are by far not as dramatic as the 28:1
figure suggests"

Why would you ignore the most extreme cases?

------
lifeisstillgood
I want to start the "Slow Code" movement after the slow food movement.
Yesterday at a clients code base I was under pressure to fix a thing, and the
obvious approach was adding gobs more code, quickly implementing the most
obvious route.

And after a (inordinately long) time reading, the light bulb moment happened
and I added a single line of code.

In my view, when you are in the business of making Seven-League Boots, you
don't need to sprint.

Yes we want to deliver products quickly, but the link between good products,
effective business and speed of code writing is tenuous at best. Take your
time, line up your shots, and be sure you are a value multiplier. (That's the
real 10x programmer. 10x as valuable. That might mean using good SEO
techniques to get paying customers, but using boring old SQL back ends)

(Link to the years old article I have not written yet, comments welcome
[http://www.mikadosoftware.com/articles/slowcodemovement](http://www.mikadosoftware.com/articles/slowcodemovement))

~~~
codingmyway
That's so true. The 10x programmers are those that make everyone else's work
faster and more accurate.

To me the main job of a lead dev is setting up all the boring processes of
build scripts, continuous deployment, package libraries, testing environments
etc., that make the other developer's jobs easier.

------
umanwizard
I suspect that the most important difference between great and typical
engineers is that the great ones make a better product, not that they make it
faster. If my hunch is correct, worrying about concepts like "10x" is missing
the point.

~~~
snoman
I'm not saying anything about you here, but I feel like this is what a typical
engineer would like to believe to make themselves feel better, but getting
things done faster leaves more time for doing them better. Once you're at that
stage, it's a simple matter of discipline.

~~~
JimDabell
> getting things done faster leaves more time for doing them better

It's the other way around. If you take the time to do things better, then you
build up momentum and make future work easier. If you want to improve
productivity, slow down and do things right, consistently. It'll pay off in
the long run.

~~~
collyw
One example of this I see all the time. You need some new functionality, but
altering database tables is a pain in the arse, compared to adding more code.
So you add more code. Rinse repeat until you have an unwieldy mess of code.
Changing the database model would have been more difficult initially but its
often worth the effort.

------
projectramo
I think they still found a large difference between the best and worst
developers. Another summary (quotes that I can't seem to format):

The main findings from this investigation of the dataset variance.data can be
summarized as follows:

    
    
      The interpersonal variability in working time is rather dif- ferent for different types of tasks.
    
      More robust than comparing the slowest to the fastest in- dividual is a comparison of, for example, the slowest to the fastest quarter (precisely: the medians of the quarters) of the subjects, called S F   .
    
      The ratio of slowest versus fastest quarter is rarely larger than 4:1, even for task types with high variability. Typ- ical ratios are in the range 2:1 to 3:1. The data from the Grant/Sackman experiment (with values up to 8:1) is rather unusual in comparison.
    
      Caveat: Maybe most experiments represented in variance.data underestimate the realistic interper- sonal variability somewhat, because in practical contexts the population of software engineering staff will often be more inhomogeneous than the populations (typically CS students) used in most experiments.
    
      Still only little is known about the shape of working time distributions. However, variance.data exhibits a clear trend towards positive skewness for task types with large variability.
    
      The effect size (relative difference of the work time group means) is very different from one experiment to the next. The median is about 14%.
    
      The oft-cited ratio of 28:1 for slowest to fastest work time in the Grant/Sackman experiment is plain wrong. The cor- rect value is 14:1.

------
questionr
"The ratio of slowest versus fastest quarter is rarely larger than 4:1, even
for task types with high variability. Typical ratios are in the range 2:1 to
3:1. The data from the Grant/Sackman experiment (with values up to 8:1) is
rather unusual in comparison."

~~~
douche
Even if it isn't the mythical 10x ratio, hiring somebody that can do 3 times
more than the next person is a no-brainer. Although I honestly wonder if that
baseline is dragged down so hard by all the people that really can't do the
job they are supposed to be doing. Programming is hard, and doing it well is
even harder. "Everyone can code" movements are great propaganda, but I'll be
honest, I've never known anybody that actually could code who wasn't a couple
of standard deviations smarter than the average Joe or Jane.

~~~
Scarblac
I'm much slower these days than I used to be, and it's because I'm bored. That
I need to do my work with a web browser, usually necessarily with internet
access doesn't help.

~~~
drumdance
For the last year I've been working on a project that has a complex toolchain,
such that every time I hit cmd-s I have to wait 2-5 seconds before I can
reload the browser to see the changes. It's remarkable how easily I can get
distracted in that short interval, especially if I experience it dozens of
times per day.

For the last couple days I've been doing Project Euler with Ruby and the lack
of lag time translates into much better focus.

------
SFJulie
The Grant Sackman experiment, often quoted, rarely reproduced.

To «prove» a 10x programmer existence you would need a bi-modal repartition on
the percentile of workers/speed.

The grant sackman/peopleware/The Mythical Man Month all try to answer a
question that is tricky : what makes someone creative productive?

People focus on the speed. But they are just forgetting the most important
part of the experiment.

One of the most important part of G/S experiment that everybody forget is the
lack of correlation between performance and

1) diploma

2) experience after 2 years of practice.

Having done more than one job, other fields of works that are also creativity
based, the «feeling» was that it is not only about coders but musicians,
intellectual professions, journalists, project manager...

What are the implication of the lack of relation between diploma and
experience?

1) Diploma are overpriced, the job market is artificially skewed in favor of
those who have the money for it;

2) New devs are underpaid, old devs overpaid.

The burden of the proof that a diploma/experience is relevant for a job should
be in the hand of the one selling diploma. Diploma especially in computer
science seems to be a SCAM

The effect of this scam is :

1) young workers enslaved by loans in jobs they may not be good at/liking;

2) a rigid job market that prevent people from moving hence artificially
creating difficulties to have full employment

3) an artificial exacerbated competition resulting in cheating from both
sides.

------
philangist
I'm pretty sure that I'm a "slow" developer. Not in terms of actually coming
up with the core fix to a problem that I'm working on (that's actually the
most straightforward part of any project) but everything else that follows.
That is to say writing clean, well-tested, documented and maintainable code.
Is this something to be concerned about long-term or should I just accept that
my work will always take a little longer to complete than my fellow
developers?

~~~
BurningFrog
Find fast people to pair program with. You'll learn some good speedup
techniques.

~~~
fapjacks
This is really a fantastic suggestion! Pair programming can't be advocated
enough. I used to think it was stupid many years ago, and then I made friends
with a guy who'd been programming longer than I'd been alive. We'd both stay
late and shared an office, and in the evenings he started mentoring me, and
we'd pair program. I think there is _no_ better way for bringing programmers
up to speed (whether junior-senior or slow-fast or new-old or whatever). Also,
having tried a variety of interviewing techniques, I find pair programming to
have one of the highest concentrations of useful information about a
candidate. Work sample being another.

~~~
BurningFrog
Are you not allowed to pair during the day?

~~~
fapjacks
Well, this was years ago, and we were working on different tasks during the
workday. It started with me asking questions and him showing me how to do
something that ended up being pair programming. Like many workplaces, there
wasn't much time for teaching moments during the workday.

------
alephnil
From the article:

> [] Three of the twelve subjects did not use the recommended high-level
> language JTS for solving the task, but rather programmed in assembly
> language instead. Two of these three in fact required the longest working
> times of all subjects. One might argue that the decision for using assembly
> is part of the individual differences, but presumably most programmers would
> not agree that doing the program in assembly is the same task as doing it in
> a high-level language.

In my experience, making the right decisions like that is the real difference
between good and not so good programmers. Good programmers do on average
better choices that results in less code, code that is easier to maintain and
reason about, and choosing language and architecture that fit the problem at
hand. It is not that good programmers develop so much faster usually.

------
iamleppert
It would be interesting to have biographical sketches of the best and worst
developer in this study.

------
Fede_V
I have a feeling that implementation speed is an instance of Goodhart's Law
([https://en.wikipedia.org/wiki/Goodhart%27s_law](https://en.wikipedia.org/wiki/Goodhart%27s_law)).
If you keep everything constant (code quality, amount of tests, documentation,
etc) then a faster engineer is obviously better. However, if you start using
speed as a criteria to judge engineers, then the easiest way to increase speed
is to sacrifice things which make code maintainable and modular.

Finding metrics which work well even when people try to game them is
incredibly difficult (if not impossible).

------
kpil
I think that this test is only indicative, and the long time real world ratio
is much higher.

Inefficient and complicated solutions build up and the mediocre developer ends
up fixing old problems.

(slow is just an indication of mediocre)

Unfortunately the long term effects are not visible until after a long time
(duh), hiding the individual differences.

------
davidgerard
(2000)

