
Should we all be looking for marginal gains? - anishkothari
http://www.bbc.com/news/magazine-34247629
======
jacobolus
This has some cute anecdotes, but is overall a pretty intellectually flimsy
argument. The summary is basically “when given a concrete metric, humans are
really good at optimizing to it. These improvements compound, and dramatic
qualitative changes can result from many tiny incremental steps.”

For anyone in computing, where we’ve seen improvement of 5+ orders of
magnitude in the past 50 years, this should hardly be a new insight.

No effort is made to examine the trade-off between tackling tiny marginal
problems vs. rethinking more fundamental assumptions and practices. Likewise,
there’s no consideration of whether the metrics involved (e.g. reduction of
liability insurance premiums, number of hot-dogs eaten during a contest,
success at exams, or toolbar click-through rate) are the most important things
to optimize, or indeed if by spending great effort optimizing for those
specific criteria we might create unexpected costs and side effects that we
won’t necessarily even know about.

I shouldn’t be too hard on the author I guess... he’s just trying to promote
his new self-help book.

~~~
wpietri
I think the contrast shouldn't be with rethinking fundamental assumptions,
which is another good but rare activity. It should be with assuming that
things are pretty much as good as they get, or assuming that small wins aren't
worth bothering with. Because that's where a lot of businesses are all the
time. Computer hardware is very much the exception.

If you want an example of where this matters, the car industry is a great
example. In Rother's "Toyota Kata", he has a graph showing productivity per
worker among big auto firms. The large car companies all rise together until
the 60s and then plateau. Toyota, which had a very strong focus on continuous
improvement just kept on rising for decades. This enabled them to go from
Japan's post-war decimation to become the world's largest car maker while the
others stagnated.

And I should add that continuous improvement is a good way to drive
fundamental rethinking. The only way to get long-running continuous
improvement is to pay a great deal of attention to how things actually are
working. Eventually you will say, "Well, now that we've solved a bunch of
little things, the biggest bottleneck is X". But changing X forces fundamental
reevaulations.

For example, consider software release. Unreleased software is like inventory
just sitting around the factory. You've paid money to make it, but it's not
earning any money for you.

Years ago, quarterly and annual release cycles were common. The last part of
the cycle was when a large, manual QA team would beat up the whole product. As
we have kept squeezing release cycles, this has forced all sorts of
innovation. At my last couple of companies, we've used continuous deployment,
where every commit to master is automatically tested and deployed to
production. These small teams averaged a few releases a day and had no QA
people. Etsy has 50+ releases a day [1]. If you were to describe this to
people 20 years ago, they'd call you insane; it's a fundamentally different
view of how software gets made. But we've gotten there via 15 years of
continuous improvement.

[1] [http://www.infoq.com/news/2014/03/etsy-deploy-50-times-a-
day](http://www.infoq.com/news/2014/03/etsy-deploy-50-times-a-day)

~~~
landryraccoon
To be fair, quarterly and annual release cycles were common when software was
a physical product. Games had to sell retail, and enterprise software still
needed to deliver disks. Until the early 2000s there was no point in
continuous integration because your release process still had to be waterfall
- you were gated by extremely slow physical processes like retail outlet
logistics.

Once continuous deployment to the end user was possible (via the internet),
the industry started following suit. And nowadays there is still something
similar for iPhone developers - you're gated by the Apple app store, which
doesn't have to be slow, but somehow still is. (This has important
implications like having to backload a huge amount of testing before app
submission because you can't be certain of how quickly you'll be able to
release a frontend patch).

~~~
wpietri
Sure, the shift in delivery mechanisms provided a lot of incentive to move to
faster deployment.

But I should note that a great deal of software in ye olden dayes was in-house
software, and that still ran on 3-18 month cycles. And that wasn't because of
technological limitations; my dad was delivering new versions of software
every few days even in the 70s. It's just that the conceptual model and the
dogma pushed in the direction of long, heavily-planned release cycles. The
Internet's main contribution here was to make short cycles not only possible
for commercial releases, but competitively advantageous in an obvious way.

------
cperciva
Patrick doesn't seem to be here yet, so I'll deliver his line for him: This is
the human version of making lots of small tweaks to your website and to your
advertising. If you can find a 1% improvement to your advertising clickthrough
rate and a 1% improvement to your conversion rate every day, by the end of the
month you'll have nearly doubled your sales.

~~~
throwaway_exer
A smart friend of mine in publishing (now a retired multi-millionaire) once
said to me, "Successful publishing businesses never stop looking for ways to
improve revenue 5% at a time."

------
lazyant
This article looks like somebody read some books and summarized parts of them
with a common theme without adding anything, I know because I read some of
those books. Still a good summary.

~~~
x43b
I felt the same way. The sport optimizations I felt like we're from Faster,
Higher, Stronger. The hot dogs I read about in the chapter on outsiders and
innovation, maybe in a Freakonomics book. I know I read the hospital and
checklists before too.

------
scottw
Marginally related: an entertaining exhibition table tennis match between the
author (Syed) and Barney Reed (worked at Google)
[https://www.youtube.com/watch?v=4ug0qbmi2Kw](https://www.youtube.com/watch?v=4ug0qbmi2Kw)

------
erikb
For the public at large this might be a new argument. For us here on HN it's
old news, though. Maybe since 2012 here and there we often see articles that
show that you shouldn't just focus on the most efficient optimizations.
Remember that article that showed that the Facebook PHP compiler was slowly
beaten by the their interpreter, because people step by step optimized the
interpreter and did just more of that than the compiler guys, and they could
also only fight back by doing all the little things, just a lot of them?

------
kaonashi
When you've taken care of the larger gains, sure.

But don't start with the minutiae, start with the fundamentals.

------
x43b
I was surprised at the book promoted at the end. I was positive this was going
to a revised excerpt from Faster, Higher, Stronger by Mark McClucsky.

~~~
meatysnapper
Given the Tour de Fukushima this year, you would be forgiven for thinking such
a thing.

British cycling is hilarious; whatever they are doing is going to trickle into
all power/weight sports.

