One thing I've enjoyed so far about being a contractor (but not a consultant) is that even as a relatively junior developer, I get to see the dirty internals of many different companies, and spot patterns between them. This is one of them.
I've been at companies that decided to switch to hot new NoSQL distributed fault-tolerant join-free key-value vector clock databases, when really they only needed to add a couple of indexes to a few heavily-queried fields. I've seen language switches and full re-architectures based on perceived performance problems, but the complexities of the new architecture made request latencies worse (c.f. https://en.wikipedia.org/wiki/Second-system_effect).
The best approach I've seen so far is described as 'list, rank, iterate'. Profile your problems aggressively, rank the issues in descending order of importance, and greedily work your way down the list, fixing them.
Before NoSQL was a thing, I saw people break their data out into 1000s of .txt files to get around this.
Whenever I run into a fresh technical problem, I often think of this:
"Why does Grandma remove the chicken's legs when she makes soup?"
Aunt Dorothy: "It's easier to cut the chicken when it's cold that when it's hot."
Uncle Bob: "Greater surface area better infuses the broth with fat."
Aunt Sue: "To allow the dark meat and white meat develop flavor on their own."
Aunt Jean: "Smaller pieces allow the chicken to cook faster and more thorougly."
Grandma: "So that it fits in the pot."
"In The Periodic Table, Primo Levi tells a story that happened when he was working in a varnish factory. He was a chemist, and he was fascinated by the fact that the varnish recipe included a raw onion. What could it be for? No one knew; it was just part of the recipe. So he investigated, and eventually discovered that they had started throwing the onion in years ago to test the temperature of the varnish: if it was hot enough, the onion would fry."
Love this! So true.
I'll never forget my first job out of college, a Senior Developer told me something along the lines of "Nothing we do is cutting edge. You will almost never need anything more sophisticated than brute force algorithms and database I/O." And yet we managed to create some pretty cool products for our users. Good article.
Other than that, the first two paragraphs made me think of my own startup, and how we can get users to engage more often and get more users. Immediately transported me into the head of our main users, and I had to message the developers on the team for thoughts.
Great post, and thanks for sharing!
Ianucci also made The Thick of It, which you should really watch if you haven't seen it. One of the best pieces of political satire ever made.
The first question I ask myself, as an engineer approaching a new problem, is "How can I cheat? What's the 90% solution that will take 10% of the time?" (The next question is, "Is 90% enough?").
I'm always surprised by engineers who don't think this way.
(From The Camel Book (Programming Perl) in about 1990 or '91, from memory)
I always try to start my day with something really boring and basic for this reason - Things like "Add X to DB class" are tedious, but they help bootstrap my brain into a place where I can tackle the bigger problems (if I need to).
I've taken it as a sign that I am officially "old" that this article was stunningly obvious. I remember being introduced to a profiler and similar instrumentation techniques by my first software engineering mentor.
Value, including your reputation, is built on applying intellectual leverage to get results that are faster and larger than expected.
The quick fixes (low hanging fruit) buy time to gedankenversuch your big architectural solution, so you can apply your critical mind to tearing it down, rebuilding it, boiling the ocean for tea, and tearing it down again while washing your underarms in the shower. With a new team, it builds trust so that "rudely" doesn't come into it when you open the topic.
 This is the leading cause of having to jump back into the shower because I forgot to rinse.
It's a virtue too rarely seen, that you should try to solve a problem with the minimal set of code possible.
Everything should be as simple as possible., but not simpler. (Sometimes attributed to Albert Einstein, even if he probably never uttered the exact words)
Edit: discussion on Hacker News
The author gives a few examples where they got stuck in and fixed the "obvious" problems. My experience is that what is a problem is not always obvious. Furthermore, while the big wins might be obvious, you'll miss many small wins if you're not carefully tracking stats. These are the main lessons of A/B testing.
It didn't take Chi-square analysis for Microsoft to notice that nobody could figure out how to start a program in Windows 95; they just sat down and watched a few people, and it was suddenly very clear that they needed a big arrow pointing to the start button.
Generating hypotheses often involves qualitative techniques. Validation must be quantitative.
A lot of this comes from having experience, you know the right place to start looking for the problem.
People see blog posts about changing button colours and think that is all there is to A/B testing. I've seen far more success with much more radical changes. I work in the A/B testing industry (see my startup Myna: http://mynaweb.com) and this experience is reflected by others in the industry. E.g. see this video by Dan Siroker from our Optimizely (a competitor of ours): http://www.youtube.com/watch?v=lDfoVxHud7Y&feature=youtu... He directly addresses this issue.
There is no reason you can't test the example you give of streamlining the checkout flow. So long as you have sufficient resources you can test anything you can change and measure -- landing pages, websites, business models...
A/B testing is an important tool to aid optimization, but knowing where to focus your efforts is key. I have over a decade's worth of experience improving e-commerce conversion so generally my first step isn't introducing testing, but fixing the obvious things I know will have an impact because they always have an impact, e.g. fixing a confusing checkout; improving search and navigation; embarking on a data cleanup and improving product content, etc..
"The obvious things I know will have an impact" sometimes don't have an impact and sometimes have a negative impact. Experience makes the chances better, but nowhere near perfect.
To give your full decision-making power over to an algorithm is stupid. To ignore quantitative power when it's available to you in your decision-making is stupid.
You'd be surprised how many big strategic decisions are made on top of a mountain of opinions. Try to be guided by what is happening not what should happen.
So while I agree that a lot of things we can overcomplicate, it's not safe to say "Just go with your intuition." It doesn't have to be as cut and dry as "once users get 10 friends they keep coming back," but don't use that as an excuse to not look at your data.
On the other hand for the problems written about in the article there are no such methods known. Thus they are far more complicated. The only property that enables us to apply them in our world is that our society brainwashes people into common sense (which is far, far away from good solutions; common sense only delivers solutions that barely work).
So I still believe (more than ever) that hard problems require hard solutions. But I define "hard solutions" differently than the article.
"But I wear a kilt you insensitive clod!"
(Just kidding, it was a great read apart from that!)
Better simple than easy