Hacker News new | comments | show | ask | jobs | submit login
Check you're wearing trousers first (robertheaton.com)
321 points by FailMore 1693 days ago | hide | past | web | 58 comments | favorite



Yes, 100%.

One thing I've enjoyed so far about being a contractor (but not a consultant) is that even as a relatively junior developer, I get to see the dirty internals of many different companies, and spot patterns between them. This is one of them.

I've been at companies that decided to switch to hot new NoSQL distributed fault-tolerant join-free key-value vector clock databases, when really they only needed to add a couple of indexes to a few heavily-queried fields. I've seen language switches and full re-architectures based on perceived performance problems, but the complexities of the new architecture made request latencies worse (c.f. https://en.wikipedia.org/wiki/Second-system_effect).

The best approach I've seen so far is described as 'list, rank, iterate'. Profile your problems aggressively, rank the issues in descending order of importance, and greedily work your way down the list, fixing them.


If there's one thing that drives me potty it's people who claim that mysql is too slow for their requirements (usually low-medium traffic website), then I look at their implementation and see no indexes on non primary-key fields.

Before NoSQL was a thing, I saw people break their data out into 1000s of .txt files to get around this.


There are so many good reasons to avoid MySQL that one shouldn't even need to bring up speed.


Agreed but in this case "mySQL" meant "SQL" (vs. NoSQL).


The single most useful thing to address perceived performance problems: Measure your performance. A often neglected part of the "list, rank, iterate" cycle is generating performance indicators that can be quantitatively assessed.


Nice post.

Whenever I run into a fresh technical problem, I often think of this:

"Why does Grandma remove the chicken's legs when she makes soup?"

Aunt Dorothy: "It's easier to cut the chicken when it's cold that when it's hot."

Uncle Bob: "Greater surface area better infuses the broth with fat."

Aunt Sue: "To allow the dark meat and white meat develop flavor on their own."

Aunt Jean: "Smaller pieces allow the chicken to cook faster and more thorougly."

Grandma: "So that it fits in the pot."


There's an even more effective version of this story that ends with grandma saying something like "back when I was a girl, we couldn't afford a big enough pot."


Yes, I believe the intended lesson of the story isn't about simple solutions or explanations, but about how the reasons for doing things in certain ways are forgotten or misconstrued as they are handed down through generations or across organizational boundaries. The mother tells her daughter they cut the legs off because it cooks more evenly; the grandmother says it's because it makes for a better flavor; the great-grandmother says it's because she couldn't afford a big enough pot.


In the version I've come across, it's "I don't know; that's just how my Grandma did it."



Perhaps the word you are thinking of is apocryphal. But I do like what the spelling apocraphyl implies--it sounds like some kind of plant compound. May I borrow this spelling?


This should be the name of the pigment that imbues those cheap bunches of flowers at the local deli with their artificially bright neon petals.


I too rather like this, but more for the reason that it emphasizes the "crap" that apocryphal stories oftentimes turn out to be :)


I just registered apocryph.al last week. I think it'd make a great blog URL.


What? Next you'll be telling me that nobody's grandma ever actually sucked eggs either!


A pg essay repeats a similar story about onions in the varnish. http://www.paulgraham.com/arcll1.html

"In The Periodic Table, Primo Levi tells a story that happened when he was working in a varnish factory. He was a chemist, and he was fascinated by the fact that the varnish recipe included a raw onion. What could it be for? No one knew; it was just part of the recipe. So he investigated, and eventually discovered that they had started throwing the onion in years ago to test the temperature of the varnish: if it was hot enough, the onion would fry."


It’s a pleasant delusion to believe that all our problems require hard solutions.

Love this! So true.

I'll never forget my first job out of college, a Senior Developer told me something along the lines of "Nothing we do is cutting edge. You will almost never need anything more sophisticated than brute force algorithms and database I/O." And yet we managed to create some pretty cool products for our users. Good article.


The "STOP SPENDING SO MUCH MONEY ON HELICOPTERS AND MANAGEMENT CONSULTANTS" line reminded me of this piece, from the excellent Armando Iannucci Shows:

http://www.youtube.com/watch?v=MSJggp-mbiA&t=50s


Funny thing is that I was a consultant before, and was brought into a company to look at their procurement decisions. And lo and behold, they actually had a helicopter. Needless to say, we said to get rid of it.

Other than that, the first two paragraphs made me think of my own startup, and how we can get users to engage more often and get more users. Immediately transported me into the head of our main users, and I had to message the developers on the team for thoughts.

Great post, and thanks for sharing!


This is surprisingly entertaining.


That show was great. Not as well known as it should be because it had the bad luck of premiering on 9/11/2001.

Ianucci also made The Thick of It, which you should really watch if you haven't seen it. One of the best pieces of political satire ever made.


If you haven't seen Yes Minister/Yes Prime Minister, you really should. 30 years on and still great political satire.


There's some fantastic parts like the bit about leading questions: http://www.youtube.com/watch?v=G0ZZJXw4MTA


+1 for The Thick of It. Which is, miraculously, free to watch on Hulu (at least in the US).


Elephant in the room indeed.


I hadn't actually made that connection!


Excellent post.

The first question I ask myself, as an engineer approaching a new problem, is "How can I cheat? What's the 90% solution that will take 10% of the time?" (The next question is, "Is 90% enough?").

I'm always surprised by engineers who don't think this way.


I think this is the essence of the phrase "a good programmer is a lazy one."


The three qualities of any great programmer: laziness, impatience, and hubris.

(From The Camel Book (Programming Perl) in about 1990 or '91, from memory)

http://www.hhhh.org/wiml/virtues.html


It ain't hubris when it's objectively true. :-)


Working on trouser-color problems is also a form of procrastination. It's much more "fun" to tackle big problems with lots of code that will get Big Results.

I always try to start my day with something really boring and basic for this reason - Things like "Add X to DB class" are tedious, but they help bootstrap my brain into a place where I can tackle the bigger problems (if I need to).


That's a great technique. It's a good application of most prolific writers' advice about writer's block: just write, dammit!

I've taken it as a sign that I am officially "old" that this article was stunningly obvious. I remember being introduced to a profiler and similar instrumentation techniques by my first software engineering mentor.

Value, including your reputation, is built on applying intellectual leverage to get results that are faster and larger than expected.

The quick fixes (low hanging fruit) buy time to gedankenversuch your big architectural solution, so you can apply your critical mind to tearing it down, rebuilding it, boiling the ocean for tea, and tearing it down again while washing your underarms in the shower[1]. With a new team, it builds trust so that "rudely" doesn't come into it when you open the topic.

[1] This is the leading cause of having to jump back into the shower because I forgot to rinse.


This mentality more than anything else helped me get through consulting projects with nebulous priorities. Tackling the "Am I wearing trousers?" (read: boring) pieces meant freeing up dendrites for the rest.


Great post. It's a trap I tend fall into myself far to often. And it's the same direction as DHH's famous rant about winning in Vegas.

It's a virtue too rarely seen, that you should try to solve a problem with the minimal set of code possible.

Everything should be as simple as possible., but not simpler. (Sometimes attributed to Albert Einstein, even if he probably never uttered the exact words)


Thanks, I had to look up that article, and it's definitely a good one:

http://37signals.com/svn/posts/3384-winning-is-the-worst-thi...

Edit: discussion on Hacker News

https://news.ycombinator.com/item?id=5002454


I enjoyed this article and I agree with the main point it makes. I want to quibble about a smaller point, which is that intuition is not always a reliable guide.

The author gives a few examples where they got stuck in and fixed the "obvious" problems. My experience is that what is a problem is not always obvious. Furthermore, while the big wins might be obvious, you'll miss many small wins if you're not carefully tracking stats. These are the main lessons of A/B testing.


The whole point of the article is "don't A/B test (or do anything else fancy) until you run out of obvious problems."


Sorry, let me be clearer. What I may believe to be an obvious problem may not actually be a problem for customers. What may be an obvious problem for customers may not be an obvious problem for me. Intuition is often wrong.


There's no impassible wall between you and your customers. Watch a few of them use your product. If 80% of them stumble somewhere--that's an obvious problem.

It didn't take Chi-square analysis for Microsoft to notice that nobody could figure out how to start a program in Windows 95; they just sat down and watched a few people, and it was suddenly very clear that they needed a big arrow pointing to the start button.


Agreed!

Generating hypotheses often involves qualitative techniques. Validation must be quantitative.


It's more that A/B testing will find the local maxima; you might get a 0.1% improvement to conversion by making the "Add to cart" a better color, but you could get a 1% improvement by streamlining the checkout flow.

A lot of this comes from having experience, you know the right place to start looking for the problem.


This argument comes up quite regularly, but it is based on a misunderstanding of A/B testing.

People see blog posts about changing button colours and think that is all there is to A/B testing. I've seen far more success with much more radical changes. I work in the A/B testing industry (see my startup Myna: http://mynaweb.com) and this experience is reflected by others in the industry. E.g. see this video by Dan Siroker from our Optimizely (a competitor of ours): http://www.youtube.com/watch?v=lDfoVxHud7Y&feature=youtu... He directly addresses this issue.

There is no reason you can't test the example you give of streamlining the checkout flow. So long as you have sufficient resources you can test anything you can change and measure -- landing pages, websites, business models...


I think maybe my point wasn't clear enough, I agree with you. I was trying to point out that with experience you know where to start testing so you can focus on fixing the pain points that will have the biggest impact.

A/B testing is an important tool to aid optimization, but knowing where to focus your efforts is key. I have over a decade's worth of experience improving e-commerce conversion so generally my first step isn't introducing testing, but fixing the obvious things I know will have an impact because they always have an impact, e.g. fixing a confusing checkout; improving search and navigation; embarking on a data cleanup and improving product content, etc..


Well, but that's exactly what A/B testing should be used for - instead of optimizing to a local optimum, you use it to verify if the 'new and improved' checkout or search or navigation is really an improvement, or is insignificant, or happens to make things worse.

"The obvious things I know will have an impact" sometimes don't have an impact and sometimes have a negative impact. Experience makes the chances better, but nowhere near perfect.


The problem, I feel, is impedance. You should be A/B testing constantly and using it to both find obvious problems, complement qualitative testing, and inform decisions about next steps.

To give your full decision-making power over to an algorithm is stupid. To ignore quantitative power when it's available to you in your decision-making is stupid.


Yes. I have seen nothing cause more wasted time than failing to carefully determine what the question (i.e. problem) is. It takes discussion and thought and creativity and getting it right is more important than any other aspect of a software project.


One accurate measurement is worth a thousand expert opinions.

You'd be surprised how many big strategic decisions are made on top of a mountain of opinions. Try to be guided by what is happening not what should happen.


While I get the point of the article, I have to say that nearly every time I run split-testing or multivariate testing I am surprised at the results. Honestly, at this point I'm somewhat convinced that sites that are uglier convert better.

So while I agree that a lot of things we can overcomplicate, it's not safe to say "Just go with your intuition." It doesn't have to be as cut and dry as "once users get 10 friends they keep coming back," but don't use that as an excuse to not look at your data.


Chances are that's just because your data is insufficient or non-independent, but I do agree that intuition isn't the be-all. A big problem with multivariate testing is explaining your results, which is a huge time sink for low or negative gains - see the other post about hero images, for example.


If you're just a dude or a small start-up then sure, avoid analysis paralysis. But if you work at a bigger company, you have to test and bring in data to make changes -- otherwise, your stuff will lose out to other priorities.


also, you need to measure and test to know whether it worked, and thus to move quickly on to the next problem. If you can't demonstrate that a problem is solved, you'll likely be told to keep working on it long after it stopped being the most pressing issue.


I don't think these solutions are simple. I personally consider the highly technical solutions far more simple, since there are scientific properties like falsifiability, lower bounds etc. that you can use to analyze these kinds of problems.

On the other hand for the problems written about in the article there are no such methods known. Thus they are far more complicated. The only property that enables us to apply them in our world is that our society brainwashes people into common sense (which is far, far away from good solutions; common sense only delivers solutions that barely work).

So I still believe (more than ever) that hard problems require hard solutions. But I define "hard solutions" differently than the article.


My dad worked his way up (several companies) from being an engineer to a technical manager and most of his success was from repeating 'keep it simple' at key junctures, so far as I can tell.


In short, identify the problem, solve it. Not, choose a technology that you like and hope it's going to solve your vaguely defined problem.


"He says, there are no easy answers! I say, he's not looking hard enough!" Bart Simpson


Okay, I'm gonna take a huge karma hit for this, but I had flashbacks to slashdot mentality here:

"But I wear a kilt you insensitive clod!"


Want happy blog readers? Check you're not hijacking peoples back button first!

(Just kidding, it was a great read apart from that!)


TLDR:

Occam Razor

TLDR2: Better simple than easy




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: