
Considering performance in the small is not “premature optimization” - mef
http://www.joshbarczak.com/blog/?p=580
======
syntheticnature
Physician, heal thyself.

It always seemed evident to me that Knuth's exhortation was about making sure
that the thing you were optimized was in fact a major contributor of code time
usage -- and that your change was an improvement.

Telling people to avoid, for example, bounds checking because it might turn
out to be a cycle soak later sounds like a good way to make your software
worse, not better, in the hopes that a few instructions saved will make the
difference. I once worked on a code base once with three different half-right
hand-hacked versions of date formatting code. I replaced them with strftime().
Certainly the call was slower, but I was provably better off optimizing the
timer routine that ran 50 times a second than worrying about hand-formatting
dates into strings.

~~~
thaumaturgy
I think the principle of charity that dang has recently been trying to promote
applies here. It's probably safe to assume that the author isn't suggesting
that software be allowed to critically fail more often in the interests of
micro-optimization.

The PDF he linked to for bounds-checking is a series of slides that notes that
conventional approaches to bounds-checking incurs a 60% runtime performance
hit, and that there is another approach that works as well but incurs only
about a 23% runtime performance hit.

Aside: rapidly improving energy efficiency has probably been one of the
greatest technological developments of the last 20 years; it would be
_amazing_ if software development practices could be similarly adapted. Like,
is it possible to have a demoscene approach to software development while
still keeping portability and maintainability? Imagine the transformation in
software we could see in the next 10 years if someone figured out how to do
that.

~~~
jacquesm
Mobile battery life might be just the thing that pushes this over the critical
point. The first party to create a smartphone with a four week battery life
because they made their (tightly integrated) system work on a 5 Mhz cpu will
make a lot of money.

~~~
com2kid
Smart phones have poor battery life due to background data sync (radios are
expensive to power up) and screen usage.

Turn airplane mode on, leave your screen off, and your phone will last a long
time.

~~~
marvy
Airplane mode is not fair. After all, "dumb" phones last for many days even
without airplane mode, as long as you don't over use them. The fair test is
this: how long will it last when used as if it were a dumb phone. Answer:
pretty good, but much less than an actual dumb phone.

~~~
ryanbrunner
Dumb phones typically don't need to use their radio to do things like check
mail though.

~~~
marvy
That's because they typically don't check mail, at least not in the
background. That's fine. Turn off the mail check feature of the smart phone.
Now will it last as long as the dumb one? Not quite. I would know, since I do
have that feature turned off. As well as location tracking.

------
knightofmars
I had a professor that used to say, "Make it work and then make it work fast."
The point being that you need to both understand and solve the problem before
you can figure out how to make the solution faster. It is the reason that the
concept of a prototype exists in every engineering discipline.

As an additional perspective, compare the solution of an engineer with 2 years
of experience to that of an engineer with 5 years of experience. If the
solutions are drastically different then interpretation of a rule such as
"avoiding premature optimization" will be drastically different as well.

Like any overly simplified statement, it is actually highly subjective. The
author of the article even calls out the specific context in which their
interpretation of Donald Knuth's rules are being applied, "I’ve listed lots of
relatively low-level things up there, but that’s just because it’s the level I
work at." and as such their interpretation doesn't necessarily apply in other
contexts.

Premature optimization is a problem if you're approaching it from a place of
ignorance. If you're doing it mindfully based on experience and domain
knowledge then it starts to make sense. But even under these conditions your
best intentions can be wrong. I've been in plenty of situations where I
thought I'd identified a code bottleneck only to have a far easier, cheaper,
and better solution completely unrelated to code come to light.

~~~
davidw
> Make it work and then make it work fast.

The version I've always heard is:

Make it work, make it right, make it fast.

[http://c2.com/cgi/wiki?MakeItWorkMakeItRightMakeItFast](http://c2.com/cgi/wiki?MakeItWorkMakeItRightMakeItFast)

------
eterm
Today I had to debug code with some database calls within 5 levels (at least)
of for-each loops.

I stopped measuring at ~40,000 database round-trips.

"Engineering time costs more than CPU time" was the attitude, and for the
original problem in its original specification, the solution was clearly OK.

But here we are, now needing to work out the original specifications, work out
the current implementation (in case they differ anywhere), and work out
whether it's worth re-writing it top-down or just fixing the worst of the
loops.

And I'm not blaming whoever wrote this originally, it must have done its job
to make it into the code base, but it really sucks to have to unpick it
because an assumption of "database calls are free" is an assumption that
unravels in a really messy way.

~~~
knightofmars
"...for the original problem in its original specification, the solution was
clearly OK."

That statement leads me to believe this was technical debt that went unpaid
and not a failing of design. Additionally, if your company is in a successful
enough position to spend the time required to pay down this technical debt
then if I were in your position I would count myself as lucky.

"...it must have done its job to make it into the code base...", exactly
right! Every time I hit one of these situations I view it as a way for me to,
"Try and leave this code a little better than you found it." I'm all about
making things easier for the next guy who comes along.

------
mmahemoff
"The website is temporarily unable to service your request as it exceeded
resource limit. Please try again later."

Cache:
[https://webcache.googleusercontent.com/search?q=cache:zjuUa-...](https://webcache.googleusercontent.com/search?q=cache:zjuUa-
xmVy0J:www.joshbarczak.com/blog/%3Fp%3D580+&cd=1&hl=en&ct=clnk)

~~~
jwmerrill
This was a very nice essay, and I'm glad to have read it.

But it's impossible not to gloat at least a little bit about someone lecturing
us about low level efficiencies from a site that can't manage to serve static
blog content to a few hundred visitors a minute (or whatever it is that the
top slot on HN drives these days).

~~~
mef
The essay is about software you write. The author most likely didn't write the
blog software nor the web server software.

~~~
vinceguidry
No he didn't write it. He had a much easier job, to pick software that does do
it right.

~~~
danielweber
He likely doesn't give a shit about that.

If my blog is unreadable for five minutes a month, I am perfectly happy and
I'm not going to change anything.

~~~
vinceguidry
Doesn't give a shit? Seriously?

I wouldn't dream of keeping a blog that couldn't handle a HN traffic spike.
That's the whole point of having a blog.

Failing at this is like launching your startup's MVP to an audience of the
first fifty people who were able to load your product launch page before it
crashed under the load. In other words, a giant, easily-avoidable waste of
effort.

~~~
shadowfox
> I wouldn't dream of keeping a blog that couldn't handle a HN traffic spike.
> That's the whole point of having a blog.

It seems possible that some people don't care that much about HN traffic
spikes.

------
rajat
All too often premature optimization is brought up as the antidote to
carefully think about what you're implementing prior to actually opening up
the IDE and start coding tests madly.

Thinking is hard, and takes time, and we want to get the feature out now,
immediately, and worry about performance later. If at all.

And, sad to say (for an engineer), it's not clear that from a "business"
perspective it's wrong. Hard to argue when accumulating features seems to
matter more than crappy software. We have a lot more software these days, to
run all of these bright new pieces of hardware, and perhaps because I'm an
old-timer, the general quality seems to have degraded significantly. But the
novelty of the stuff certainly has exploded, and I'm continually delighted by
the twists and features that folks are coming up with, while being saddened by
crashes, slowdowns, need for restarting, etc.

------
hindsightbias
For some reason, the vast majority of developers take 'premature' to be a
synonym of any.

If CS majors spent a fraction of the time learning how to optimize the way
EE/CEs do, we'd need a lot less magic from the latter.

~~~
TillE
A huge amount of CS is algorithms and data structures. My undergrad degree
also required two semesters of learning how a computer works from a logic gate
level upwards, culminating with writing a simple pipelined CPU in Verilog.
After three years I had a pretty good understanding from both the high level
(algorithms) and the low level (caches, etc).

CS programs vary, but basically all of them give you the tools needed to write
efficient code. They generally don't teach you much about software
"engineering", though, which remains a chaotic mess of a field.

~~~
ajarmst
Part of the problem, is that Dijkstra's complaint is often still quite
relevant: "Software engineering, of course, presents itself as another worthy
cause, but that is eyewash: if you carefully read its literature and analyse
what its devotees actually do, you will discover that software engineering has
accepted as its charter 'How to program if you cannot'."

------
jayvanguard
Yes, yes it is. Simply asserting the opposite doesn't suddenly invalidate an
entire industry's decades worth of experience.

One good point from the essay though is Knuth's example of 12% speed increase
for low effort is definitely worth doing. I agree.

A better way of putting things is:

Considering low effort, small performance improvements that don't affect other
factors such as code readability or system maintainability is not "premature
optimization".

If you are considering performance "in the small" and it affects
maintainability you are indeed prematurely optmizing.

------
pekk
"Premature" means "before measuring"

~~~
eropple
There are many optimizations that make sense to do "before measuring" because
they're free and easy to do as a matter of habit. For example, the use of
value types in C# is just as easy as classes--you just need to know how your
runtime works and pick the correct allocation method.

It takes me literally no time whatsoever to know intuitively that, yes, this
should be a struct rather than a class. But I've seen developers dismiss this
as "premature optimization" rather than _knowing how my software works_.

~~~
chc
Can you quantify the % performance gains you've gotten from this on a given
project? Because from what you've said here, it seems entirely possible you're
not actually writing appreciably more performant code than they are. This is
why the measuring is usually how you know how your software works.

~~~
mark-r
I think you missed the point of the parent comment. If there's two ways of
doing the same thing, and one way is clearly more efficient, there's no reason
not to do it the efficient way. Get in the habit, and you won't even waste
brainpower thinking about it. 99% of the time it might not make a difference,
but for that 1% you're ahead. You're also covered if a change somewhere else
causes your code to be executed a lot more often than you expected it to be.

~~~
eropple
Agreed. Though, to answer your parent post: when helping friends with XNA or
MonoGame projects, I've shaved entire milliseconds off of frame times
(particularly notable when you have a 16ms budget) by removing needless
reference types, either by unpacking and using primitives or by replacing
those reference types with structs.

Or, when I was at TripAdvisor, we (not me originally, though I inherited it)
greatly improved typeahead performance by dropping the built-in collections
classes (which used boxed primitives, because lolbrokengenerics) in favor of
Trove (or Colt, I can't remember) and the reified primitive types. If I write
Java today, I reach for Trove as a matter of course just because it means I
never write List<Integer>.

This stuff matters. Getting in the right habits is not hard and will pay off
in the ten percent of cases where it matters.

------
sp332
[https://xkcd.com/163/](https://xkcd.com/163/)

~~~
rnhmjoj
It's incredible how there is always a relevant xkcd comic.

------
stevebot
The problem I have found is not early optimization, but knowing what to
optimize early and what can wait. Everything in a system can be optimized, but
there obviously isn't the time to do this.

I work in Android, so optimizing I look at first is bitmap loading,
backgrounding tasks, and sqlite queries. Usually, this is where 90% of the
performance benefit comes from.

~~~
Pxtl
Exactly. I've been working on a codebase where we're trying to make a rush-job
project into something mature and reusable. One of the developers on my team
was concerned about our obviously and grossly inefficient business-logic layer
that applied business rules to a given record. I told him not to worry about
it - lo and behold, we ran a profiler and the business layer was barely a blip
on our performance problems, and our dynamic GUI stuff represented the lion's
share of wasted time.

------
PythonicAlpha
He is right.

He is right about premature optimizations and also right about craftsmanship.

Many performance problems stem from bad overall design decisions, bad
abstractions and bad data structures (summarized: craftsmanship). I always
wonder, how fast today's processors have become and how slow they get with
today's software.

One of my first computers had 64k of memory and a processor that was so
incredible slow compared to today's processors, that it is unbelievable. And
still, they made so many good software with it. In today's programs the
resources of this computer would not suffice for the idle loop.

Also the first Unix computer that I worked on, had only 8MB of memory and had
X11, LaTeX and many other stuff!

It was a long way down the road of abstractions, that today's computer could
hardly live with 1GB of memory and 1GHz processor (2 cores, please, please!)
for basic usage.

The real art in computer science is, to know where to optimize -- to use your
time most effectively.

------
lnanek2
Hmm, he complains about the awesome bar being coded inefficiently, but I'm on
a several year old laptop and the awesome bar is pretty much instant at
showing results for me when I type a key. So his evidence is not compelling in
my experience. Reading his links, it sounds like his memory error checking
tool is what causes slow down, not the usual Chrome code. Kind of bizarre he
is pointing fingers at others when his stuff is the problem that needs to be
worked on.

His whole argument boils down to: > Considering performance in the small is
not “premature optimization”, it is simply good engineering, and good
craftsmanship

But that's the same reason German industry tends to compete so badly right
now. They do a lot of hand trimmed and finished components that really could
have been better designed for automation and require less custom
craftsmanship. His argument seems to boil down to aesthetics.

------
dyadic
I'm very happy to see this article and more people taking this mindset.

Many people don't even know the context or the original quote. And many times
I've seen discussions about improving a piece of code shot down by a single
incantation of this Knuth. It's sad.

------
bcheung
Considering performance is a basic tenant of software architecture and
engineering. Obsessing about it needlessly is the "premature" part.
Considering the business case is the most important thing to remember.

------
michaelvkpdx
Brilliant blog. Thank you for writing this. Optimization is important! You
don't have to tune everything to the fastest possible speed. But remember, in
web programming, you may be writing a function that is called 10 or 100 or 500
times per request. That little optimization will add up quickly.

Be smart when developing, and know where the easy optimizations and common
pitfalls are in your language and toolset. Optimize as you can without
sacrificing maintainability.

------
current_call
"As a developer, I am tired of my IDE slowing to a crawl when I try to compile
multiple projects at a time. I am tired of being unable to trust the default
behavior of the standard containers. I am tired of my debug builds being
unusably slow by default."

Don't forget web browsers. Web browsers are horrible.

------
ky3
Nobody's arguing that speed isn't important. What's more important is to get
things right.

And no, incorporating speed in the specification doesn't make any difference.
Suppose the spec says, "page must load in 200ms." Fine, if you don't care what
correct page loading means, you're perfectly served with a blank one.

What's at the root of such intellectual capitulation? Complacency? Absence of
skills? "Correctness is hard, let's just randomly perturb settings instead
while fiddling with a stopwatch. Correctness is hard, let's just conflate
motion with progress."

Whence the shabby treatment of correctness like porn: I'll know it when I see
it.

------
Toine
TLDR : sometimes you need to optimize.

I don't understand what's new.

~~~
mark-r
It's not a question of optimize/don't optimize. There's a continuum, and the
pendulum swings back and forth. The author makes a case that the prevailing
industry attitude is too far on the "don't optimize" side.

And stop quoting Knuth to justify it.

------
skybrian
This article is not that well optimized. It could have been much shorter. I
think just the lesser-known Knuth quote would be enough.

------
0xdeadbeefbabe
Your small is another man's big. So, "don't prematurely optimize" could be
said another way, "a different point of view is like losing 80 IQ points" or
you could say it positively like Kay does
[http://en.wikiquote.org/wiki/Alan_Kay](http://en.wikiquote.org/wiki/Alan_Kay),
but I think he's only positive because he drinks once in a while.

------
dang
We changed the title to a representative sentence from the article. If anyone
can suggest a better one, we'll change it again.

~~~
lmm
Please don't. As a reader, not seeing the same article under a different title
when I return to the homepage is much more valuable than having the "right"
title.

~~~
dang
The article's title is both linkbait ("Stop Doing Bad Thing, Plus Famous
Person") and misleading: people are mostly not misquoting Knuth, as the
article itself points out ("Donald Knuth once wrote these unfortunate words").
To leave that title unchanged would violate both of the HN guidelines about
titles.

~~~
lmm
Then please violate and/or change the guidelines.

~~~
dang
Misleading and linkbait titles on the front page would make Hacker News much
worse.

