

Ask HN: Something to be said for slow software? - evanrmurphy

Does quick software always leave the best impression on users? In many cases, yes of course: people hate when a slow computer gets in the way of what they want to do. But I wonder if an application that's too zippy can be perceived as cheap in some cases.<p>Analogy: Everybody loves low prices, but if you sell your product at a bargain (or give it away for free), its perceived value can suffer. People see a competitor who's charging more than you and assume their product is higher quality.<p>Compare:<p><pre><code>  * "It's free so I guess it's not worth much."
  * "It takes no time to load so I guess it's not doing much."</code></pre>
======
gruseom
I would make an opposite point: software that's distinctly faster than its
peers impresses users not only as faster but also as _generally higher-
quality_ in ways one would not think had anything to do with performance.

Users are sensitive to tiny differences in speed, much tinier than most
software developers consider significant. For example, there have been many
studies showing dramatic drop-offs in web usage with what seem like micro-
differences in response times.

A good example is how Chrome is gaining rapidly in market share primarily by
being more responsive. There was a fascinating post recently on a political
blog I read where the author was talking about switching to Chrome even though
he couldn't quite figure out why. I read it as an example of the psychological
spill-over effect of performance. (Edit: here it is:
[http://www.talkingpointsmemo.com/archives/2009/12/the_case_f...](http://www.talkingpointsmemo.com/archives/2009/12/the_case_for_mind_control.php).
Well worth reading for anyone interested in the psychology of software.)

It's a fascinating conundrum that while hardware has gotten orders of
magnitude faster, software has remained sluggish. I know it's fashionable to
explain this by saying "but software is doing orders of magnitude more than it
used to", but I think this is only a partial explanation; it's also the case
that we're leaving huge amounts on the table because of the way we write code.
I used to believe in the dream of an ideal high-level language abstracted away
from the hardware altogether. That now seems to me a harmful delusion. What we
need is a better way to combine high-level constructs with low-level ones.
This in turn (veering wildly off on a tangent here...) explains the persistent
appeal of C++ despite its manifest perversity. No other mainstream language is
even in that game.

~~~
evanrmurphy
Thanks for sharing that article. The Chrome example is definitely relevant and
compelling.

No doubt the case is strong case for better performance improving overall
perception of the software by the user. My question though is, is there really
_nothing_ to be said for slowness anywhere? Nothing for the occasional well-
placed delay? Nothing for dramatic pause? Were all those accumulated hours of
my childhood waiting at loading screens on the PlayStation 1 really for
nought?

~~~
davidcuddeback
I suppose an algorithm that takes a little longer to run might make the user
think that the software is doing something more advanced, but that would only
apply when the software leaves room for the user to make qualitative
judgements about the answer.

Take the filters in photo editing software for example. In Photoshop, you can
apply many types of filters. Let's say you apply a noise-reduction filter in
Photoshop and it takes 3 seconds to run, then you switch to the Gimp and apply
a noise-reduction filter and it runs instantaneously. [1] The slowness may
lead users to believe that Photoshop's algorithm is more advanced. Of course,
this may partially be due to branding. Photoshop is the $1000 professional
photo-editing software, whereas the Gimp is free editing software put together
by hackers. This ties it to your analogy in the original post. Now if the
situation were reversed, and Photoshop's algorithm ran faster than the Gimp's,
I suspect it would be less likely for people to believe that the Gimp's
algorithm is more advanced.

This may work for a noise-reduction algorithm, because there is no one correct
answer. People may believe that the microscopic differences in the pixels
represent a difference in quality. Obviously, if you compare two algorithms
that are supposed to compute an answer for which there is an oracle (e.g.,
Fibonacci or factorial), then slowness is just slowness. If I write a slow
implementation of fib(n) and you write a fast one, people will know yours is
better because it's faster and there's no ambiguity in the answer that could
allow room for people to believe that mine is "more advanced."

Of course, this is just me speculating. I don't know of anyone that's
conducted research on this.

[1] This is hypothetical. I know nothing about the performance differences
between Photoshop and the Gimp.

------
_delirium
I think it can be the case if people have a perception that it's supposed to
be a computationally intensive task. It's kind of a standard joke in AI
circles that if you have a demo coming up, one way to make your dumb system
seem smarter is to make sure it takes a bunch of time and displays diagnostic
messages while doing so. Ideal is to slap a fancy sci-fi-movie front end on it
that displays both nice graphics (for flashiness) and a console window spewing
a bunch of technical garbage (for scientificness).

Of course, there are some legitimate reasons people might come away with the
impression that a system like that is smarter, mainly that they have more idea
what it's actually doing, because it was slowed down and visualized in a way
that let them see it operate (instead of just spitting out the solution).

The somewhat less legitimate reason is that there are a certain group of
people really enamored with high-end computing, and so if your thing runs in 2
seconds on a desktop, they assume it must suck, because why weren't you doing
something fancier that needed 30 minutes on a huge cluster?

------
rmundo
I think a UI with well thought out transitional animations certainly enhances
the user experience. It might be slower than just switching to a new
window/view/focus, but it helps the interface seem more intuitive. The rapid
expansion/zoom that iphone apps do when you open them, for example.

As for slowing things down for no reason, no. I don't think anyone enjoys
having their time and attention put on hold.

------
jsz0
People are very task oriented. They want their tools to work quickly and
effectively. They don't spend much time thinking about how the tool works so I
don't think it's possible for software to be too quick. It opens the door to
your customers feeling like they're not getting the optimal value out of their
hardware over what would basically be a marketing gimmick. It also leaves your
venerable to your competition who can simply offer a faster alternative which
has the coat tails of representing their software as being more efficient,
better designed and sold by a company with better programmers & management. On
a slightly different topic I do think it's possible for GUIs to poorly
represent a fast application is working correctly. The first h264 video encode
I did on an octo-core machine left me with that feeling. I was sure something
had gone wrong. It _was_ too fast. Once I realized everything went well I was
delighted of course.

