
What “Worse is Better vs The Right Thing” is really about - m_for_monkey
http://www.yosefk.com/blog/what-worse-is-better-vs-the-right-thing-is-really-about.html?
======
loup-vaillant
Linus' and Alan's citations aren't incompatible. Actually, I think they're
both true. Yes, massively parallel trial-and error works wonders, but if you
favour the first solutions, you'll often miss the best ones. Actually, effects
such as first time to market, backward compatibility, or network effects often
trump intrinsic quality by a wide margin. (Hence X86's dominance on the
desktop.)

Yes, Worse is better than Dead. But the Right Thing dies because Worse is
Better eats its lunch. Even when Worse actually becomes Better, that's because
it has more resources to correct itself. Which is wasteful.

The only solution I can think of to solve this comes from the STEPS project,
at <http://vpri.org> : _extremely late binding_. That is, postpone decisions
as much as you can. When you uncover your early mistakes, you stand a chance
at correcting them, and deploying the corrections.

Taking Wintel as an example, that could be done by abstracting away the
hardware. Require programs to be shipped as some high level bytecode, that
your OS can then compile, JIT, or whatever, depending on the best current
solution. That makes your programs dependent on the OS, not on the hardware.
Port the compiling stack of your OS, and you're done. If this were done, Intel
wouldn't have wasted so much resources in its X86 architecture. It would have
at least stripped the CISC compatibility layer over it's underlying RISC
design.

But of course, relying on programmers to hand-craft low-level assembly would
(and did) make you ship faster systems, sooner.

~~~
icebraining
Debian runs on many architectures without being slowed by bytecode JITs. I'd
say that's a technically better solution.

~~~
lnanek2
Yes, but the software has to be compiled to the architecture too using that
solution. So it is just putting the problem on to others: the users, the
software developers, etc. - many of whom will happily just use something else.

~~~
derleth
> the software has to be compiled to the architecture too using that solution

Right. Which is why Debian has a package management system. Most people never
have to compile any of the software they use.

------
cs702
Great essay -- I agree with its main point: "worse" products triumph over "the
right thing" when they are a better fit for the evolutionary and economic
constraints imposed by an evolving competitive landscape.

Some examples:

* In the case of the rise of Unix, the market of the 1960's and 1970's valued simplicity and portability over "correct" design.

* In the case of the rise of the x86 architecture over the past three decades, the market valued compatibility and economies of scale over the simplicity and elegance of competing RISC architectures.

* In the case of the current rise of ARM architectures for mobile devices, today's market values simplicity and low-power consumption over compatibility with legacy x86 architectures.

~~~
dgreensp
Yes, a great explanation, especially the wrap-up at the end.

You can see how people with too narrow a view of technical design would
misconstrue this point. If you're hell-bent on (shallow) perfectionism, you
perceive the dichotomy as perfect vs imperfect. One step more healthy is
simple vs. complex, because it means you've recognized that time is finite,
and too much complexity can legitimately kill a project; it's one of the first
real-world constraints on viability that intrudes. "Viability" is really the
goal, though, typically, when you include the motivations of the humans
involved, like seeing their project be successful and have an impact.

The more you can align your moral compass with viability rather than turning
smaller issues into battles between good and evil, the more successful you
will be.

~~~
loup-vaillant
Actually, making it simple is often _harder_. If it where things like the
STEPS project[1] would be widespread by now.

Unix didn't won because it was simpler. It won because it was _easier to
implement_. In terms of _overall_ simplicity, Lisp systems were probably far
ahead.

[1] The punchline is "Personal computing in 1 book (20KLoC)" Compiler suite
included of course. <http://www.vpri.org/pdf/tr2011004_steps11.pdf>

~~~
kragen
There _were_ Lisp systems that were simpler than Unix, such as AutoLISP,
Scheme, and later XLISP, but the ones that were competing with Unix in the
1970s were things like MACLISP, Zetalisp, and Interlisp, which were much more
complex than Unix was at the time. I mean, Zetalisp had its own _microcode_ ,
its own hypertext documentation, its own GUI, transparent persistence, and a
WYSIWYG text editor, at a time when Unix had a couple of C compilers, man
pages (its own typesetting system, to be fair), @ and # as the defaults to
erase a line and a character, and ed as the standard text editor.

------
gruseom
I never really got the "Worse is Better" essay. It obviously doesn't mean what
everyone says it means and what it does mean isn't clear. This post points
some of that out. For example, Worse in the essay was associated with
simplicity. But the classic examples of Worse triumphing in the marketplace
(the OP cites x86 as an example) are anything but simple: they are
hypercomplex. Not only that, their complexity is largely what _makes_ them
Worse. Simplicity is rather obviously Better, not Worse. Smalltalk (which the
OP cites as Better) is far simpler than its more successful peers. The more
you look at the original essay, the more its conceptual oppositions seem
muddled and at odds with history. I've concluded that it boils down to exactly
one thing: its title. "Worse is Better" is a catchy label that touches on
_something_ important about technology and markets and means different things
to different people.

~~~
praptak
Such essays are always about concepts that cannot be precisely defined. That
aside, I believe that it wasn't really _simplicity_ that "worse" was
associated with but _easyness_.

This distinction is well described by Rich Hickey (of Clojure fame) in his
"simple made easy" talk. The key point is that simple-complex and easy-hard
are two separate axes and that "easy" usually leads to "complex" (the case of
x86, I believe) but you can invest some effort in shaping the environment so
that "simple" can be "easy".

------
jeffdavis
I often think about software development in similar terms -- evolution versus
intelligent design.

The weakness of evolution is that it takes millions of years, it's heavily
dependent on initial conditions, there's lots of collateral damage, and most
lines die out.

The weakness of intelligent design is that we're only so intelligent, which
places a pretty low limit on the possible achievement. (And intelligence is
generally regarded as close to a normal distribution, meaning that the
smartest people can only handle a small multiple of the complexity of the
average person).

Obviously, evolution and design need to be combined somewhat. The question is:
how much of each, and at what times during a project? Do you spend 10% of the
time quietly planning, 10% arguing with a small group of designers, and 80%
trying things and trying to get feedback? Or is it more like 40%, 40%, and
20%? And how do you mix trying things with the designing things?

~~~
sitkack
Evolutionary process is not just genetic.
[http://www.youtube.com/watch?v=dOZ3Xt6ZMBA&t=6m15s](http://www.youtube.com/watch?v=dOZ3Xt6ZMBA&t=6m15s)
David Sloan on the evolution of egg laying hens.

See the <http://en.wikipedia.org/wiki/Replicator_equation>

------
sedachv
Thank you Yossi for writing this piece. It's about time that Worse is Better
argument was debunked. Worse isn't better, portable, free (libre, gratis, or
at least really cheap) is better.

What many people forget is that during the time frame Worse is Better talks
about, Lisp machines cost as much as two or more houses. You couldn't get a
decent Lisp system on affordable hardware until MCL, and then you still needed
a fairly high-end Mac to run it on.

OTOH, Unix and C-based software ran on a bunch of different machines, which
you either already had or could acquire inexpensively. The software was easy
to get and inexpensive as well. Then 4.3BSD and Linux came along, and you
couldn't beat that on price.

------
ScottBurson
Interesting to note, in this connection, the rising popularity of Haskell,
which is _way_ off at the "Right Thing" end of the spectrum.

Maybe it is really possible to come up with the Right Thing eventually -- it
just takes a lot of research.

~~~
jonathansizz
What makes you think that Haskell is rising in popularity? While this may be
the case on sites like HN and Lambda the Ultimate, more broadly Haskell is
nowhere. It certainly hasn't come anywhere near anything that could be
described as success in the marketplace.

So your example actually supports the author's thesis, not your own.

------
gajomi
An enjoyable and stimulating read. The original essay, by virtue of a few
semantic ambiguities (what is "simple" anyway?) is apt to invite this sort of
commentary.If I have read this correctly, the author eventually agrees that
worse really is better, with the clarification on what this means outlined in
the first part of the essay.

However, I was hoping to see a deeper analysis of how the nature of the
evolutionary pressure in his domain contributed to the worse is better effect
(I am an evolutionary biologist, so I find this kind of thing interesting).
For example, if the "product" in question was a mathematical concept of
interest to professional mathematicians, there almost certainly be a niche
space in which version of the concept exhibiting "consistency, completeness,
correctness" will dominate over the competition. For mathematicians
consistency and correctness are strongly selected for (completeness, broadly
defined, is usually much harder to obtain). For a the average iPhone app,
these things still matter, but in a very indirect sense. They get convolved
(or low passed, as Alan Kay describes) with other concerns about shipping
dates and usability and so on. I would be interested to see a classification
of different domains in which "worse is better" and "the right thing"
philosophies dominate, and those in which they are represented in roughly
equal proportions.

------
drblast
It's not too instructive to look back on things that occurred mostly due to
happenstance and try to assign reasoning to it.

And it's a bit of a stretch to associate Linux with "Worse is Better." A major
reason for using Linux in the early days was that it was the best alternative
to Windows 95 because it got process isolation on x86 right.

~~~
DenisM
I don't see how to reconcile your first paragraph and your second paragraph.
First you dismiss your virtual opponent's argument as trying to assign meaning
to a random (or incomprehensible) outcome, then you turn around and assign
meaning to the same thing.

------
j-g-faustus
This actually reminds me of the Plato/Artistotele difference. Plato held that
there was an ideal, perfect version of everything in a sort of Idea Heaven,
and the goal of the philosopher was to get ever closer to understanding that
ideal.

Aristotele, on the other hand, thought that Heaven was too remote, and held
that we could learn more by measuring what we see in this world. As opposed to
the presumably ideal, but unaccessible, concepts in Heaven.

The medieval church loved Plato, the scientific revolution loved Aristotele.

My point is that the difference between these two frameworks for interpreting
the world seems to be fundamental. Fundamental in the sense that the
distinction has been with us for at least a couple of milennia, and we are
apparently not likely to agree on a single answer anytime soon.

~~~
sedachv
It's funny you bring this up, because Platonic idealism has been thoroughly
debunked in the past century. Some of the latest thinking on the subject is
known as "new materialism," and its core tenet is exactly the "technical
evolution" that Yossi talks about in the article. I recommend Manuel de
Landa's _War in the Age of Intelligent Machines_ for an introduction (it's as
near to a hacker's philosophy book as I've seen).

------
PaulHoule
I try really hard to not take a left vs right view in software design.

I sometimes build systems that are overengineered and I sometimes build
systems that are underengineered.

I do believe that every line of code, every function, every class, every
comment, every test, everything, is like a puppy that you have to take care
of.

If a team adds a "dependency injection framework" that adds a parameter to
each and every constructor in a system that has 800 classes, which that's a
real cost that's going to make doing anything with that system harder.

I'm a big believer in "cogs bad" because I've seen large teams live the
lesson.

From my viewpoint the perfect system is as simple as possible, but well
engineered.

------
kemiller
Another way to put this is that myopically optimizing for perfection along one
axis may fatally de-optimize another.

------
DenisM
My angle at the problem is the concept of "engineering debt": if a well-
designed product is the state of being "debt-free", and a deviation from good
design is a unit of "engineering debt". That debt will have to be serviced in
the form contortions you have to make to work around the design flaws, and
then eventually paid down in the form of rewrite, or discharged in an
engineering bankruptcy (such as abandoning the product).

Engineering debt, much like financial debt, is an instrument one can use to
trade some present-point expenditure for a larger future expenditure. Where
one makes sense so often does the other.

Sadly, engineering debt is much harder to account for. Old companies are
carrying huge amount of debt and are often times oblivious to it.

I think we could advance the state of the art if were to find a way to
quantify engineering debt. As a starting point I suggest a ratio of line
changes aimed at servicing vs. line changes aimed at creating new features. If
100 lines of new functionality require 10 lines of base code changes, the debt
is low. The opposite is true, the debt is high. I believe such metric could
speak to both business managers and engineers, so it provides a good common
ground for the two groups of reach consensus and prioritize work.

~~~
olliesaunders
The woes of high debt has always been my argument against all debt since very
early experiences as a programmer where I had to recover from a high debt
situation. I thought I was being clever in pointing out a cost that is
overlooked by divorced-from-details managers and blew it out of all
proportion. What I forget, and is the main point I take from this essay, is
that the cost of keeping a low debt increases exponentially as the debt
decreases.

> I think we could advance the state of the art if were to find a way to
> quantify engineering debt. As a starting point I suggest a ratio of line
> changes aimed at servicing vs. line changes aimed at creating new features.
> If 100 lines of new functionality require 10 lines of base code changes, the
> debt is low. The opposite is true, the debt is high

That’s the thing with this debt. You can only quantify it once you’ve paid it
back because its quantity is predicated on the cost of paying it back, which
differs depending on your aptitude for doing so. And because it’s invisible
neurotic programmers like me can start to actively fear it, leading to poor
decisions.

------
direllama
I just don't think "simple" is quite the right word for Worse is Better.

"... It's called Accessibility, and it's the most important thing in the
computing world.

The. Most. Important. Thing.

..."

[https://plus.google.com/112678702228711889851/posts/eVeouesv...](https://plus.google.com/112678702228711889851/posts/eVeouesvaVX)

~~~
lnanek2
What's really funny is that he says Google is ruled by people who only care
about product and that platform is suffering, whereas amazon is all services
by edict. But the most successful Android tablet product out there is the
amazon Kindle Fire. So Google didn't even beat amazon on product and even
ended up having to copy it with the Nexus 7.

------
charlieflowers
I think the lesson is, "whatever is available tends to propogate, even if it
is shit. Especially if it is for a large mass of humans, who tend to act
stupidly in mass."

You can see it all over the place. How great of an Internet provider was AOL,
for example?

------
hkon
If he is a perfectionist and has been for many years, then his worst can only
be so bad...

------
dreamdu5t
There are so many generalizations in this article I don't know where to
begin...

~~~
DenisM
Begin by listing one that you think is unwarranted. Otherwise your comment
detracts from the conversation rather than contributing.

