
The Rise of Worse Is Better (1991) - ropable
https://www.dreamsongs.com/RiseOfWorseIsBetter.html
======
segfaultbuserr
This article was originally written with a critical tone of C & Unix, it said
they have a lot of problems, but with a 50% functionality, a lot of people had
accepted it as good enough, then spreaded it like "the ultimate computer
viruses". After it's publication, many Unix people reinterpreted it as a
celebration of Unix's successful apporach since then.

I always strongly dislike JavaScript for its inconsistencies and ugly syntax,
I found the tendency that everything now comes with a JavaScript engine and
force developing in JavaScript regardless of their native programming language
is unfortunately, and I always dreamed that the alternative timeline which
Scheme became the language of the web.

But I've changed my position in recent years, because of this article. Now I
think the same principle applies to JavaScript. It was something with a 50%
functionality, and so it became the ultimate virus on the Web, and since then,
a huge amount of manpower was spent on improving JavaScript. As a result, it
still has the same inconsistencies from its original design, but overall, due
to all the efforts, the overall usability is actually higher than alternative
systems with a "cleaner" / "better" design. Just like a Unix-like OS like
FreeBSD or Linux is still one of the most usable systems in existence. So I
think I'd just accept JavaScript...

~~~
jstimpfle
> After it's publication, many Unix people reinterpreted it as a celebration
> of Unix's successful apporach since then.

The other comment below links this:
[https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf](https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf)

Look on book page 219 (224 in my pdf viewer), and read around the section that
starts with "It is far better to have an underfeatured product that is rock
solid ..."

It's probably not a "reinterpretation of the Unix people". More like, the
author did too good of a job by not taking sides explicitly, so everyone just
interpreted it the way they liked. If anything, the author argued why "worse
is better" is really better. To take the "Worse" approach just means to "not
take all the unnecesssary effort" which will result in a product that exists.

In other words, it's an essay about how big bang approaches don't work out.

------
akkartik
I recently discovered my favorite summary of "Worse is Better". It's by the
author, but it isn't anywhere in articles by that name.

 _“It is far better to have an under-featured product that is rock solid,
fast, and small than one that covers what an expert would consider the
complete requirements.”_

[https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf](https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf)
(pg 219)

~~~
ChrisSD
Hm... when I think "worse is better" I'm not thinking of software that is
"rock solid, fast, and small". Have I been misunderstanding the essay?

I thought most of these applications start out small but they're nowhere near
rock solid and haven't been optimised for speed. It just gets the essential
features in the hands of the people who need it and works just well enough to
be useful.

~~~
akkartik
I was surprised as well! But this is from the author so one can't argue with
it.

As I think about it more, it makes a lot of sense. The only people willing to
put up with something flaky are programmers. For most people, it can do little
but it _has_ to be reliable.

~~~
ChrisSD
Also by the author (under a pseudonym), Worse is Better is Worse:
[http://dreamsongs.com/Files/worse-is-
worse.pdf](http://dreamsongs.com/Files/worse-is-worse.pdf)

> I’ve always told him that his penchant for trying to argue any side of a
> point would get him in trouble.

------
mjw1007
I think the central point here is "completeness must be sacrificed whenever
implementation simplicity is jeopardized".

Indeed, sometimes omitting a feature which really ought to be there makes life
easier for the end user, not just the implementor, even though the user has a
good reason to want the feature.

For example, Subversion allows versioning empty directories while Git (like
CVS) doesn't.

On the face of it this is just a deficiency in git, but the fallout from
Subversion doing the Right Thing is quite extensive: because Subversion treats
directories as first-class objects rather than just part of a file's name, you
can get a Subversion repo into all kinds of strange and confusing states.

With git the user is never going to be confused by the result of something
like "I deleted the directory then tried to merge a branch which added a file
inside it".

------
gerbilly
Main main criticism of the 'worse' development approach, rush out a simple MVP
then iterate, is that it only works for simple problems.

Not all problems can be solved by incremental refinement.

~~~
workthrowaway
that's interesting. what example problems incremental refinement isn't able to
solve?

~~~
wry_discontent
The flaws of incremental refinement should be obvious to anybody who's worked
in a codebase more than 5 years old built on that approach. Maybe it could
work in theory, but in practice, the iterations on crap produce more crap.

As far as a specific example, I remember a discussion with Uncle Bob where he
specifically mentions banking and accounting as systems that shouldn't use
that approach because you'll build the wrong thing.

~~~
AnimalMuppet
And yet the conventional wisdom is that "a complicated system that works is
almost always found to have been derived from a simple system that works".

But you're right, evolving code often turns it into a mess. The only way that
doesn't is if, at each stage, the people working on it keep the architecture
and code clean. That takes discipline, not just by the programmers, but also
by management - they have to give the programmers time to do the cleanup that
is needed, not just time to shoehorn something in.

------
Isamu
The title itself lets you know the desire of the author to occupy some
philosophical high ground, while admitting some hard truths.

They could have asked more honestly, "why is C/Unix winning hearts and minds
while Lisp-based systems are not" but first they wanted to provide the given
that C/Unix was clearly inferior to Lisp based systems. That was not up for
debate.

So that sets the bounds for the discussion that follows, and frames the
discussion as "why are people choosing the clearly inferior over the clearly
superior?"

You could write a similar essay from the point of view of what you might call
the "original intent" of C/Unix, which is that simplicity is chronically
undervalued and everybody, everywhere, all the time, try to add "just one more
feature" to make it better.

~~~
jonjacky
That essay has already been written, Rob Pike's UNIX Style, or cat -v
Considered Harmful [1]. It inspired a web site and project devoted to
simplifying Unix [2].

[1] [http://harmful.cat-v.org/cat-v/](http://harmful.cat-v.org/cat-v/)

[2] [http://cat-v.org/](http://cat-v.org/)

------
ChrisSD
> C is therefore a language for which it is easy to write a decent compiler,
> and it requires the programmer to write text that is easy for the compiler
> to interpret.

Ironically this is no longer true. Not so much because C has changed but
because the hardware underneath it has.

~~~
chrchang523
I would instead state that the standard for what constitutes a "decent
compiler" has risen. For any given performance target, I'd still bet that it
takes less work to hit it with a C compiler than with a compiler for most
newer languages. (Partly because the success of C has resulted in C-related
hardware design constraints...)

------
wayoutthere
Worse isn't better, _simple_ is better. Simple is much easier to adapt to
complex use cases because it can be easily understood at a high level. Simple
avoids bikeshedding, which Lisp developers (the intended audience of this
article) are notorious for.

------
dang
A thread from 2018:
[https://news.ycombinator.com/item?id=16716275](https://news.ycombinator.com/item?id=16716275)

2014:
[https://news.ycombinator.com/item?id=7202728](https://news.ycombinator.com/item?id=7202728)

2011:
[https://news.ycombinator.com/item?id=2725100](https://news.ycombinator.com/item?id=2725100)

------
DannyB2
I remember reading this, when it was first published, I think maybe in AI
Magazine.

Being a huge Common Lisp fan at the time, I immediately adopted the idea that
Correctness is the most important single thing above all else. I don't care
what's in the box. The external interfaces on the box should be correct.

This seems like a basic thing we take for granted in tools, libraries,
languages and other things we use. Dishwashers. Thermostats.

------
tboyd47
There's also the hiding-in-plain sight explanation that C was just easier than
Lisp for English-speaking people to learn and use because, like English, it's
SVO instead of VSO. Then object-oriented languages overtook C for the same
reason.

------
chooseaname
It's like literary fiction vs popular fiction. One might think literary
fiction is the right way to do fiction, but popular fiction is where the money
is.

~~~
AnimalMuppet
But define "better". Literary fiction is "better" in the sense of
"communicating profound ideas better". Popular fiction is "better" in the
sense of "being something that people want to read".

