
The Rise of Worse Is Better (1991) - otoolep
http://dreamsongs.com/RiseOfWorseIsBetter.html
======
pjmlp
To bring another example of worse is better, regarding how C's industry
adoption caused a regression in compiler optimizations research.

"Oh, it was quite a while ago. I kind of stopped when C came out. That was a
big blow. We were making so much good progress on optimizations and
transformations. We were getting rid of just one nice problem after another.
When C came out, at one of the SIGPLAN compiler conferences, there was a
debate between Steve Johnson from Bell Labs, who was supporting C, and one of
our people, Bill Harrison, who was working on a project that I had at that
time supporting automatic optimization...The nubbin of the debate was Steve's
defense of not having to build optimizers anymore because the programmer would
take care of it. That it was really a programmer's issue....

Seibel: Do you think C is a reasonable language if they had restricted its use
to operating-system kernels?

Allen: Oh, yeah. That would have been fine. And, in fact, you need to have
something like that, something where experts can really fine-tune without big
bottlenecks because those are key problems to solve. By 1960, we had a long
list of amazing languages: Lisp, APL, Fortran, COBOL, Algol 60. These are
higher-level than C. We have seriously regressed, since C developed. C has
destroyed our ability to advance the state of the art in automatic
optimization, automatic parallelization, automatic mapping of a high-level
language to the machine. This is one of the reasons compilers are ...
basically not taught much anymore in the colleges and universities."

\-- Fran Allen interview, Excerpted from: Peter Seibel. Coders at Work:
Reflections on the Craft of Programming

~~~
mgummelt
I'm not sure the usage of C for apps is an example of worse is better so much
as network effects. It's easier to write apps in the same language as the OS.

~~~
bsder
In addition, C progressed because it was _free_.

People forget that compilers in the 1980's were expensive.

If you wanted something compiled other than assembly language, your choices
were C (1987--gcc could generally bootstrap) and Pascal (1984--mostly
TurboPascal) and ... nothing.

Look at what was sitting in Dr Dobbs Journal in the 1980's. People were typing
in enormous programs in BASIC(!) in order to play with things like LISP and
Prolog. Not a recipe for adoption ...

I remember getting stuck using some Macintosh Lisp thing in 1986/1987--I pined
for my OS9 machine and C and the dulcet (not!) grind of it's floppy drive.
That says something.

~~~
pjmlp
C was not not free at all.

Small-C was probably the only dialect freely available, everything else was
either commercial, or came for "free" with an UNIX workstation price tag.

In fact, it was due to Solaris change of heart regarding developers, that some
devs started to contribute to GCC.

[https://news.ycombinator.com/item?id=8988021](https://news.ycombinator.com/item?id=8988021)

[https://news.ycombinator.com/item?id=8984454](https://news.ycombinator.com/item?id=8984454)

I have a few editions of Dr Dobbs Journal comparing C compilers, all
commercial, GCC was yet to be relevant.

~~~
bsder
Okay, maybe C isn't _free_ at the time, but you can get it for anywhere
between $50-100 1985 dollars--that qualifies as fairly cheap--especially for
the kinds of technically competent individuals who would buy such a thing.

So, what "better" language could you get for that price at that time?

~~~
pjmlp
Actually without bothering to go again diving into old magazines, it was more
like $150 or more, with student discount.

Pascal compilers like Turbo Pascal, for example.

~~~
AdieuToLogic
Aztec C[0], by Manx, cost about $200 USD in 1988. It came with a "complete"
dev environment for PC-DOS (the version I have) which included:

\- C compiler

\- Assembler

\- Linker

\- Make

\- Grep

\- Z (an editor kinda like vi IIRC)

I remember it being able to make very small executables, which meant a lot at
that time.

0 -
[https://archive.org/details/bitsavers_manxAztecClApr88_25693...](https://archive.org/details/bitsavers_manxAztecClApr88_25693545)

------
jancsika
I wonder if "Worse is Better" simply wins because developers experience
failure at every part of the process "for free," as the common case. You
design a crappy API, test it, hack around it, ship it, debug it, criticize it,
design something better, deprecate it, and continue through another iteration
of the entire dev process.

Meanwhile the other school is trying to design an elegant and complete API,
but the elegance and completeness cannot leverage a full iteration of real
tests, hacks, release, real-world debugging, real-world criticism from
arbitrary devs/users, plus all the pressure to design something better so the
awful idea can be deprecated and forgotten before it brings the whole project
to its knees.

~~~
convolvatron
worse does win.

but I dont get why a well designed api cant go through the same kind of real
world feedback that you describe. are you implying without saying that the
good version just never makes it out into the world since its being
perpetually polished?

~~~
jancsika
Here's some rank speculation after reading the first few weeks of the w3c svg
group archive:

A well designed api typically arrives at the dev cycle _from somewhere else_.
This causes longer iterations in the dev cycle because new devs have to
contend with two sources of truth-- the dev cycle itself and whatever
unreachable realm used unreachable arguments to bring the well designed api
into existence.

So you get these weird situations where certain topics are in scope, but
others are out of scope because they were decided in the unreachable realm.
For example, a discussion about using CSS for SVG shape attributes (fill,
stroke, etc.) gets cut short because the unreachable realm already decided
that they don't want to have to "parse twice" in order to get at the CSS data.

Oops, now modern browsers have CSS props, SVG presentation attributes with
corresponding CSS props with the same name that override them, _and_ other SVG
attributes which don't have corresponding CSS props. _Except they do in the
SVG 2.0 spec_.

And now you're stuck because you've got this wonderful new(ish) (actually
well-designed) API for web animation but it's unclear how to treat SVGs
because of the ambiguity among the above categories. Chromium seems to already
be doing things the SVG2 way while last I checked Mozilla still uses SVG1.1
and had prefixes for the svg presentation attys (perhaps behind a feature
flag, can't remember).

Slowwww...

Compare that to just deciding early on to put _all_ attys that could
conceivably be related to "presentation" in CSS.

But that's hard to argue when the counter-argument is that the issue was
already considered in an unreachable realm in a process that isn't available
for inspection.

I also read threads where both SVG path syntax and the awful path arc command
parameters could have been improved if the unreachable realm hadn't existed.

On the other hand, devs who write self-described "crappy apis" (or some such
self-deprecating business) don't typically have an unreachable realm. Or if
they do, their self-deprecation makes it clear that it rarely if ever trumps
the open realm.

So my idea is that a single iteration through crappy api-land is faster
because there's only one source of truth leading to greater clarity and a
greater incentive to participation at any level.

------
fullsage
One of the best product post-mortems that I've seen is the RethinkDB post from
Slava:

[http://www.defmacro.org/2017/01/18/why-rethinkdb-
failed.html](http://www.defmacro.org/2017/01/18/why-rethinkdb-failed.html)

He does a great job describing how the "worse is better" essay plays out in
the modern world, and how it played out for them in the OSS DB market.

~~~
ScottBurson
Previous HN discussion:
[https://news.ycombinator.com/item?id=13421608](https://news.ycombinator.com/item?id=13421608)

------
vilhelm_s
I also highly recommend Yossi Kreinin's "What _Worse is Better vs The Right
Thing_ is really about" ([https://yosefk.com/blog/what-worse-is-better-vs-the-
right-th...](https://yosefk.com/blog/what-worse-is-better-vs-the-right-thing-
is-really-about.html))

> Some view [markets and competition] as mostly good and others as mostly
> evil. This is loosely aligned with the "right" and the "left" [...] I will
> try to show that the disagreement about markets is at the core of the
> conflict presented in the classic essay, _The Rise of Worse is Better_ [...]
> and not the trade-off between design simplicity and other considerations as
> the essay states. So the essay says one thing, and I'll show you it really
> says something else. Seriously, I will.

------
api
Modern examples include JavaScript and the web as an application platform.

While the arguments here have some validity, I think a more important reason
for the triumph of worse-is-better is human cognitive overhead. The correct
solution being more complex requires more cognitive overhead to support,
maintain, port, and implement when new implementations are needed.

Yet another reason is probably political. Correct solutions are harder to
develop, so they are often developed by commercial entities and subject to
restrictive licenses. Free software has less resources, so tends toward the
New Jersey approach. It's hard to compete with free.

~~~
chubot
Yeah I'd argue that a prime example of "worse is better" is JavaScript and
APIs like document.write("<p>..."). [1]

In other words, starting out with a browser that can render HTML and CSS, and
then adding an API like document.write() is a huge hack.

Exposing the entire rendering engine with a textual interface is hacky. For
one, the lack of escaping makes security bugs the default (somewhat fixed with
template strings in ES6). It's also really hard to reason about performance,
because you don't know when layout changes are triggered and so forth.

The whole "virtual DOM" thing is an example of fixing the flaws in this API.

But on the other hand, it's sort of "simple" in that you don't have to learn
as much to use it, and it gives you the POSSIBILITY of doing everything
dynamically.

[1] NOTE: There are 2 distinct points in the essay. One is that "It is better
to get half of the right thing available so that it spreads like a virus."
That is, ship first, and then evolve the software.

The other is whether you should expose the implementation details of the
underlying platform, or if you should wrap it up in a "nice" interface.
JavaScript does the former.

Also, neither of these points is "crappy software wins", which the title seems
to imply. It's more subtle than that.

~~~
etatoby
JavaScript was designed in 10 days in 1995 to get _something_ out ASAP. It was
simple and hacky and it spread like a virus in the following 23 years,
covering almost every niche of programming.

It is only now getting such (basic) language features as variables and
constants being lexically scoped by default, a generic flow control primitive
(async/await + exceptions which is almost, but not quite, as powerful as
call/cc), as well as useful features such as multiline/template strings,
proper classes, symbols, a shorter lambda syntax, and dedicated hashmap/set
containers.

JavaScript is the quintessential example of why Worse is Better.

PS: even though CSS was first proposed in 1994, the availability of JavaScript
and document.write() in browsers, as well as its use on the web predate the
adoption of CSS by several years.

------
zakum1
This is one of the most human insightful pieces to ever be written and is
timeless in its applicability. It is not just about languages and computer
science, but about understanding adoption in a digital world. We can all apply
it to our products as much as our technology

------
kbp
It's mentioned at the top of the linked page, but just to highlight it, this
is only an excerpt from the longer piece "Lisp: Good News, Bad News, How to
Win Big" which was written in 1991 and talks at length about the then-current
state of the Lisp world and its future. The full article is well worth a read:
[https://dreamsongs.com/WIB.html](https://dreamsongs.com/WIB.html)

~~~
zeveb
> The full article is well worth a read:
> [https://dreamsongs.com/WIB.html](https://dreamsongs.com/WIB.html)

Agreed. It’s notable (among _many_ other things) for its comment in 3.1,
‘Scheme is a smaller Lisp, but it also suffers from the MIT approach. It is
too tight and not appropriate for large-scale software.’ I’m always a bit
saddened when I see a production system written in Scheme instead of Lisp.

His thoughts in 3.6 are very interesting, too. From his thoughts on method
definitions, it sounds like maybe he wanted a language-specified dynamic
editing environment. Either way, it’s sad that the Lisp world never came out
with a proper successor to Common Lisp. It’s by far the best general-purpose
language out there today, but it is not the best possible.

------
Animats
That's from the late 1970s or the early 1980s.

The modern version is the shift from "shut up until you can demo" to "market,
then build".

------
mad44
To give a recent example, compared to Borg, Mesos exemplifies "worse is
better" approach. [https://muratbuffalo.blogspot.com/2017/02/mesos-platform-
for...](https://muratbuffalo.blogspot.com/2017/02/mesos-platform-for-fine-
grained.html)

What are other recent examples you can think of?

~~~
danieldk
I think Go is a pretty good example of Worse is Better. Go's designers
preferred simplicity over correctness. (some copying and pasting here and
there is better than introducing the complexity of generics).

(This is without negative sentiment, I use Go for some work.)

~~~
mgummelt
How does Go lack correctness?

~~~
danieldk
Some examples:

\- You can accidentally implement an interface.

\- Lack of a proper ownership model can lead to unexpected results, which
leads to bugs. Consider e.g. growing of slices:

[https://play.golang.org/p/3qfd5B8_ktK](https://play.golang.org/p/3qfd5B8_ktK)

\- nil is not equal to nil:

[https://golang.org/doc/faq#nil_error](https://golang.org/doc/faq#nil_error)

(Obviously, it is is easy to move the goalposts without a proper definition of
'correctness' ;).)

~~~
cdoxsey
I think correctness is the wrong word. Maybe safety would be better?
(Verboseness? Precision?)

Go is quite a stable language. Well-specified, rigorously tested, and
conservatively updated.

I think all the behavior you listed is well-defined, albeit perhaps
unexpected.

Lack of generics doesn't make code incorrect. Less clear maybe, or easier to
make a mistake. Or clearer and a lot more code.

I guess you could argue that it makes writing correct code harder?

------
ripap
See also Gabriel's (Nickieben Bourbaki is an anagram) rebuttal of his own
paper: Worse is Better is Worse ([http://dreamsongs.com/Files/worse-is-
worse.pdf](http://dreamsongs.com/Files/worse-is-worse.pdf)).

~~~
B1FF_PSUVM
Or realising that "perfect is the enemy of good" is an adage for a reason ...

[https://en.wikipedia.org/wiki/Perfect_is_the_enemy_of_good](https://en.wikipedia.org/wiki/Perfect_is_the_enemy_of_good)

------
pchristensen
I was just thinking about this, comparing the Uber (worse) vs Waymo/Google
(Better) approaches to self-driving cars.

------
fooblitzky
I wonder how much impact the availability of libraries makes, compared to the
elegance of the design?

