
The problematic culture of “Worse is Better” - jamii
http://pchiusano.github.io/2014-10-13/worseisworse.html
======
haberman
> “Worse is Better”, in other words, asks us to accept a false dichotomy:
> either we write software that is ugly and full of hacks, or we are childish
> idealists who try to create software artifacts of beauty and elegance.

I think this criticism is slightly misattributed. _Ideologues_ ask us to
accept a false dichotomy. Ideologues exist on both sides of an issue usually.
For example, there are most definitely ideologues who believe this same
dichotomy exists, but believe that the only option is to choose beauty and
elegance, even at the expense of practicality. These are the people who (for
example) wish Lisp machines had won and that we lived in a monoculture of
_only_ Lisp.

I like C++ because I think it's useful, but I'm not a Worse is Better
Ideologue. For example, I like Rust as a potential "better is better"
replacement for a lot of C++'s use cases. I like Rust particularly because it
seriously addresses nearly all of the practical advantages of C++. Many
"Better is Better" ideologues would rather just hand-wave away the negatives
of the compromises required to achieve their vision (like mandatory GC).

Rust proves that "better is better" design can address real-world practical
challenges, while also providing an escape hatch back into a more primitive
"Worse is Better" kind of world (unsafe blocks). It will be interesting to see
if this approach gains the traction that I hope it will.

~~~
handsomeransoms
Great point, just a nit: Rust does _not_ have mandatory GC.

~~~
steveklabnik
It doesn't even have optional GC.

~~~
dllthomas
Arguably, it has mandatory _manual_ GC.

------
al2o3cr
"Other professions, like medicine, the law, and engineering, have values and a
professional ethic, where certain things are valued for their own sake."

[citation needed]

I can't parse what exactly this is supposed to refer to. Doctors treat the
patients they get - and frequently wind up with no choice but to use
treatments that are only barely worse than the disease (see the history of
chemotherapy drugs, for instance). Outside of legal academia, the legal
profession is always working with imperfect information, imperfect systems,
imperfect people making the calls (ffs, I feel a Law and Order episode
breaking out)...

To me, the giant disconnect is that we've still got two threads of thought
still mixed together under "Computer Science": the actual science-y bit, and
the "shovel bits from A to B" software construction part. It's as if materials
science, structural engineering, and construction management were all lumped
together. Putting a new sidewalk in does not require the development of an
entirely new method for making concrete.

~~~
karmelapple
How long did it take these three different fields to fully separate and have
their own concentrations in universities? That may give us a timeline on how
long it could be in the computer science, software engineering/architecture
(does such a concentration exist involving architecture?), and software
development methodology (project management... which is almost no existent in
academia) fields.

~~~
dreamfactory2
Computer science used to be quite separate from application development and
the nearest it would come to actual software was low level OS componentry. Any
conglomeration is more recent.

------
jasode
Discussing the meme "worse is better" is difficult because there are different
interpretations and usages of that phrase. It's description vs prescription.
The blog author chose the prescription interpretation. I can't tell if the
author unknowingly did this because he did not see a separate evolution of
that meme.

First interpretation is the descriptive usage. We could explicitly prefix the
meme and qualify it as RGWIB (Richard-Gabriel-Worse-is-Better). The original
label was the observation that "simpler" software was more successful than
more full-featured software with ambitious goals. Richard's thesis wasn't
about "hacks" but about small and simple things that satisfy users and builds
momentum.

Second interpretation is the prescriptive (or self-justifying) stance which I
might call HOBWIB (Hacks-Ok-Because-Worse-is-Better.) This appears to be what
the blogger is complaining about.

However, HOBWIB isn't Richard Gabriel's thesis. That "worse is better" has
taken on a life of its own and _repurposed_ by others who are unaware of RG's
original meaning is just a circumstance of adopting snappy soundbites.
Whatever behavior the author is complaining about would exist whether the
exact phrase "worse is better" existed or not.

>“Worse is Better”, in other words, asks us to accept a false dichotomy:
either we write software that is ugly and full of hacks, or we are childish
idealists who try to create software artifacts of beauty and elegance.

That label does not have that power over us. For example, we have a label for
certain human behavior and call it "passive aggressive." The existence of that
phrase _did not force us to choose_ whether to be passive aggressive or not.
Likewise, thinking that the existence of 3 words "worse is better" is forcing
us into a dichotomy of bad vs good design is flawed analysis.

~~~
akkartik
Totally.

RPG's WIB said _nothing_ about bad design. C didn't do better than lisp
because it was a hack. It was an incredibly clean design, as clean as the
original lisp. RPG was comparing two clean designs, and trying to describe the
evolutionary properties of one that made it more fit in its environment (ie
us) than the other. The biggest such property is simplicity of implementation.
(Later C compilers have gotten more complex and ++, but a modern compiler
wouldn't have been as competitive in an earlier world where C wasn't already
adopted: [http://www.ribbonfarm.com/2011/09/23/the-milo-
criterion](http://www.ribbonfarm.com/2011/09/23/the-milo-criterion))

------
pron
I can argue with many points made in this article, but there's one assumption
at the very core that I think is false: _that we know how to do it better_.

There have been many ideas in software development floating around in the last
couple of decades, from pure functional programming to interactive programming
and programming by example. All of these approaches -- and the many others I
haven't mentioned -- have explored interesting and possibly promising future
directions, but all are yet to demonstrate _consistently superior results_ ,
across various domains, to the "broken" system we have today.

We are still exploring, and maybe some of the ideas people play with today
will serve as the seeds of future, revolutionary development procedures. But
it's not like we _know_ today how to do it significantly better even if we
wanted to do away with the "old" ways right noe.

~~~
sanderjd
Yep, in my experience with my own software, today's "revolutionary, way better
designed approach" is tomorrow's "hacky awfulness that needs a revolutionary,
way better _re_ design", ad infinitum. It's much cheaper to accept and
incrementally improve an imperfect solution. But I recognize the possibility
that this is purely a personal experience and better developers than I really
do know "how to do it better".

~~~
RyanZAG
_> better developers than I really do know "how to do it better"._

They don't. They're just selling something. Not to say some approaches aren't
better than others, it's just the only way to know which approach is really
better is to implement the same problem space in both and compare. Anybody
telling you that X is better than Y at Z without personally using both X and Y
to do Z is just trying to sell you something. Beware.

It's also why the best test when deciding on a programming language or
framework is to look at what others have actually created with it.

~~~
dllthomas
_" It's also why the best test when deciding on a programming language or
framework is to look at what others have actually created with it."_

To a point. If everyone followed that rule, no one would ever build anything
in anything new. At which point, looking at what others have actually created
with it would merely be a test of the longevity of the programming language or
framework.

------
jamii
Calling people out as #worseisworse is almost certainly going to be
ineffective - attacking people just makes them defensive and argumentative.
Instead, can we find useful principles to steer us away from the two bogeymen
(hack-it-till-it-works vs useless-ivory-tower-wankery)? Some ideas:

 _The real world exists._ Successful solutions usually come from actually
understanding the problem and all it's constraints. Solutions that are
dismissed as overly academic tend to fall afoul of solving just the immediate
problem and ignoring eg compatibility, switching costs, accessibility to
beginners. Similarly, the linux community spent a long time burying it's head
in the sane w.r.t. usability and appearance. Sure, people _shouldn 't_ care
how pretty your UI is. But they do and you can't change that by burying your
head in the sand.

 _Everything has a cost._ The benefits of your new solution and the switching
cost have to outweigh the pain of the old solution by a large margin. The
original 'worse is better' was really an observation that simple solutions
that do half the job are often cheaper than a hugely complicated solution that
can do everything. If you spend every day coming up with better ways to make
widgets, it's easy to believe that everyone wants to have as much widget-
customising power as possible and is willing to invest time in learning to get
that power. If you actually talk to your users you might find that only having
three kinds of widget isn't annoying enough to make them spend time learning
something better.

~~~
afarrell
I agree with this except possibly one thing.

Saying "Sure, people shouldn't care how pretty your UI is." mischaracterizes
user interface design as being about prettiness rather than interaction. But
that might in fact be your very point and you might actually talking about the
mischaracterization of others.

~~~
jamii
I was thinking of the fact that beautiful software is perceived as more
usable, all else being equal ([http://www.ergonomicsclass.com/wp-
content/uploads/2011/11/Tr...](http://www.ergonomicsclass.com/wp-
content/uploads/2011/11/Tractinsky_2000.pdf)). This is irrational but not
controllable. Unfortunately many good solutions lose out in the market because
their developers make the mistake of expecting users to be rational econs,
rather than accepting reality and working around it.

------
Htsthbjig
What this man calls "worse" actually means "something that in better in ways I
fail to understand in my simplistic reduction of the complex world".

I love Lisp. I used it a lot for creating code that elegantly modifies itself.

But my entire software company succeeded because of languages like C/C++ and
python, not because of Lisp or any "functional programming language of the
week".

C gives you raw power that nobody else does. The fact that we could use c on
c++ did become very useful so many times.

Some people believe that forcing other people not to use things like pointers
is a good thing, it is "better".

For me it is the same as forcing people not to use sharpen edges on knifes for
their own good.

Yes, it is better in the sense that people wont cut themselves with knifes.
But it is worse on other areas too.

And if you let people choose, they will choose to continue using their knifes
until something genuinely better appears.

This is exactly what people like this man can't stand, people choosing on
their own to use something they don't like so they want to force their
"better" way(my way or the highway).

If you have something better, well, show us the code, instead of ranting, and
you will discover the fact that making anything that people actually want to
use is way harder than ranting in a blog.

~~~
dllthomas
_" For me it is the same as forcing people not to use sharpen edges on knifes
for their own good. Yes, it is better in the sense that people wont cut
themselves with knifes."_

People cut themselves far worse trying to use a knife that's too dull for
their task than a knife that's too sharp. Sharp knives are only safer when
you're not using the knife (more likely to be cut by a sharp knife in a drawer
than a dull knife in a drawer, &c).

------
yuliyp
The viewpoint the author seems to expound reeks of second system syndrome.
That is, he seems to argue that incremental evolution is broken, since the
state at any given point is suboptimal given what someone building from
scratch would build, having all the lessons of hindsight with them.

Of course it is. Yet computer science history is full of overly ambitious
projects (Plan9, Vista, Lisp) falling by the wayside as inferior solutions
which were more incremental continued to chug along. The author didn't stop to
consider why those failed, only to blame a myopic community; as if Lisp being
better would have proven itself if only more people gave it a chance.

Incremental changes are how we progress. We take the lessons we have learned
from what we're doing now, and then try something a little bit different. We
can't try and redesign everything at once, so we pick a few things, and other
compromises get left in. Compromise is an essential requirement to getting
anything big done.

~~~
samstokes
_he seems to argue that incremental evolution is broken_

I think he argues that this sort of absolutism is broken.

The author isn't proposing that we discard incremental change, just the
uncritical assumption that incremental is the only reasonable change.

See his analogy with portfolio theory: he's not even challenging
incrementalism as the default, any more than he'd suggest putting 90% of your
money into emerging markets.

~~~
sanderjd
It seems more like a historical observation than an uncritical assumption,
that incremental change has shown itself to be more successful. The author's
claim seems to be that incremental change has been more successful
historically because it has more mindshare. I think that causality is
backward.

------
Animats
The one-line summary: "Basically, no one seems to grasp that when stuff that's
fundamental is broken, what you get is a combinatorial explosion of bullshit."

Exactly. A few examples.

\- C's "the language has no idea how big an array is" problem. Result: decades
of buffer overflows, and a whole industry finding them, patching them, and
exploiting them.

\- Delayed ACKs in TCP. OK idea, but the fixed timer was based on human typing
speed, copied from X.25 accumulation timers. Result: elaborate workarounds,
network stalls.

\- C's "#include" textual approach to definition inclusion. Result: huge, slow
builds, hacks like "precompiled headers".

\- HTML float/clear as a layout mechanism. Result: Javascript libraries for
layout on top of the browser's layout engine. Absolute positioning errors.
Text on top of text or offscreen.

\- The UNIX protection model, where the finest-grain entity is the user. Any
program can do anything the user can do, and hostile programs do. Result: the
virus and anti-virus industries.

\- Makefiles. Using dependency relationships was a good idea. Having them be
independent of the actual dependencies in the program wasn't. Result: "make
depend", "./configure", and other painful hacks.

\- Peripheral-side control of DMA. Before PCs, IBM mainframes had "channels",
which effectively had an MMU between device and memory, so that devices
couldn't blither all over memory. Channels also provided a uniform interface
for all devices. IBM PCs, like many minicomputers, originally had memory and
devices on the same bus. This reduced transistor count back when it mattered.
But it meant that devices could write all over memory, and devices and drivers
had be trusted. Three decades later, when transistor counts don't matter, we
still have that basic architecture. Result: drivers still crashing operating
systems, many drivers still in kernel, devices able to inject malware into
systems.

\- Poor interprocess communication in operating systems. What's usually needed
is a subroutine call. What the OS usually gives you is an I/O operation. QNX
gets this right. IPC was a retrofit in the UNIX/Linux world, and still isn't
very good. Fast IPC requires very tight coordination between scheduling and
IPC, or each IPC operation puts somebody at the end of the line for CPU time.
Result: elaborate, slow IPC systems built on top of sockets, pipes, channels,
shared memory, etc. Programs too monolithic.

\- Related to this, poor support for "big objects". A "big object" is
something which can be called and provides various functions, but has an arms-
length relationship with the caller and needs some protection from it.
Examples are databases, network stacks, and other services. We don't even have
a standard term for this. Special-purpose approaches include talking to the
big object over a socket (databases), putting the big object in the kernel
(network stacks), and trying to use DLL/shared object mechanisms in the same
or partially shared address space. General purpose approaches include CORBA,
OLE, Google protocol buffers and, perhaps, REST/JSON. Result: ad-hoc hacks for
each shared object.

~~~
rm445
Blaming UNIX for the virus industry is a stretch. A per-user protection model
is better than none at all.

~~~
tensor
The anti-virus industry is a result of DOS and Windows _not_ having a unix
permission model. Running as the user with highest privileges was the norm.

Now that a unix permission model is the norm, viruses are comparatively gone
and replaced by malware that simply tricks the user into installing it. No
permission model will help you against this. As a partial result, now we see
things like iOS where we remove control from the user, or OS X where we try to
make it inconvenient to be duped into giving access.

There are certainly still exploits that don't require duping the user, but the
anti-virus industry certainly wasn't established based on these.

~~~
dllthomas
_" No permission model will help you against this."_

That's not really true. No permission model will be 100% effective, but a more
fine-grained permission model might lead to more users saying "Um, no,
mysteriously executable pornography, I _don 't_ want to give you my bank
records and the ability to email my friends."

~~~
abstrakraft
Like how (non-tech) people pay attention to the permissions required by
Android and iOS apps they want to install?

~~~
ObviousScience
Let's imagine that I have privilege grouping sub-users, something like
name.banking, name.work, etc. Now my work files can't see my banking unless a
window pops up going "Would you like Experimental Thing for Work to have
access to name.banking?"

I think being able to explain to the computer how my data is grouped, and
access patterns in it, is more natural for users than most of the security
models we have today.

It's also much easier to have two copies of the browser load, depending on if
I'm invoking it through name.banking or name.general. And much easier to
explain to grandma you do banking when you use name.banking and you look at
cat photos in general.

Grandma isn't stupid, she doesn't understand how technology work. Making
permissions based around how she categorizes her information and how she
divvies up tasks is more natural for her than insisting security only work if
she understand how computers work.

------
dasil003
I was heavily critical of the author's original rant about CSS, and my
citation of worse-is-better probably at least partially inspired him to write
this post ([http://pchiusano.github.io/2014-07-02/css-is-
unnecessary#com...](http://pchiusano.github.io/2014-07-02/css-is-
unnecessary#comment-1468335855)).

However he completely misunderstood my point to suggest that worst-is-better
is a conscious design quality. In no way did I mean that, and I don't believe
Richard Gabriel meant that either. Worse-is-better is about the reason certain
solutions win in the market; it's about an evolutionary trait, not a design
philosophy.

Look at it this way. If there are 100 possible technical solutions to a given
computing problem, the ones that solve the problem more comprehensively are
naturally going to tend to be more complex, this complexity comes with an
adoption cost, and that cost works against the likelihood of adoption.
Furthermore, when you are talking about something that solves the scope of
problems which CSS solves, there is no way you can just sit down and design a
better system, it will need to go through many iterations to solve all the
things that CSS solves (which no single human being is comprehensively aware
of btw). In order to see that kind of investment a system needs buy-in from
downstream users over long a period of time.

So the strawman that the author sets up is the idea that someone is out there
selling a worse solution on purpose because of this meme which is simply an
observation of how technical adoption markets play out. Of course such foolish
people may be out there, but no intelligent developer sets out to create
something that is deliberately worse. Rather, each tradeoff is considered in
its specific contemporary context of the imperfect information available.
Whether a solution gains traction and wins in the marketplace is has nothing
to do with the subjective qualities of "worse" or "better", but rather is a
confluence of the state of the market at various points in time: how well it
solves immediate problems, how well it works with existing tech, how easy to
adopt, and of course some amount of hype loosely related to the
aforementioned.

What doesn't play a huge role is how ugly the evolution of this tech is going
to look 20 years down the line when the entire landscape has evolved.

It's a little tiresome when a young idealist comes into a 25-year-old field
(I'm guessing the author is not much older than that himself), pisses on the
hard work of thousands of people pushing web tech forward bit by bit over
decades, and says it should be replaced wholesale because it's just rubbish,
and then when someone tells them that you're welcome to try but you'll never
get any traction in that Sisyphean task, he responds that people aren't
engaging in a rational discussion.

~~~
grey-area
_Furthermore, when you are talking about something that solves the scope of
problems which CSS solves, there is no way you can just sit down and design a
better system_

CSS has proved an awful system for layout, design and practically every domain
it attempts to address over the last couple of decades. Having worked with
both for a long time, here's the fundamental difference between HTML and CSS,
as I think it's a useful comparison and highlights when worse is truly better,
and when it is just worse:

HTML was limited and simple by design, and has gradually improved (the
original worse is better ethos)

CSS was broken by design, and hasn't much improved (maybe in a few years with
flexbox and grids, maybe).

I don't find it at all surprising that someone thinks CSS could and should be
replaced, and I'm skeptical that if the best minds of our generation took
another look at it, they couldn't find some far better ways to lay out content
than an inconsistent and confusing box model with concepts like floats tacked
on to it and lots of module proposals to try to glom on additional layout
modes. I can't think of many problems CSS solves well, apart from separation
of style/layout and content, even there at present we have div soup for grids
instead of table soup - hardly a huge improvement on what went before. Some of
the problems it attempts to solve are not even real-world problems and are of
its own devise (for example the Cascading Style priorities in its name - what
a bizarre focus for a layout language).

Perhaps the answer is a turing-complete layout language (personally I doubt
it), or perhaps it's another more informed declarative one, but I'm quite sure
we can improve on it, and the comparison to HTML is apposite, because that has
stood the test of time rather well compared to its companion technologies.

 _It 's a little tiresome when a young idealist comes into a 25-year-old
field_

We don't live in the best of all possible worlds, and to make it better, we
sometimes have to take a step back from a local maximum and look at the bigger
picture; that involves listening to 25-year-olds coming up with something
better - most of the time they won't, but sometimes they will. If you find
yourself not even listening, and more concerned with sunk costs, work already
done, and expertise already gained, that's not healthy and not a convincing
riposte.

~~~
ZenoArrow
"that involves listening to 25-year-olds coming up with something better"

That doesn't involve listening to them, it involves them putting in the work
to develop something as a proof of concept, and that proof of concept evolving
through feedback and collaboration into something people want to use.

To put it more crassly, people don't stop using something because others say
it's shit, they stop using something when there's a better option on the
table, and that better option needs to be more than just architecturally
improved, it needs to be genuinely useful, and this is what people who twist
"worse is better" fail to understand... legacy code sticks around when it's
genuinely useful.

And to address the original author... yes, the idea with writing software is
to get the job done, but if you want to work on tooling to improve that
process be my guest.

~~~
grey-area
_That doesn 't involve listening to them, it involves them putting in the work
to develop something as a proof of concept, and that proof of concept evolving
through feedback and collaboration into something people want to use._

Well I think it's always useful to listen - even if you disagree or think your
interlocutor mistaken. Of course as you say ideas are worth a lot less than
implementation, but the one point I fully agree with the original author on is
this:

CSS is not best of breed, existing for 20 years does not make it good in any
sense, and it is a terrible example of worse is better in the original sense.
However arguing over "worse is better" is just going to end in arguing about
what that vague phrase means, so I won't enter that particular rabbit-hole - I
agree the original author misunderstood or has not encountered the original
meaning.

In the case of CSS, what holds back adoption of alternatives is almost
entirely browser-vendor inertia and the institutional barriers to producing a
better solution, not some technical superiority of CSS, so what I object to in
the parent comment is the implication that CSS won because it is technically
superior to other layout methods and is complex because it deals with lots of
complex domain problems which a 25 year old couldn't possibly fathom. It
introduces needless complexity, doesn't even properly address the domain
problems (design, layout, grids etc), was badly designed from the start and
has become even more complex with age, and I'd argue it has succeeded mostly
by riding on the coat tails of HTML.

------
slashnull
I'm just gonna go full Poindexter and say that Richard Gabriel did try to
distance himself from worse is better, at first, but then he veered wildly in
the following years between the two positions.

And, the fundamental idea of Worse is Better is so incredibly older than UNIX,
even in the field of computer science; John Von Neumann said himself that the
Von Neumann machine was a temporary workaround and that a better architecture
would quickly replace it.

I live in a near constant fight in the rebel factions of Better and so far, I
have to reckon we get constantly massacred by the Empire of Worse.

------
orangeduck
I'm a big advocate of "worse is better" \- in fact I put it in my bio even at
the risk of scaring off potential colleagues or employers. The real meaning of
"worse is better" is of course subjective and complicated. And as has been
pointed out in these comments, "worse is better" is not about creating "worse"
software and then marketing it as "better". It is something much more
philosophical.

In my eyes "worse is better" is about the mindset of approaching a task. It is
about diving right in and learning through production - without being
paralysed by the idea of introducing hacks or ugly design. It is the idea
that, for the moment, there isn't a need to be worried about covering every
edge case or possible failure option. It persuades you to focus on something
simple and easy to explain, with a single purpose or intent. It is better to
produce something (anything) and see where it takes you.

It also says how important it is to embrace contribution and collaboration.
How important it is to, after some threshold, release yourself from feelings
of ownership.

But if I had to nail down exactly why I believe "worse" is so successful
("better") it is because those that create "worse" software don't focus on the
software - they focus on the idea. From that the software is painfully drawn.
The software might suck, but I believe ideas are better. They are more
persistant, easily explored, dynamic, and shareable than software. Ideas that
are good, simple, and easily taught are far more important than well designed
software. That is why they survive.

There are lots of programmers and hackers who don't believe in "worse is
better". Sometimes you seen them on HN with a fantastic new programming
language (or something) they have designed and built in isolation - perfect in
every aspect (at least to them). Nothing quite hurts like their confusion when
interest dwindles and their software is forgotten. All they had seen on HN
were "worse" links every day, and after years they had provided "better" \- to
them it is criminal that it hasn't been picked and gained momentum.

Worse is better is not going away, and I think you can either engage yourself
in it as a philosophy, or struggle.

~~~
empthought
"Sometimes you seen them on HN with a fantastic new programming language (or
something) they have designed and built in isolation - perfect in every aspect
(at least to them). Nothing quite hurts like their confusion when interest
dwindles and their software is forgotten."

I suspect you've described Rich Hickey to a tee here. /s

------
dbenhur
The author's Closing Remarks give rather short-shrift to the nuanced thinking
Dick Gabriel has contributed to this idea. Gabriel's "Worse is Better" page
[https://www.dreamsongs.com/WorseIsBetter.html](https://www.dreamsongs.com/WorseIsBetter.html)
concludes with:

"You might think that by the year 2000 I would have settled what I think of
worse is better - after over a decade of thinking and speaking about it,
through periods of clarity and periods of muck, and through periods of multi-
mindedness on the issues. But, at OOPSLA 2000, I was scheduled to be on a
panel entitled "Back to the Future: Is Worse (Still) Better?" And in
preparation for this panel, the organizer, Martine Devos, asked me to write a
position paper, which I did, called "Back to the Future: Is Worse (Still)
Better?" In this short paper, I came out against worse is better. But a month
or so later, I wrote a second one, called "Back to the Future: Worse (Still)
is Better!" which was in favor of it. I still can’t decide. Martine combined
the two papers into the single position paper for the panel, and during the
panel itself, run as a fishbowl, participants routinely shifted from the pro-
worse-is-better side of the table to the anti-side. I sat in the audience,
having lost my voice giving my Mob Software talk that morning, during which I
said, "risk-taking and a willingness to open one’s eyes to new possibilities
and a rejection of worse-is-better make an environment where excellence is
possible. Xenia invites the duende, which is battled daily because there is
the possibility of failure in an aesthetic rather than merely a technical
sense."

Decide for yourselves."

------
neito
Honestly, I think it's deeper than this. Even today, we have people who
intentionally seek out ugly, imperfect designs because they think they work
better. We have a cultural idea (at least, I can speak for the United States,
hopefully some others will chime in) that "real" work is ugly, gross, and
messy. Heck, I even had a friend who chided people who played a particular
version of CounterStrike because it was "Too polished". We're measuring in one
dimension something that's two dimensional: Work vs. Not work and Beauty vs.
Uglyness. We steep in this cultural broth of the idea that "real work is
ugly", and then wonder why the tools we decide to do "work" in are "ugly".

------
dools
For the record I think that writing software commercially should only consider
the bottom line. What you do in your own time is your business; that's why
open source is good for innovation through revolution. Ethics for legal and
medical professions is a broken analogy because all of their ethical
considerations focus on outcomes for clients/patients, not the details of how
their services are executed.

------
jhallenworld
"Perfect is the enemy of better"

------
cwyers
"This 'Worse is Better' notion that only incremental change is possible,
desirable, or even on the table for discussion is not only impractical, it
makes no sense."

It makes all kinds of sense. Consider for a second the sort-of parable of
Chesterton's fence. As G.K. Chesterton wrote:

"In the matter of reforming things, as distinct from deforming them, there is
one plain and simple principle; a principle which will probably be called a
paradox. There exists in such a case a certain institution or law; let us say,
for the sake of simplicity, a fence or gate erected across a road. The more
modern type of reformer goes gaily up to it and says, 'I don’t see the use of
this; let us clear it away.' To which the more intelligent type of reformer
will do well to answer: 'If you don’t see the use of it, I certainly won’t let
you clear it away. Go away and think. Then, when you can come back and tell me
that you do see the use of it, I may allow you to destroy it.'"

It's basically that whole Spolsky think about why you shouldn't rewrite code
from scratch[1]:

"Back to that two page function. Yes, I know, it's just a simple function to
display a window, but it has grown little hairs and stuff on it and nobody
knows why. Well, I'll tell you why: those are bug fixes. One of them fixes
that bug that Nancy had when she tried to install the thing on a computer that
didn't have Internet Explorer. Another one fixes that bug that occurs in low
memory conditions. Another one fixes that bug that occurred when the file is
on a floppy disk and the user yanks out the disk in the middle. That
LoadLibrary call is ugly but it makes the code work on old versions of Windows
95.

Each of these bugs took weeks of real-world usage before they were found. The
programmer might have spent a couple of days reproducing the bug in the lab
and fixing it. If it's like a lot of bugs, the fix might be one line of code,
or it might even be a couple of characters, but a lot of work and time went
into those two characters.

When you throw away code and start from scratch, you are throwing away all
that knowledge. All those collected bug fixes. Years of programming work."

Broadly speaking: you, for ANY definition of you, are unable to design a
perfect system at the first go. You will not account for all use cases, there
will be edge cases you don't consider, so on and so forth. But it goes further
than that: if you want to design something better than C++, you need to
understand why so many people use C++. If you want to replace CSS, you need to
understand why CSS is popular. If your thinking on the matter hasn't evolved
much past "CSS is unnecessary" and blaming a single catchy essay for every
decision you disagree with, then instead of writing essays about how everyone
else is doing it wrong, maybe you should spend more time trying to understand
what everyone is trying to do, why they're trying to do it and what resources
they have to do it with. And THEN maybe I'll let you tear that fence down.

1)
[http://www.joelonsoftware.com/articles/fog0000000069.html](http://www.joelonsoftware.com/articles/fog0000000069.html)

~~~
empthought
"If you want to design something better than C++, you need to understand why
so many people use C++."

No, you don't. You just need to design something better than C++.

"If you want to replace CSS, you need to understand why CSS is popular."

No, you don't. You just need to replace CSS.

Now, having people _adopt_ better-than-C++ and better-than-CSS might take a
little bit of the psychoanalysis (marketing?) you propose. But understanding
_why_ someone uses C++ is not really relevant at all in order to design a
superior replacement.

I assure you, the designers of Java did not really care _why_ people used
COBOL for so much banking code, and yet here we are, with Java having
supplanted COBOL for much of that work.

Every time someone trots out Chesterton's fence, I trump it with Sturgeon's
law.

Edit: heck, an even better example is JavaScript app frameworks. Do you think
their designers understand _why_ people programmed in Smalltalk or Swing or
VB.NET or Delphi? No. Judging from their approach and idiom, obviously not.
And yet here we are.

~~~
Nacraile
""If you want to design something better than C++, you need to understand why
so many people use C++." No, you don't. You just need to design something
better than C++."

And how do you define "better than C++"? C++ didn't get to be as popular as it
is for nothing. It got to be as popular as it is because it is a very good
tool for solving certain classes of problems. If you don't understand why
people still choose C++ for new projects, you're going to have a tough time
building something that will change those decisions. And if you think people
only choose C++ for new projects because they're idiots, frankly, you're the
idiot.

"Everybody is an idiot" is a much less likely hypothesis than "I'm missing
something". Before you throw out a popular tool whose appeal you don't
understand, you would be well advised to spend some time comprehensively
falsifying "I'm missing something". Hence, "If you want to replace X, you need
to understand why X is popular."

~~~
empthought
It wasn't my example. But there is no a priori reason why "something better
than C++" must be designed by someone with a deep understanding of C++. This
holds for anything you might put in place of C++:

\- must someone who designs a car understand why people ride horses?

\- must someone who designs a phone understand why people use smoke signals?

\- must someone who designs a gun understand why people use swords?

------
dkarapetyan
Isn't twitter also an example of worse is better. Why send the people willing
to engage with your argument to twitter then?

------
dreamfactory2
I always associated worse is better with VHS vs Betamax rather than
incrementalism i.e. utility realised > potential utility

------
slashnull
But seriously, it takes some nerve to write a blogpost _against_ Worse is
Better that's on every aspect worse than the paper that originally codified
Worse is Better.

Oh wait, no, is this lack of quality justifiable because this content-free 2
minutes read is more efficient and easier to write than a serious exploration
of the consequences of whatever the author was ranting about?

~~~
account_306
I'll probably get shadow banned for agreeing with this, but who cares anyway.

The article ignores LEAKY ABSTRACTION. that's all there is to say. Now lets
have some marketers tell us why that's a good thing and we can get back on
this irony loop, this irony loop, this irony loop.

------
michaelochurch
"Worse Is Better" has evolved into a monster.

Originally, it looked like this:

* MIT philosophy: never compromise on correctness, even in bizarre corner cases. Aim for conceptual beauty _to a programmer_.

* New Jersey (Bell Labs) philosophy: comprise on correctness for simplicity or performance.

In 1985, six years before "Worse Is Better" was originally written, the "New
Jersey" attitude was probably more useful. Most people, if they wanted to
write acceptably performant software, had to do it in assembly. C was less of
a leap, conceptually and in terms of average-case performance, than Lisp was.
People who'd been writing assembly programs for years could learn how to write
performant C. Writing _performant_ Lisp would be much harder. A contemporary
Common Lisp executable is at least 40 MB (obviously, that wasn't the case in
the 1980s); at that time, 1 MB was a powerful machine. "Worse is better"
_worked_ in the 1970s and '80s. If every piece of computing work had to be
perfect before it could be shipped, we'd be far behind where we are.

Also, quite a number of the original Unix programs were for natural-language
processing (at a level that'd be primitive today) and paper formatting. With
the resources of the time, it would've been impossible to get much of that
stuff perfectly right anyway.

Bell Labs wasn't full of the anti-intellectual idiots who invoke worse-is-
better, lean-startup tripe today. They knew what they were doing. They knew
the compromises they were working under. They bet on Unix and C rather than
Lisp machines, and they were right. In 2014, thanks to our ability to stand on
the shoulders of giants that were built using C, we have machines that can
efficiently run code in pretty much any language, so the C programmers _and_
the Lispers have won. At least, on that front.

However, the "worse is better" lesson doesn't apply nearly as well in 2014. We
can do about 500,000 times as much computation per dollar as we could in 1991.
That's 500,000 times (at least!) as many opportunities for things to go wrong.
A bug that happens once per 100 billion operations used to be negligible and
now it's often not.

Unfortunately, we have an industry beset by mediocrity, in which commodity
developers are managed by commodity business executives to work on boring
problems, and low software quality is simply tolerated as something we'll
always have to deal with. Instead of the knowing compromise of Bell Labs,
"worse is better" has evolved into the slipshod fatalism of business people
who just assume that software will be buggy, ugly, hard to use, and usually
discarded after about 5 years. Yet we're now in a time where, for most
problems, we _can_ affordably do them correctly and, because things happen so
much faster now, we often put ourselves and our businesses at serious risk if
we don't.

~~~
qznc
It is about economics. The just-good-enough wins against excellence, because
it is cheaper. When excellence finally pays of the good-enough competition
already was profitable for years and solved five more problems.

What was good enough in the 80s is not good enough in 2014. Our views about
good enough changed. For example, security requirements are much higher today.

The question is not if Better or Worse is better. The question is what is good
enough?

~~~
dustingetz
> The just-good-enough wins against excellence, because it is cheaper.

Its not clear to me that this is the case anymore. The whole startup thesis is
that a small, excellent team can outperform vastly larger teams of average
people. This is the opposite of how it was in the 80s.

------
Fuzzwah
The font used on this page is difficult to read. I'm running current Chrome on
Win7.

[http://i.imgur.com/QB8RJwl.png](http://i.imgur.com/QB8RJwl.png)

note: I checked that screenshot on another machine and it didn't seem so
bad.... so maybe this is also something strange with the monitor / res I'm
using... still thought I'd mention it.

~~~
rkuykendall-com
The same website on Chrome, OS X, Retina:

[http://i.imgur.com/KdvPoEB.png](http://i.imgur.com/KdvPoEB.png)

~~~
Fuzzwah
Thanks for taking the time to take a screenie and share it. Looks like I need
to investigate why some sites fonts look so bad on this machine.

~~~
scholia
Firefox, Windows 7 - I don't like the text either
[http://i.imgur.com/s3V1dwf.jpg](http://i.imgur.com/s3V1dwf.jpg)

------
zubairismail
I agree with the points in 100%

