
The bomb in the garden - alipang
http://unitscale.com/mb/bomb-in-the-garden/
======
bct
The consistency of PDF rendering has nothing to do with it being an ISO
standard. It's based on a completely different model that made different
tradeoffs with respect to user-end flexibility. It was also controlled by a
single company for 15 years.

TimBL and the W3C _were_ pushing for new payment models 15 years ago (and
probably earlier). This is why HTTP has 402 Payment Required. There was a W3C
micropayments working group back in 1998:
[http://www.w3.org/ECommerce/Micropayments/](http://www.w3.org/ECommerce/Micropayments/)
.

This stuff would be hard to get right in a vacuum. It's even harder when
you're trying to get groups with competing interests to agree on something,
especially when you have as little leverage as standards organizations tend to
have. The view put forward in the last half of this essay is naive.

~~~
minikites
Xanadu had it even earlier:
[http://en.wikipedia.org/wiki/Project_Xanadu](http://en.wikipedia.org/wiki/Project_Xanadu)

> Every document can contain a royalty mechanism at any desired degree of
> granularity to ensure payment on any portion accessed, including virtual
> copies ("transclusions") of all or part of the document.

------
kolektiv
It's a great talk, and I wish I'd seen it. I found myself nodding quite a lot,
despite not being a designer (though I used to play one before I got in to
tech/code). And in many ways he's right.

But I did find myself questioning the premise towards the end. Why do we want
to save "the web" in the form it was? Apps, etc. are different. Certainly they
raise the barriers between the average user and creation in the way that the
web traditionally didn't.

But to do well on the web these days - is that really as easy it was anyway?
The standards are more complex. If you want to stand out in a real way that's
probably going to take something like an App's worth of effort.

So do we want to encourage the web - a small and quite specific (and, perhaps,
time-bound) layer? Or do we want to be focusing on the internet? Because
that's doing great. I hear it's huge. And all of these apps - well, they use
the internet, just maybe not the web.

I don't know what the next 20 years will look like (I do remember what things
looked like from about '96 onwards, when I first started dabbling) but I think
we should stand a little further back than trying to maintain something
because "it was great". Maybe it was, but maybe it's not the right answer for
the future.

EDIT: (To reply to some of the comments/downvotes)

I wasn't suggesting a world of Apps. I was suggesting a world of "other"
approaches. The internet is also open and standards based. There are many
standards which don't sit on top of HTTP. What I meant to point out is that
the internet is not the web - and perhaps a new world of standards between
connected devices, physical things, etc. will be a much richer place than a
document-centric world. It's merely an idea, not a statement against openness.

~~~
GeneralMayhem
The big thing that the Web provides that apps don't is the ability to move
smoothly from one page to another. Apps live in their own silos most of the
time, and provide an extra barrier to use in that you have to specifically
download and install that one app in order to run it.

You would need something like a browser that masks the download process like
it does for Web pages. Then you're left with Web pages that spawn independent,
standalone programs. Flash, anyone?

Of course it's possible (and likely) that something completely different will
be invented, but I don't think individual apps for everything that's currently
a site is the answer.

~~~
7952
I am not sure why the division between apps and the web is even a useful one.
What is the difference between a tab in my browser and an app on my taskbar?
The app was likely downloaded from the web and sends/receives data from the
web. The difference is how it caches data; which has always had a huge impact
on user experience.

The most important thing is to find better ways of synchronising data between
a user machines and the web. It should be possible to download a complete
document as a file, use a web cache to render it, and then submit edits to the
cloud as a diff. The user shouldn't need to worry if this is happening locally
or not.

------
ringmaster
He makes a good point in that design generally sucks on the web, though I
disagree with practically everything else in this article, which seems to lay
all of the blame for bad design on poor web standards and the need for
revenue-earning advertising. Design needs to support the content - be it text,
photo galleries, or advertising - not vice-versa. The complaint is valid, but
the targeted cause is completely incorrect. Instead, try taking aim at factory
CMSes that cram all content into a poorly-designed blog-like form factor and a
lack of imagination, knowledge, and skill on designers' part for integrating a
good design with those tools.

And... Should web designers learn to code? Absurd question. If you don't code,
you're merely a hack graphic designer, not a web designer. You don't need to
code all your designs, but you need to know code well enough to know that your
design can be implemented; therefore, you need to know how to code.

~~~
jfb
Disagree. Why are those CMS so god-awful? What're the incentives that they're
designed to achieve? Badness in tooling is usually a _symptom_ , not a cause.

~~~
ringmaster
It's a byproduct of a one-size-fits-all-content design for a CMS that all of
their output looks like the same drivel. In contrast, if a designer had the
opportunity to hand-design each content page distinctly, such a site would
look markedly better.

CMSes are not built to produce beautiful design, but to expedite content
publication into a designed-once template, so it's no shock that their output
typically exemplifies bottom-barrel design.

~~~
jerf
"It's a byproduct of a one-size-fits-all-content design for a CMS that all of
their output looks like the same drivel."

I think this is a really important point, and want to further expand on it.
The New York Times was cited as having a bad website. However, on occasion,
when they put their mind to it, they produce a truly beautiful page with some
fantastic visualization or something. But it's a one-off; you can't produce a
beautiful page for every story, so you get the lowest-common-denominator page
for most stories.

A literal paper magazine is a concrete artifact, and a set of designers can be
tasked with making each page perfect. A web page on these sites is dynamic and
may literally not serve the same page twice. If, as a designer, you don't
internalize the idea that these are two _extraordinarily_ different domains,
despite the superficial visual similarities of the final product, then you're
going to have a hard time accurately diagnosing the core problem. And I do
believe many designers make this error on one level or another, when I see
these complaints.

In Matthew Butterick's defense, I do think he did implicitly make one very
concrete suggestion for how those sites can improve, though, at least in terms
of design: Get rid of the ads. It's hard, probably to the point of impossible,
to create a coherent design for a website when you are selling to the highest
bidder the right to put a bright yellow and orange banner with a poorly-
photographed item described in a random non-anti-aliased font unaligned with
anything else on your page in the viewer's highest-priority visual space, and
a different such thing for _each page load_. Against an uphill battle like
that, is it really any wonder that the web designers of the world have
collectively given up?

------
ISL
The author does a great job of identifying concerns, but the final thesis,
that a new standard is the answer and the old standards are fragmented, has
been repeated throughout history.

A successful new standard must be so good that all the fragmented platforms
scramble to implement it exactly.

TeX is _the_ underlying standard in typography for the physical sciences and
mathematics (and I hope CS :) ). It's an immovable standard; moving from
version 3.0 in 1978 to version 3.1415926 in 2008, exclusively through
bugfixes.

A web standard the likes of TeX could yield the results the author is looking
for. Alas (fortunately?), TeX is rarely used alone. Libraries and macros that
build upon it suffer variable stability, consistency, and clarity.

To achieve the author's aims, what's needed isn't a call for a new web
standard, it's a web standard so good, complete, and obviously stable that
everyone can't help but use it.

------
aidenn0
Hell yes you need to test PDFs in different readers on different devices.
Otherwise you end up with really odd behaviors in them including things like
text that is impossibly small to read when the illustrations are small enough
to fit on the screen

------
retrogradeorbit
Although I agree with a lot of it, I also think he's off the mark on a lot of
it. It's a long article, so I'll just pick a few.

"And if you worry, as some do, that the alternative to no W3C is chaos, or if
you worry that the alternative is a web ruled by Google and Microsoft and
Apple—I don’t think so."

Did he completely miss the browser wars? Or the MPEG-LA controversies? This is
not just a fear, this is exactly what will happen.

"I think the alternative is a web that’s organized entirely as a set of open-
source software projects. And we didn't have that choice 15–20 years ago, at
the beginning of the web. Because open source hadn't quite arrived as a way of
doing things."

Open Source arrived well before the web. Maybe it took the web to arrive for
him to notice it. But so much amazing work pre-dates the web. As for open-
source creating standards... well that's some kind of joke. Compare the look
of Linux desktop versus OSX. In Linux, applications can look radically
different, all on the same platform. Your desktop ends up looking like a
hodgepodge of different designs. Each app looks and operates completely
differently.

"Linux, Apache, Perl, Python, WordPress—all of these technologies have
succeeded without a W3C-style supervising authority."

This is just not true. Linux, Apache, Perl and Python all have central
authorities. And have benefited from combining this with a more randomised set
of contributions. I'm not sure about WordPress.

------
msandford
Is there anything we can practically do to support this guy? I think he gets a
lot of stuff right and I want to show my support. But I have no idea how to do
that other than posting a comment on HN. Probably not what he had intended.

~~~
mbutterick
Hi, this is Matthew Butterick. I wrote "The Bomb in the Garden."

Yes, there is something you can do to support me — well, forget me, I'm not
important. But the question I'm interested in is important. So if more people
started thinking about the question, started having conversations about the
question, tried to answer the question (on HN or elsewhere), that would be
great.

The question is this:

Based on its 20-year track record, is the W3C equipped to keep the web
competitive over the coming 5, 10, and 20 years? (And if not, what should
replace the W3C?)

The issue of "20-year track record" is important. The W3C has been around long
enough that it can be — it must be — evaluated by its performance, not by its
promises.

Furthermore, the notion of "competitive" is increasingly important, as
alternative media platforms make inroads against the web (as I discuss in the
article)

Bottom line, I want the web to win. And it's not winning. It's falling behind.
But we need to be willing to ask the hard questions about how it got here. We
need to hold the W3C's feet to the fire. We need to agitate for the web we
want. Because in the end, it's not the W3C's web, it's not Google's web, it's
OUR web. And if we don't like the web we end up with, we'll have no one to
blame but ourselves.

------
badman_ting
I didn't want him to be right, but he is brutally, painfully, utterly right.
All those examples, my god.

------
BenoitEssiambre
God I hate large grey text. Why are you trying to defeat the contrast ratio of
my monitor? I paid good money to get it! If you want to enhance my reading
experience on longer articles, give me the option of a lower _background_
brightness (a gray background) so that I am not staring at a bright light for
long periods of time. The text should stay black.

Also why is the text column to the right instead of in the center? There is
nothing on the left using that space! This is a lost opportunity for pleasing
symmetry.

------
mattmanser
Don't be at all put off by the opening statement.

Most definitely worth reading even if you end up disagreeing with some of it.

~~~
pstuart
Is there a tl;dr?

He says we need to move away from an advertising model but not what will
replace it. Who's going to pay for all of this?

------
jvdh
Google cached version:
[http://webcache.googleusercontent.com/search?q=cache:uTRn2_o...](http://webcache.googleusercontent.com/search?q=cache:uTRn2_oncF4J:unitscale.com/mb/bomb-
in-the-garden/+http://unitscale.com/mb/bomb-in-the-
garden/&cd=1&hl=en&ct=clnk&lr=lang_nl%7Clang_en)

~~~
mumbi
Text:
[http://webcache.googleusercontent.com/search?q=cache:uTRn2_o...](http://webcache.googleusercontent.com/search?q=cache:uTRn2_oncF4J:unitscale.com/mb/bomb-
in-the-garden/+http://unitscale.com/mb/bomb-in-the-
garden/&lr=lang_nl%7Clang_en&hl=en&tbs=lr:lang_1nl%7Clang_1en&strip=1)

------
richardwhiuk
Oh dear. This feels like the hobby horse of someone whose just realised that
different browser markup exists and decided to arbitrarily blame the W3C.

This article starts out in the right place - which is a good thing. There are
a large number of websites shown which aren't the best designed in the world.

Unfortunately, it then goes dramatically of track.

"Because we’re going to have to do it cheaply, with the advertising pushing
costs down. This was supposed to be one of the virtues, I think, of web
standards."

No, the point of web standards was never to make web design cheap because
there is less money to pay for it. That's never been the objective. The point
of web standards are to make it possible to write a single article which can
be viewed in a similar way on a number of different devices.

"And the misery exists because of the W3C—the World Wide Web Consortium.
That’s the organization that supervises web standards." \- No, that's
incorrect. Web standards aren't poorly enforced because of the W3C - they are
poorly enforced because browser makers extend standards in different ways, and
because some browser makers underinvested in maintaing compatibility with the
rest of the web (e.g. IE between 1999 and 2008).

"... And way too lenient in enforcing them." \- The W3C has NO power to make
any browser manufacturer enforce web standards. None. Zilch. Nothing. They can
say whether a implementation is standards compliant, but historically that's
had no effect on either the adoption rate of the browser, or whether the
browser manufacturer will do anything.

"No. Because PDF is an ISO standard. " \- No that's not why. It being a
standard doesn't magically make everyone suddenly exactly implement the
standard. PDFs work across platform because there's a single reference
implementation (Adobe Reader) which all other readers must copy. Also, the PDF
standard (as far as the functionality being implemented identically) has been
unchanged for the past 15 years. A PDF written today will almost certainly run
on a PDF reader written 10 years ago. There's nowhere near the same pace of
innovation - a browser of 15 years ago is unlikely to be able to perform AJAX,
let alone <video>, <audio> or WebGL.

"I love that, Tim. Did you say that 15 years ago? No. Well, did you say it
five years ago? No. When did you say it? You said it three months ago?" \-
Aside from the fact that it's way easier to pay for things on the web than at
pretty much any point in recent history (Compare completing an arbitary online
transaction, which might take order of 2 minutes, without a preset price,
which may reoccur, to anything older than about five years), Tim Berners Lee
is hardly the only person allowed to innovate on the web. There are a dozen
companies pioneering online payment protocols. Mozilla are currently looking
at WebPayment for Boot 2 Gecko -
[https://wiki.mozilla.org/WebAPI/WebPayment](https://wiki.mozilla.org/WebAPI/WebPayment)

"And if you worry, as some do, that the alternative to no W3C is chaos, or if
you worry that the alternative is a web ruled by Google and Microsoft and
Apple—I don’t think so." \- Yes, it was. Please do yourself a favour and look
up the WHATWG which took HTML 5 away from the W3C due to disagreements on
process. It was pretty much a group of Mozilla, Google, Microsoft, Apple and
Opera deciding what happened. Disband the W3C (through some might act of God)
and that's what they'll go back to. To be fair, it's still those five, just
under some other grouping

"The W3C could refuse to renew the membership of organizations that took
actions contrary to the spirit of web standards, including repeatedly failing
to implement them. If the W3C made participation in the consortium contingent
on timely implementation, members would either comply or quit." \- You don't
seem to get that the W3C needs organizations such as Microsoft, Apple, etc a
hell of a lot more than they need it. If the W3C rejected them because they
happen to have different implementations of gradients, they would reform the
WHATWG, and the W3C would have $n less to promote web standards. The W3C does
what the member organizations vote that it should do, or they leave. It's
literally that simple. Hence the sitatuation with DRM and Do Not Track.

------
ableal
_" Solving problems is the lowest form of design."_

Well, that may be right or wrong, but it is thought-provoking. In a Karl
Kraus, Half-Truths and One-and-a-Half Truths, sort of way.

