
Why did Alan Kay say the Internet was done well, but the Web was by amateurs? - yitchelle
http://programmers.stackexchange.com/questions/191738/why-did-alan-kay-say-the-internet-was-so-well-done-but-the-web-was-by-amateur
======
liotier
The whole reason why the HTML world took off while the existing technologies
lingered is that amateurs could grab whatever they could and run with it -
incompetence be damned ! Innovation often happens when amateurs find
accessible technology and bend it in strange ways that make the professional
cringe. Professional are then needed for security, scaling, cost reductions
and all the things that require maturity but instead of innovating they would
have refined the state of the art to new and still stagnating levels of
perfection. Hurray for amateur enthusiasts who do interesting new things
because no one told them it was the wrong way to do it !

~~~
qw
I agree. Many underestimate how the lack of strictness helped early users. You
could start with plain text and add a few <br> or maybe a <p> (wIthout end
tags) to create paragraphs.

Anyone with minimal knowledge could start sharing content. If they had to pass
any kind of validation, many would not have bothered. Yes it was a mess, but
it helped us get enough information to reach critical levels. I would rather
have a web with messy HTML than FTP servers filled with Word documents or a
proprietary AOL / MSN

~~~
KMag
The problem of broken markup would have been better solved on the server side,
rather than the client side.

As a general principle, pushing denormalized data as far to the edges of the
system as possible minimizes complexity of the system as a whole.

------
trotsky
[http://1997.webhistory.org/www.lists/www-
talk.1993q1/0182.ht...](http://1997.webhistory.org/www.lists/www-
talk.1993q1/0182.html)

[http://1997.webhistory.org/www.lists/www-
talk.1993q1/0197.ht...](http://1997.webhistory.org/www.lists/www-
talk.1993q1/0197.html)

Andressen: I'd like to propose adding this in this way.

Berners-Lee: I don't think that's a good idea.

Andressen: Oh, I totally agree. By the way, we're doing it anyway.

~~~
rgbrenner
The entire thread, starting with your first link, is really great. Tim
Berners-Lee even suggests 2 alternatives for the IMG tag. Thanks for that.

~~~
podperson
The alternatives are not much better (overloading a tags is more elegant and
would probably have eliminated the later mess of needing embed tags and
plugins for additional types of media).

Even now we have img, audio, video, etc. instead of say a single well-behaved
and extensive media tag that could also be used to include content from
another page.

~~~
drzaiusapelord
I wonder if this was a concession to the limited browser code and slow
networks at the time. With the img tag you know its going to be a jpg, gif, or
png. If it was a media tag then you have no idea what it is. Imagine
downloading a 10mb quicktime file or flash object on dial-up so the mime
header can be read as you wait for the page to render. Its only recently that
anyone has bothered to handle video in the browser.

Could web servers back then just send the client the header of a file? I think
these guys were working with a lot of ugly limitations that we've only
recently overcome. Andreeson recently said that he expected the back and
forward buttons to be temporary but no one thought of a good replacement for
them. We still haven't.

------
jiggy2011
The technologies that could be called "the internet" solve more well defined
problems than the technologies that could be called "the web".

TCP/IP for example is a description of an electronic locomotive network. It
ferries packets around. What is in the packets? Who knows? Who cares? What
about privacy, authentication, authenticity? Not our problem!

The web on the other hand is a mishmash of adhoc parts,
HTTP,HTTPS,HTML,Websock,DOM etc. Are we a client/server document protocol? Are
we a distributed runtime? Do we guarantee secure communication? Perhaps. Are
we stateless? Technically, but state tracking is what people use us for! We're
shipping around a turing complete language everywhere, is this an integral
part of the web? Or just something we slapped on top for good measure?

~~~
czr80
It's interesting to note that choosing not to worry about those things
("privacy, authentication, authenticity") were very much design choices -
perhaps the most consequential design choices ever?

~~~
DanBC
...and it's a shame that the WWW is good enough for people to attempt to
kludge something on top of it, and continually fail, while not being bad
enough for anyone to succeed in launching alternatives that fix those
problems.

See also SMTP which has flooded the Internet with huge amounts of garbage
traffic but which is simple and works well enough that any alternatives fail.

~~~
Silhouette
_See also SMTP which has flooded the Internet with huge amounts of garbage
traffic but which is simple and works well enough that any alternatives fail._

I agree with your main point, but I wanted to add that there are plenty of
alternatives to sending e-mails using SMTP these days. They mostly go by names
like "Facebook" or "Google+" and run over protocols like HTTPS talking to a
centralised service provider, and they solve the fundamental problem --
letting people send text and picture messages to friends/colleagues for
viewing at their own convenience -- in a different and mostly incompatible
way.

~~~
jiggy2011
Decentralisation is quite important for a basic business communication
process.

If business switched to facebook as the primary method of communication then
facebook becomes critical infrastructure for the economy.

If it suffers downtime then there becomes a massive economic cost, so "move
fast and break things" is no longer a reasonable motto.

~~~
Silhouette
Of course anyone who needs reliability and privacy will question the use of
the specific public social networks I mentioned as examples, but the same
principle applies when businesses deploy centralised in-house systems. They
still handle messaging (and often a lot more besides) for their key functions
using dedicated tools, so general purpose e-mail is no longer necessary for
those functions. Increasingly these organisations also use VPNs to allow
employees and other associates who are not physically present to hook into the
same centralised systems remotely.

~~~
jiggy2011
Sure, but I think you would be hard pressed to find a company that didn't have
some form of email system for dealing without people outside of the company
etc even if internal communication is done by some other tool. It's a network
effects problem, you could build a better email system but you can't get rid
of email completely until it's as commonly used as email.

Such a common tool needs to be highly resilient to catastrophic failure in the
same way email is. For example the DDOS on sendgrid affected sendgrid and
their customers but the rest of us were unaffected.

The implications of a system that allowed groups like Anonymous to cripple
worldwide trade at will via DDOS would be quite scary.

~~~
Silhouette
I agree with this too. I'm not suggesting that e-mail is dead or anything,
just that a lot of use cases aren't so much being replaced by an alternative
to SMTP as being replaced by another communication channel entirely.

------
dasil003
The critique is a bit unfair because the Internet had clear design goals and
criteria with which to apply rigorous engineering practice. The web by
contrast was a essentially a document sharing and linking protocol and
document format. It looks bad mostly because it outgrew it's design, but at
the same time if it had been designed correctly for its eventual use _it
probably never would have taken off to begin with_. Granted, it has
fundamental warts as well, but I ascribe to that to the relative uncertainties
of designing a higher level application that has at once more constraints and
unanticipated usage.

I don't know, perhaps Kay is right, being that he is 1000 times the engineer I
am, but the web just seems such an amorphous thing compared to TCP/IP.

------
jbert
Standards bodies ossify over time.

In Europe in the 80s/early 90s, X.25 comms, full seven-layer OSI stack and
X.400 email was the "real thing", standardised by OSI, mandated by government
procurement and implemented by telcos. This was "commercial grade" networking
and email, with TCP/IP and SMTP being derided as "academic".

The IETF were the scrappy, pragmatic upstart. "We believe in rough consensus
and working code".

I've not followed many later IETF standardisations, but I did take a passing
interest in CAP (calendar access protocol) a few years back. (since there
wasn't a good open source calendaring stack (not sure there is now, either,
tbh).

It seems to me as a distant observer/potential implementor, CAP would be very
hard to get started on. It mandated the use of otherwise unused-at-the-time
BEEP [<http://tools.ietf.org/html/rfc3080>] (for good reasons - the same good
reasons OSI mandated their full 7 layer stack...), embedded (unspecified parts
or SQL92 (without a grammar!)) and seems to me a daunting prospect. It looked
to me as though they were really saying "everyone doing CAP has to really have
an SQL backing store with a particular schema and you pretty much have to pass
these queries through to it".

So I see OSI->IETF->W3C as a progression over time. Not sure where to now...

[I'm sorry if I'm harsh to the people who put a lot of effort into CAP. I
didn't contribute and so I don't really have the right to criticise, but from
a potential implementors pt of view, it seemed daunting. I'm also not tracking
any current IETF work, so I know I have a narrow view and am likely wrong in
many ways.

update: ...and I've just skimmed the final RFC and they do have a grammar for
the SQL subset. I recall that that wasn't present in the drafts available at
the time.]

~~~
kabdib
A standard should reference working code. Preferably more than one
implementation by different parties. Ideally at least one implementation is
open source.

This doesn't mean that standards based on working code is automatically
superior, but it's less likely you'll get absolute turkeys.

------
acdha
I read this as Kay being unfamiliar enough with the lower level protocols to
assume they're significantly cleaner than the higher level web. The “designed
by professionals” era he's talking about still had major problems with
security (spoofing is _still_ too easy), reliability and performance which is
why there's still new work being done tuning everything for high speed or high
packet loss links. Go back just a little further and hostnames were resolved
by searching a text file which people had to distribute!

Both systems are complex heterogenous systems and have significant backwards
compatibility challenges any time you want to fix a wart. It's easy to spot
problems, hard to fix them, and as the array of failed competitors to eithet
shows it's surprisingly hard to design something equivalent without going
through the same learning curve.

As a biologist might tell an intelligent design proponent, if you look at one
and see genius design you're not looking closely enough.

~~~
noblethrasher
Interestingly enough, Alan Kay, who has an undergaraduate degree in the
subject, was inspired by biology (particularly the way that field discovers
and organizes knowledge) when he designed Smalltalk.

------
xradionut
Having been involved in computers since the 70's, the internet before WWW, and
the web since, I come to the conclusion that the Web succeeded because it was
barely good enough to work for amateurs, but no better. As hypertext it sucks.
As a application platform it sucks too. But for allowing free exchange of
information, it's been pretty damn good for such a crufty piece of tech.

------
qompiler
HTML5 is being designed by "professionals" and I have never seen such a mess.
Functionality is being introduced that shouldn't even be part of HTML, just
look at the amount of new input types.

~~~
Ygg2
At risk of downvotes I must ask why? New input are added probably to bring
them more in line with system inputs of the operating systems.

~~~
qompiler
HTML should be a markup language, CSS should be the styling language and
JavaScript should be used to handle matters such as input, processing and
output. The principle of separation of concerns is very simple and elegant. In
the long run it makes sure you don't get a mess of a framework in which things
can be done in 50 ways, of which 25 work only in browser X and the other 25
behave differently in each browser.

~~~
huskyr
You'd rather have the current situation of thousands of badly implemented
datepickers and people loading the complete jQuery UI suite just for the
datepicker instead of a simple <input type="date" /> ?

~~~
qompiler
It doesn't solve anything. When you move such functionality into HTML you will
just get "thousands" of browser specific datepicker implementations, in each
version of said browser.

If there is a badly implemented datepicker written in JavaScript you can use a
different library. A good JavaScript library will make sure it works with
consistency and as expected in different browsers.

~~~
Shish2k
> When you move such functionality into HTML you will just get "thousands" of
> browser specific datepicker implementations, in each version of said
> browser.

My phone has one phone-optimised date picker; my PC has one PC-optimised date
picker; my screen-reader has one screen-reader-optimised date picker. Sure
there might be other versions for other platforms, but I as a user would only
interact with three, and those three are all consistent and optimal for their
situation.

Seems much better than our current variety of scripts which all target the
desktop PC and fail at it, and don't even attempt to work on any other
platform...

> If there is a badly implemented datepicker written in JavaScript you can use
> a different library.

Not if it's on somebody else's site I can't.

> A good JavaScript library will make sure it works with consistency and as
> expected in different browsers.

A library written and deployed today will work flawlessly with the browsers of
tomorrow, including things like screen readers which the library author hasn't
even thought about?

Taking your logic further, why bother having HTML parsing, layout, or
rendering in the browser? In fact, if you object to the idea of semantic
markup and would rather manually specify every detail of every element for
every situation, why are you using HTML and not giving users a .exe? :P

------
fpp
The full quote from Alan Kay's interview with Dr Dobbs last year:

( [http://www.drdobbs.com/architecture-and-design/interview-
wit...](http://www.drdobbs.com/architecture-and-design/interview-with-alan-
kay/240003442) )

"...The Internet was done so well that most people think of it as a natural
resource like the Pacific Ocean, rather than something that was man-made. When
was the last time a technology with a scale like that was so error-free? The
Web, in comparison, is a joke. The Web was done by amateurs..."

------
fulafel
Top answer attributes difference to web pioneers being "not standards people",
wtf? professional standards committee attendees aren't exactly known for
technical excellence.

------
hp50g
I agree entirely with this. the world wide web lacks any "engineering" at all.
It just happened and is as a result, a bit of a mess.

Even standards such as html5 don't get anywhere near making something coherent
and consistent.

~~~
BerislavLopac
Actually, the underlying foundations of WWW -- primarily HTTP and hypertext --
are excellently designed from the beginning. The problem is in fact that they
were designed too well -- it was too easy to build a good-enough content that
was easily accessible by a lot of users, and the real harm was done not by the
amateurs, but by the "wrong" kind of professionals -- graphic designers and
developers from other kinds of the field (databases, system, financials) who
failed to understand the capabilities and specifics of this new platform.

EDIT: Removed assertion that Web was implemented excellently from the
beginning; I understand that HTTP 1.0 was inefficient, and there were other
early issues. But the design was sound, and the implementation improved
quickly.

~~~
hp50g
I disagree with the first bit. HTTP is an ambiguous, overcomplicated mess. I
regularly have to wedge myself into the stack and deal with things like cookie
directives, pipelining, persistent connections and utterly broken caching
semantics etc. It's just horrible from end to end.

HyperText itself is fundamentally a well engineered concept (I mean it worked
fine for HyperCard etc) but to base the public facing WWW on SGML was just a
plain horrible idea.

For a number of years I actually preferred Gopher, WAIS and Usenet. I still do
now when I think about it for the sheer simplicity and the fact it's designed
to push indexes and information to rather than to ooze marketoid vomit.

Agree with the "wrong kind of professionals" statement though.

~~~
dasil003
I see where both of you are coming from, but I can't help but notice the
number of competing network protocols and formats that HTTP/HTML defeated in
the market. Did the web succeed _despite_ its technical failings or because of
them?

~~~
dredmorbius
Possibly a mix. The fact that the protocols were free an unencumbered, and
reference implementations of both server and client software were provided
helped immensely as well. Any idiot could come along and either pick up
working pieces or modify them. And they did.

Tim O'Reilly's had a few things to say about watching the Web take off. Its
primary competition was closed systems: either entirely closed networks, or
proprietary protocols, or both. He wasn't willing to bet his company on any
such thing. When the Web emerged, it was clear to him that it was the solution
he'd been looking for.

------
atirip
Well, in my book FTP is The Absolute Worst Protocol Ever Invented :-)

~~~
vidarh
It's high on the list, certainly... I might be inclined to hate more on the
SOAP designers these days, but FTP does very well on the "idiotic complexity
in few pages" metric.

------
robomartin
Here's an interesting and very relevant TED talk:

[http://www.ted.com/talks/danny_hillis_the_internet_could_cra...](http://www.ted.com/talks/danny_hillis_the_internet_could_crash_we_need_a_plan_b.html)

The Internet was not conceived to be what it has become. Having had a front
row seat to watch it go from academia to what it has become today one can only
be in awe of what --I'm gonna say it-- private enterprise, capitalism and the
drive to succeed can produce. Yes, it is not perfect. What project at this
scale is?

~~~
cma
> private enterprise, capitalism

Shouldn't you also include public spending's role; e.g. CERN?

~~~
robomartin
The explosion of the Internet into every nook and cranny is and was a 100% due
to massive private investment at all levels. It was a rush for gold. Fortunes
made and lost.

EDIT: Strange words replaced. Damn iPad auto-text!

~~~
cma
The Internet is different from the Web. It is off topic, but if you want to go
into that, the Internet as distinct from the Web had an even bigger public
kick-start.

~~~
robomartin
Of course the two are interdependent. And both were grown to what we have
today by private enterprise not government. Knowing what I know I'll venture
to say that the public sector contribution to building either the web or the
Internet is just-about microscopic when compared to private investment in both
time and money.

Yes, there was a kick-start phase, but it was nothing compared to what came
later.

Furthermore, it was done without intent, purpose or even an idea of what it
could become. It was simply a communications network for the military at first
and for a select few universities later.

In the TED video I linked the presenter is holding a small book that contained
every single person using the Internet at the time.

What's interesting is that he registered the third Internet domain, so he was
there from the very start.

I was peripherally involved with a couple of interesting private efforts in
the early days of the gold rush. One of them was a group led by Phil Anschutz.
He owned the right-of-way along major railway routes. They trenches the heck
out of those routes to lay in massive amounts of dark fiber. A huge investment
in the future that I am sure payed off handsomely.

------
walshemj
The internet was done by amateurs ISO was done by professionals :-)

Which is a good job other wise we would all be using x.400 and x.500 et-al and
you subs would take what your PTT gave you and like it.

Of course I would have a cool mail address - but I was root on the UK's ADMD
at one point ;-)

------
andrewbinstock
I did the interview with Alan. Let me give the larger context. The interview
was done shortly after Kay had shown a presentation done 50 years ago of one
of the very first GUI programs. He wondered then and later in this interview
why GUIs had essentially not evolved much in the intervening period, despite
the huge jumps in technology in other areas.

With me, he discussed how UIs were uninviting and mystifying to people who
don't know how to use them (children and seniors). In addition, he felt the
Web provided very little that would foster the individual's learning on
his/her own terms.

So while I suspect he'd agree with many of the comments posted here about the
Web, he was not primarily speaking about protocols and bits and bytes, but by
the end result: a presentation quality that only serves up what a TV can do,
rather than deeply engaging individuals.

------
pfraze
Alan Kay was talking about the program model. There's no isolated module below
the document namespace, leading to monolithic web apps and a lack of user-
configurability. The browser needs to run more than one program at a time to
be a serious OS.

I imagine he compared the browser pretty directly to Smalltalk.

------
Lerc
There seems to be rather a lot of consensus that the web is a mess but it is
what we have.

Have there been any serious attempts at producing a solution that is better
designed that does not simply add new things to the spec. An argument can be
made that if a better system existed it would not be able to compete because
of inertia, but surely it would be worth at least making the attempt.

I have scribbled notes and designs for years now exploring ideas of what could
be done with a clean slate and hindsight that was unavailable to early browser
designers. I can't be alone in this. Who else has explored the possibilities
of what browsers could be?

~~~
Ygg2
This was some time ago on HN <http://axrproject.org/>

~~~
Lerc
That does indeed look interesting. I shall take a look. I certainly hope there
are more projects like this out there. There are many approaches that might be
better or worse which might not be apparent until someone attempts it.

------
AnthonyMouse
It seems like people are talking about a bunch of stuff like the ability of
amateurs to create a web page easily and that sort of thing.

Maybe some of that stuff helped. But I don't think those were the big reasons.
Look at how the web works:

1) Web pages work on all relevant platforms. If someone creates a new browser,
most existing web pages will render on it, and failures are in the nature of
bug fixes rather than complete rewrites. Compare this with e.g. new operating
systems and C programs.

2) If you change the web page, the update is available immediately to
everyone. Compare this to the morass of outdated, incompatible and vulnerable
software on your typical Windows box.

3) All the code effectively runs in a sandbox. Users generally don't have to
worry about clicking on a link and having it send virus spam to everyone in
their contact list.

Those are some huge advantages. Advantages that more than make up for the fact
that web development is a huge pile of kludges built against each other that
only hasn't been discarded and replaced entirely as a result of the massive
amount of mutual interdependence.

The primary competitor to the web was Java. But Java was hampered by horrible
mismanagement on the part of Sun and repeated malicious attempts to destroy it
on the part of Microsoft, so the web won. Which is probably unfortunate.
Imagine a web where Java (or Go) rather than Javascript was the primary
language.

------
ronilan
He is right. But that's why the web is so wonderful - it is clearly man-made.

~~~
jeremyswank
Agree. Beyond that, I'd say that it is more of an expression of culture than
it is an expression of technology. (Carefully noting that these cannot be
easily separated, anyway.) "The Web" as currently practiced is more of a craft
(and at best, and art) than it is a technical exercise.

Certainly, is inelegant and even ugly in places: much like the tool we humans
(maybe should) value most: natural language. Natural language is fraught with
inconsistencies and wtf's we have kind of made it up as we've gone along
(standards academies notwithstanding).

Think of how lovely an eflorescence it is of the workings of the mind, and how
well it serves one of our deepest aspirations: to connect with another
thinking being and share our thoughts.

------
grey-area
I think the most interesting question with regards to the web is how it
compares to other sets of protocols/formats which try to do the same thing,
not how it compares to the underlying transport (the internet).

Why did the the web (i.e. HTML over http) succeed where so many other document
markup formats and protocols had failed in the past - before the web we had
Hypertext, SGML, gopher, FTP etc. but the web has overtaken them all (while
indebted to all of them of course).

HTML was the simplest solution we've yet found to a potentially very complex
problem - how to share and read documents in a common format. It's certainly
not perfect, but part of the reason for its popularity is this simplicity.

------
dredmorbius
The Titanic was built by professionals. Noah's ark was built by an amateur.

~~~
tincholio
With the difference that the Titanic actually existed

~~~
dredmorbius
The allegory is nonetheless useful. There are other more concrete examples. In
the technical world, there's the contrast, say of Linux with other non-free
OSes, ranging from BeOS to VMS to WinNT.

There's Wikipedia vs. Brittanica or Encarta. There's free markets vs. command
economies (or both ends vs. a mixed social market economy). Many 17th, 18th,
and 19th century advances and institutions in both science and letters were
founded significantly on the efforts of amateurs.

The deeper point is that the root of "amateur" isn't some connotation of
naive, inexperience, or lesser quality, but "amore", literally, love. And
love-of-task is a strong motivator for producing high-quality work.

------
pfortuny
Pufff, there is a tremendous bias on 'raw functionality' in that statement:
the Internet was so well done BUT FOR SECURITY issues, which were never taken
into account.

This detail has difficulted development to a huge extent.

One could say that the Internet was so well done for 'utopian citicens'. Not
that it is bad, but we have learnt a lot for the future from those mistakes. I
_hope_ we have learnt.

The routing protocols, DNS etc (which are essential for the Internet) have so
many gaping security holes that today we wonder how naive one can be when
designing communication protocols...

~~~
gizmo686
>One could say that the Internet was so well done for 'utopian citicens'.

Even for utopian societies, some of the Internet's security problems would
still be problems. The main issue is that for the most part, the protocol
assumes that information provided by another source is correct. This could
clearly be used mallisiously, but it also allows small, innocent mistakes to
take down the network.

For example, back in 2008, Pakistan decided to block youtube within their
country. While this specific example would not happen in a utopian society,
there could be valid reasons that a sub-network would want to disable or
redirect requests going to a different server.

What happened was that one of the ISPs, instead of simply blocking outgoing
requests to youtube, advertised itself as youtube. This change propogated
through the network, and youtube went down globally.

Still, when these issues do come up without any malice, they should be
relatively easy and painless to fix.

------
lukego
I saw an interesting talk recently by internet visionary Scott Shenker and I
wondered if he was suggesting the opposite: that the internet is a mess and
they need to learn from software people.
<http://www.youtube.com/watch?v=YHeyuD89n1Y>

------
martinced
The Web and all the technologies that come with it is a special kind of hell:
a Rube Goldberg machine whose level of complexity and crazyness is hard to
match.

The associated mediocrity is kinda jaw-dropping too: it's 2013 and the main
"engine" used to run all these client-side apps in the browser is still...
Single-threaded. Seriously. I'm not saying "language" because JavaScript in
itself has so many warts that there are several technologies out there who try
to dodge the language altogether and directly generate JavaScript source
code... Even Douglas Crockford himself is criticizing JavaScript a lot.

But it's not just JS... The madness simply never stops: HTML, (X)HTML, CSS
(oh, that one...), JS. And all the incomplete and overly complicated
"standards" and, of course, all the competing implementation.

I'm always amazed that we're in 2013 and a _lot_ of the demo page posted on HN
still cannot serve videos correctly to people who don't have Flash in their
browsers. That's the current state of affair: most web devs still aren't able
to serve various video formats to please everyone. Why isn't this something
that comes for free with the webapp server? Is it rocket-science to encode in
different formats and serve the correct one? Apparently it is. So imagine for
more complicated things...

To me the most fascinating is that hardly anyone steps back and wonders _"Why
do we have such an incredible mess?"_.

But that's what we have and a big part of software development is now for the
Web and the trend ain't stopping anytime soon: be it server-side or client-
side, it's a browsers world nowadays.

So let's stop bitching, let's learn the warts and ins and outs of these uber-
shitty technologies and let's get back to work ; )

~~~
pornel
Video is not a technical problem, but a legal/political/economic one
(technically good codecs covered by patents & corporations unwilling to adopt
to other codecs).

