

The Web Is Wrong - evincarofautumn
http://evincarofautumn.blogspot.com/2011/12/web-is-wrong.html

======
Turing_Machine
"Programmers may be surprised to note that the vast majority of users do not
ever view the sources of a web page."

I can't imagine anyone being surprised by that. Programmers, on the other
hand, look at source _all the time_. Being able to view the source when you
wonder "how did they do that?" is a big factor behind the rapid evolution of
the web.

"The source of the desktop applications you use isn’t typically human-readable
unless the source is open and you seek it out."

Which, I would argue, is one reason why most innovation is now occurring on
the web rather than the desktop.

"Beginners need consistency, and the majority of developers are always going
to be beginners, so we have no choice but to help them."

By concealing the way things are done from them and eliminating the terabytes
of publicly viewable example code?

Sorry, that doesn't seem very helpful to me.

~~~
nicholasreed
I agree completely. My first thought after reading this article was that while
99% of users may not view the source of a page, the 1% (web devs) that
actually create additional pages find View Source, Firebug, etc. extremely
useful. When I'm helping any new web developer out one of my first steps is to
have them use and understand Firebug. Preaching to the choir here, but it
cannot be understated how important access to and readability of source code
was to me when I first began.

Compiling and delivering in an unreadable format is possibly the worst
solution to this problem. Who ends up benefiting from this? The end user who
saves a minuscule amount of time loading the page? Is hampering innovation and
progress really worth that amount of time?

Below, the author makes a point that viewing the actual source is unreadable
anyways. Very true, but tools like Firebug and Chrome Developer Tools help
combat that problem to an extent by nicely formatting HTML, CSS, JS, etc. And
while many large sites (Gmail, etc.) have compact and utterly horrendous
source, I would argue that novice developers are not looking there first for
help, instead they look at less complex sites that are not minified (because
they don't need to increase page loading speed).

~~~
kls
I would argue a lot of what was called "mash-ups" a few years ago, came about
because people could view the underpinnings. Long before companies started
documenting official API's developers where pulling apart their code and
adding and mixing and matching features from various web app providers. I
think there is a culture of openness on the web and part of that I believe is
due to the readability of the underpinnings.

That being said, the web has felt like technology soup for some time now,
shortly after getting away from CGI, it just felt like we where doing it wrong
(at least the apps part). The newer JavaScript / CSS / HTML front-ends
communicating to a REST back end feels a little more right to me over all the
proprietary server pages mess (ASP, JSP, PHP) that we contorted the document
language with, to make pages, apps. But it does still feel like there should
be a better way to deliver apps. I think this is why IOS apps are doing so
well, the use a common REST back end with the web, yet make development of the
UI a lot less complex. I think the web could learn a lot from Apple's
successes on the mobile platform, but there is just so much inertia behind web
technologies that a radical departure is just not in the cards.

------
georgemcbay
FTA: "At the very least, we should stop serving HTML, CSS, and JavaScript to
users. Let’s instead serve things in concise, binary format—compiled
documents, compiled stylesheets, and bytecode-compiled JavaScript that runs in
a standard-issue stack-based virtual machine."

I'm right there with him in wishing there were a chance in hell web browsers
would move to a (language neutral as possible) VM for client-side code, but as
for those other things, they basically _are_ served in "binary compiled"
versions most of the time, in the form of gzipped versions of the documents.

Using a different custom binary format for the same data would save very
little over a gzipped html/css file and would add an immense amount of
complexity, thus really isn't worth it, IMO.

~~~
evincarofautumn
Well, if you say I made one good point, then at least I made one good point. A
standard VM is the most obvious thing we need. But I think there is definitely
something to the notion of web applications as real applications, which can be
queried in a structured way and composed (think pipes) with one another and
with desktop applications.

~~~
magicalist
If you haven't checked them out, you might find web intents interesting:
<http://webintents.org/>

~~~
evincarofautumn
I hadn’t, but it seems to be a very good solution within the current model of
things. Thanks for putting me on to this.

------
ams6110
_Let’s instead serve things in concise, binary format_

We tried this already in the 19[89]0s, it was called client-server.

------
guard-of-terra
What problem does it solve for the end-user?

If it doesn't, we should skip it. Programmers love large-scale infrastructure
refactorings and they always get those wrong. Both in axiomatics and in
implementation.

For example, I would argue that if unified web bytecode was adopted in 2002,
its performance would be inferior today than leading JS engines provide.
Optimizing high-level code is actually simplier than optimizing low-level
kludge built upon wrong assumptions. And they would not get it right in 2002.
I argue they wouldn't today,

------
andrewflnr
Apparently people are having trouble with this: just because it's binary
doesn't mean you can't read it. It just means you can't read it with the tools
you already have on your computer. We would have to build the editing tools to
read and modify whatever data format comes out of this, but _we can do that_
and it won't be that hard.

I think we should do it anyway. I dream of a composable, computable data
format that can represent anything and be sent anywhere. We could build web
apps with it. We could make desktop data formats with it. I know it's
ambitious, but I don't think it's impossible. I'm not expressing it terribly
well because it's late. But I think this is a good idea.

------
mkup
Text format allows various forms of security analysis and on-the-fly
modifications of HTML/Javascript in personal firewalls, browser plugins etc
(for the purpose of blocking malicious scripts, advertisements, flash, images
from other domains with http 401 and other stuff).

Compiled binary executable blob with fixed address references will be non-
modificable. You either run it or not, but you can't cut a part of it off
while passing through firewall.

So, proposed change is a step backward from the security perspective.

------
BerislavLopac
The Web is wrong as a software platform -- because it was never supposed to be
a software platform: [http://berislav.lopac.net/post/4726275775/of-course-web-
deve...](http://berislav.lopac.net/post/4726275775/of-course-web-development-
is-broken)

While the Web model was good enough for a time with slow and expensive
Internet connections, in the era of ubiquitous broadband it becomes more and
more of a bottleneck.

In my opinion, the future lies in a more equal interconnection of Internet
devices, which would not be divided to servers as clients as we have on the
Web, but more on a peer-to-peer level. BitTorrent is a primitive example of
one aspect (storage) of what I have in mind, but it would expand into
processing and all other kinds of data exchange. One can easily imagine a
networked TV being used as a display device for any other networked appliance,
from mobiles to microwaves.

Whoever defines the best standard for that to be build upon will hold the keys
to the future. And you may call this SkyNet if you like. ;)

------
sprobertson
I like to imagine a flipped scenario, where the content, style and interaction
are even further separated; allowing better inter-app sharing of each aspect.
For example, if the content of your blog just came through in JSON, I could
very simply quote your post on my blog, with my layout & style. It seems that
compiling it all to binary would make sharing harder instead.

~~~
evincarofautumn
But if all web applications were to serve content as meaningful objects, then
the underlying format could be anything, so long as it were standardised. VMs
could transmit everything as JSON if they so chose, but they might as well use
a binary format because no human need ever see it.

If all applications are communicating using the same abstractions, then it
doesn’t even matter what language they’re written in. My site can be written
in Haskell and it’ll serve blog entries just as well to Haskell on the Firefox
VM on Windows as to Ruby on the Chrome VM on a Linux toaster. And we never
need to agree on an interchange format, because that’s the VM’s job.

------
guard-of-terra
What problem does it solve for the end-user?

If it doesn't, we should skip it. Programmers love large-scale infrastructure
refactorings and they always get those wrong. Both in axiomatics and in
implementation.

For example, I would argue that if unified web bytecode was adopted in 2002,
its performance would be inferior today than

------
GilbertErik
Is it just me or does it sound like this guy REALLY wants everyone to move
from html/css/js back to Flash apps?!?

~~~
evincarofautumn
You mean a proprietary platform that offers little to no interoperability,
accessibility, composability, or searchability? No, that’s precisely the
opposite of what I want.

------
nchuhoai
Welcome to the world of backwards compability.

~~~
andrewflnr
Backwards compatibility only goes so far. It has to break someday. We need to
make sure it's worth the pain when it happens.

