
Mozilla’s Rejection of NativeClient Hurts the Open Web - chadaustin
http://chadaustin.me/2011/01/mozillas-rejection-of-nativeclient-hurts-the-open-web/
======
mahmud
Completely disagree.

I don't even think Google "adopted" NaCl, it's still more a private project of
a few, compared to Chrome(OS) and Android.

NaCl is one of those technologies that I secretly hope fails. Pull the plug on
x86 already, FFS.

Saying that the Open Web needs an app-store to compete with the walled gardens
is also a fallacy. Quite the contrary, the web competes with the walled
gardens because it _doesn't_ have such master-record. You will see this
pattern repeat itself; the chaotic nature of the web beats any attempts to
impose "order" upon it. Cf. the dismal failure of dmoz and other curated
directories, compared to search engines.

If you want to use performance critical software that pushes the limits of
your machine, download it, use Java or whatever.

I doubt the app-store model will survive long term; individual merchants and
vendors tend to move faster than the infrastructure that houses their shops
and supports their business (almost literally.) The moment a few merchants
diverge from the proscribed machinery of the market, and exploit
inefficiencies, as they will, is the moment you will see this model come apart
at the seams. If the entire model isn't killed along with its platform by
changes in technology.

The one good example I can give you is the buildings built throughout South
East Asia to house street-vendors. Specially in Cambodia and Vietnam, the
governments built multi-story shopping centers, just piles of concrete, and
invited vendors to push their carts in and congregate. The markets might have
looked like proper malls at some point, but with time, they're just shaded
bazaars like any other street market, except this one isn't dried by the sun
and there is no drainage. Over time, the building has all the appeal of a
street market, except you have to climb up some slippery stairs tightly packed
with pickpockets. However, most higher-end merchants gather _around_ the
vertical-bazaars and open proper walk-in shops with more products, better
presentation, and all the amenities of a private property (fan, seats, water
cooler, bathroom, TV/radio, etc.)

~~~
1337p337
I couldn't agree more if I were suddenly granted the magical ability to concur
with the power of a thousand suns.

As far as NaCl goes, the absolute last thing the web needs is
portability/endianness issues. I'm writing this comment on a phone (ARM), and
I also browse on my desktop machine (x86-64) and my netbook (Atom, x86). (Not
to even mention PS3/XBox 360/Wii/older Macs, all of which come with browsers.)
Wanna make web the web _worse_? Force developers to target several
architectures for their apps, or greet users with "Your CPU isn't supported!"
pages. Lots of Webapp developers have trouble with _cross-browser_ issues!

If you look at Firefox plugins (native) versus extensions (Javascript), you
can see fairly plainly that, even though it's slower, the ease of development
in Javascript (even with all its quirks) outweighs the benefits of native
code, even for heavyweight extensions.

The best solution, in my opinion, is to expose a (fast) bytecode VM to web
developers. Javascript is okay for a lot of purposes, but it's a messy
language, and can be pretty clunky. A bytecode VM (one that at least has a
Javascript front-end) could solve most of these problems. Off the top of my
head, the JVM has Rhino for JS (as well as support for Ruby, Python, Clojure,
and a host of others) and is easily restricted, if somewhat heavyweight. The
Lua VM is faster than the JVM, is portable to a number of platforms (having a
clean, portable, lightweight implementation), and I suspect that a Javascript
front-end would not be hard, given the similarities between the languages (Lua
could be described as a cleaner, faster Javascript). The language of choice
would be up to the developer, as long as it could target the VM (and we'd not
have to put together hacks like minifiers or $language-to-JS compilers like
CoffeScript and Objective-J). Getting Google, Microsoft, and Mozilla to agree
on a VM is trickier, although I hear MS loves Lua.

I also agree with the app store assessment (1000 suns, etc.), but have little
to add.

~~~
junkbit
NaCl works on both ARM and x86 using the same downloadable bytecode. The long
term goal seems to be to translate the bytecode in the client using a
sandboxed LLVM to target the local architecture

See the following video from the recent LLVM Developers' Meeting

Portable Native Client, David Sehr, Google [mp4,269mb]
[http://www.llvm.org/devmtg/2010-11/videos/Sehr_NativeClient-...](http://www.llvm.org/devmtg/2010-11/videos/Sehr_NativeClient-
desktop.mp4)

~~~
othermaciej
As I understand it, if you use Native Client now, you get an architecture-
specific binary. The LLVM bitcode based portable version is still a work in
progress. So it's more accurate to say "long term goal" than "works".

The problems with Native Client go beyond its currently x86-specific nature
though. The Web is based on open standards, and a requirement for most
standards is having multiple independent implementations. Native Client is a
complicated enough technology that it might be completely impractical to spec
it in sufficient detail and independently reimplement.

You might think this is only a technical issue of standards process, but a
standard with only one implementation ends up de facto controlled by a single
entity, even if the implementation happens to be open source. In practice,
you'll get support for the platforms and CPU architectures that Google cares
about, in their priority order. You can see how that might not be so appealing
for an entity that doesn't want Google to be setting the agenda quite that
much.

In addition to this, the security architecture of the whole thing seems pretty
dubious. It does have a better attempt at security design than ActiveX. But
given the basic approach (binary-level validation of binary code), it has a
lot of attack surface.

All that being said, it's a neat project with a lot of hack value. It just
doesn't seem like a great fit for the Web. It is likely more driven by Chrome
OS at this point.

~~~
chc
So are you arguing against Native Client or pointing out a couple of niggles
with the current implementation of Native Client? I don't think anyone was
suggesting that the current experimental builds are perfect and infallible.

And if a single implementor is a bad thing, shouldn't you _want_ Mozilla in on
it?

------
DjDarkman
It's ironic to me, how self contradictory the title is. NativeClient is not a
way to make the web more open, in fact it's a way to make the web more
binary/obscured.

Low level memory access, pointers and the likes are the 'horrors'
Java/C#/<name your high level language> programmers are running away from. The
author fails to point out why would anyone want low level memory access.

> Preemptive response: But NativeClient is x86! Basing the open web on a
> peculiar, old instruction set is a terrible idea! That’s why I point to LLVM
> and Portable NativeClient (PNaCl). It’s not a burden to target PNaCl by
> default, and cross-compile to x86 and ARM if they matter to you.

This seems to imply that the browser should have a compiler that complies the
low level bytecode into real machine code. The author should realize that this
would be almost identical to running an SWF or a Java plugin, which makes the
whole idea pointless.

~~~
chadaustin
Hi, original author here,

Responding to your bytecode argument, modern JavaScript JITs are already
compiling JavaScript to machine code. That means JavaScript is becoming the de
facto bytecode of the web. Then the argument becomes "what's the most
appealing bytecode for the web?" I'd argue that SWF isn't (closed), Java isn't
(for the same memory layout and language translation issues I discussed), and
JavaScript isn't (memory layout and language translation). Sandboxed LLVM
makes a much better intermediate format in a world where web applications have
the same capabilities as native applications.

"The author fails to point out why would anyone want low level memory access."

Please read Tom Forsyth's postings that I linked at the top of mine.
Basically, in the last 30 years, clock speeds have gone through the roof, but
memory latencies have only increased a couple orders of magnitude. Thus,
memory is a primary concern in any application where low-level performance
matters, like the ones I listed (games, simulations, video, DSP).

~~~
BonsaiDen
> like the ones I listed (games, simulations, video, DSP).

Those should not run in a Browser, they should on a real OS (or Emacs).

What's next VMWare inside your Browser? Then it's OS -> Browser -> VM -> OS ->
Browser...

Sorry but just because things are possible, doesn't mean that they should be
done...

~~~
mahmud
but you can't just right-click on a binary executable and "view soure", bro! ~

Seriously, people managed to write successful, cross-platform software without
expecting everyone with any kind of gadget or device to run it with one click.

~~~
chc
When was this? Atlantis? I don't remember any point in history where this was
simple and commonplace.

------
EGreg
I disagree with the article's sentiment. I'm glad the web is focusing on
Javascript. Here's why: the web is a nice, text-based environment that is safe
to execute on your computer. Each web resource may contain active scripts but
they are pretty innocuous.

Native applications are binary and can do all sorts of nasty things. Sure,
this sandbox is supposed to be safe, but what if it's not? When an application
is delivered over the web, one should really make sure that it wasn't somehow
changed or sabotaged. Right now this is impossible. At the very least, this
proposal would have to be implemented so we can trust what is being
downloaded: <http://news.ycombinator.com/item?id=2024164>

Here is what I suggest: native libraries should expose their objects to
javascript, which should do the majority of the work. Kind of like PhoneGap
does on the phones. These native libraries (like OpenGL, say), should be
served from cdns over httpc:// and the user agent can verify that they are
safe after downloading them.

~~~
tzs
According to Brendan Eich, the designer of Javascript:

    
    
        JS had to "look like Java" only less so, be Java's
        dumb kid brother or boy-hostage sidekick. Plus, I had
        to be done in ten days or something worse than JS would
        have happened.
    

(See comments from [http://www.jwz.org/blog/2010/10/every-day-i-learn-
something-...](http://www.jwz.org/blog/2010/10/every-day-i-learn-something-
new-and-stupid/) for that and more from him)

Given that constraint, JS is an amazing hack. But it's 2011 now. Why should we
have to use Javascript for client code in the browser--why shouldn't we be
able to use Ruby, or Python, or C#, or Clojure, or Haskell, or F#, or Lua, or
whatever else we want? It seems ridiculous to me that we're having a
resurgence of language innovation on the desktop and on the server, but not in
the browser.

~~~
tiles
We can. [http://mozakai.blogspot.com/2010/08/emscripten-llvm-to-
javas...](http://mozakai.blogspot.com/2010/08/emscripten-llvm-to-javascript-
compiler.html) or its kin will definitely see a place in the future of web
development.

x86 is a sucky architecture to target but we target it cause that's what's
worked on. Path of least resistance. Same will happen to JavaScript as a
bytecode target, inevitably, and when it's abstracted away and browsers speed
up even faster, we won't be bothered by the pain points, just use the
implementation.

~~~
EGreg
Can you tell me how this is different from, say, Java? The fact that you are
happy to run arbitrary binary code that was compiled for a specific platform?
So now we are supposed to "write once compile everywhere" like C++, and then
ship that to platforms? That is not in the spirit of the web AT ALL. If you
want Java, at least THAT is write once, RUN anywhere. And why isn't running
Java more seamless? I think answering THAT will be helpful to the original
discussion.

------
trotsky
Let's ignore the philosophical or design issues surrounding native client and
look at a practical one.

Why does the article take issue with Mozilla alone? Surely they aren't the
only browser vendor that won't be implementing native client. Mozilla is being
singled out here precisely because other major platforms are considered to be
lost causes.

Safari and Internet Explorer are unlikely to support NaCl for obvious
competitive issues. Heavily curated platforms like the iPhone prevent even
third parties from supporting such a feature.

Once you realize that even with Mozilla support you'd still only be looking at
a ~60% penetration, you're going to be working around it anyway. Once you're
dictating platforms, plugins or providing a fall back implementation I'm not
sure if support in one specific browser is going to make or break anything. If
you're willing to target only half the web you're simply not that concerned
about ubiquity to start with.

~~~
chc
Once Mozilla and Google support it, that puts a lot of pressure on the others.
It might still not have happened, but Mozilla was the swing vote.

~~~
pornel
That doesn't guarantee success. Google's + Mozilla's pressure in video codecs
hasn't changed Microsoft's and Apple's mind.

------
pedrocr
The only way I can see PNaCl catching on is if Google deploys it in Chrome and
the Android Browser and creates an automatic fallback to a javascript PNaCl
interpreter for the browsers that don't have it. Then it becomes about "why is
Firefox/IE/iOS slower at running this webapp?"

But even then you'd need Google or someone else to deploy some interesting
PNaCl apps to make having it worthwhile.

It's a pity that this is such a long shot because JS isn't a particularly good
language and the stack we seem to be heading towards (and that Mozilla favors)
is something like CoffeeScript -> Javascript -> bytecode -> machine code.
Javascript doesn't seem like a very interesting compiler target or an easy
language to make fast[1]. Maybe the new ES5 strict mode or some other subset
of javascript can be agreed upon as a basis for compilers, that is easier to
run fast. Then that can be the IL for the web.

[1] Best implementations are 3-5x slower than the JVM according to
<http://shootout.alioth.debian.org/>

~~~
BonsaiDen
Well JS is _very_ dynamic, people have for a long time focused on the
optimization of static typed compiled languages. The JVM did awesome things in
the area of JITs, V8 Crankshaft is already pushing the limits once again, up
to 3x gains over the current version in the Browser. There's still a lot of
potential in optimizing a language like JavaScript, but Rome wasn't built over
night, give JS some more time.

~~~
pedrocr
I just find all that engineering effort such a waste when it is optimizing
Javascript of all things. I've seen a lot of arguments of why Javascript isn't
"that bad" but few about why it is actually good when compared to comparable
languages.

I see a lot of good coming out of building VMs for more dynamic languages than
Java, but there seems to be movement in the direction of running away from
writing pure Javascript (CoffeeScript/GWT/etc) that if you're going to build a
JS VM you might as well define a strict subset of the language that will be
optimized and let everyone target that when building VMs and languages.

------
Hoff
There's no shortage of failed great technology, unfortunately.

You need a way to get this launched, which means you need widely-available
clients (PNaCl), and you need the content as otherwise nobody needs those
clients you don't yet have, and you need the tools for developing the content
which aren't available, and you need a way to interest enough folks in this
technology into adopting it, whether by pushing them (cash) or pulling them
(cool, useful, solves my problem, etc).

And you need to sort out and preferably prevent the security attacks and how
you're planning on providing content protection (yes, you're going to need
some sort of copy protection get more than token commercial content), the
usual UI adoption issues for non-tech users (they're the big market, and not
the geeks), and with all the usual nasties that can derail or dissuade the
early adopters of any technology.

Getting to critical mass with these sorts of products is inherently
entertaining, and involves rather more thought and cash and effort than with
the technology itself. Have you looked at how all that'll happen here, rather
than looking (just) at the (admittedly cool) technology?

------
Dn_Ab
And so the pendulum has swung back. I believe Mozilla's action is rational
assuming my logic is not flawed. See if you can unravel it.

Premise: The more indistinguishable a browser gets from its underlying
operating system the more of the properties of said system it must share. Thus
stretching the nature of the abstraction, making it shallower (till the
machine) hence also increasing the probability of leaks in said abstraction.

Speciation will occur across system architecture peculiarities and cause
splintering of browsers. Destroying their main advantage - which is the strong
guarantees it makes on your deployed code being accessible across a vast range
of platforms. That is, physical constraints and combinatorial considerations
will make it very hard to write code that uses architecture specific
optimizations and assumptions while still falling back robustly across all
devices. And if it can work everywhere then it must not be directly exposing
such a hardware layer and then, what is the point. The same can be acheived by
optimizing javascript JITing. The JITter can take care of that optimal device
specific optimized code generator. Google wants to create an OS, Mozilla wants
to improve the browser.

In particular, as margins from speed, ui and features decrease; each vendor
will become incentivized to avoid commodification and distinguish themselves
from their competitors by moving faster than the glacial speeds of standard
bodies and introducing incompatibilities. While being slow to pick up those of
others. In essence each browser would basically evolve into and become
indistinguishable from a current OS with all its pitfalls (isomorphic rather
than homomorphic as currently). And if we are targeting specific VMs then we
may as well factor out the browsers as they are no longer a vital component of
the equation. Completing the cycle. To be restarted with metabrowsers.

Seems to me that pushing for native into the browser without carefully
considering the tradeoffs is foolish. You cannot have uniformity without
sacrificing diversity. This seems like the original Java dream rebooted. But
it seems to me that wanting the same UI and code to work everywhere while
taking advantage of underlying hardware, while automatically adapting and
falling back on visuals and optimizations is a pipe dream. That is of course,
until OS's and programs become intelligent and partially alive. At least
microbe level intelligence. And virus like adaptibility.

Aside: NaCl appears to have a decent amount in common with silverlight.
Particularly in terms of tradeoffs, weaknesses and gains.

~~~
chc
It's not clear from your comment that you have a good understanding of what
Native Client is. You're making vague, barely technical objections to the
browser getting "more indistinguishable … from its underlying operating
system" and losing the minuscule amount of cross-platform capability it has
now without explaining how a more open and potentially efficient platform for
code execution than "Whatever text-based JavaScript interpreter the browser
vendor decided to include" would necessarily do that.

~~~
Dn_Ab
My comment was an argument as to why Mozilla's choice not to introduce this
dependency was not a bad one - there is no technical reason why the fast
execution multiple language advantages of NaCl cannot be architected into a
javascript vm. and why mozilla's path does more to keep things open longer by
using an already widely adopted, more understood technology. The enemy you
know is better than the enemy you don't know type thing. Who knows what can of
warms the concepts sandboxing relies on will contain. It must contains flaws
as all creations of humans do.

And then my opinion that browsers will evolve into the platform and not be
separable from the OS. Stuff like NaCl simply accelerates that by introducing
a dependency on one company or creating a technology that invites splintering
on implementation due to its complexity and uniqueness.

------
JoachimSchipper
Considering NaCl in the context of papers like
<http://eprint.iacr.org/2010/594> (timing the CPU cache to break AES) is...
interesting. You'd hope that Google would have considered such issues, but a
quick search doesn't yield anything.

~~~
chc
Are you saying that AES in JavaScript is more secure?

~~~
JoachimSchipper
In the sense that Javascript crypto is horribly broken (see e.g.
[http://rdist.root.org/2010/11/29/final-post-on-javascript-
cr...](http://rdist.root.org/2010/11/29/final-post-on-javascript-crypto/)) but
can't really be used to attack other applications running on your computer,
yes. NaCl itself is probably fine-ish for implementing crypto protocols - it's
just that it looks like a perfect vehicle for attacking other crypto
implementations running on the same processor. (Well, except for the noise
from running Chrome, but I still wouldn't use an SSH session while running
NaCl.)

------
mycroftiv
If NativeClient is available as a plugin, can't Firefox users benefit from the
technology regardless of whatever Mozilla thinks? I like the idea of
lightweight x86 sandboxes like Native Client and vx32, but I understand why
Mozilla isn't very interested in them. As long as nothing is done to actively
hinder their development and use by those who are interested, is there really
a problem here?

~~~
chadaustin
The problem with plug-ins is that relying on them causes a huge drop-off in
usage, at least in my experience. Requiring a download of some kind, even if
it's just a plug-in, costs you about 30% of your possible users.

It sounds like Unity3D has similar numbers:
<http://forum.unity3d.com/threads/39362-Web-player-adoption>

------
endergen
Obscuring code isn't a bad thing, it's what I miss about c/c++. It was so much
easier to share a library without giving away source code.

The increased power that NaCL would add would enable more disruptive
applications than just js apps. Client side custom video encoding, browser
based distributed computing, and yes again a better economy of buying selling
software libraries.

------
EGreg
Think of it this way:

Flash, Java, Javacript, et al are great for the web because they are "write
once, run anywhere". The same source code runs everywhere. That is what the
web is all about. HTML and CSS are not scripting languages and they are also
"write once, run anywhere". That is how the web delivers your programs.

Now, what will your Native Client do?

It will have COMPILED code for ONE platform. Like in C++ where you "write
once, compile everywhere". Except you probably won't be able to compile
everywhere. The point of the web is that any platform should be able to run
your app.

On the other hand, I can see Native Client as being useful for extension
libraries. You know, like PhoneGap plugins. The Javascript can test if the
object is there, and if it is, use some standard interface. You could build up
a standard library of these. As long as it is available on a wide enough range
of platforms. Certain APIs are already exposed by the browser, through HTML5,
that were originally in Google Gears.

Look, I agree that it's more of a philosophical thing, and indeed you are
welcome to make an extension for Mozilla and all the other browsers. But the
security risks alone will make this a tough sell to INCLUDE in a browser --
harming the spirit of the web. Not only that, but the web is totally against
"favoring one platform over another" ... it is BECAUSE of the web that the
platforms are able to work better together.

~~~
salmonsnide
Did you miss the PNaCl part? Check it out here:
<http://nativeclient.googlecode.com/svn/data/site/pnacl.pdf>

------
megaman821
People are going to want to target their C, Java, Python, etc. code to the
browser. The only question is what is the better target, Javascript as a VM or
a sandboxed LLVM?

Honestly the inclusion of NaCl wouldn't change that much. JavaScript would
still be the easiest path for most developers to choose. It will be only
language that the browser hosts the interpreter and has no compilation step.

~~~
pohl
JavaScript does have a just-in-time compilation step in most implementations
these days. A quick scan of llvm-bc by a back end would be equally invisible
to the user.

~~~
megaman821
I just meant the user has to compile their language of choice to the llvm byte
code, as I don't see browsers hosting a bunch of interpreters for different
languages.

~~~
pohl
Only the developer would need to do the step of compiling to llvm-bc, not each
user. The browser need only host the back end JIT compilation step.

------
itsFF
I agree that JavaScript may not be the optimal solution (our kids will hate us
for making it the language of the web).

But why LLVM? Why not something more standard like CIL or Java bytecode? I
would personally love to see CIL in all browsers. It compresses much better
than LLVM bytecode...

------
jorangreef
The web is starting to come into its own thanks to relentless efforts to
improve a limited set of tools: HTML, CSS, Javascript. What the web needs is
not more tools, but for the tools it already has to continue to be improved.

~~~
drdaeman
Yeah, so relentless efforts to improve any other tools than those are just
missing the web.

------
MikeW
Google don't even enable NaCl by default on their Cr-48 laptops!

------
mariusmg
NativeClient is a bunch of horse poo and shouldn't be in a browser anyway.

