Hacker News new | comments | show | ask | jobs | submit login
Mozilla’s Rejection of NativeClient Hurts the Open Web (chadaustin.me)
85 points by chadaustin 2370 days ago | hide | past | web | 101 comments | favorite



Completely disagree.

I don't even think Google "adopted" NaCl, it's still more a private project of a few, compared to Chrome(OS) and Android.

NaCl is one of those technologies that I secretly hope fails. Pull the plug on x86 already, FFS.

Saying that the Open Web needs an app-store to compete with the walled gardens is also a fallacy. Quite the contrary, the web competes with the walled gardens because it doesn't have such master-record. You will see this pattern repeat itself; the chaotic nature of the web beats any attempts to impose "order" upon it. Cf. the dismal failure of dmoz and other curated directories, compared to search engines.

If you want to use performance critical software that pushes the limits of your machine, download it, use Java or whatever.

I doubt the app-store model will survive long term; individual merchants and vendors tend to move faster than the infrastructure that houses their shops and supports their business (almost literally.) The moment a few merchants diverge from the proscribed machinery of the market, and exploit inefficiencies, as they will, is the moment you will see this model come apart at the seams. If the entire model isn't killed along with its platform by changes in technology.

The one good example I can give you is the buildings built throughout South East Asia to house street-vendors. Specially in Cambodia and Vietnam, the governments built multi-story shopping centers, just piles of concrete, and invited vendors to push their carts in and congregate. The markets might have looked like proper malls at some point, but with time, they're just shaded bazaars like any other street market, except this one isn't dried by the sun and there is no drainage. Over time, the building has all the appeal of a street market, except you have to climb up some slippery stairs tightly packed with pickpockets. However, most higher-end merchants gather around the vertical-bazaars and open proper walk-in shops with more products, better presentation, and all the amenities of a private property (fan, seats, water cooler, bathroom, TV/radio, etc.)


I couldn't agree more if I were suddenly granted the magical ability to concur with the power of a thousand suns.

As far as NaCl goes, the absolute last thing the web needs is portability/endianness issues. I'm writing this comment on a phone (ARM), and I also browse on my desktop machine (x86-64) and my netbook (Atom, x86). (Not to even mention PS3/XBox 360/Wii/older Macs, all of which come with browsers.) Wanna make web the web worse? Force developers to target several architectures for their apps, or greet users with "Your CPU isn't supported!" pages. Lots of Webapp developers have trouble with cross-browser issues!

If you look at Firefox plugins (native) versus extensions (Javascript), you can see fairly plainly that, even though it's slower, the ease of development in Javascript (even with all its quirks) outweighs the benefits of native code, even for heavyweight extensions.

The best solution, in my opinion, is to expose a (fast) bytecode VM to web developers. Javascript is okay for a lot of purposes, but it's a messy language, and can be pretty clunky. A bytecode VM (one that at least has a Javascript front-end) could solve most of these problems. Off the top of my head, the JVM has Rhino for JS (as well as support for Ruby, Python, Clojure, and a host of others) and is easily restricted, if somewhat heavyweight. The Lua VM is faster than the JVM, is portable to a number of platforms (having a clean, portable, lightweight implementation), and I suspect that a Javascript front-end would not be hard, given the similarities between the languages (Lua could be described as a cleaner, faster Javascript). The language of choice would be up to the developer, as long as it could target the VM (and we'd not have to put together hacks like minifiers or $language-to-JS compilers like CoffeScript and Objective-J). Getting Google, Microsoft, and Mozilla to agree on a VM is trickier, although I hear MS loves Lua.

I also agree with the app store assessment (1000 suns, etc.), but have little to add.


NaCl works on both ARM and x86 using the same downloadable bytecode. The long term goal seems to be to translate the bytecode in the client using a sandboxed LLVM to target the local architecture

See the following video from the recent LLVM Developers' Meeting

Portable Native Client, David Sehr, Google [mp4,269mb] http://www.llvm.org/devmtg/2010-11/videos/Sehr_NativeClient-...


As I understand it, if you use Native Client now, you get an architecture-specific binary. The LLVM bitcode based portable version is still a work in progress. So it's more accurate to say "long term goal" than "works".

The problems with Native Client go beyond its currently x86-specific nature though. The Web is based on open standards, and a requirement for most standards is having multiple independent implementations. Native Client is a complicated enough technology that it might be completely impractical to spec it in sufficient detail and independently reimplement.

You might think this is only a technical issue of standards process, but a standard with only one implementation ends up de facto controlled by a single entity, even if the implementation happens to be open source. In practice, you'll get support for the platforms and CPU architectures that Google cares about, in their priority order. You can see how that might not be so appealing for an entity that doesn't want Google to be setting the agenda quite that much.

In addition to this, the security architecture of the whole thing seems pretty dubious. It does have a better attempt at security design than ActiveX. But given the basic approach (binary-level validation of binary code), it has a lot of attack surface.

All that being said, it's a neat project with a lot of hack value. It just doesn't seem like a great fit for the Web. It is likely more driven by Chrome OS at this point.


So are you arguing against Native Client or pointing out a couple of niggles with the current implementation of Native Client? I don't think anyone was suggesting that the current experimental builds are perfect and infallible.

And if a single implementor is a bad thing, shouldn't you want Mozilla in on it?


The best solution, in my opinion, is to expose a (fast) bytecode VM to web developers.

The base Squeak/Pharo VM runs bit identically on over 50 platforms. The performance of the Lua VM and its suitability for ECMAScript is very attractive, though.


While you make fair points that I mostly agree with, I have a tiny nitpick: Lua isn't really a simpler Javascript; although I understand the point you were trying to make so I won't argue over it. :-)


It's true; it's a bit of an over-simpification. I was trying not to get any more long-winded than I already was. :)


Lua VM is not faster than JVM or V8; you might be thinking about LuaJIT, which is not as portable as the standard Lua implementation and uses different bytecode format.


I don't know which Lua you're used to, but as of 5.1, it's incredibly fast; I may be benchmarking the wrong things (the big one I tried was finding the first 2,000,000 primes by trial division, which should lend itself well to hotspot optimizations). Either way, though, I'd still argue that a VM (JVM and Lua were just a couple of examples of VMs that are fast, used in real applications, and are readily jailable) is the ideal solution.



FYI, Firefox extensions can contain native code. Of course, you need to include a binary for every OS/arch/Mozilla version triple you wish to support.


I mentioned extensions; the first half of my previous job was spent on FF plugin development. Once in a while, one of us would get frustrated enough to try looking at how extensions are built for FF; never worth it. (Our IE guy was enviable in the sense that he only had to produce two DLLs, and got to work in his language of choice.)


This is kind of at odds with your earlier comment...


> "If you want to use performance critical software that pushes the limits of your machine, download it, use Java or whatever."

This is a bit too simple. The web is becoming a platform for rich interactive applications more and more. Nowadays we can use tons of applications without downloading and installing any software. Most people would agree this is a good thing.

However, the web as a platform is inherently broken. Engineering large scale web applications with the sheer combination of HTML, CSS and JavaScript is just a pain in the ass. Every level of abstraction you introduce is pushing the limits of the browser already. Having a sandboxed machine code interpreter would allow developers to spend their time on developing software using their own favorite abstractions. This can save time and frustration.

Maybe NaCl or even some bytecode alternative have their disadvantages, but they solve a very annoying problem for sure.


Comments like this demonstrate why web _applications_ will lose

The inertia and individual investment in HTML and JavaScript is so incredibly high that web players (from the browser makers such as Mozilla to the individual web developers) absolutely steadfastly refuse to evolve with changing requirements.

The belief that JS+HTML is "just fine" and no significant evolution is required to support application development is absolutely incomprehensible; what other application platform available today has seen iteration as glacially slow as the web? The browser makers have consistently stuck to their "HTML+JS is fine for everyone" guns, ECMAScript has failed to evolve almost entirely, and no common application development framework or platform has emerged.

HTML+JS libraries -- such as jquery -- are just fine for interactive document publishing, but are no replacement for Cocoa, Android, or Qt. The lack of common re-usable and extensible UI components is a travesty. Proponents like to advance the idea that this chaos is a benefit -- choose whatever tools you want -- but the rising popularity and the massive investment in application development on native app stores demonstrates a market desire for something simpler: A common well-defined application development platform is valuable, and the only barrier to adoption in the web world was a trustable distribution mechanism -- app stores.

The fact that you think App Stores will fade away is demonstrative of a failure to understand why the app store is so popular -- ease of distribution, easy to use common development platform, ease of payment processing, common UI (instant user familiarity), performance (!), open development platform (no JS sandbox, not all languages must be filtered through JS).

Ask yourself why Google has ChromeOS on one hand, Android on the other, and is working on technologies like NaCL that could bring many of the advantages of Android to Chrome?

Lastly, hopefully I can pre-empt the "JS is a common bytecode platform, just target that" crowd. Yes, you can emulate any other turing machine on top of a turing machine, but that doesn't mean it will be fast, clean, easy to use, easy to debug. It absolutely makes no sense to halt the evolution of the browser as a platform and instead tell everyone to use a high-level language like JavaScript as a common bytecode.

[Addendum]

I also should mention, anecdotally, that I've had some long-winded conversations with a member of the Chrome App Store team -- me from the perspective of a long-time native developer, them from the perspective of being tasked with coming up with ways to improve the web as an application development platform.

From my conversation, which may or may not reflect reality:

There seem to be two schools within the Google App Store team. The first (and seemingly most common) belief that there's nothing terminally broken with the DOM, HTML and JavaScript, they just need to find a way to make it easier to implement re-usable UI components, perhaps a common model for namespaces, objects, and other small improvements to JS.

The second seems to believe that a fundamental re-imagining is required -- perhaps dropping the DOM entirely in favor of rendering with canvas and a traditional view/event hierarchy. Consider opening the playing field for more languages (eg, via NaCL).

I think the competing strategies within Google and the clear differences of popularity between them -- Android, Chrome/ChromeOS/Chrome AppStore -- provide an enlightening view as to the likely future of application development on the internet.


HTML+JS libraries -- such as jquery -- are just fine for interactive document publishing, but are no replacement for Cocoa, Android, or Qt. The lack of common re-usable and extensible UI components is a travesty.

Cappuccino http://cappuccino.org/

SproutCore http://www.sproutcore.com/

Ext JS http://www.sencha.com/products/js/


I don't want to dismiss these out-of-hand, but they're really not a replacement for a modern application development framework.

Beyond functionality (of which these can and do provide considerably less), part of what makes a platform like Cocoa/UIKit so valuable -- it is used by every application on the device. The applications will interoperate both with each other and with the OS (background services/tasks, quicklook, spotlight, etc, etc, etc). The interaction model will already be familiar to the user, visual cues will be understood, the speed and inertia of animations (such as scrolling) will be expected. There is an enormous value to being able to leverage user expectation, and this is where too many choices falls down.


I have seen many apps (on all platforms -- iPhone, Android, OS X, Windows, etc.) that deviate from user expectations. Ultimately, meeting user expectations is the responsibility of the developer not the framework.

HTML has its own set of visual cues that you and millions of others easily interact with every single day. I would argue that the interaction model of HTML/JS apps may be as familiar or more familiar to users.

I don't disagree that HTML/JS apps can be difficult to develop, but I do not think they are going to "lose". (I don't think they are going to win either. It's not a win/lose situation.)

Since it seems that your background is in native applications, I just wanted to provide you with some references to frameworks that provide something a little more advanced than jQuery and interactive documents.

Obviously, each team needs to look at its project and goals and choose whether a native app, an HTML/JS app, or both is appropriate.


I'd like to point out a couple of things about these choices for examples of rich UI libraries: All three of them throw most of the DOM, and reimplement almost everything from scratch: structure, layout, and most behavior (outside of that provided by input elements that can't realistically be implemented by hand). I don't think you can ask for a stronger indictment of the DOM and related APIs with respect to implementing applications.

And that's the problem. It's not [p]NaCl vs. Javascript. It's whether the core APIs are designed for the task, which in the case of web applications they clearly aren't. The most egregious failings of web applications stem directly from bad APIs, and if you look at their counterparts (for example) in iOS and Android, there's absolutely no comparison. A better script engine (and yes, almost anything would be better than Javascript) would be nice, but would do very little to solve the API problem (don't forget that NaCl can't call anything not provided by the browser, so it can't serve as an "escape hatch").

There are those who suggest that the browser APIs are close to "good enough" and just need to be tweaked a bit to make it possible to create good, efficient web apps on par with modern native apps. I'd like that to be the case (my day job is working on browser application tooling and infrastructure), but so far it's not happening where it matters most -- on mobile devices. On the desktop you can usually get away with murder because the machines are so fast (and have so much memory), but on more limited mobile platforms (and by limited I mean ~1GHz ARM with ~0.5G of memory) web apps aren't even close to their native counterparts. There are some laudable attempts out there (e.g. Gmail mobile, the Sencha Touch demos), but if you poke on them a bit you start to notice obvious performance and memory issues (e.g., when Safari Mobile crashes on the iPad it's usually out of memory).

Does this mean native apps in app stores will "win", and the web will "lose"? The web "won" on the desktop (for most kinds of apps, anyway), in spite of its severe limitations, because it provided a reasonably stable platform and super-simple deployment. App stores solve the deployment part of this problem reasonably well, but fail badly at the cross-platform part. Can the web be made "good enough" in time to win again? I don't think anyone can accurately predict the outcome at this point.


Yeah, well, web is not native, it's not meant to be, and it will never be, and I for one am glad that it's so different from native development.


Mozilla would be happy to replace Qt/Cocoa/... with HTML+CSS+JS (or XUL). So it's obvious they'll be highly reluctant to have the exact opposite.

Microsoft once tried this in Windows 98 (thanks tightly-integrated IE) - and fairly succeeded. Since then, the attempts to "bring the Web to the desktop" are fairly constant.


Yet another new one is QT Web runtime: http://qt.nokia.com/developer/qt-roadmap (for lack of better link really)


Not disagreeing with you, but just one point:

> Saying that the Open Web needs an app-store to compete with the walled gardens is also a fallacy.

Chad doesn't say that the open web needs an app store, just that "it needs a realistic answer to native code." I assume he wants to avoid a future of fast app store apps vs. slow open web apps, not encourage the creation of a one-stop shop for web apps.


NaCl is one of those technologies that I secretly hope fails. Pull the plug on x86 already, FFS.

There's no reason why NaCl should be inherently limited to x86. x86 is currently the biggest platform, but there's no reason why NaCL couldn't switch to a "Universal" payload like OS X has, combining the 2 largest ISA. (x86 and arm?)

EDIT: I've since read that NaCl is working on LLVM, which would be even better.


1. LLVM is not portable. PNaCl is trying to make it so, but it's a work in progress.

2. A 'universal' payload with ARM and x86, would just work on them. What if in 5 years we have new architectures? Only supporting ARM and x86 would hold back innovation there.


1) Yes, it's a work in progress, just like NaCl.

2) You're posing a what-if for 5 years down the road? Your horizon is really that far? Sounds like grasping for straws to me.


Are you saying we shouldn't think 5 years ahead?

GP was right, including just x86 and ARM is not good enough. Heck it even ignores x86_64 right now, and won't support new SIMD extensions that come out from time to time. Not to mention be a barrier to anyone introducing a new arch, as GP said.

Thinking ahead here is vital.


Thinking ahead is good, but precluding a technology because a particular implementation would have to be changed in 5 years is just flat-out silly for tech. You couldn't ever buy a car with criteria like that. Grasping for straws.


The point is that if you have only x86 and ARM binaries, you can't run them directly on a new architecture. It isn't a matter of changing an implementation.

It's like trying to run a C64 binary today. The only practical way is emulation, which is slow - but thankfully fast enough in this case. In general though, it means new architectures will run more slowly than existing ones. That's not a good thing.


By then just recompile it for the next wave of tech/standards.


Do you think we still have the source code to all C64 binaries out there? We don't. And in 5 years, we won't have the source to all the stuff we are running now.


Fine, those things can just die or limp along.


I doubt the app-store model will survive long term

In general, yes, but there's a good chance more niche stores like Steam will survive for a fairly long while.

* Compared to e.g. either of the Apple app stores, Steam is a very open platform: there are very few demands imposed on the publisher, and developers can release updates whenever they wish, for example. The only other platform where the sort of iterative development Steam can provide is possible is the Web.

* It provides a tremendous value-add: infinite game downloads, automated patching and save file syncing in the cloud. Of course, none of these are issues with a Web-based game, but at least some of the stuff available on Steam isn't going to be possible in the browser any time soon.

* Games are generally much less fungible than the average application, Valve's own games even more so.

* Valve's earned the trust of most PC gamers, including mine. It's probably (percentage wise) the most trusted app store vendor currently in business.


It's ironic to me, how self contradictory the title is. NativeClient is not a way to make the web more open, in fact it's a way to make the web more binary/obscured.

Low level memory access, pointers and the likes are the 'horrors' Java/C#/<name your high level language> programmers are running away from. The author fails to point out why would anyone want low level memory access.

> Preemptive response: But NativeClient is x86! Basing the open web on a peculiar, old instruction set is a terrible idea! That’s why I point to LLVM and Portable NativeClient (PNaCl). It’s not a burden to target PNaCl by default, and cross-compile to x86 and ARM if they matter to you.

This seems to imply that the browser should have a compiler that complies the low level bytecode into real machine code. The author should realize that this would be almost identical to running an SWF or a Java plugin, which makes the whole idea pointless.


Hi, original author here,

Responding to your bytecode argument, modern JavaScript JITs are already compiling JavaScript to machine code. That means JavaScript is becoming the de facto bytecode of the web. Then the argument becomes "what's the most appealing bytecode for the web?" I'd argue that SWF isn't (closed), Java isn't (for the same memory layout and language translation issues I discussed), and JavaScript isn't (memory layout and language translation). Sandboxed LLVM makes a much better intermediate format in a world where web applications have the same capabilities as native applications.

"The author fails to point out why would anyone want low level memory access."

Please read Tom Forsyth's postings that I linked at the top of mine. Basically, in the last 30 years, clock speeds have gone through the roof, but memory latencies have only increased a couple orders of magnitude. Thus, memory is a primary concern in any application where low-level performance matters, like the ones I listed (games, simulations, video, DSP).


> JavaScript JITs are already compiling JavaScript to machine code

The details of how the interpreter works and of what it interprets are irrelevant to this discussion. It's JavaScript code that's being transported, not the compiled binary. While I agree tools like Closure somewhat obscure the resulting JavaScript, it's still source code that's being downloaded to my environment and it's my browser's job to decide how it should be executed.

I would have nothing against a site that sends me source code to be compiled within my computer so it could run inside a sandbox, but I won't like when Facebook starts pushing binaries I should trust won't break out of the sandbox they should respect. You can't easily do static analysis on binaries.

And I would love to be able to browse the web on my SPARC, POWER and MIPS boxes.

One good reason for JavaScript being slower than C is its typing. If we could write type-strict JavaScript code, it should not compile to less efficient binaries than corresponding C. BTW, incremental compiling has been around since the 80's - code can be compiled as quickly as it can be transferred through HTTP.


You can't easily do static analysis on binaries.

The binary format in question here is LLVM-BC, which is just a compact representation of LLVM-IR, which is a single-static-assignment representation specifically designed for static analysis. SPARC, POWER and MIPS backends already exist, FWIW.


Well... This solves one problem.


It solves the only problem you brought up in the comment he was responding to, AFAIK. If you meant to point to others, I think you need to be more explicit or people will miss them.


So, you would be happy if a banner served through Facebook could push a obfuscated (good guys may well play by the rules, but bad guys will figure out in no time how to circumvent anything LLVM-BC brings to the table) binary blob to be executed in your browser? How much would you trust the LLVM-based sandbox?

Well... I wouldn't.

In fact, I can't understand why taking more or less the same shortcut to a dead-end Java took a decade-and-half ago is suddenly a good idea and why disagreeing with it means dooming the web to failure.


How much would you trust the JavaScript-based sandbox?

I don't see why you'd be more afraid of the blob because it's binary rather than obfuscated plaintext. I mean, heck, they could compile the LLVM bytecode to the exact same JavaScript bytecode they're currently using. If you're not afraid of JavaScript, I don't understand the objection you're raising here.

That's what I'm getting at here. Executing remotely downloaded code is scary, but we already do that.

And the problem with Java was that it had terrible performance. The idea of NaCL is that it will actually perform better than what we have now.


> And the problem with Java was that it had terrible performance.

It had, indeed, terrible performance in 1996.

> The idea of NaCL is that it will actually perform better than what we have now.

Again, I see no reason why an improved JavaScript language/runtime would not perform as well as this LLVM-based solution, with the added benefit of building upon 15 years of knowledge on how to secure (and not to secure) a JavaScript-based sandbox. This is a whole new can of worms we don't have to open. We have something that mostly works, that has been battle-tested for more than a decade, and instead of improving on it, some (very clever) people are pushing a whole new stack. Convince me this is the sane solution.


Again, I see no reason why an improved JavaScript language/runtime would not perform as well as this LLVM-based solution...

The original article here does an excellent job of explaining why. Did you read it? In particular, his reference to Tom Forsyth's article on Moore's Law versus Duck Typing is very informative. And his reference to the game Supreme Commander makes it pretty clear what level of performance he would like to see web-deployable pieces of code achieve.

with the added benefit of building upon 15 years of knowledge on how to secure (and not to secure) a JavaScript-based sandbox

Those years of experience can be brought to either solution, can't they?

I watched the video that junkbit posted a link to here, and they appear to not trust the llvm-bc. Once the bitcode is translated to a native executable, they run a verifier on the resulting binary, and if the verifier is unable to prove that the only instructions that can execute are those in thi binary, then they have a strict policy of not letting it run. In addition to that, the translator itself runs as a NaCl module so that if a bug is found, it cannot be maliciously used to escalate privileges.

Their approach seems pretty reasonable to me.


While you're right that memory is the bottleneck in many CPU intensive programs, the idea that you need low level access to circumvent it is far from obvious. For example, several well known people have argued that C makes it more difficult to use memory effectively, because pointers make compiler job so difficult (Fortran still beats C for most numerical benchmarks).

I think we are at a point where architectures are so different that even though in theory, controlling memory pattern is potentially more powerful, in practice, it is impossible to do it right except when you can spend insane amount of time on it. The difference between P4 and core duo, for example, is enormous as far as organizing memory accesses. This is exactly like ASM vs C: you can still beat C with ASM, but doing so across all architectures is almost impossible to do it by hand.


> like the ones I listed (games, simulations, video, DSP).

Those should not run in a Browser, they should on a real OS (or Emacs).

What's next VMWare inside your Browser? Then it's OS -> Browser -> VM -> OS -> Browser...

Sorry but just because things are possible, doesn't mean that they should be done...


Games should not run in a browser? That's a funny claim. Try telling that to:

1. Millions of users, who play them happily each day.

2. Sites like Kongregate, Armor Games and Newgrounds, whose business is to publish them.

3. Sites like FlashGameLicense, whose business is to help the business of developing and publishing games that run in browsers.

If something can be done, someone will probably try to make a business out of it. If it catches on, then people who say "just because something is possible, doesn't mean it should be done" are wasting their breath.


but you can't just right-click on a binary executable and "view soure", bro! ~

Seriously, people managed to write successful, cross-platform software without expecting everyone with any kind of gadget or device to run it with one click.


When was this? Atlantis? I don't remember any point in history where this was simple and commonplace.


"Those should not run in a Browser, they should on a real OS (or Emacs)."

Why? It was not so long ago that people would have said the same thing about an email client.


> NativeClient is not a way to make the web more open, in fact it's a way to make the web more binary/obscured.

I'd disagree. Minified JS source is about as "open" as LLVM bytecode - you won't read both with your eyes. And they are both standardized, have FOSS implementations etc.

Openness vs obscurity is almost completely irrelevant to Javascript vs PNaCl. One can obfuscate code in any language. Openness is important, but it's a completely different matter.

> Low level memory access, pointers and the likes are the 'horrors'

It seems that everyone are missing the main point of PNaCl. PNaCl is NOT a tool to give programmers a headache with manual memory management. It IS a tool to give them ability to write in Python, Ruby, Perl, Haskell, Go, C++ and so on - in any language that can be compiled to LLVM bytecode. And mix them to their heart's content.

PNaCl is - as I see it - primarily, an attempt to get one important thing right - to not misuse JavaScript as a weird sort of bytecode. And to give Web-as-platform so much needed language diversity instead of The One Standard Language (JavaScript).

> This seems to imply that the browser should have a compiler that complies the low level bytecode

They already do that for a long time. V8, TraceMonkey and Carakan are the examples. The whole point of PNaCl is to give browsers a proper bytecode, and use JavaScript properly - as a programming language, not as universal assembly code for the web.

There are many obstacles, unsolved problems and distractions (like x86-only NaCl), but the overall direction is right.


> The author fails to point out why would anyone want low level memory access.

To get out of the sandbox and wreak havoc? No really, NativeClient is the __last__ thing the Web needs. In the end people will either port their old, bug ridden and insecure C++ code to that thing or they will write new platform dependent code... or both at the same time. That's completely against the OpenWeb.


> To get out of the sandbox and wreak havoc?

NaCl, on the whole, is sound. There have only been a few attacks against it, and those were largely during the "come own us" phase. I'm sure there will be further attacks against it, but that is the last issue in play here. Cross-platform and existing standards compatibility are by far the more important ones, not to mention the benefits of code you can optimize as you see fit (Javascript, Flash, and other high-level languages).

Worrying about sandbox escapes from NaCl is silly when you consider the insane attack surface that existing browsers expose to the JS engine.


I don't worry about NaCI itself, I worry about someone who's too stupid to implement it correctly. And those people are everywhere, especially in big companies. i.e. Nintendo, broken (self built!) RSA in the Wii or even better Sony, using the same "random" number for all their crypto on the PS3... yes FX and Chrome are Open Source, but MS and Opera are not.

Anyways, I'd rather spent a lot of time improving the JITs instead of writing "optimized" low level code myself these days.


You have two choices: use Google's implementation, which is open source and licensed such that it can be used effectively anywhere, or implement it yourself. Implementing it yourself, as long as you follow the NaCl "spec" (a term I use very loosely here) is pretty simple, although it isn't without pitfalls; you should use the existing implementation unless there's a good reason to do otherwise, though.

Personally, I'm a huge fan of the everything-managed approach (hell, I started a pure-managed OS project for a reason), but I don't think that's a reason to avoid NaCl.


Most standards bodies require at least two independent implementations of a specification before labelling it a standard.


Apparently ISO is not one of them, OOXML required zero implementations


I'm sure there will be further attacks against it, but that is the last issue in play here.

I'm sure this attitude will survive for many years to come, although it really shouldn't.


Forget all this crap... if you need to remote out the UI part of some complicated app, just write a Java front-end and deploy it using JNLP. Let web-browsers remain good at what they're good at, and what they were meant for: Browsing the web. Web browsers make great hypermedia navigation / browsing tools... but they're really not so great at being the universal standard remote client interface for complex applications. :-(


Yeah, that whole applications on the web thing is totally just a fad. I'll right get on rewriting Google Maps using a Java front-end.


I don't think Google Maps does any heavy lifting and number-crunching in javascript in the browser; in fact, all it does in js is "remote out the UI part".


I never said it did, but was responding to:

Let web-browsers remain good at what they're good at, and what they were meant for: Browsing the web. Web browsers make great hypermedia navigation / browsing tools... but they're really not so great at being the universal standard remote client interface for complex applications. :-(


Which is exactly the one thing the grandparent said to use Java for.


For the most complicated apps you are right, but most applications are not that complicated and web technology solves problems that Java has not even recognized as problems. Things like advanced accessibility, semantics, device independence (SVG, CSS Media Queries) true open standards ++. You can do much more with a web browser today than "browsing the web" and browsers has done amazing development the last 5 years and will continue with amazing development the next 5 years as well.


I disagree with the article's sentiment. I'm glad the web is focusing on Javascript. Here's why: the web is a nice, text-based environment that is safe to execute on your computer. Each web resource may contain active scripts but they are pretty innocuous.

Native applications are binary and can do all sorts of nasty things. Sure, this sandbox is supposed to be safe, but what if it's not? When an application is delivered over the web, one should really make sure that it wasn't somehow changed or sabotaged. Right now this is impossible. At the very least, this proposal would have to be implemented so we can trust what is being downloaded: http://news.ycombinator.com/item?id=2024164

Here is what I suggest: native libraries should expose their objects to javascript, which should do the majority of the work. Kind of like PhoneGap does on the phones. These native libraries (like OpenGL, say), should be served from cdns over httpc:// and the user agent can verify that they are safe after downloading them.


According to Brendan Eich, the designer of Javascript:

    JS had to "look like Java" only less so, be Java's
    dumb kid brother or boy-hostage sidekick. Plus, I had
    to be done in ten days or something worse than JS would
    have happened.
(See comments from http://www.jwz.org/blog/2010/10/every-day-i-learn-something-... for that and more from him)

Given that constraint, JS is an amazing hack. But it's 2011 now. Why should we have to use Javascript for client code in the browser--why shouldn't we be able to use Ruby, or Python, or C#, or Clojure, or Haskell, or F#, or Lua, or whatever else we want? It seems ridiculous to me that we're having a resurgence of language innovation on the desktop and on the server, but not in the browser.


We can. http://mozakai.blogspot.com/2010/08/emscripten-llvm-to-javas... or its kin will definitely see a place in the future of web development.

x86 is a sucky architecture to target but we target it cause that's what's worked on. Path of least resistance. Same will happen to JavaScript as a bytecode target, inevitably, and when it's abstracted away and browsers speed up even faster, we won't be bothered by the pain points, just use the implementation.


Can you tell me how this is different from, say, Java? The fact that you are happy to run arbitrary binary code that was compiled for a specific platform? So now we are supposed to "write once compile everywhere" like C++, and then ship that to platforms? That is not in the spirit of the web AT ALL. If you want Java, at least THAT is write once, RUN anywhere. And why isn't running Java more seamless? I think answering THAT will be helpful to the original discussion.


Because ECMAScript turned out to be pretty awesome, and is now the de facto standard on the web.

If you don't like Javascript, your best alternative right now is to make a language that compiles into it. Take whatever "nice" features you want. Here was my crack at it: http://news.ycombinator.com/item?id=2044752


Creating a language that compiles to JavaScript without essentially being an alternative syntax for JavaScript or being agonizingly slow is surprisingly difficult. I'm not aware of a single one. Do you know of any? Your "language that compiles into it" falls into the former category.


Languages like CoffeeScript do fall into the "alternative syntax" category. And things like emscripten (and some others I've seen) probably fall into the latter. If you try to abstract out a traditional VM on top of Javascript, you're likely to get poor results.

As another commenter suggests, though, we have a language like Haxe, which is much more than just an alternative syntax. And GWT does a very good job with Java -- with the exception of some corner cases, it's usually as fast as hand-written Javascript, and often faster because of a great deal of static optimization. The Google Closure compiler arguably defines a different language (if you turn on "advanced optimizations", it only accepts an effectively-statically-typed Javascript variant).


Off the top of my head I know of haXe, York Haskell Compiler. And WebSharper on F# allows you to do some cool stuff.


Let's ignore the philosophical or design issues surrounding native client and look at a practical one.

Why does the article take issue with Mozilla alone? Surely they aren't the only browser vendor that won't be implementing native client. Mozilla is being singled out here precisely because other major platforms are considered to be lost causes.

Safari and Internet Explorer are unlikely to support NaCl for obvious competitive issues. Heavily curated platforms like the iPhone prevent even third parties from supporting such a feature.

Once you realize that even with Mozilla support you'd still only be looking at a ~60% penetration, you're going to be working around it anyway. Once you're dictating platforms, plugins or providing a fall back implementation I'm not sure if support in one specific browser is going to make or break anything. If you're willing to target only half the web you're simply not that concerned about ubiquity to start with.


Once Mozilla and Google support it, that puts a lot of pressure on the others. It might still not have happened, but Mozilla was the swing vote.


That doesn't guarantee success. Google's + Mozilla's pressure in video codecs hasn't changed Microsoft's and Apple's mind.


The only way I can see PNaCl catching on is if Google deploys it in Chrome and the Android Browser and creates an automatic fallback to a javascript PNaCl interpreter for the browsers that don't have it. Then it becomes about "why is Firefox/IE/iOS slower at running this webapp?"

But even then you'd need Google or someone else to deploy some interesting PNaCl apps to make having it worthwhile.

It's a pity that this is such a long shot because JS isn't a particularly good language and the stack we seem to be heading towards (and that Mozilla favors) is something like CoffeeScript -> Javascript -> bytecode -> machine code. Javascript doesn't seem like a very interesting compiler target or an easy language to make fast[1]. Maybe the new ES5 strict mode or some other subset of javascript can be agreed upon as a basis for compilers, that is easier to run fast. Then that can be the IL for the web.

[1] Best implementations are 3-5x slower than the JVM according to http://shootout.alioth.debian.org/


Well JS is very dynamic, people have for a long time focused on the optimization of static typed compiled languages. The JVM did awesome things in the area of JITs, V8 Crankshaft is already pushing the limits once again, up to 3x gains over the current version in the Browser. There's still a lot of potential in optimizing a language like JavaScript, but Rome wasn't built over night, give JS some more time.


I just find all that engineering effort such a waste when it is optimizing Javascript of all things. I've seen a lot of arguments of why Javascript isn't "that bad" but few about why it is actually good when compared to comparable languages.

I see a lot of good coming out of building VMs for more dynamic languages than Java, but there seems to be movement in the direction of running away from writing pure Javascript (CoffeeScript/GWT/etc) that if you're going to build a JS VM you might as well define a strict subset of the language that will be optimized and let everyone target that when building VMs and languages.


Don't hold your breath. Everyone seems to be predicting huge gains in Javascript performance based on two data points (it used to be really, really slow, and now it's only 5-10x slower than native). But an incredible amount of effort has been poured into Javascript optimization, and it's crawled way up the asymptote, so to speak.

The reason Javascript performance is likely to max out well below that of native code (and even below less "dynamic" dynamic languages like Lua) is that it's freakishly dynamic. JITing dynamic VMs resolving dispatches (this.that) efficiently by making assumptions based on the state when a function (or trace) was compiled. When those assumptions change, they have to fall back and either interpret or recompile. In Javascript, there are a lot of things that can change to break these assumptions (e.g., anything, anywhere, in the prototype chain). Much more so than in more straightforward dynamic languages.

It's also not really true to suggest that optimizing dynamic languages is somehow "uncharted territory" and that we should expect huge gains as people explore the space of solutions. Most of the techniques used in VMs like V8 trace their history all the way back to Smalltalk (and Self, sometimes via HotSpot).


There's no shortage of failed great technology, unfortunately.

You need a way to get this launched, which means you need widely-available clients (PNaCl), and you need the content as otherwise nobody needs those clients you don't yet have, and you need the tools for developing the content which aren't available, and you need a way to interest enough folks in this technology into adopting it, whether by pushing them (cash) or pulling them (cool, useful, solves my problem, etc).

And you need to sort out and preferably prevent the security attacks and how you're planning on providing content protection (yes, you're going to need some sort of copy protection get more than token commercial content), the usual UI adoption issues for non-tech users (they're the big market, and not the geeks), and with all the usual nasties that can derail or dissuade the early adopters of any technology.

Getting to critical mass with these sorts of products is inherently entertaining, and involves rather more thought and cash and effort than with the technology itself. Have you looked at how all that'll happen here, rather than looking (just) at the (admittedly cool) technology?


And so the pendulum has swung back. I believe Mozilla's action is rational assuming my logic is not flawed. See if you can unravel it.

Premise: The more indistinguishable a browser gets from its underlying operating system the more of the properties of said system it must share. Thus stretching the nature of the abstraction, making it shallower (till the machine) hence also increasing the probability of leaks in said abstraction.

Speciation will occur across system architecture peculiarities and cause splintering of browsers. Destroying their main advantage - which is the strong guarantees it makes on your deployed code being accessible across a vast range of platforms. That is, physical constraints and combinatorial considerations will make it very hard to write code that uses architecture specific optimizations and assumptions while still falling back robustly across all devices. And if it can work everywhere then it must not be directly exposing such a hardware layer and then, what is the point. The same can be acheived by optimizing javascript JITing. The JITter can take care of that optimal device specific optimized code generator. Google wants to create an OS, Mozilla wants to improve the browser.

In particular, as margins from speed, ui and features decrease; each vendor will become incentivized to avoid commodification and distinguish themselves from their competitors by moving faster than the glacial speeds of standard bodies and introducing incompatibilities. While being slow to pick up those of others. In essence each browser would basically evolve into and become indistinguishable from a current OS with all its pitfalls (isomorphic rather than homomorphic as currently). And if we are targeting specific VMs then we may as well factor out the browsers as they are no longer a vital component of the equation. Completing the cycle. To be restarted with metabrowsers.

Seems to me that pushing for native into the browser without carefully considering the tradeoffs is foolish. You cannot have uniformity without sacrificing diversity. This seems like the original Java dream rebooted. But it seems to me that wanting the same UI and code to work everywhere while taking advantage of underlying hardware, while automatically adapting and falling back on visuals and optimizations is a pipe dream. That is of course, until OS's and programs become intelligent and partially alive. At least microbe level intelligence. And virus like adaptibility.

Aside: NaCl appears to have a decent amount in common with silverlight. Particularly in terms of tradeoffs, weaknesses and gains.


It's not clear from your comment that you have a good understanding of what Native Client is. You're making vague, barely technical objections to the browser getting "more indistinguishable … from its underlying operating system" and losing the minuscule amount of cross-platform capability it has now without explaining how a more open and potentially efficient platform for code execution than "Whatever text-based JavaScript interpreter the browser vendor decided to include" would necessarily do that.


My comment was an argument as to why Mozilla's choice not to introduce this dependency was not a bad one - there is no technical reason why the fast execution multiple language advantages of NaCl cannot be architected into a javascript vm. and why mozilla's path does more to keep things open longer by using an already widely adopted, more understood technology. The enemy you know is better than the enemy you don't know type thing. Who knows what can of warms the concepts sandboxing relies on will contain. It must contains flaws as all creations of humans do.

And then my opinion that browsers will evolve into the platform and not be separable from the OS. Stuff like NaCl simply accelerates that by introducing a dependency on one company or creating a technology that invites splintering on implementation due to its complexity and uniqueness.


Considering NaCl in the context of papers like http://eprint.iacr.org/2010/594 (timing the CPU cache to break AES) is... interesting. You'd hope that Google would have considered such issues, but a quick search doesn't yield anything.


Are you saying that AES in JavaScript is more secure?


In the sense that Javascript crypto is horribly broken (see e.g. http://rdist.root.org/2010/11/29/final-post-on-javascript-cr...) but can't really be used to attack other applications running on your computer, yes. NaCl itself is probably fine-ish for implementing crypto protocols - it's just that it looks like a perfect vehicle for attacking other crypto implementations running on the same processor. (Well, except for the noise from running Chrome, but I still wouldn't use an SSH session while running NaCl.)


If NativeClient is available as a plugin, can't Firefox users benefit from the technology regardless of whatever Mozilla thinks? I like the idea of lightweight x86 sandboxes like Native Client and vx32, but I understand why Mozilla isn't very interested in them. As long as nothing is done to actively hinder their development and use by those who are interested, is there really a problem here?


The problem with plug-ins is that relying on them causes a huge drop-off in usage, at least in my experience. Requiring a download of some kind, even if it's just a plug-in, costs you about 30% of your possible users.

It sounds like Unity3D has similar numbers: http://forum.unity3d.com/threads/39362-Web-player-adoption


Obscuring code isn't a bad thing, it's what I miss about c/c++. It was so much easier to share a library without giving away source code.

The increased power that NaCL would add would enable more disruptive applications than just js apps. Client side custom video encoding, browser based distributed computing, and yes again a better economy of buying selling software libraries.


Think of it this way:

Flash, Java, Javacript, et al are great for the web because they are "write once, run anywhere". The same source code runs everywhere. That is what the web is all about. HTML and CSS are not scripting languages and they are also "write once, run anywhere". That is how the web delivers your programs.

Now, what will your Native Client do?

It will have COMPILED code for ONE platform. Like in C++ where you "write once, compile everywhere". Except you probably won't be able to compile everywhere. The point of the web is that any platform should be able to run your app.

On the other hand, I can see Native Client as being useful for extension libraries. You know, like PhoneGap plugins. The Javascript can test if the object is there, and if it is, use some standard interface. You could build up a standard library of these. As long as it is available on a wide enough range of platforms. Certain APIs are already exposed by the browser, through HTML5, that were originally in Google Gears.

Look, I agree that it's more of a philosophical thing, and indeed you are welcome to make an extension for Mozilla and all the other browsers. But the security risks alone will make this a tough sell to INCLUDE in a browser -- harming the spirit of the web. Not only that, but the web is totally against "favoring one platform over another" ... it is BECAUSE of the web that the platforms are able to work better together.


Did you miss the PNaCl part? Check it out here: http://nativeclient.googlecode.com/svn/data/site/pnacl.pdf


People are going to want to target their C, Java, Python, etc. code to the browser. The only question is what is the better target, Javascript as a VM or a sandboxed LLVM?

Honestly the inclusion of NaCl wouldn't change that much. JavaScript would still be the easiest path for most developers to choose. It will be only language that the browser hosts the interpreter and has no compilation step.


JavaScript does have a just-in-time compilation step in most implementations these days. A quick scan of llvm-bc by a back end would be equally invisible to the user.


I just meant the user has to compile their language of choice to the llvm byte code, as I don't see browsers hosting a bunch of interpreters for different languages.


Only the developer would need to do the step of compiling to llvm-bc, not each user. The browser need only host the back end JIT compilation step.


The ability to use a better language could very well make that the easier route even though it requires compilation. I mean, languages other than PHP and Perl are stupendously popular on servers these days despite the fact that they're superficially less easy to get up and running.


I agree that JavaScript may not be the optimal solution (our kids will hate us for making it the language of the web).

But why LLVM? Why not something more standard like CIL or Java bytecode? I would personally love to see CIL in all browsers. It compresses much better than LLVM bytecode...


The web is starting to come into its own thanks to relentless efforts to improve a limited set of tools: HTML, CSS, Javascript. What the web needs is not more tools, but for the tools it already has to continue to be improved.


Yeah, so relentless efforts to improve any other tools than those are just missing the web.


Google don't even enable NaCl by default on their Cr-48 laptops!


NativeClient is a bunch of horse poo and shouldn't be in a browser anyway.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: