I don't even think Google "adopted" NaCl, it's still more a private project of a few, compared to Chrome(OS) and Android.
NaCl is one of those technologies that I secretly hope fails. Pull the plug on x86 already, FFS.
Saying that the Open Web needs an app-store to compete with the walled gardens is also a fallacy. Quite the contrary, the web competes with the walled gardens because it doesn't have such master-record. You will see this pattern repeat itself; the chaotic nature of the web beats any attempts to impose "order" upon it. Cf. the dismal failure of dmoz and other curated directories, compared to search engines.
If you want to use performance critical software that pushes the limits of your machine, download it, use Java or whatever.
I doubt the app-store model will survive long term; individual merchants and vendors tend to move faster than the infrastructure that houses their shops and supports their business (almost literally.) The moment a few merchants diverge from the proscribed machinery of the market, and exploit inefficiencies, as they will, is the moment you will see this model come apart at the seams. If the entire model isn't killed along with its platform by changes in technology.
The one good example I can give you is the buildings built throughout South East Asia to house street-vendors. Specially in Cambodia and Vietnam, the governments built multi-story shopping centers, just piles of concrete, and invited vendors to push their carts in and congregate. The markets might have looked like proper malls at some point, but with time, they're just shaded bazaars like any other street market, except this one isn't dried by the sun and there is no drainage. Over time, the building has all the appeal of a street market, except you have to climb up some slippery stairs tightly packed with pickpockets. However, most higher-end merchants gather around the vertical-bazaars and open proper walk-in shops with more products, better presentation, and all the amenities of a private property (fan, seats, water cooler, bathroom, TV/radio, etc.)
As far as NaCl goes, the absolute last thing the web needs is portability/endianness issues. I'm writing this comment on a phone (ARM), and I also browse on my desktop machine (x86-64) and my netbook (Atom, x86). (Not to even mention PS3/XBox 360/Wii/older Macs, all of which come with browsers.) Wanna make web the web worse? Force developers to target several architectures for their apps, or greet users with "Your CPU isn't supported!" pages. Lots of Webapp developers have trouble with cross-browser issues!
I also agree with the app store assessment (1000 suns, etc.), but have little to add.
See the following video from the recent LLVM Developers' Meeting
Portable Native Client,
David Sehr, Google [mp4,269mb]
The problems with Native Client go beyond its currently x86-specific nature though. The Web is based on open standards, and a requirement for most standards is having multiple independent implementations. Native Client is a complicated enough technology that it might be completely impractical to spec it in sufficient detail and independently reimplement.
You might think this is only a technical issue of standards process, but a standard with only one implementation ends up de facto controlled by a single entity, even if the implementation happens to be open source. In practice, you'll get support for the platforms and CPU architectures that Google cares about, in their priority order. You can see how that might not be so appealing for an entity that doesn't want Google to be setting the agenda quite that much.
In addition to this, the security architecture of the whole thing seems pretty dubious. It does have a better attempt at security design than ActiveX. But given the basic approach (binary-level validation of binary code), it has a lot of attack surface.
All that being said, it's a neat project with a lot of hack value. It just doesn't seem like a great fit for the Web. It is likely more driven by Chrome OS at this point.
And if a single implementor is a bad thing, shouldn't you want Mozilla in on it?
The base Squeak/Pharo VM runs bit identically on over 50 platforms. The performance of the Lua VM and its suitability for ECMAScript is very attractive, though.
This is a bit too simple. The web is becoming a platform for rich interactive applications more and more. Nowadays we can use tons of applications without downloading and installing any software. Most people would agree this is a good thing.
Maybe NaCl or even some bytecode alternative have their disadvantages, but they solve a very annoying problem for sure.
The belief that JS+HTML is "just fine" and no significant evolution is required to support application development is absolutely incomprehensible; what other application platform available today has seen iteration as glacially slow as the web? The browser makers have consistently stuck to their "HTML+JS is fine for everyone" guns, ECMAScript has failed to evolve almost entirely, and no common application development framework or platform has emerged.
HTML+JS libraries -- such as jquery -- are just fine for interactive document publishing, but are no replacement for Cocoa, Android, or Qt. The lack of common re-usable and extensible UI components is a travesty. Proponents like to advance the idea that this chaos is a benefit -- choose whatever tools you want -- but the rising popularity and the massive investment in application development on native app stores demonstrates a market desire for something simpler: A common well-defined application development platform is valuable, and the only barrier to adoption in the web world was a trustable distribution mechanism -- app stores.
The fact that you think App Stores will fade away is demonstrative of a failure to understand why the app store is so popular -- ease of distribution, easy to use common development platform, ease of payment processing, common UI (instant user familiarity), performance (!), open development platform (no JS sandbox, not all languages must be filtered through JS).
Ask yourself why Google has ChromeOS on one hand, Android on the other, and is working on technologies like NaCL that could bring many of the advantages of Android to Chrome?
I also should mention, anecdotally, that I've had some long-winded conversations with a member of the Chrome App Store team -- me from the perspective of a long-time native developer, them from the perspective of being tasked with coming up with ways to improve the web as an application development platform.
From my conversation, which may or may not reflect reality:
The second seems to believe that a fundamental re-imagining is required -- perhaps dropping the DOM entirely in favor of rendering with canvas and a traditional view/event hierarchy. Consider opening the playing field for more languages (eg, via NaCL).
I think the competing strategies within Google and the clear differences of popularity between them -- Android, Chrome/ChromeOS/Chrome AppStore -- provide an enlightening view as to the likely future of application development on the internet.
Ext JS http://www.sencha.com/products/js/
Beyond functionality (of which these can and do provide considerably less), part of what makes a platform like Cocoa/UIKit so valuable -- it is used by every application on the device. The applications will interoperate both with each other and with the OS (background services/tasks, quicklook, spotlight, etc, etc, etc). The interaction model will already be familiar to the user, visual cues will be understood, the speed and inertia of animations (such as scrolling) will be expected. There is an enormous value to being able to leverage user expectation, and this is where too many choices falls down.
HTML has its own set of visual cues that you and millions of others easily interact with every single day. I would argue that the interaction model of HTML/JS apps may be as familiar or more familiar to users.
I don't disagree that HTML/JS apps can be difficult to develop, but I do not think they are going to "lose". (I don't think they are going to win either. It's not a win/lose situation.)
Since it seems that your background is in native applications, I just wanted to provide you with some references to frameworks that provide something a little more advanced than jQuery and interactive documents.
Obviously, each team needs to look at its project and goals and choose whether a native app, an HTML/JS app, or both is appropriate.
There are those who suggest that the browser APIs are close to "good enough" and just need to be tweaked a bit to make it possible to create good, efficient web apps on par with modern native apps. I'd like that to be the case (my day job is working on browser application tooling and infrastructure), but so far it's not happening where it matters most -- on mobile devices. On the desktop you can usually get away with murder because the machines are so fast (and have so much memory), but on more limited mobile platforms (and by limited I mean ~1GHz ARM with ~0.5G of memory) web apps aren't even close to their native counterparts. There are some laudable attempts out there (e.g. Gmail mobile, the Sencha Touch demos), but if you poke on them a bit you start to notice obvious performance and memory issues (e.g., when Safari Mobile crashes on the iPad it's usually out of memory).
Does this mean native apps in app stores will "win", and the web will "lose"? The web "won" on the desktop (for most kinds of apps, anyway), in spite of its severe limitations, because it provided a reasonably stable platform and super-simple deployment. App stores solve the deployment part of this problem reasonably well, but fail badly at the cross-platform part. Can the web be made "good enough" in time to win again? I don't think anyone can accurately predict the outcome at this point.
Microsoft once tried this in Windows 98 (thanks tightly-integrated IE) - and fairly succeeded. Since then, the attempts to "bring the Web to the desktop" are fairly constant.
> Saying that the Open Web needs an app-store to compete with the walled gardens is also a fallacy.
Chad doesn't say that the open web needs an app store, just that "it needs a realistic answer to native code." I assume he wants to avoid a future of fast app store apps vs. slow open web apps, not encourage the creation of a one-stop shop for web apps.
There's no reason why NaCl should be inherently limited to x86. x86 is currently the biggest platform, but there's no reason why NaCL couldn't switch to a "Universal" payload like OS X has, combining the 2 largest ISA. (x86 and arm?)
EDIT: I've since read that NaCl is working on LLVM, which would be even better.
2. A 'universal' payload with ARM and x86, would just work on them. What if in 5 years we have new architectures? Only supporting ARM and x86 would hold back innovation there.
2) You're posing a what-if for 5 years down the road? Your horizon is really that far? Sounds like grasping for straws to me.
GP was right, including just x86 and ARM is not good enough. Heck it even ignores x86_64 right now, and won't support new SIMD extensions that come out from time to time. Not to mention be a barrier to anyone introducing a new arch, as GP said.
Thinking ahead here is vital.
It's like trying to run a C64 binary today. The only practical way is emulation, which is slow - but thankfully fast enough in this case. In general though, it means new architectures will run more slowly than existing ones. That's not a good thing.
In general, yes, but there's a good chance more niche stores like Steam will survive for a fairly long while.
* Compared to e.g. either of the Apple app stores, Steam is a very open platform: there are very few demands imposed on the publisher, and developers can release updates whenever they wish, for example. The only other platform where the sort of iterative development Steam can provide is possible is the Web.
* It provides a tremendous value-add: infinite game downloads, automated patching and save file syncing in the cloud. Of course, none of these are issues with a Web-based game, but at least some of the stuff available on Steam isn't going to be possible in the browser any time soon.
* Games are generally much less fungible than the average application, Valve's own games even more so.
* Valve's earned the trust of most PC gamers, including mine. It's probably (percentage wise) the most trusted app store vendor currently in business.
Low level memory access, pointers and the likes are the 'horrors' Java/C#/<name your high level language> programmers are running away from. The author fails to point out why would anyone want low level memory access.
> Preemptive response: But NativeClient is x86! Basing the open web on a peculiar, old instruction set is a terrible idea! That’s why I point to LLVM and Portable NativeClient (PNaCl). It’s not a burden to target PNaCl by default, and cross-compile to x86 and ARM if they matter to you.
This seems to imply that the browser should have a compiler that complies the low level bytecode into real machine code. The author should realize that this would be almost identical to running an SWF or a Java plugin, which makes the whole idea pointless.
"The author fails to point out why would anyone want low level memory access."
Please read Tom Forsyth's postings that I linked at the top of mine. Basically, in the last 30 years, clock speeds have gone through the roof, but memory latencies have only increased a couple orders of magnitude. Thus, memory is a primary concern in any application where low-level performance matters, like the ones I listed (games, simulations, video, DSP).
I would have nothing against a site that sends me source code to be compiled within my computer so it could run inside a sandbox, but I won't like when Facebook starts pushing binaries I should trust won't break out of the sandbox they should respect. You can't easily do static analysis on binaries.
And I would love to be able to browse the web on my SPARC, POWER and MIPS boxes.
The binary format in question here is LLVM-BC, which is just a compact representation of LLVM-IR, which is a single-static-assignment representation specifically designed for static analysis. SPARC, POWER and MIPS backends already exist, FWIW.
Well... I wouldn't.
In fact, I can't understand why taking more or less the same shortcut to a dead-end Java took a decade-and-half ago is suddenly a good idea and why disagreeing with it means dooming the web to failure.
That's what I'm getting at here. Executing remotely downloaded code is scary, but we already do that.
And the problem with Java was that it had terrible performance. The idea of NaCL is that it will actually perform better than what we have now.
It had, indeed, terrible performance in 1996.
> The idea of NaCL is that it will actually perform better than what we have now.
The original article here does an excellent job of explaining why. Did you read it? In particular, his reference to Tom Forsyth's article on Moore's Law versus Duck Typing is very informative. And his reference to the game Supreme Commander makes it pretty clear what level of performance he would like to see web-deployable pieces of code achieve.
Those years of experience can be brought to either solution, can't they?
I watched the video that junkbit posted a link to here, and they appear to not trust the llvm-bc. Once the bitcode is translated to a native executable, they run a verifier on the resulting binary, and if the verifier is unable to prove that the only instructions that can execute are those in thi binary, then they have a strict policy of not letting it run. In addition to that, the translator itself runs as a NaCl module so that if a bug is found, it cannot be maliciously used to escalate privileges.
Their approach seems pretty reasonable to me.
I think we are at a point where architectures are so different that even though in theory, controlling memory pattern is potentially more powerful, in practice, it is impossible to do it right except when you can spend insane amount of time on it. The difference between P4 and core duo, for example, is enormous as far as organizing memory accesses. This is exactly like ASM vs C: you can still beat C with ASM, but doing so across all architectures is almost impossible to do it by hand.
Those should not run in a Browser, they should on a real OS (or Emacs).
What's next VMWare inside your Browser? Then it's OS -> Browser -> VM -> OS -> Browser...
Sorry but just because things are possible, doesn't mean that they should be done...
1. Millions of users, who play them happily each day.
2. Sites like Kongregate, Armor Games and Newgrounds, whose business is to publish them.
3. Sites like FlashGameLicense, whose business is to help the business of developing and publishing games that run in browsers.
If something can be done, someone will probably try to make a business out of it. If it catches on, then people who say "just because something is possible, doesn't mean it should be done" are wasting their breath.
Seriously, people managed to write successful, cross-platform software without expecting everyone with any kind of gadget or device to run it with one click.
Why? It was not so long ago that people would have said the same thing about an email client.
I'd disagree. Minified JS source is about as "open" as LLVM bytecode - you won't read both with your eyes. And they are both standardized, have FOSS implementations etc.
> Low level memory access, pointers and the likes are the 'horrors'
It seems that everyone are missing the main point of PNaCl. PNaCl is NOT a tool to give programmers a headache with manual memory management. It IS a tool to give them ability to write in Python, Ruby, Perl, Haskell, Go, C++ and so on - in any language that can be compiled to LLVM bytecode. And mix them to their heart's content.
> This seems to imply that the browser should have a compiler that complies the low level bytecode
There are many obstacles, unsolved problems and distractions (like x86-only NaCl), but the overall direction is right.
To get out of the sandbox and wreak havoc? No really, NativeClient is the __last__ thing the Web needs. In the end people will either port their old, bug ridden and insecure C++ code to that thing or they will write new platform dependent code... or both at the same time. That's completely against the OpenWeb.
Worrying about sandbox escapes from NaCl is silly when you consider the insane attack surface that existing browsers expose to the JS engine.
Anyways, I'd rather spent a lot of time improving the JITs instead of writing "optimized" low level code myself these days.
Personally, I'm a huge fan of the everything-managed approach (hell, I started a pure-managed OS project for a reason), but I don't think that's a reason to avoid NaCl.
I'm sure this attitude will survive for many years to come, although it really shouldn't.
Let web-browsers remain good at what they're good at, and what they were meant for: Browsing the web. Web browsers make great hypermedia navigation / browsing tools... but they're really not so great at being the universal standard remote client interface for complex applications. :-(
Native applications are binary and can do all sorts of nasty things. Sure, this sandbox is supposed to be safe, but what if it's not? When an application is delivered over the web, one should really make sure that it wasn't somehow changed or sabotaged. Right now this is impossible. At the very least, this proposal would have to be implemented so we can trust what is being downloaded: http://news.ycombinator.com/item?id=2024164
JS had to "look like Java" only less so, be Java's
dumb kid brother or boy-hostage sidekick. Plus, I had
to be done in ten days or something worse than JS would
Why does the article take issue with Mozilla alone? Surely they aren't the only browser vendor that won't be implementing native client. Mozilla is being singled out here precisely because other major platforms are considered to be lost causes.
Safari and Internet Explorer are unlikely to support NaCl for obvious competitive issues. Heavily curated platforms like the iPhone prevent even third parties from supporting such a feature.
Once you realize that even with Mozilla support you'd still only be looking at a ~60% penetration, you're going to be working around it anyway. Once you're dictating platforms, plugins or providing a fall back implementation I'm not sure if support in one specific browser is going to make or break anything. If you're willing to target only half the web you're simply not that concerned about ubiquity to start with.
But even then you'd need Google or someone else to deploy some interesting PNaCl apps to make having it worthwhile.
 Best implementations are 3-5x slower than the JVM according to http://shootout.alioth.debian.org/
It's also not really true to suggest that optimizing dynamic languages is somehow "uncharted territory" and that we should expect huge gains as people explore the space of solutions. Most of the techniques used in VMs like V8 trace their history all the way back to Smalltalk (and Self, sometimes via HotSpot).
You need a way to get this launched, which means you need widely-available clients (PNaCl), and you need the content as otherwise nobody needs those clients you don't yet have, and you need the tools for developing the content which aren't available, and you need a way to interest enough folks in this technology into adopting it, whether by pushing them (cash) or pulling them (cool, useful, solves my problem, etc).
And you need to sort out and preferably prevent the security attacks and how you're planning on providing content protection (yes, you're going to need some sort of copy protection get more than token commercial content), the usual UI adoption issues for non-tech users (they're the big market, and not the geeks), and with all the usual nasties that can derail or dissuade the early adopters of any technology.
Getting to critical mass with these sorts of products is inherently entertaining, and involves rather more thought and cash and effort than with the technology itself. Have you looked at how all that'll happen here, rather than looking (just) at the (admittedly cool) technology?
Premise: The more indistinguishable a browser gets from its underlying operating system the more of the properties of said system it must share. Thus stretching the nature of the abstraction, making it shallower (till the machine) hence also increasing the probability of leaks in said abstraction.
In particular, as margins from speed, ui and features decrease; each vendor will become incentivized to avoid commodification and distinguish themselves from their competitors by moving faster than the glacial speeds of standard bodies and introducing incompatibilities. While being slow to pick up those of others. In essence each browser would basically evolve into and become indistinguishable from a current OS with all its pitfalls (isomorphic rather than homomorphic as currently). And if we are targeting specific VMs then we may as well factor out the browsers as they are no longer a vital component of the equation. Completing the cycle. To be restarted with metabrowsers.
Seems to me that pushing for native into the browser without carefully considering the tradeoffs is foolish. You cannot have uniformity without sacrificing diversity. This seems like the original Java dream rebooted. But it seems to me that wanting the same UI and code to work everywhere while taking advantage of underlying hardware, while automatically adapting and falling back on visuals and optimizations is a pipe dream. That is of course, until OS's and programs become intelligent and partially alive. At least microbe level intelligence. And virus like adaptibility.
Aside: NaCl appears to have a decent amount in common with silverlight. Particularly in terms of tradeoffs, weaknesses and gains.
And then my opinion that browsers will evolve into the platform and not be separable from the OS. Stuff like NaCl simply accelerates that by introducing a dependency on one company or creating a technology that invites splintering on implementation due to its complexity and uniqueness.
It sounds like Unity3D has similar numbers: http://forum.unity3d.com/threads/39362-Web-player-adoption
The increased power that NaCL would add would enable more disruptive applications than just js apps. Client side custom video encoding, browser based distributed computing, and yes again a better economy of buying selling software libraries.
Flash, Java, Javacript, et al are great for the web because they are "write once, run anywhere". The same source code runs everywhere. That is what the web is all about. HTML and CSS are not scripting languages and they are also "write once, run anywhere". That is how the web delivers your programs.
Now, what will your Native Client do?
It will have COMPILED code for ONE platform. Like in C++ where you "write once, compile everywhere". Except you probably won't be able to compile everywhere. The point of the web is that any platform should be able to run your app.
Look, I agree that it's more of a philosophical thing, and indeed you are welcome to make an extension for Mozilla and all the other browsers. But the security risks alone will make this a tough sell to INCLUDE in a browser -- harming the spirit of the web. Not only that, but the web is totally against "favoring one platform over another" ... it is BECAUSE of the web that the platforms are able to work better together.
But why LLVM? Why not something more standard like CIL or Java bytecode? I would personally love to see CIL in all browsers. It compresses much better than LLVM bytecode...