Hacker News new | comments | show | ask | jobs | submit login

The only way I can see PNaCl catching on is if Google deploys it in Chrome and the Android Browser and creates an automatic fallback to a javascript PNaCl interpreter for the browsers that don't have it. Then it becomes about "why is Firefox/IE/iOS slower at running this webapp?"

But even then you'd need Google or someone else to deploy some interesting PNaCl apps to make having it worthwhile.

It's a pity that this is such a long shot because JS isn't a particularly good language and the stack we seem to be heading towards (and that Mozilla favors) is something like CoffeeScript -> Javascript -> bytecode -> machine code. Javascript doesn't seem like a very interesting compiler target or an easy language to make fast[1]. Maybe the new ES5 strict mode or some other subset of javascript can be agreed upon as a basis for compilers, that is easier to run fast. Then that can be the IL for the web.

[1] Best implementations are 3-5x slower than the JVM according to http://shootout.alioth.debian.org/

Well JS is very dynamic, people have for a long time focused on the optimization of static typed compiled languages. The JVM did awesome things in the area of JITs, V8 Crankshaft is already pushing the limits once again, up to 3x gains over the current version in the Browser. There's still a lot of potential in optimizing a language like JavaScript, but Rome wasn't built over night, give JS some more time.


I just find all that engineering effort such a waste when it is optimizing Javascript of all things. I've seen a lot of arguments of why Javascript isn't "that bad" but few about why it is actually good when compared to comparable languages.

I see a lot of good coming out of building VMs for more dynamic languages than Java, but there seems to be movement in the direction of running away from writing pure Javascript (CoffeeScript/GWT/etc) that if you're going to build a JS VM you might as well define a strict subset of the language that will be optimized and let everyone target that when building VMs and languages.


Don't hold your breath. Everyone seems to be predicting huge gains in Javascript performance based on two data points (it used to be really, really slow, and now it's only 5-10x slower than native). But an incredible amount of effort has been poured into Javascript optimization, and it's crawled way up the asymptote, so to speak.

The reason Javascript performance is likely to max out well below that of native code (and even below less "dynamic" dynamic languages like Lua) is that it's freakishly dynamic. JITing dynamic VMs resolving dispatches (this.that) efficiently by making assumptions based on the state when a function (or trace) was compiled. When those assumptions change, they have to fall back and either interpret or recompile. In Javascript, there are a lot of things that can change to break these assumptions (e.g., anything, anywhere, in the prototype chain). Much more so than in more straightforward dynamic languages.

It's also not really true to suggest that optimizing dynamic languages is somehow "uncharted territory" and that we should expect huge gains as people explore the space of solutions. Most of the techniques used in VMs like V8 trace their history all the way back to Smalltalk (and Self, sometimes via HotSpot).


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact