Hacker News new | past | comments | ask | show | jobs | submit login
On Asm.js (acko.net)
240 points by scribu on Nov 27, 2013 | hide | past | web | favorite | 182 comments



I feel like PNaCl is the technically superior approach - define a stable set of LLVM bytecode and build an interface to run it in the browser. But the uptake is a problem, no other browser maker wants to adopt a big chunk of code controlled by Google, tailored to run optimally in Chrome.

So asm.js took a beeline - leveraging existing Javascript machinery for security/JIT and shoehorning a way to run executable LLVM on top of it. The fact that it's Javascript is just a detail, a legacy of the days when one vendor (Netscape) was able to push through a standard for running code in the browser.

In today's fragmented browser landscape I find it hard to believe a consensus could be reached again. Most of the major browser makers (Google, Microsoft, Apple) have their own platform agenda to push - in that context asm.js's compatibility with existing Javascript engines gives it the best chance of adoption.

But I agree, the hoops that had to be jumped through are a damned shame.


> So asm.js took a beeline - leveraging existing Javascript machinery for security/JIT and shoehorning a way to run executable LLVM on top of it.

A big advantage of asm.js over PNaCl is that it's not LLVM.

On the other hand PNaCl is very tightly coupled to LLVM. Any time I find myself checking if a some $feature is available in PNaCl, the discussion revolves around what LLVM can or can't do and how LLVM does it, see [1] for most recent example.

I just can't see how PNaCl could be meaningfully reimplemented. Reimplementing it from scratch would require implementing something bug-for-bug compatible with LLVM. That's a gargantuan task. On the other end of the spectrum, why not just use the original implementation? Then it would work very much like any other plugin. I don't think it's a good solution. It seems the only remaining option is implementing it on top of LLVM. But then you still share all potential limitations (e.g. in portability) and all security issues with all other LLVM-based implementations. It could end up being not that bad -- it means more work on LLVM itself, making it better -- but I'm not sure if that would offset the other problems. How many features that appeared first in chrome or webkit were then reimplemented elsewhere in a different, better way? Do we want to end up with only one browser? No, there are lots of things we wouldn't have if Firefox didn't happen, there are lots of things we wouldn't have if Chrome didn't happen. Do we want to end up with only one PNaCl runtime?

[1]: https://code.google.com/p/nativeclient/issues/detail?id=3475


I think you're conflating frontend and back end. As they currently exist both asm.js and PNaCl frontends are LLVM, if just because it's a good C/C++ solution to parse and optimize. The backend is indeed an LLVM subset as of today for PNaCl and not asm.js, but it's a very restricted subset because the complex bitcode definition LLVM has was pruned a lot for PNaCl, both to distance it from LLVM churn and to make it implementable in another VM.

It is definitely possible to implement a non LLVM back end for PNaCl.

The bug you quote is quite the opposite of what you state: Mark was explaining how complex and potentially often varying LLVM's representation of atomics was. There is a need for atomics to express the full breadth of capabilities C11/C++11 has, but it has to be solved in a way that is close to the standard and no more, non of LLVM's quirks and legacy cruft (LLVM predates these standards, launched PNaCl does not). The primitives also have to be packaged portably so that they can be expressed correctly on any hardware targets (this happens to be tricky to get right, especially since Clang usually makes assumptions about the target). I think PNaCl got this one exactly right, but then again I'm the one who implemented this. :-)


> As they currently exist both asm.js and PNaCl frontends are LLVM (...)

I don't think it's fair to call LLVM a frontend for asm.js just because LLVM can compile to asm.js. By that logic LLVM is also a frontend for x86. That doesn't mean anything. Asm.js is a platform, LLVM is a compiler, it's expected that LLVM can target it asm.js, so can other compilers. Hopefully both asm.js and PNaCl will be targeted by multiple compilers not using LLVM.

> It is definitely possible to implement a non LLVM back end for PNaCl.

Anything's possible. Reimplementing PNaCl from scratch (custom everything) obviously will take much more work than implementing it on top of LLVM. However i honestly think that there would be a large chasm between PNaCl on top of LLVM and PNaCl on top of any other virtual machine. And this is a problem.

EDIT:

I think i'm too focused on LLVM, sorry about that. What I should have written but didn't: PNaCl is hard to reimplement, and not only because it's a large project. For example iirc Chrome doesn't explicitly support the asm.js pragma and doesn't try to be compliant, but it already performs many of the optimizations from the its spec and the Chrome team appears receptive of changes that make this sorta support better. If things continue that way, soon it won't matter if Chrome supports the pragma or not. I.e. right now Chrome supports (say) 10% asm.js and eventually it could support up to 90% with no problems. On the other hand can Firefox implement PNaCl piecemally? What would a 10% reimplementation of PNaCl look like? I think even a 90% or anything else short of a full implementation will not be interoperable.


> I feel like PNaCl is the technically superior approach - define a stable set of LLVM bytecode and build an interface to run it in the browser.

I don't agree. I think that LLVM bitcode was not really designed for this either [1]. Google has not succeeded in eliminating all of the undefined behavior from LLVM [2]. At least asm.js has a fully defined semantics as specified by ECMA-262. For any hope of interoperability between browsers, that's critical. Things like order of iteration of properties were once left undefined by ECMA-262, but pages began to rely on Netscape's behavior and other browsers had to reverse engineer what Netscape did. Eventually they were added to the standard. Likewise, without all the undefined behavior removed from PNaCl, anyone who wanted a separate implementation of PNaCl would have to reverse engineer what Chrome did.

There is also the issue of compilation speed: asm.js was designed to compile quickly by a backend that doesn't do much optimization. LLVM, on the other hand, was optimized for -O2, which compiles much less quickly. PNaCl at -O0 is far worse than either V8 or OdinMonkey. On the Web, compilation speed matters a lot.

I think an actually technically superior approach, if backwards compatibility were not a concern, would involve:

1. A from-scratch, untyped, non-SSA bytecode, basically an isomorphism to asm.js. All operations would have fully defined semantics. This would yield fast compilation without the JavaScript syntax and would be amenable to multiple backends, like LLVM or V8.

2. A mechanism for providing a C binding to WebIDL-based Web APIs. This would eliminate the necessity of Pepper and would mean that any new Web APIs would instantly be available to native code.

PNaCl, unfortunately, is neither of these. asm.js isn't either, but I think it's closer to this ideal than PNaCl is.

[1]: http://lists.cs.uiuc.edu/pipermail/llvmdev/2011-October/0437...

[2]: http://www.chromium.org/nativeclient/pnacl/stability-of-the-...


I think quoting Dan's email repeatedly is entirely unhelpful, I'd appreciate it if you wouldn't to that. We've actually addressed every single point in Dan's email, talked to Dan repeatedly, and gone beyond what he originally wrote. Sure it has a catchy title, but if you actually read it (and the replies it generated back then), and then compare to what PNaCl does (including discussions on the LLVM mailing list) you'll see that I'm not making things up here. Our IR subset was designed to be portable and stable.

Undefined behavior: it's mostly gone by the time the pexe is created, and our intent is to remove all that's left (e.g. shift by larger than bit width is trivial to fix). I think what's left is mostly non-issues that can't be easily fixed without breakage, so honestly I don't think undefined behavior is any kind of a deal breaker at this point. asm.js still doesn't have canonical NaNs so it's not fully defined ;-)

Note for other readers here: yes C/C++ have undefined behavior, but both PNaCl and asm.js settle on actual behavior before something gets shipped to the browser.

On compile speed: agreed, but I'll take the "it's getting better" approach here (hey, asm.js can do it for runtime!).

I'm not quite sure why having a non-SSA representation would be a requirement. I agree that there are advantages, but I think that's true of both approaches.

Extra bindings and non-Pepper: agreed.


> I think quoting Dan's email repeatedly is entirely unhelpful, I'd appreciate it if you wouldn't to that. We've actually addressed every single point in Dan's email, talked to Dan repeatedly, and gone beyond what he originally wrote. Sure it has a catchy title, but if you actually read it (and the replies it generated back then), and then compare to what PNaCl does (including discussions on the LLVM mailing list) you'll see that I'm not making things up here. Our IR subset was designed to be portable and stable.

My point wasn't to imply that PNaCl has done a bad job moving mountains to make LLVM into something stable and portable—in fact, the project has done an incredible job with it. I'm just saying that LLVM is not really any more of a natural fit than JS is. JS starts with compatibility and defined semantics and needs to add a low-level type system to become a portable native runtime. LLVM starts with a low-level type system and needs to add compatibility and a defined semantics to become a portable native runtime. In neither case was the system designed for it, as Dan's email shows.

> Undefined behavior: it's mostly gone by the time the pexe is created, and our intent is to remove all that's left (e.g. shift by larger than bit width is trivial to fix). I think what's left is mostly non-issues that can't be easily fixed without breakage, so honestly I don't think undefined behavior is any kind of a deal breaker at this point.

I just wish PNaCl hadn't shipped with any known undefined behavior. (I don't believe NaN canonicalization problems were known at the time asm.js shipped.)

Does PNaCl still have undefined behavior if you access outside a valid allocated region, as that document states ("accessing undefined/freed memory")? It is defined in asm.js, as the application ships its own malloc and there is no type system to tell the runtime what kind of pointer something is. Supposedly Microsoft got into trouble with their libc's malloc because they couldn't change it without breaking apps that relied on accessing freed memory in certain ways. I can see that being true on the Web as well…


Oh I see your point on being a proper fit. It's not much of an argument then, since you're just saying that making it work when it wasn't quite designed for that use is hard, but we did make it work. So I agree :-)

I also wish there weren't any UD, but we did try to do a good list and I think we chose which to fix and which to punt on purpose.

I see memory allocation (both for code and data) as well-defined randomness: saying that the allocated base addresses are random allows many interesting types of sandboxing to be used which couldn't be used otherwise (more on that in the future). I think this will lead to interesting perf gain in some cases, and interesting security gains in others. More pragmatically it currently allows ASLR while still providing memory access without indirection (i.e. reg-reg addressing) on x86-32 and ARM.


Just out of curiosity, what is the rationale for a non-SSA bytecode? Is the idea that each implementation will have a platform-optimized SSA format?


SSA tends to result in a larger on-the-wire format since you have more values and phi nodes. Interpretation is slower because you end up with a lot of values unless you do some sort of register allocation, and you have to interpret phi nodes. If and when you need SSA, the dominance frontiers algorithm is fast in practice (as IonMonkey/Crankshaft show).


While you may be right with regards to the file size (although most SSA-representations are not built for small size, there is quite some potential to reduce it, imho), you'll have to do register allocation/spilling/etc. in your VM/compiler anyway and that is easier and faster on SSA[1, and more]. The dominance frontiers algorithm is actually not that good imho, there are better options[2], especially considering that one may not want to construct an unnecessary dominance tree in an Interpreter/VM. Now, I have no experience how much overhead SSA-deconstruction inflicts when lowering to machine code.

[1] http://www.cdl.uni-saarland.de/projects/ssara/ [2] http://www.cdl.uni-saarland.de/projects/ssaconstr/

Yes, I'm associated with that group.


> the dominance frontiers algorithm

AFAIK V8 does not use dominance frontier based algorithm for SSA construction. It just inserts phis eagerly at loops headers and on forward edges it inserts phis when merge is needed.


From a developer's perspective PNaCl and emscripten is almost the same to work with, the difference is just a couple of hundred lines of wrapper code (less then the differences betwenn iOS and Android wrapper code) and you need to find a higher abstraction layer for threaded code. And if the differences on the surface are so small, and the performance difference is acceptable as well (it is IMHO), I don't really care how the goal is achieved under the hood. Asm.js or PNaCl are not platform-lock-ins like DirectX or Flash. It's just POSIX + OpenGL with a handful of platform-specific functions, and practically the same code compiles on any other POSIXish + OpenGLish platform.


Good enough for C++, but with JVM/CLR languages running them right in the browser would be more ideal.


You're just considering the game-perspective, where you don't need anything but GL and sound. What about WebRTC? What about DOM manipulation? You can't do this from Emscripten very well.


It's actually really easy to build C interfaces to any Javascript API or lib. Since compiled code is just JS you can just directly call from C code into JS code and vice-versa. Most of emscripten's CRT wrapper is actually implemented in hand-written JS, as well as the OpenGL-to-WebGL wrapper. For an app which mainly needs to do DOM manipulation it is probably better to write it mostly in traditional JS and only implement the performance sensitive stuff in a C library compiled to JS. Best of both worlds...


How is PNaCl not Chromium lock-in? Pepper is defined by the Chromium implementation.


Only if you look at the generated binary. But the source code which created this binary can be compiled to a myriad of other platforms with only minimal changes. You can compile the (mostly) same source code to JS+WebGL, iOS, Android, OSX, Linux (SteamOS), even Windows (but not Windows Store apps), ... The differences between these platforms is about 1% or 2% of lines of code in a typical 3D game. Compared to porting a game to, say the Xbox One or WiiU this is "peanuts".


The other big advantage of PNaCl is that you have the option of distributing it as a signed application that the user downloads once. This is critical for doing in-browser encryption or manipulating sensitive data. You need there to be a trusted runtime that makes this possible, but this can't be done in pure Javascript (where it's too easy to insert malicious code that leaks your keys). It's not enough to simply use a crypto API in the browser either---you don't want the Javascript to touch the data at all (otherwise, it could leak the plaintext). Moreover, unlike Java and Flash, PNaCl code is compiled in such a way that the browser can do just-in-time static analysis to verify that the code cannot escape the runtime. These two features---signed code that's trust-on-first-use and JIT static analysis---make PNaCl a much more desirable runtime than asm.js in my opinion.


> PNaCl code is compiled in such a way that the browser can do just-in-time static analysis to verify that the code cannot escape the runtime.

You can (and people do) similar things to sandbox JS, by running it in an iframe or a web worker, for example, plus some static analysis.

Also, the structure of asm.js ensures that the code inside it cannot access outside except through a small number of statically analyzable entry points. You can likewise verify no one modifies the asm.js data array by putting the entire codebase in a closure, and doing a trivial static analysis to see that it is not used.


How do I verify the integrity of the sandbox JS, especially since I potentially have to download it each time I need it? Similarly, how do I verify the integrity of the asm.js code I'm receiving? Integrity is just as important, if not more important, than static analysis---for example, a malicious JS crypto library could pass a static analysis test, but intentionally generate weak keys.

There are ways to integrity checks manually, of course, but as far as I know the browser does not perform these integrity checks automatically (and it would be difficult to do so--you'd need to implement JS signing, and you'd need to implement PKI to get the right public keys to verify the JS signatures).


So your argument in favour of PNaCl is a completely incidental feature of it that could be (probably quite easily) added to regular javascript and is really just a workaround for the fact that the PKI trust model for SSL is totally broken?

That's a pretty odd way to end up lumped with a technology.


I need to verify that not only does the JS I received behave correctly (i.e. it will be appropriately sandboxed), but also that the JS I received from the server is the one that the developer intended to serve me. This is because even with perfectly secure transport, a compromised server can always give me JS that is valid and stays sandboxed, but does bad things (such as generating weak keys).

I need to perform an integrity check on the JS to satisfy the second requirement, and the integrity check should not depend on the server that served the JS (since a compromised server could lie about its hash, for example). Moreover, the check needs to be automatic, and trustworthy. One solution is to check the hash of the JS against known-good hashes from the developers (i.e. get the JS from the website, and get the hash from a CA), and then cache the JS locally until I determine that a new version of the JS exists (I want to avoid re-downloading it over and over--that only gives MITM and MITS attackers more chances to serve me bad JS, and it's slow). Not an easy problem; otherwise we'd be doing it already :)

PNaCl offers the infrastructure to do this. I would use asm.js if it did so as well.


> asm.js's compatibility with existing Javascript engines gives it the best chance of adoption.

This is the key. With asm.js the worst case is slow, not totally broken.


Which is why Google should have a PNaCl javascript implementation, so that it degrades gracefully like asm.js but the fast case is faster.



Awesome. I had indeed seen this before and forgotten about it.


What I miss in all those discussions regarding JS, asm.js and PNaCl is that the web is the best software distribution platform we have by a huge margin, this is really its overwhelming killer-feature. All the user needs is an URL. No "downloading", no installation, no special user permissions, no app shops, no gate keepers, no walled gardens, and everything is automatically multi-OS and multi-CPU-architecture. The web as a runtime platform may not be ideal, but the web as a "software distribution platform" rocks. For this I'm even happy to give up a few CPU cycles. And no platform should be tied to a specific language. The more choices there are, the better.


Have you actually shipped a complex codebase to the web? There are a TON of major issues which are just as bad:

Old problem: Downloading

New problem: Downloading. Large JS apps have to asynchronously load potentially very large codebases, every time the app is loaded

Old problem: Installation

New problem: Installation. What if the user has NoScript? AdBlock? Mobile Opera vs. Mobile Safari vs. Chromium vs. IE.

Old problem: Special User Permissions; gate keepers

New problem: Proxies. Websockets. Anti-virus software. On-machine firewalls. Corporate firewalls. "Special" toolbars and plugins that redirect the browser.

The web is not a "software distribution platform". It is not a "distribution platform", because code is retrieved on-demand and run, and there is no simple way to "cache" that locally and re-run it. (Chrome App Store does not count, because it is antithetical to the model you've described). It is not even software in the traditional sense that you mean, because everything is inherently client-server, so most of the app lives on the server or in the cloud.


None of the things you've listed are actually really problems, or have well-known solutions. HTTP caching solves the downloading problem for most classes of application. You can also run apps locally with app-cache and local storage (and in the future IndexedDB), though I'll grant that this isn't absolutely perfect today.

I don't understand your "installation" issue. A user with NoScript knows how to turn it off for a web application they want to use, and generally do so. They just want to retain control over JS execution. AdBlock is not related to actual application delivery, but revenue generation, somewhat orthogonal.

Having to support cross-platform quirks is also something that isn't unique to the web. If you support multiple versions of iOS for your app, you will hit similar issues. Not to mention if you wanted to actually be cross-platform and work anywhere other than iOS with the same app.


http://www.flohofwoe.net/demos.html, these demos are about 150k to 250k lines of C++ code, and are about 700k to 1.5MB downloads, the same as zipped native executables. I'm also on the team which does a web-based MMO which is running for 3 years now (not done with emscripten though), so I'm aware of the problems :) It's still miles better then closed platforms.


Having worked on both traditional and web projects, I disagree with “just as bad”. In every case, what you're talking about is easier to solve on the web than off-line.

Downloading: easily solved with browser caching. This is still faster than trying to, say, walk a very non-technical user through downloading and opening an installer.

Installation: not an issue for most users. NoScript/AdBlock are a small percentage of users for most sites and since most of the web breaks that way, many of them have learned to recognize when their local system broke something.

Permissions: the problem in both cases are the gatekeepers and every single “new problem” you listed was also an old problem of equal or greater impact. If the gatekeepers are determined and unaccountable, they'll block everything either way. The difference is that the web side has a better security model and is more likely to be open because you're following a known precedent – Google, Amazon, Facebook, etc. have created enough demand that even the most fascist IT departments allow HTTPS. This is not the case for the traditional desktop model where every local software install is considered independently.


> New problem: Downloading. Large JS apps have to asynchronously load potentially very large codebases, every time the app is loaded

Unless you use an old browser, this is not a problem anymore. http://www.w3.org/TR/html5/browsers.html#offline


This guy gets it.

The browser, html/css/js, is the most ubiquitous common runtime that has ever existed.

A lot of services have an iPhone app, an android app, and sometimes even a desktop program. But if you have a website, you get all of those platforms at once.

Sure you have to do things differently, usually more slowly, but you can do whatever you do on any device with a browser.


It's not the case that a web app is capable of doing "whatever you do on any device." The browser exposes a tiny fraction of the actual device capabilities.

Browsers are themselves native apps, and therefore web apps will always be less capable than native apps, because web apps are subject to the one-two punch of the limitations imposed by native apps and the browser.


Im not sure your contention that browsers will always be hamstrung is true, time will tell, but proposals for HTML spec camera integration, gps integration, etc are alive and well.

To the point of paying the fee of app in app, you're right that there will always be a penalty. But "will browser apps ever run as fast as native" is not really the right question. The right question is "will browser apps ever be able to run fast enough."

Finally, a clarifying point. When i said "do whatever you do" I meant that a website does the same thing in every browser, not that websites currently can do the same thing as native apps.


> All the user needs is an URL. No "downloading", no installation, no special user permissions, no app shops, no gate keepers, no walled gardens...

Exactly. The humble hyperlink is the most amazing thing about the web. It's simple yet devastatingly powerful.

I'm not a snob about the programming languages of the web. I don't care what language I need to learn to create content for the web, or what tools I need to use. The language and the tools will always be a means to an end — that is, the radical ability to provide free and instant access to any content for everyone on the Internet.

(from a blog post I wrote "why I create for the web": http://blog.neave.com/post/64669185529/why-i-create-for-the-... )


I noticed the shift from the parent's "software" to your "content." The distinction is critical. Most web pages are content, i.e. what's worthwhile in them is their media (text, images, video, etc.), and not their code. What draws us to HN is its content, and we appreciate its minimalist aesthetic, i.e. if it it used WebGL or something it could only be made worse.

As a content delivery platform, the web is unparalleled, ridiculously good. I have paid for content (for example, NSFWCorp), and will do so again in the future.

But as a software delivery platform, the web is unproven if we are to be kind, and crap if we are to be honest. Web software is fragile, limited, and subject to the whims of the site maintainers, who may modify it without warning, or even remove it entirely (e.g. Google Reader). I have spent hundreds of dollars on native productivity software, games, etc. but I have yet to spend a dime on a website for its JavaScript.

That may change in the future, but I doubt it: any program that's a sufficiently good web app can be rewritten as a desktop app with more capabilities. Ultimately the web may be the go-to place for trivial or gimmicky software, but the most powerful apps will be peers to the browser.


You're right, its more about software, less about content (but the line between software and content is blurry anyway). But it is becoming harder and harder to actually distribute software to user's desktops or mobile devices since these platforms have either been closed from the beginning (iOS, game consoles), or are quickly starting to become closed (OSX, Windows8). You can no longer simply send a download link to your users, or have your own download website with a 3rd party payment provider. The user has to jump through a comical amount of hoops to get the app running (Do you really want to run this extremely dangerous software downloaded from the interwebs? Administration rights are needed to install this software!, etc etc etc etc... its a travesty).

Everything has to go through the centralised, closely guarded app shops. You'll have to go through silly certification processes to get your app into the app shop and if the platform owners (or some minion working in certification) feels like it, they can just remove your app without warning.

Compare this to a web app. You deploy the stuff on a web server of your choice, the user clicks on a link. Done.


If by 'free and instant' you mean 'just pay for an ISP/phone carrier and wait for a page to load, or a web-app to spin and spin'.


Imagine there would be a language that would compile to proper bytecode (not JS), that would run on a standardized platform which is present on almost every computer, that is mature, sandboxed, and actually pretty fast. Oh, wait, that already exists and is called Java.

Java has gotten a bad rep lately due to some high-profile drive-by-malware bugs. But if the java codebase would have gotten the same intensive care that the webkit codebase got, this would no longer be an issue.

Many people remember java to be sloooow. When I first came into contact with it in school, that was certainly the case, but since a couple of years it has had a modern JIT that could easily rival native code.

Java applets are ugly, sure, but that is largely due to the decades old AWT, and the poor font support it used to have. With SWT, you can have native widgets (dunno if they work in Applets, but they are nice on the Desktop), and with antialiased drawing you can get the same results as with HTML5 canvas.

Java applets (and Flash, and Silverlight) died for marketing reasons, and political reasons. There were no unsurmountable technical issues. The outcome is that we are stuck with "worse is better" for the foreseeable future, only max. 50% to 1% of the possible native performance, and a bunch of restrictions we only slowly realize what they mean (no sockets, no signed applications, no anonymous/serverless mashups, less hardware access than we used to have, suboptimal caching, suboptimal tooling like languages, debuggers, content creation tools (I haven't seen anything that can replace Flash for simple vector animations yet) and so on.


On the web, when one thing just works, albeit slowly; and another thing requires installation but runs quickly; the thing that just works is likely to win.

Unless java gets upgraded to a first class browser component, JavaScript will tend to win. As a product designer, every step I take my user through loses users. Installing java is a big, scary step that I can almost always avoid.

The insurmountable issue is that it requires users to do installation work.


Well, until about a year ago, Java was on (I'm guessing) 70% of PCs, Flash on > 95%. Now with the shift away from plugins, and the growing importance of mobile, its shrinking quickly.

But! in the brave new world of HTML5 and so on, you still can't assume that everybody has all these features. Either they are stuck on older browsers (at work, or my old laptop that I rarely use), there are subtle implementation differences bewteen browsers (although I have cutting-edge Android devices, the cool demonstrations often don't work nicely in them), or finally the browser is OK, but the computer is too slow.

I only have one computer that can run all this newfangled WebGL stuff at decent speed, and it's my gaming PC at home.


Web gl is a small part of the rich internet applications currently being built. I agree that the web is still too early for advanced games.

On the point of older browsers: 1) the trend of % of people using older browsers is going down. Whereas the number of people using modern browsers without flash and/or java is going up (ie iOS mobile).

2) someone with an older browser expects certain parts of the web to be broken. Being broken in IE6 somewhat says "were more modern than you, try again after you upgrade.". I think people who see this are likely to come back at a future date, whereas someone who can't use it bc of flash is unlikely to think their problem will go away in the future.


But the cost of installing Java is spread across several apps. So the cost is negligible over time.

It is like saying, if I build a web-app, the user needs to first install the browser. True, but once installed, other web-apps have a zero cost of installation.


Not exactly. A web browser has to be installed before they start the process. Java has to be installed either at the beginning of the process or during the process.

I.e. the installation point of java is inside of my conversion funnel for some x > 0% of people I am targeting.


Well, plenty of money has gone into the JVM fast to start. Sure, it's fast to run once warm, making it great for servers, but verifying bytecode and running it is still slow. I've seen benchmarks showing that compiling JS is faster than loading equivalent JVM bytecode.

Additionally, the JVM security model has a much larger surface area than JS, allowing for all kinds of things JS doesn't, which, I'm going to guess is why security has been worse.


Agreed with most points, largely it is timing (as with Flash's original stronghold and the re-up when it solved a huge video on the web problem with Flash video i.e. Youtube rush).

However Snap.svg (snapsvg.io) made by the Raphael creator and backed by Adobe seems to be the best vector animation replacement yet. It is really a mashup of Flash vector and Silverlight declarative style, almost an iteration (you still lose the compiled nature of it and very good compression by default though as it isn't swf -- just like Silverlight). Still nothing has all the features of flash except maybe Unity, minus the vector part, but plugins are looked at in a worse light now.

So we are in this transitionary stage where the new stuff is better but it takes much more work to get it to do the same across all platforms. WebGL, hardware accelerated <canvas> and <svg> and libs that glue those nicely like Three.js or 2D EaselJS and vector libs like Snapsvg will see it through, or further iterations of those. Once WebGL takes hold across all browsers and a couple years we will be in new, more capable hardware accelerated lands. Flash was a big software rendering, CPU hogging bummer at the end.


Snap.svg looks really nice! I meant mainly the Flash authoring tools (Macromedia Flash) are missing. Every idiot used to be able to click together a little animation. It even spawned a new (although questionable) art style of "Flash cartoons".


Let java die in the browser. I'm an advocate for java as a backend language. It's one of the most mature runtimes and for speed and what is great as a server language that doesn't require manual memory management. Hadoop and co is built on top of the JVM for a reason. It's even great as an ecosystem for scala, clojure and all that, but the java GUI tools (swing and I'd even argue fx) are horrible to code with. I won't comment on android as it's a separate beast in that arena.

Java in the browser is an abomination though. As a client side language...forget it. I love it as a server runtime though.


Instead of the JVM-as-moral-equivalent-to-an-embedded-iframe architecture we ended up with, we certainly could have had <script type="text/java">. Or <script type="application/jvm" src="foo.jar">, for that matter, with a JVM "engine" sitting sibling to the position Javascript "engines" have in the browser, in terms of access to the DOM and sandboxing provided by the browser-as-platform. That would have made JVM bytecode a wonderful target.


There actually is a Java DOM API that could have been used in applets. It was never widely supported though, AFAIK.

http://www.w3.org/TR/DOM-Level-3-Core/java-binding.html


Asm.js is not Javascript. It's native code!

It's not a VM bytecode. It's assembly language, and the target machine is your native CPU. It is sandboxed by the browser. It is present on any computer with a browser and Javascript, which is more computers than have a browser and Java plugin, or any other plugin for that matter. Asm.js is less mature that Java, but it will grow up. It is currently supported by more browsers than Java, so even if it's crappier right now, it will eventually surpass and replace Java in the browser.

If you are thinking that asm.js is Javascript or is interpreted, you're thinking about it wrong. Think about it this way: Take an arbitrary executable. If you dissassemble it, you have some sort of assembly code. There are several different assembly language syntaxes, but they're all basically the same. Asm.js is basically just another assembly language, with two key differences.

The first difference between asm.js and any other assembly languages is that the only existing asm.js assemblers happen to assemble executables that are sandboxed by your browser. You can't do syscalls to use sockets or files or the Windows registry. You can't fork() or run non-asm.js executables. Still, if you can live with the "syscalls" that the browser gives you, you can run native code in any browser. The browser is your operating system.

The second difference between asm.js and other assembly languages is the really clever part! It is not just assembly. It is both valid assembly language that can be assembled to native machine code and also valid Javascript that can be executed by a Javascript VM -- and both ways of looking at it are semantically equivalent!

It's like being able to write a novel where the text is both valid English and valid Spanish at the same time, and the plot is the same no matter which language you speak.

So I lied. It is Javascript. But you shouldn't think of it that way. You write Javascript. Your compiler writes assembly language. Asm.js is assembly. The Javascript aspect is just a clever backwards compatibility hack. It could have been specified to have a more traditional and readable assembly language syntax, but then you'd need plugins and nobody reads assembly anyway. As is, it's machine code that happens to be able to run on any Javascript VM. This means you can deploy native machine code to any browser without a plugin. In the event that your browser is old and doesn't contain an asm.js assembler, that's ok! Your unmodified native code will still run, albeit more slowly, in a plain old Javascript VM such as everybody has had for years. Except lynx users.

The browser as an operating system is currently worse than a Java runtime plugin in that the browser has less functionality. However, browsers are currently better in that they have more penetration and a higher velocity of improvement. Long term, the browser as a full-featured operating system makes more sense than as a mere scriptable document viewer slash plugin container.


It's assembly language the same way LLVM bitcode is assembly langauge, which means IT ISN'T. It is also represented in text, which means that it is several times larger than it needs to be.

Another problem with Asm.js in practice is that it's used with Emscripten which doesn't define the syscalls you talk about. All the DOM/WebAudio/WebRTC/etc API's have to be redone in Emscripten's headers. GL was easy, because WebGL is so close to EGL and friends. But what about DOM manipulation? Emscripten can't really do this yet.

These two disadvantages happen to be problems solved in PNaCl, which has a compact bitcode (which doesn't pretend to be JavaScript) and a proper API (Pepper).


You are technically correct - the best kind of correct! While I was being a bit loose with my language, I don't think that changes the point.

So maybe it's not an assembly language or a bytecode, but that's a pretty unimportant distinction. Those are all intermediate representations (IR). A good browser will JIT your asm.js IR to machine code in much the same way that a good JVM will JIT your bytecode. An old browser will interpret it more slowly, but it will still work.

Asm.js may be crappy at the moment, but it will get better. It's an easily JITtable IR for the web that's already supported in all browsers, even the old ones. And the security is basically free since that's already being paid for. Yes, it's a bloated text encoding, but gzip is pretty effective. No, it doesn't have great compiler support yet, but Emscripten and equivalent will no doubt have lots of useful headers and emulation and translation libraries very soon. Asm.js is going to win for the same reason that Javascript won: everybody already has it.


Nitpick: asm.js is designed for AOT, not JIT, compilation.

A good browser will AOT asm.js, not JIT it.

(In other words: pre-compiled, not compiled on the fly. Consequence: immediately as fast as a JIT would have… eventually… made it, but initial pause while compiling.)


The fact that asm.js is represented in text does not make it unlike assembly language. It makes it unlike bit/bytecode or machine code.


assembly is just mnemonics for the machine code - that is very important to remember. its a different encoding, but its one which has a strong and tightly coupled relationship to the point where disassemblers are a thing..

sure asm.js is encoded as text, but its not encoding anything close to what assembly languages do.


I have another reply to that effect somewhere in this thread :)

Modern assembly languages often at a slightly higher level of representation than 1-to-1 with the instruction set so it's tough to draw a tough line, but yes, generally speaking, you should be able to translate the majority of the language to machine code with opcode tables. It's also interesting that machine code isn't even the lowest level representation of machine instructions on many architectures, which internally use microcode to implement architectural instructions.

My point was that LLVM bitcode and asm.js are both intermediate targets, but not exactly the same.


You mean macro assembler languages? Its a pedantic distinction but I wasn't intending to imply those... 'Pure' assembly languages are still not 1 to 1 but are never worse than many to one, eg nop is often a real, but useless instruction with some other mnemonic....

Not sure what you point is about microcode... That is an implementation detail which, even for eg, x86 LEA where you think you are leveraging it, it is not important or useful beyond trying to understand performance characteristics. What you get to work with is whole instructions.


We agree! I'm not arguing with you. I'm just trying to provide a bit more context for people who are trying to understand the purpose of all this. Judging by the vast majority of remarks on the article, there's a lot of misunderstanding of how this stuff works.


> It is also represented in text, which means that it is several times larger than it needs to be.

According to the emscripten FAQ[1], gzipped emscriptem output (which uses asm.js) is about the same size as gzipped native code.

[1] https://github.com/kripken/emscripten/wiki/FAQ


Asm.js is not at all an assembly language. An assembly language has a very close, if not one-to-one relationship to actual machine instructions. Assembly language is by definition tied to a particular architecture and non-portable. I've never heard of an assembly language that doesn't let you directly address architectural registers and memory.

Asm.js is just a subset of JS that, when used in blocks, is precompiled. In that sense, it's much more like writing C than assembly (or at least C-like subroutines to be embedded in larger scripts).


Outstanding explanation, +1 and thanks for blowing my mind. I find myself sorely tempted to drop everything and throw myself into this exciting new world. One thing that always bothers me with these xyz.js technologies; Is there an explicit or implicit implication that they are actually implemented wholly or partially in an actual file called xyz.js ? In the particular rather than the general; Is there actually an important javascript source code file involved called asm.js ?


No, asm.js is a subset of JavaScript. So it makes no sense to imagine a single file.


If asm.js is not VM bytecode, what is?

If asm.js is "native code," then what an example of VM bytecode that is not "native"?


I think your point is "all VM bytecodes are not native".

native means the 'byte code' of your hardware... not some VM?


From what I understand, asm.js is using JavaScript as a kind of portable assembly language which means there is no VM.


No VM? What do you call V8, SpiderMonkey, etc?


The conversion from asm.js to native code is much easier than the conversion from JavaScript to native code. The conversion from the real assembly code to native code would be slightly easier than that, but not significantly, making asm.js close to the assembly language (with an infinite number of registers, if that's your concern).


If "ease of converting to native code" is your criterion for "close to assembly language," then Brainf*ck is about the closest to assembly language you can get, because you can write a BF->x86 JIT in about 100 lines of C: http://blog.reverberate.org/2012/12/hello-jit-world-joy-of-s...

So if that's what we're going for, we should forget asm.js and PNaCl and just use BF and expect very close to native speed.

But this won't in fact give native code speed, despite how easy BF is to JIT. Why do you suppose that is?


Huh, I thought it is obvious from the context but I'll restate my words here: It is easier to make native codes from asm.js code than to make native code with the similar performance characteristics from JavaScript code.

It is always easy to get things just work, even while it may mean that printing "Hello, world!" would take minutes. But it is hard to get things work and fast, and there is a certain threshold that it is much harder to optimize once that threshold has been reached. Asm.js have much higher threshold than JavaScript's (and obviously BF's); you can literally replace each line of the already-verified asm.js code with a direct translation with a simple register allocator and it may occasionally (not always, of course) beat optimized JavaScript implementations. Yes, modern JavaScript implementations can optimize the same JavaScript code to that level once it becomes a hotspot! But having a higher threshold changes the problem, for example, time spent to analyze and annotate JavaScript code with hidden types and guards and assumptions can now spent for other optimization passes (e.g. auto-vectorizations).

And you are seriously wrong with BF JIT, you at least have to merge runs of same instructions. (Not to mention other common optimizations, c2bf has only some of them.) Of course it would only matter if you want BF to be fast...


Ah, see, your statements are getting weaker and more qualified, which is what I've been trying to point out.

The original claim in this thread was that asm.js "is native code." This annoyed me in the same way it would annoy an astronomer if you called an asteroid a planet through some chain of tortured logic.

Then OP and others started saying it's "basically the same as native code" because it's "very easy to JIT," but that's clearly not true either thanks to the BF comparison.

So now you're further qualifying by saying ask.js is close to native code because it's easy to JIT "with similar performance." But if I point out corner cases where native code is much faster because of vectorization or use of specialized instructions, you will have to qualify further.

All I am pointing out is that calling asm.js "native code" is inaccurate and misleading. Especially when this is used to imply that asm.js (or PNaCl for that matter) is the be-all end-all answer to giving native code performance. Sure it's a lot closer to native than JS is, but it's still pretty far from actually being native.


> This annoyed me in the same way it would annoy an astronomer if you called an asteroid a planet through some chain of tortured logic.

I think many people just say "stars" while they should say "planets", "asteroids", "stars", "galaxies" etc instead, even when they are well aware that stars are not same as other planetary objects. Well, I think I have made many required qualifications in the second comment as your point becomes clearer. (And I think many others have the similar assumptions, but that's another story.)

> All I am pointing out is that calling asm.js "native code" is inaccurate and misleading.

You are totally right on that. Asm.js is just "a JavaScript subset that can act like a portable assembly language". No performance guarantee here. One can expect the performance boost with the reasonable assumptions but as far as I recall V8 people don't think so. In fact, I'm not that interested in the performance of asm.js but in the potential use of asm.js for non-JS contexts.


Browsers are supposed to compile asm.js code directly into native code before running it. Running it through the VM is the fallback for browsers that don't support it directly.


I can compile BF directly to native code (and actually I did: http://blog.reverberate.org/2012/12/hello-jit-world-joy-of-s...) so does that make BF native code?

The point of these questions is that the GGP's statements are absurd. There is no universe in which asm.js is "native code".


No, it's not technically native code, nor is it exactly assembly or bytecode. It an IR that browsers can JIT very easily. Which is basically the same thing.

It's interesting to note that x86 and amd64 code isn't even truly "native". They're just bytecode IRs that are interpreted by a CISC virtual machine emulated in microcode running on a RISC cpu that you can't program directly. Everything is an IR. Python is an IR for the thoughts in my head. It's turtles all the way down.


> No, it's not technically native code, nor is it exactly assembly or bytecode. It an IR that browsers can JIT very easily. Which is basically the same thing.

So BF is "basically" native code too, since it can be JITted very easily? In fact, it's much easier to JIT BF than asm.js, so according to your definition it is even more "native" than asm.js.

The way you are using "native" takes away all of its meaning.

> Everything is an IR. Python is an IR for the thoughts in my head. It's turtles all the way down.

It's really not.

Yes, every executable representation is a representation (though we don't usually call representations "intermediate" if they are designed to be executed directly).

But the hand-wavy idea that because two things are both representations they are "basically the same thing" is so far from true that it is the opposite of insight. The truth is that the differences between representations is one of the deepest concepts in compilers and VMs.

For example, the entire difference between asm.js and PNaCl mostly boils down to the differences between their two different representations (a modified LLVM bitcode vs. a JavaScript subset). If these two things are "basically the same" then asm.js and PNaCl are "basically the same" too.


True but I'm replying to a comment stream that describes asm.js code as "portable assembly language" which is a pretty reasonable description.

It's certainly not native code since no CPU exists that can execute asm.js code directly.


Java's greatest sin for the web was not having the JVM have access to the DOM like javascript.



The web's greatest sin was allowing programmatic access to the DOM, IMO.

Interactive sites suck, if you want that, use a sandbox so I can easily discard it.


I always have issue with the idea that a JIT or interpreter can rival native code.

In many cases it just reveals a lack of understanding of why C is powerful. It lets you choose your implementation details very heavily (not heavily enough imo [!]) to the point where few languages can do things it can't. Meaning that comparing like for like code is naive - you should compare a copy of the 'faster' implementation in C to the implementation in the language being benchmarked. Due to design limitations its basically impossible for Java to seriously compete with C...

I do generally agree that Java gets a serious bad rap for nearly nothing though...


I can fully recognize that Java is a fast and powerful language, but it also makes me want to shove huge wooden slivers up my fingernails. That doesn't change despite the truth of your statements.


But what the poster is really thinking of is the JVM. We now have Scala, Clojure, and others that target the JVM as a platform.


The JVM doesn't run on Android or iOS, and never will. That's a huge chunk of users they can't touch, but the web can.


Aren't all Android apps running the JVM?


They run Dalvik, which is Googles proprietary VM. Its similar to the JVM in purpose, but it is supposed to avoid Oracle's patents and has a few specializations for mobile.

AFAIK (I'm not a java developer), any language that can target the JVM can target Dalvik.


AFAIK, the only language that's officially supported by Dalvik is Java, and code in other languages tends to run into limitations in the bytecode-to-Dalvik translator.


Surely there is a Scala for it? Alternatively use Mono and C#/F#.


The issue is not any particular language but the platforms. Asm.js is an assembly language that gives native-code performance to browsers that's theoretically on-par with the JVMs' JITs.

You can think of asm.js as a bytecode (like JVM bytecode) that all browsers can interpret. Some browsers are now getting JITs for this "bytecode".

You don't write asm.js in the same way that you don't write assembly or JVM bytecode by hand. You write Java or Clojure or Scala or C or Haskell. Your compiler then turns that into either JVM bytecodes or asm.js assembly. And your JVM or browser will JIT that bytecode into native code.

The real issue now is that the JVM is currently a more capable platform, but the browser has broader penetration and arguably better security. Raw performance is becoming a non-issue with asm.js.


Actually asm.js is possible to AOT compile, not just JIT.


On a related note, are there anything preventing a language like Clojure(script) that already targets the browser to target asm.js instead? What would the advantages and disadvantages be (if any)?


I'm pretty sure the same argument can be made, but more strongly, for C.

C is massively more cross platform than Java - a JVM implementation often relies on C. C compilers are often the first things implemented for new platforms. C is very close to pure native code - to the point where many equate C/C++ with native. All of Java can be beaten or matched by C performance wise because of the power it gives to leverage the hardware in precisely the same way the JVM can but without (so many) overheads that are designed in.


This discussion is missing the simple point that the web is rapidly gaining all the flaws that Flash used to have, without the nice (for some) editing environment.

The web really isn't suited for app development at all, as the native mobile markets have demonstrated, while the viability of it as a document delivery platform diminishes every time the content gets hidden behind a massive layer of scripts.


> the web is rapidly gaining all the flaws that Flash used to have

False.

1. Modern JavaScript vms are sandboxed, not native plugins with unending security holes.

2. It's nearly all open standards and open source, not proprietary closed source controlled by one company. OpenGL, EcmaScript, W3, Mozilla, Chromium, blink, webkit.

3. It works on mobile devices, flash doesn't.

4. The tools are out there. Check out appcelerator or unity's tools.

5. Flash never had the native performance or hardware acceleration that modern JavaScript has.

6. Don't think Flash ever had the kind of momentum JavaScript has. Like has anyone ever made a derby.js for Flash? A sharejs? Compiled other languages to ActionScript? With source maps to help?

7. Flash avoids all kinds of privacy settings and plugins in your browser. Flash gives you less control.

8. Even Adobe has moved past Flash, these arguments are all done, why haven't you moved on?


I think the above poster is talking about UX...

> 3. It works on mobile devices, flash doesn't.

False. Many (120000+) Flash apps are actively running with Adobe AIR on App Store and Google Play. AIR is a technology to package SWF as a native app.

> 4. The tools are out there. Check out appcelerator or unity's tools.

Unity can not export for HTML5 yet. Apparently they are working on it but I don't think it's that easy (As for the original article, I DO MIND the initial load time. That will be a big problem for Unity as well).

> 5. Flash never had the native performance or hardware acceleration that modern JavaScript has.

False. plain (non asm.js) JavaScript is slow. Don't trust micro benchmarks. And at least ActionScript is faster than plain JavaScript (http://j15r.com/blog/2013/07/05/Box2d_Addendum). And Flash Player 11+ has Stage3D which utilize GPU acceleration as well as WebGL. For instance, please watch this Facebook game (http://www.youtube.com/watch?v=vBIJVt05jwc). Note that this is a commercial product. Not an experiment. Moreover, the same "Epic Citadel" demo was released for Flash on Mar 2012. One year earlier than HTML5 (http://epicgames.com/news/epic-games-releases-epic-citadel-i...).

> Compiled other languages to ActionScript?

Adobe Alchemy...

Please don't denounce Flash without a knowledge of Flash. I'm already not a user of Flash and currently developing with Unity and Cocos2d-x instead, but I feel I need to advocate Adobe guys from unfair bashing.


I think there are a couple of misconceptions here:

> 1. Modern JavaScript vms are sandboxed, not native plugins with unending security holes. Flash was sandboxed (just as Java). You were strongly restricted in what you could access with ActionScript. Of course, the security was poorly executed. I blame it on the fact that it was developed in a different time, without the experience and tools we have today, and that it was closed-source. Still, native plugins don't have to be inherently insecure.

> 2. It's nearly all open standards and open source, not proprietary closed source controlled by one company. OpenGL, EcmaScript, W3, Mozilla, Chromium, blink, webkit. True, but I'm starting to think "Open Standards" was a huge Trojan horse. - It lead to, or continued, a huge monopolization of platforms. Only a few large companies are able to maintain modern browsers (see the demise of Opera). This is bad, because they can push politically motivated restrictions on their users. (Firefox has a whitelist to only allow certain media codecs to play, although the system would support more. I could modify the source for myself - I did - but it's of no use, as the people visiting my sites won't be able to use it. I don't have the marketing power of Mozilla or Google.) - The "open" "web" platform limits what kinds of apps you can write. You can't really write apps without an (accountable!) central server. Without sockets, you have no Bittorrent, no P2P, no Tor, no Instant Messaging... - I think it also pacifies people who would otherwise be worried about today's locked up platforms. "You can always write a web app if you don't get in the app store."

> 3. It works on mobile devices, flash doesn't. Which was a political decision, not a technical one. There used to be thousands of free Flash games, that would have ran with minimal porting on mobile devices. But Apple couldn't control flash, they wouldn't get their 30% cut from Flash apps, and they couldn't censor apps. So they forbid Flash on iOS, and crippled it on OS X, which was one of the main reasons for its demise.

> 5. Flash never had the native performance or hardware acceleration that modern JavaScript has. Flash had support for native video playback before HTML. It had 2D acceleration for animations before there was canvas, and Shockwave Flash had 3D acceleration years before WebGL.


> True, but I'm starting to think "Open Standards" was a huge Trojan horse. - It lead to, or continued, a huge monopolization of platforms. Only a few large companies are able to maintain modern browsers (see the demise of Opera).

Free-slash-open-source programs are like banks in this way. In principle, a bank that fails can always be shut down rather than bailed out, and this is what justifies the existence of private-sector banks. In principle, an open-source program can always be forked if you can't persuade the maintainers to make the changes that you want, and it's always been agreed that this is a central, essential requirement for a program to be considered free-slash-open. But some programs are, in practise, TBTF - Too Big To Fork. A program can be "big" not only by having a large codebase but also through network effects, such as having vast amounts of client software tied to one of its interfaces. The big-boy Web browsers are TBTF in both these ways. So if, for example, you're insulted by Google's decision to knife MathML (as everyone should be), it's relatively easy to roll a Chromium with MathML inside, but you'd still effectively be just maintaining a branch, because you'd have no hope of maintaining "your" browser independently if Google took the whole Chromium codebase in a direction you didn't like - and more importantly, good luck getting users to use your browser or developers to create MathML webpages to support it.

A second example of the phenomenon is the Gnome/KDE mess - part of the reason that the Linux desktop sucks is that, even if you have a clear idea of how it could be better, it's still a whole lot of man-hours to spin up an alternative implementation, get apps customised for it, and so on. In general, an area is the domain of TBTF to the extent that you have to win a political persuasion battle or spend a truckload of your own money before you can produce a viable implementation of your alternative idea.

The solution, to the extent that there is a solution, to these problems is a technical one: find a way to shrink large programs and/or break them up into small, reasonably independent ones. (Of course all social/political problems are technical problems in disguise just as all technical problems are social/political problems in disguise. ;) ) In the case of the web, this is why the vertically-integrated Web browser must go away https://news.ycombinator.com/item?id=6720793 .


> Compiled other languages to ActionScript?

Haxe compiles to the same thing ActionScript compiles to, which seems like the appropriate comparison.

http://en.wikipedia.org/wiki/Haxe (And looking it up, apparently it does compile to AS, though I don't think it typically goes via AS when generating a swf. I might be wrong though.)


> Compiled other languages to ActionScript?

For the interest of historical accuracy, I must point out that Adobe Alchemy predates Emscripten by years.


Alchemy didn't compile to ActionScript, it compiled to the underlying VM bytecode and it relied on special bytecodes added to the runtime specifically for Alchemy.

A more direct comparison in this case is NaCL - Alchemy-compiled outputs would not work on an old ActionScript VM, while one of asm.js's benefits is that the generated code will work on any javascript runtime, because it's just JS.


> 1. Modern JavaScript vms are sandboxed, not native plugins with unending security holes.

Right, (and I'm also in part replying to my sibling replies here) can we stop talking about "sandbox"ing as though it's something concrete and real?

Something being "sandboxed" doesn't really mean anything - or rather it does, but only in an abstract way. Everyone who uses the word "sandboxed" means something slightly different, and every time in history someone has implemented a "sandbox"ing" system, their idea of "sandboxing" is slightly different.

One person's "sandboxed" means "disables language access to the functions that could affect the system" where another person's "sandboxed" means "uses clever os features to isolate all execution into a separate container".

This is why you can endlessly debate whether x or y is or isn't "sandboxed".


1. Modern JavaScript vms are sandboxed, not native plugins with unending security holes.

Flash actually was sandboxed. Poorly, yes -- but so were JS VMs until very recently. It was only a matter of time.

2. It's nearly all open standards and open source, not proprietary closed source controlled by one company. OpenGL, EcmaScript, W3, Mozilla, Chromium, blink, webkit.

While the Flash IDE itself was closed-sourced, the format itself was almost entirely open-source -- and third-party tools have been available for a long time to compile SWFs on the level: http://en.wikipedia.org/wiki/Adobe_Flash#Open_Screen_Project

3. It works on mobile devices, flash doesn't.

Entirely a political argument.

6. Don't think Flash ever had the kind of momentum JavaScript has. Like has anyone ever made a derby.js for Flash? A sharejs? Compiled other languages to ActionScript With source maps to help?

This is a tautological argument. "Javascript is better because Javascript is better"


> The web really isn't suited for app development at all, as the native mobile markets have demonstrated

Mobile markets have also demonstrated tightly controlled walled gardens, where fast iteration, platform independence and openness have been replaced with strict rules and controls by the market controlling entities. Is that really a direction you want to move in?

The web as a platform is still lacking in terms of User Experience, but id argue its catching up quickly and already far ahead in most other areas.


> The web as a platform is still lacking in terms of User Experience, but id argue its catching up quickly and already far ahead in most other areas.

The web has had quite a long time to get there, yet it is still easier to write a decent application on NEXTSTEP that spawned the web then on the web. You would think we could at least make it as easy as HyperCard or Visual Basic 1.0, but we are still trying to push UI into a format that wasn't designed for it. I'm not sure the ultimate solution, but I get the feeling it will be what comes next and not more iterations of the same.


"openness" - I want to use my favourite language and familiar environment to produce high-performance apps with good UX. The only advice I have heard from "open web" guys so far is to use transpilers (thanks very much). Xamarin and Mono seem to be the only organization that tries to give developers what they are asking for. It's a shame that they have to charge, but it's a reality.


>The web really isn't suited for app development at all, as the native mobile markets have demonstrated

It may not be suited for app development, but people want to run applications in their browser. I'd rather use gmail than outlook, google docs than microsoft office. If I can have photoshop in the browser, I'd rather use it there than install it separately.

This is the new reality. Deal with it. HTML is no longer just a document markup language.


It's very simple - there is no magic here. asm.js is just a "pidgin instruction set architecture", to allow communication between an emerging set of VMs - the browser runtimes - and a compiler backend. (The front-ends are the LLVM front-ends.)

The article is exactly right in saying that it's a way to route around JS. Javascript fanboys should not be praising asm.js, because it's a way to route around them. (Which is fine by me; JS is an abortion of a language that cannot die fast enough.)

I see asm.js as the Revenge of Compiled Languages. Coupled with generic interfaces for accessing underlying graphics and audio hardware, we're just right back where we started with Java applets. Write your apps in whatever language; run in the browser.


pwang, there are some people who actually enjoy coding in JS, believe me I am one of them.

One cool thing about JS is that you have runtimes for it in computers, tablets, phones, TVs and most current videogames so you can experiment and build stuff for a variety of hardware that no other language can reach as easily (of course you can reach anything with C but it is not easier).

Remember all that could be done in JS two years ago and all we could do now. Imagine by the end of the next year what we'll be able to.

I don't care if people are using LLVM to cross-compile other languages to the JS runtime, this kind of approach and research makes better runtimes and both camps benefit, the people who hate JS can use their fav language and those that enjoy JS end up with a better runtime.


I (not pwang) think it's an abortion of a language because the syntax, scoping and the language as are a whole is a pain to use. Sure, it runs on pretty much everything but that doesn't mean it's suddenly enjoyable to use. The only outstanding feature of JS is that it runs on lots of stuff but it need not be the only one.

As you say getting other languages to run is a good thing because those who don't like JS don't have to use it (or indirectly use it with stuff like Coffeescript).


just to be clear, I don't think that JS reason to be enjoyable is that it runs everywhere. I do enjoy the prototypical inheritance, I like the scope and the language in general. I don't like coffeescript but not due to a technical reason, its about personal taste, I'd rather work in JS. To each their own, the fact that we have a non-proprietary language that is available everywhere is a very good thing.

IMHO most people that do not enjoy JS is because they approach it from a mindset of an OOP programmer. People try to treat it like Java or approach it like a toy language to cook quick jQuery script then they feel frustrated. I am not saying that JS is everyones cup of tea but some people like me enjoy it.


The ideas that CoffeeScript is an acceptable replacement for JS and that "the language as a whole is a pain to use" aren't compatible.


Some people actually enjoy pain...


>Javascript fanboys should not be praising asm.js, because it's a way to route around them.

Depends on what kind of JavaScript fanboy you are. If you love the language and want applications to be written, from the ground up, in JS, asm.js is not your friend. If you want to build applications that run on pretty much every platform + kitchen sink, this is for you.


Doesn't that just make you a browser+web fan, and not a JS fan?


This article neatly sums up my thoughts on asm.js. On the one hand, I really like the fact that Mozilla is doing a lot to promote Javascript (which I usually enjoy programming in). On the other, asm.js basically destroys any reason to write Javascript.

For someone interested in web-based gaming, it really discourages me from investing in Javascript-based tools, since the future _won't_ be hand-written Javascript. It's only a matter of time before a tool like Unity includes an asm.js export target.


That's a good thing.

We spend too much time developing languages instead of environments. Unity is a good environment for developing games. If your goal is to make a game you should use Unity (or whatever... GameMaker has an HTML5 export module; use that). Javascript has nothing to offer you to enhance your productivity and develop your game.


I disagree, I think the future will be all kinds of things, both on the web and elsewhere. On the web, it will be both handwritten JS as well as compiled JS from various languages like C/C++ (into asm.js), TypeScript, etc.

Each of those is good for some use cases, but none is good enough for everything. Just like we have many languages for native development and web servers and so forth.


Pick a language you like (on its merits), not the one that is ubiquitous.


I'm very impressed by the Epic Citadel demo. But how do you debug a C/C++ program that has been converted to asm.js? Whats the current tooling like?


Usually I'm debugging a native "desktop build" of the code in VStudio or XCode since 99% of the bugs are not emscripten specific. But having said that: browsers are becoming a really good debugging platform as well. emscripten can emit source maps, which kinda lets you directly debug the C++ code in the browser (not as fluent as in a native debugger yet, since only code lines are mapped, not variables, but the potential is clear). Some WebGL related debugging tools are really good in the browser (e.g. Firefox' WebGL shader debugger or the WebGL inspector browser extension). Especially on OSX which doesn't have very good debugging tools for desktop GL the browser WebGL debugging tools are actually better then what's available as native tools (not as good as PIX or NVIDIA's NSight tool on Windows though).


Thanks! I was thinking specifically of bugs that occur in the browser target but not the native build.

Given your experience that "99% of the bugs are not emscripten specific", I'm very impressed with how emscripten can retain the semantics of the native code across the conversion and optimisation process. Obviously thats what any compiler does, but in this case the target environment seems to me to be much more complex than a physical cpu: threads, memory management, etc.


> in this case the target environment seems to me to be much more complex than a physical cpu: threads, memory management, etc.

It compiles for a restricted subset of javascript. There are no threads (there are no threads in javascript itself in the first place), and it works around memory management (it sets up big typed arrays, and memory allocations are slices of that)


asm.js is brilliant because of the current situation we are in. However, the current situation we are in is silly, and can correctly be called a "tragedy of the commons."

Why? Well, for example, is JavaScript truly the best language we'd be able to create for browser scripting, or did it win by historical accident? (And I'm actually a fan of Javascript programming!)

And beyond that, we'd all like something that gives us safe native-speed rendering control in the "sandbox". Silverlight was meant to, as was Flash/Flex. But those were proprietary, and we didn't want one company to have "control of the web." HTML 5 hasn't been what we hoped for.

So basically, we're one big, divided bureaucracy that is not making rational decisions (what big divided bureaucracy does?).

I guess what might happen is something new will eventually come along (who knows how long it will be) that actually displaces the web as we know it. It will be an adoption-tsunami, similar to what the web itself was, and therefore it will be able to ignore this series of historical accidents that we're chained to today.


I agree, an Unreal Engine compiled to javascript doesn't make sense. It would make much more sense if it were compiled to some kind of bytecode (like NaCl). That will ultimately give you better performance, less loading time, etc.

However, maybe there are some libaries which compiled to asm.js are small enough that they can still run in a browser without asm.js support, e.g. an compression or an encryption algorithm compiled to javascript. Maybe that is the sweet spot for asm.js: You are still able to support Safari and IE though native javascript, but it's faster on Chrome and Firefox.


> It would make much more sense if it were compiled to some kind of bytecode (like NaCl). That will ultimately give you better performance, less loading time, etc.

Bytecode doesn't mean smaller download or faster startup, or necessarily better performance. It might, but you need numbers to show that.

In practice, right now startup is faster on asm.js than PNaCl, and vice versa for execution speed, but in both cases the differences are not large. Compare for yourself:

http://www.flohofwoe.net/demos.html

http://trypepperjs.appspot.com/examples.html


"It would make much more sense if it were compiled to some kind of bytecode (like NaCl)."

asm.js is a bytecode. The ".js" is misleading; it's not wrong, but it's misleading. asm.js is actually a bytecode specification that has a syntax that overlaps (but is not a superset of) Javascript, and the semantics of the bytecode overlaps (but is not a superset of) Javascript, but it is not Javascript. It's a bytecode that happens to have a serialization to a Javascript-compatible syntax.

Do not be fooled by the surface appearance and the name. Look at what it does. It is a bytecode, just one with a bizarre serialization. (And as a result of that serialization, the most literal interpretation of "bytecode" admittedly doesn't quite work, but it's a bytecode in every way except for a literal fixed-length code of bytes.)


How many CPU cores can said "bytecode" utilize at a time? Does it make a difference between value and reference types for optimal performance?


asm.js code is just numbers and functions, no JS "objects". Passing a C++ object by reference resolves to passing a pointer just like in native code (a pointer in emscripten-generated code is actually an index into the typed-array representing the heap). A nice side-effect of this is that asm.js code doesn't trigger the garbage collector.


asm.js doesn't have support for reference types. You can access threads via the HTML5 'Web Worker' feature.


So, all the C++ threading code will automatically be converted to 'Web Workers'?


No, but it's not a great deal to write a parallel execution subsystem in C/C++ which abstracts the differences between pthreads and WebWorkers away. Using low-level pthreads primitives directly in high-level code is a bad idea anyway.


> I agree, an Unreal Engine compiled to javascript doesn't make sense.

That demo runs at 60 FPS in my browser, allowing me to casually spend more money on more video games. Your objections seem ideological, as opposed to perf benchmarks, distribution improvements or other quantifiable terms.

Can you expand on why it doesn't make sense?


Just make sure you don't refresh your browser cache after downloading a couple gigs of assets.


Well that has nothing to do with JS or asm.js, only how the game manages its assets. The Citadel demo needs to preload all data before it starts which is very easy to implement but is of course a suboptimal approach to asset loading, a proper game would only load what's really needed to start (4..5 MBytes max?), and from then on stream everything on demand in the background. It's about how much new data can be presented to the player per second, not how big the game data is overall. Asset sharing/reuse, data compression, procedural generation are all important topics for browser games in general, not just games running in asm.js.


Original Java bytecode was actually worse than compiling to some real language - all the value lifetime and other optimization information was lost, even conversion to native asm produced unoptimized code. Is NaCl any different? I imagine javascript that is jit-compiled could create a pretty good binary?


NaCl is just native machine code with some additional restrictions, AFAIK. It can make use of existing compilers and optimizers.

I agree that the binary is probably relatively efficient, but a) only if your code deals with 32bit floating point numbers, b) the binary (erm js file) is probably going to be bigger (hence increased download times) c) the compilation does not come for free.


For file size, see http://mozakai.blogspot.com/2011/11/code-size-when-compiling... Summary: it's comparable, not bigger.


>It means JavaScript has nothing to do with it, it's just the poison we ended up with, the bitter pill we supposedly have to swallow.

Yeah. That's right. If we could remake the web from the ground up, and had 100% buy-in from all the major parties (corps and devs) to implement and use the new common standard, JavaScript would not be the central language. But we can't. So welcome to reality.


On an unrelated note, I absolutely love that site's header animation (like what you see if you scroll up and/or click "play").


Is that why mobile Firefox kept crashing? Dammit, don't send that stuff to my phone if I didn't ask for it.


The citadel demo, if anyone's looking for it: http://www.unrealengine.com/html5/

The article seems pretty doomy regarding asm.js - is it really going to take off and become an unweildy/frozen standard? Or is it of interest only to people writing game engines in pure JS / vanilla browser technologies?


Honestly, I've got no idea what the OP was trying to say... I think they are implying that asm.js is cool, but they're annoyed that it's being built on top of JS rather than from-scratch.


Second this. I get that they are saying - asm.js is cool, but it's on JS, so eugh.

Personally I'd love for a really low level language (statically typed, little if any magic, no-GC, multi-threaded) to work across all browsers, but I don't think browser vendors can just sit and say - here is the perfect cross browser language that will make all of net easy.

Perhaps the best strategy is to have low level strict language on top of which you build more easier to use constructs - which describes IMO both asm.js and PNaCl.


As the writer of the native compilers, I read the article as written from the kind of the developer who always developed in interpretative languages, who sees asm.js as "wrong" because "it's not the JavaScript he would write by hand.

And again. as the writer of the native compilers, I'm absolutely pro asm.js. I believe it's the best direction that JavaScript optimization can take, solving more hard problems elegantly. Including the fact that asm.js is a better representation than most of bytecodes (or even all). Seriously.


Interesting. Why do you consider that asm.js is a better representation than most bytecodes?

Don't get me wrong JS is awesome (and so is asm.js) but I think stuff like static typing and multi-threadedness are JavaScripts Achilles heel.


We've already tried native and bytecodes: ActiveX, Java, NaCl, Silverlight.


It's ignoring history. The author is saying that browser developers should get together and agree on something better than JavaScript. At first this seems quite possible, since browsers come out with new features all the time. But people have been trying to replace js for a long time and it has never ever worked.


So Asm.js is a strict subset of Javascript, and the author is saying that Asm.js is only useful when you want to compile a C/C++/Whatever library into Javascript, and that it's not interesting for Javascript developers themselves. Then I wonder: why? Can't you write Asm.js code directly (instead of compiling to it)?


You can but it's a bit like writing inline assembly code. The code doesn't look very nice and you lose LLVM's optimizer passes which kick in before the JS code generation.


But you could take Brython or Pythonium and make a Python2asmjs compiler (as I suggested in their forum: https://groups.google.com/forum/#!topic/brython/I7VoZNCiphI)

This could be a chance to skip JS altogether.


That's the great hope. Instead of bringing Javascript to the server side to have a single-language workflow, you'll be able to bring the non-JS language of your choice to the client side.


Yes, asm.js shouldn't be making JS devs too excited, because you shouldn't write asm.js code directly. It's best to take a strongly typed languages and then run it through an asm.js compiler. JS isn't strongly typed and many of the optimizations are unavailable to JS devs. I am happy that V8 is continuing to focus on optimizations that serve the JS writers and I hope the language survives because I really love writing it.


> "And this is really the biggest contradiction of them all. Tons of people have invested countless hours to build these VMs, these new languages, these compilers, these optimizations. Yet somehow, they all seem to agree that it is impossible for them to sit down and define the most basic glue that binds their platforms, and implement a shared baseline for their many innovations. We really should aim higher than a language frozen after 10 days, thawing slowly over 20 years."

It's kind of hard to redesign an airplane in flight, and because of the way the web works, that's a problem that applies to browsers a lot more than some other pieces of software.


> It's kind of hard to redesign an airplane in flight

It is true that IE6 was 5 years in flight. However that has changed. Google replaces Chrome plane each 6 weeks.

JavaScript it is new IE6.

I really would like to see Dart in Chrome stable release. Sometime ago I thought that pure VM, like JVM, would be better. Now I think it is safer to leave scripts loaded as plain text. Just like HTML, CSS. It is hard to guess how whole web ecosystem would work with scripts loaded as bytecode.


hot swapping and migration are extremely well solved problems in software... the reason you see it screwed up so much as that few people actually care about it at all.

i'm convinced that fear and lack of understanding are the genuine problems here...

browsers and the webstack are fantastically shoddy and poorly engineered though and that is a serious obstacle... but it doesn't make the problem intractable or hard. 'just' effort.


Could we combine ASM.js and "native web libraries?" (see https://github.com/samsquire/ideas#51-native-web-libraries)

Compile some non-Javascript code. Create a package containing the native code. Serve ASM.js and the native web library.

This package is then installed in the browser and the browser switches to it when it detects code that is about to be used that matches the installed web library signature.

If the browser supports native web libraries, it uses that. Otherwise it falls back to ASM.js.

Either way, we gain performance and we can compile code natively AND to ASM.js.


Windowx XP give me the blue screen of death, with Firefox 25.0.1


XP is end of life in 5 months so I would expect a lot more problems in the not so near future.

http://www.microsoft.com/en-us/windows/enterprise/endofsuppo...


like gigantic botnets due to unpatched zero day exploits.


My first comment and I get down votes :(

Please don't get me wrong, I was just looking for confirmation of the same error. I love Firefox. Even I never switch to Chrome for common office work. I know microsoft will stop supporting XP, but hopefully Mozlla don't [1]. Unfortunately, in some third world companies (like where I work), IT guys prefer not to change what works until it is absolutely necessary.

Sorry for my English. PS: I'm not that IT gut.

[1] http://www.neowin.net/news/mozilla-to-support-firefox-on-win...


After Microsoft stops providing security updates, it will no longer effectively work.

You really, really need to upgrade somehow.

It really is necessary.

If you can't upgrade Windows, you need to consider moving to a different operating system.


Is it possible to compile Blink and run it using asm.js in Firefox? Then you could run Firefox inside of the Blink renderer, also in asm.js, and ...


One problem is that you can't do runtime code generation with Emscripten. V8 has no interpreter, so you would need a different JavaScript engine.

JavaScript engines can be run using asm.js. https://github.com/jterrace/js.js is a SpiderMonkey port to Emscripten (with JIT disabled).


You can absolutely do runtime code generation, you'll just be generating javascript. If you generate asm.js compatible javascript you'll be effectively JITting native code on the fly (you're just jumping through some absurd hoops to do it).

There are some proof-of-concept projects out there that do runtime generation of JS on the fly for things like recompilation and such in the browser right now. I believe Shumway [1] does it (recompiling ActionScript VM bytecode to JS on the fly) and PyPy.js has an experimental JIT too [2]. The runtime library for my compiler JSIL also does a significant amount of runtime code generation to improve performance and reduce executable sizes [3].

[1] https://blog.mozilla.org/research/2012/11/12/introducing-the...

[2] http://www.rfk.id.au/blog/entry/pypy-js-poc-jit/

[3] http://jsil.org/ ; https://github.com/sq/JSIL/blob/master/Libraries/JSIL.Unsafe... and others


I don't see the problem with asm. It appears that Chrome and Opera support them too. It can only be a good thing. https://blog.mozilla.org/futurereleases/2013/11/26/chrome-an...

It doesn't require requires a plugin install like NaCl


>I don't see the problem with asm.

That's because you didn't read the article.

Which among other things, already points to that blog post.


The article links to that article, come on! Don't people read stuff anymore?


The linked website has an horrible animated background picture. Reading the text is almost impossible with IE 11. Screenshot: http://postimg.org/image/46p2krx95/


I hate to be that guy, but you're using IE.

On a serious note, you can't really blame people for not supporting a browser that needs hack for every single version.

Related: http://www.paulirish.com/2011/browser-market-pollution-iex-i...


Meh, delivering a bunch of text with some images and hyperlinks inbetween was pretty much a solved problem twenty years ago. It's nice that there's still progress being made towards doing more interesting things on the web, but if you have trouble pushing text and images to IE the problem might not lie with IE.


Ask the JQuery devs, Chrome causes them more troubles than all other modern browsers combined (IE9+, FF, Safari, etc).

I am well aware of the history and I hate IE < 10 too. I use FF as main development browser, and beside others of course Chrome and IE 11. The trouble comes when people only test their website with Chrome. I am using IE 11 as I have to at my workplace and it has by far the best UX (IMHO). FF still uses only one process and cannot handle my habit of using dozens or even hundreds of tabs in an optimal way.


IE11 is a very promising alpha release of the next "good" IE.


no.


I am surprised that Dart is not mentioned as a comparison in any of the comments. It is also heralded as a better and more performant Javascript. Is it because the use-case of ASM more limited? I would imagine that DOM integration in ASM would be tricky so that would set Dart apart.


asm.js is not "a better JavaScript". It's not "JavaScript with better performance". In most senses that people would use, it's not even JavaScript, really.

asm.js is not a faster jQuery. It's not for doing yellow-flash alerts, or for more quickly selecting and manipulating DOM elements.

asm.js is a way to ship traditionally desktop-type software, including software which was originally written in a language like C or C++ and compiled into asm.js from that source, to a web browser, to be run in the web browser.

asm.js is (a subset of) JavaScript's syntax, used as a transport/intermediate representation for that code. It will execute just fine as JavaScript, and well-optimized JavaScript engines can do quite well with it (hence it is fully backwards-compatible, if not as fast, in a browser which supports JavaScript but not asm.js), but the intent is primarily to treat it as an IR, with the browser performing the final step of compiling that IR to native code before executing it as native code, rather than executing it as JavaScript.


I definitely recommend poking around the console on this site!


UTF-8 is a legacy mechanism now?


What would it take to get a standards board to approve a common VM for browsers?

I don't see this ever happening. They would in effect be eliminating themselves. They would have to find new jobs or even careers.

Once the VM is standardized, what about HTML/JS/CSS. Well who the hell wants to use those slow moving legacy technologies?

So the standardization now becomes for python, for C#, for scala and lisp ETC.(and their associated UI frameworks). Not controlled by the W3C at all - thus their extinction.

It's more than this though. The W3C has an agenda and it is not to advance technology, it is to slow it down. They want everything moving so slowly that standards can be followed across the board. They want JS/CSS/HTML to be the end all not just in the browser, but everywhere. I think that this should be pretty clear if you follow their trail going back 10-15 years.

It is like a socialist government in a way. The promise is to keep everything stable and let everyone be on equal footing (equal here because the technology moves so slowly that nobody can be left behind by it.) They have to kill and silence many revolutionists who want freedom along the way to do so but consider themselves justified in doing so. Meanwhile, in a neighboring free government with limited govt, people flourish. They have more ups and downs true, and mistakes are made along the way, but after 10 years the free country is wealthy and flourishing, while the socialist one is stagnant and poor.

Think of the mere opportunity of innovation that would exist if a language creator could sit down and create a new language and UI framework universally for browsers in a well established and supported way. This lack of freedom is stagnating innovation.

Let the people decide. Make a standardized VM and your HTML/JS/CSS stack and let the people vote with their choice of options that appear.


You are overestimating the power that the standards board has over what browser vendors choose to include in their software. What you are asking of them is not really in their power. Besides, if VMs can make the W3C irrelevant, then they will do so regardless of W3C's approval of any particular standard.

For what it's worth, we're already very close to the point where "a language creator can sit down and create a new language and UI framework universally for browsers".


You are overestimating the power that the standards board has over what browser vendors choose to include in their software.

Mozilla and Google already want a VM as evidenced by their work. It would not take any convincing to get them there.

if VMs can make the W3C irrelevant, then they will do so regardless of W3C's approval of any particular standard.

Now you're underestimating the power of an established standard. For something to really flourish, enterprise has to adopt it. Without a standard, this will not happen. This is a bit of chicken and the egg scenario.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: