Wayyyy back in 2007 I half-jokingly made a prediction about the future of computing. The hot topics at the time were AJAX and new multi-core processors targeted at consumers. Everyone was trying to envision which language would win out by making it easy to write massively parallel programs. There were lots of articles published about Haskell and software transactional memory. The demise of C and C++, and their managed Java/C# spawn was imminent. How could you write in a language that couldn't handle multiple cores for you?
The future is here, in 1/4 of the time I thought it would take. I guess I have to go write a new prediction.
So. . .here's my prediction, taking into account the pace
of hardware development and the history of software
development. In 20 years, cheap hardware will be
ridiculously fast, but it will still look very much like
Intel hardware today. We'll have many, many CPU cores to
work with, but nobody will use a parallel programming
language designed to take advantage of multiple cores.
Instead, virtualization (i.e. VMware, Xen) will be
integrated into the operating system, and each process
will run on its own virtual machine.
Each web browser will have been expanded into a full blown
widget toolkit and have merged in something that looks
like Flash, but there will still be multiple incompatible
browsers. The latest craze will be a browser compatibility
layer written in a programming language that compiles
applications but allow you to use them anywhere.
People will set about re-writing a version of Photoshop in
the new compatibility layer, and everyone will wonder why
they'd do that, when the current version of Photoshop runs
in Internet Explorer just fine.
Edit: The lesson here is that if you want to predict the future, try to do it by making a prediction so ridiculous it could only be a joke.
In what way is using LLVM bitcode (PNaCL) at all similar to ActiveX?
Yeah yeah I get it. I like asm.js as much as the next person; I just wish this whole PNaCl-is-ActiveX meme would seriously go away. It's willful ignorance.
FWIW, I think it's fantastic that asm.js is turning out great. I also think that PNaCl is a worthy technical achievement. Two ways of solving the same problem given different constraints. I think Mozilla has a better chance and agree with their reasoning re- browser compatibility and third-party adoption. Still doesn't mean I'm going to go around calling PNaCl ActiveX when it clearly isn't.
I think asm.js has technical advantages over LLVM bitcode. Neither were really designed for this purpose (for LLVM, you must strip the undefined behavior; for JS, you must strip the high-level semantics), but at least JS has a fully specified, multi-vendor standard specifying the execution semantics. Plus asm.js has a very minimal type system and is non-SSA (while retaining high-level loop constructs that can be easily used to construct SSA without dominance frontiers if needed), both of which I think are advantages for delivering bytecode over the Web.
That must be why asm.js v0.1 already introduced non-standard language extensions (imul) and has a roadmap full of more. Which browsers does the asm.js of today run on? And which browsers will the asm.js of the future run on?
It works really well and I'm glad to see more of this happening. Being able to share games over the web that run at near-native performance levels without plugins has a lot of potential both for end-users and creators. Really cool stuff.
If IE is giving you half the performance of the nearest, free, easily downloaded competitor in your favourite online game, you might be tempted to switch. And this is for the few users that are on IE10!
Two years ago when the tables were turned and Google announced Dart, which has the same compatibility strategy as asm.js (fastest with explicit support, falls back to regular JS), Brendan Eich had these harsh words for Google's move (https://news.ycombinator.com/item?id=2982949):
So "Works best in Chrome" and even "Works only in Chrome"
are new norms promulgated intentionally by Google. We see
more of this fragmentation every day. As a user of Chrome
and Firefox (and Safari), I find it painful to experience,
never mind the political bad taste.
But now "runs best in Mozilla Firefox" is cool?
I actually like asm.js and think it's a totally reasonable approach. I mention this only to highlight that maybe all of the bad-faith assumptions Google gets are not entirely deserved. Objectively these two are similar moves and I think they are both in good faith.
Obviously I'm not Brendan or Chris Peterson, but I see there being two meanings of "works best in Firefox/Chrome": one of which is "Chrome/Firefox has a faster implementation of the standards" (which is what all browsers strive for—for the Web to "work best" in their browser), and the other of which is "Firefox/Chrome has a faster nonstandard implementation of X and there is a worse, standards-based fallback for browsers that don't support this vendor-specific extension", which is what I read Brendan's frustration as relevant to.
By "runs best in Firefox", I just meant that Firefox executed asm.js content faster than Chrome (though the same standards-based content is compatible with all browsers). Google's "works best in Chrome" demos usually depend on non-standard features or the phrase is just marketing text.
I can't speak for Brendan, but I think his comment linked above was concerned that Dart and NaCl promote proprietary content types.
The Dart language is an interesting but incremental improvement like CoffeeScript and TypeScript. The Dart VM is a dead end because Apple, Microsoft, and Mozilla will never embed Google's Dart VM. If Google merged Dartium's VM into Chrome, web servers would serve proprietary Dart content to Chrome and "separate but equal" JS content (generated from Dart) to other browsers. That approach would require extra tooling and testing from web developers and open the door to Google services serving exclusively Dart content. Imagine if YouTube only supported Chrome.
If Dart is "proprietary," then so is asm.js, Mozilla Persona, Rust, and many other cool things that Mozilla is doing.
For example, take Persona. Mozilla invented, on its own, a new identity protocol which is not based on any existing standards and which it solely controls, released some code that implements it, and is trying to convince people to adopt it. When Google does the same kind of thing, it is called out as "proprietary."
> and open the door to Google services serving exclusively Dart content. Imagine if YouTube only supported Chrome.
This is exactly the kind of bad faith presumption that troubles me. Google is trying to push the web forward but in your mind their end game is to break its web properties for every browser except Chrome? This is so far from reality that I don't know how to respond to that, except to lament that distrust of Google by people associated with Mozilla is so high.
> If Dart is "proprietary," then so is asm.js, Mozilla Persona, Rust, and many other cool things that Mozilla is doing.
I don't think asm.js, Rust, dart2js, or Go are "proprietary" because their implementations are open and their output can be executed by multiple platforms or clients. Dart source files is only executable by Google's Dart VM.
> This is exactly the kind of bad faith presumption that troubles me. Google is trying to push the web forward but in your mind their end game is to break its web properties for every browser except Chrome?
I do not think Google is plotting to create a walled garden (unlike Microsoft of the 1990s). But I think many people at Mozilla worry about future timelines where proprietary systems could be created, even if they are the inadvertent, cumulative result of different product teams' decisions.
Consider Google Hangouts. When Hangouts was launched in May of this year, it included an NPAPI plugin to support other browsers like Firefox. But now, only seven months later, the Hangouts service is very popular but is only accessible from as a Chrome extension. The Firefox NPAPI plugin is no longer supported or available for download.
Or consider Chrome's announcement to drop NPAPI itself. NPAPI is the source of much woe and instability for Firefox, but NPAPI is a de facto standard for browser-independent content. Pepper, the proposed replacement, is only implemented by Chrome. The only version of the Adobe Flash Player that uses Pepper is the Chrome port maintained by Google engineers. (Disclosure: I work at Mozilla and I used to work on Adobe's Flash Player team.) A co-worker at Mozilla told me that he asked a friend on the Chrome team "How much of the Pepper API does Chrome's Flash Player use?" and the half-joking response was "150%" because Chrome's Flash Player relied on Chrome internals that were not exposed in the Pepper API.
Or consider Google's 2011 promise to drop support for H.264 from Chrome. Google had the leverage with YouTube to migrate content and clients from H.264 to WebM, but that never happened. Even today, many YouTube videos are not encoded as WebM and content creators have a strong incentive to publish H.264 videos because Google Ads only support YouTube's Flash player.
I'm not really sure these two are different. How is dart2js "worse" than the Dart VM, except for speed? And even in speed, dart2js is faster than hand-written JS in many cases (see https://www.dartlang.org/performance/)
With source maps Dart is debuggable from JS AFAIK. And dart.js allows deploying Dart content in a way that works seamlessly whether or not the Dart VM is present. I'm really not seeing the difference here.
Unfortunately the "enterprise" seem set on it. Yesterday I was working with a new content management platform and I had to downgrade my Internet Explorer to 9 because it only works with IE 7,8 and 9. Then that still didn't work because of some MSXML problem I couldn't fix and I ended up having to use a VM with XP installed and IE 8.
Un-be-lievable. I don't know who these Enterprisey IT managers are who are making these decisions but they seem to have a lot of power which they are using irresponsibly. And they seem to love IE.
A decade ago the large company I worked for switched to a web based timecard system. The only problem, ie only and the software team of 50+ all were using hpux or solaris workstations. No ie. Oh and they wanted timecards done daily.
Set up 2 terminals in the lab with a unix backend and a windows NT virtual environment so we can use IE to do timecards.
Is that really a bad thing? You have a known working VM image to access one particular system. It will never stop working. At worst we'll just run an x86 emulator on our 10GHz 60-core ARM-based phones.
asm.js really shines for usecases like this, the fact that they did the port in a week is truly remarkable. GFX/Audio/Multiplayer all works fine, but loading times are pretty slow on Chrome, but for a first version this seems great.
Edit it seems pretty playable in firefox and chrome, don't notice much of a difference performance wise. Now please make a reverse case for PNaCL and pepper.js and we can all get really excited :)
I feel like "compatible with all modern browsers" is a little dishonest as long as Apple keeps dragging their feet with WebGL in iOS. I know that this isn't an asm.js problem, but the implication is that you can make a game like this which runs in all modern browsers, and that's simply not true yet.
I would argue that a browser without WebGL is not a "modern browser". WebGL is part of what a browser is today, and it works (and is on by default) in recent versions of Internet Explorer, Chrome, Firefox and Opera.
I agree in some sense, but the honesty of a statement depends partly on how the audience can reasonably be expected to interpret it. And you can reasonably expect an audience to interpret "modern web browser" to include any major browser released within the last year.
I disagree, when IE8 and 9 came out, and within a year of their release, did many people consider them "modern browsers"? I don't think so. (Only with IE10 and really with 11 did Microsoft's browser be worthy of that designation.)
While I really see potential of asm.js couple of things about execution (including this title)
1. Brings my browser to its knees (Firefox here)
2. Security. They seem to open themselves up to a whole lot of potential hacks (I hope their code deals with cookie jacking, csrf, sniffing, etc). While this would be a problem with closed source, this is made worse by running it in browser
3. Cheaters. How would they deal with people injecting code and creating bots?
Major desktop titles probably not but major mobile titles definitely yes .
WebGL + asm.js on desktop computers should be roughly equivalent to native iOS / Android applications on top-of-the-line mobile devices (with possibility to get better effects on desktop GPUs).
This is btw what's going on with these very fast Emscripten ports - they can be done so quickly because the bulk of the hardest work was already done before, when engine and game developers made their code and maybe even more importantly their assets work well on mobile devices.
If you noticed many of these fast Emscripten ports were not desktop ports but had already existing mobile versions before - Unreal Engine Citadel demo, Unigine Crypt demo, and now Monster Madness.
Nevertheless it's still very impressive, big kudos to Mozilla folks.
 For example these iOS 3d games should be feasible on the web (runtime performance wise, big asset downloads may be pain): The Walking Dead, The Wolf Among Us, Grand Theft Auto 3 / Vice City, Infinity Blade series, Max Payne 1, Mirror's Edge, XCom: Enemy Unknown, Real Racing, Asphalt series, mobile Need for Speed series, etc.
Plus of course anything 2D - Angry Birds, World of Goo, Plants vs Zombies, Osmos, Limbo, etc.
Minecraft in the browser has been done already, hasn't it? It'd be pretty trivial, as it's been ported to java / c# / objective c already, and has low system requirements.
Given that asm.js games tend to run at something like 15-30% of the speed of native code, I doubt we'll see Battlefield 4 or Starcraft 2 anytime soon.
Additionally, given that the WoW game client is many gigabytes in size, I doubt we'll see that either, even if it could run.
This is from a link in the article for more information. These Monster Madness fellows found Firefox gave them 33% of native performance, and Chrome 20%. Presumably IE would give you fractions of these figures.
I figured that, rather than citing figures from benchmarks, I'd go for the real world scenario that this release has given us.
Mozilla need to tone down the disingenuous marketing as it's going to bite the otherwise worthy asm.js in the long run.
The problem is that video fails to explicitly point out that the only version relying on asm.js is the browser version, which is the least interesting, while they're all excited about different devices those are all supported thanks to Unreal's cross platform support and has nothing to do with the asm.js version.