

Asm.js: A Low Level, Highly Optimizable Subset of JavaScript for Compilers - Peteris
http://badassjs.com/post/43420901994/asm-js-a-low-level-highly-optimizable-subset-of

======
haberman
As someone who's been a vigorous proponent of (P)NaCl for a long time, I have
to say that I'm excited to see this. If it can truly deliver on its promise
(near-native-code speeds with no imposed GC overhead), it will be a truly
welcome advance indeed.

I hope that if it does succeed that it will open up even more possibilities,
like optional SSE/AVX intrinsics (for added speed in the most demanding
software) and threading support.

It's great and educational to see an alternative approach to the problem that
is obviously quite different than (P)NaCl. May the best technology win.

~~~
gsnedders
For much, NaCl has a 2x perf penalty (v. native C++), and PNaCl higher
(compilation overhead, and compilation has a higher cost: delaying runtime
from starting). asm.js _already_ , despite being brand-new, for much has a 2x
perf penalty: equal with NaCl, yet more portable. (See zlib, for an example of
this.)

There are still cases where asm.js is slower (box2d, for example, though there
it's broken JS through the point needed for 60fps), but I'd expect nothing but
the difference to decrease. Unlike PNaCl, it's not mere research (after
several years it's still not shipped), works cross-browser, and further unlike
NaCl, works cross-platform.

I expect someone will write some binary serialization for asm.js: you have all
the primitives for iadd, isub, etc.

~~~
haberman
> For much, NaCl has a 2x perf penalty (v. native C++)

Huh? From the NaCl paper: "The worst case performance overhead is crafty at
about 12%, with other benchmarks averaging about 5% overall."

How do you get from 5% to 2x (100%)?

> asm.js already, despite being brand-new, for much has a 2x perf penalty:
> equal with NaCl

Have you actually run the same benchmarks on both and found them equal?

According to this comment no such benchmarks have been performed yet (at least
by azakai): <http://news.ycombinator.com/item?id=5228737>

If not, declaring it "equal" seems a bit premature.

------
kibwen
Prior discussion (with comments from dherman, Luke Wagner, and various other
Mozilla employees): <http://news.ycombinator.com/item?id=5227274>

------
justin_vanw
Why would we go this route? Why not just define a bytecode spec and be done
with all of this nonsense? Mozilla is perfectly free to implement the bytecode
by compiling it to a restricted subset of JS (in fact, it might make a lot of
sense to do so). Defining a simple register or stack based machine and
producing bytecode for it would greatly simplify the job of implementers, it
would make defining the specification and determining compliance with the
standard very straight forward, and it would let us eventually do web
programming without being tied to a specific language.

~~~
azakai
Why would a bytecode spec be simpler for implementers? Is there some part of
the spec of asm.js that looks hard to implement to you?

And why would a bytecode spec be better for other languages than this approach
(which also supports that)?

~~~
justin_vanw
I don't know you, but I'm guessing you are somehow in love with javascript and
invested in seeing it succeed.

The world is largely divided into two camps, the people who are in love with
javascript and the people who think it is a sick joke. Those of us in the
'sick joke' camp would like to see javascript made optional to web
programming, and not a part of our toolchain at all. If javascript is part of
the toolchain at all, I will have to eventually debug javascript code, and
javascript was written under the 'principle of maximum surprise' making it a
very awful language to those of us who haven't based our careers on learning
every single edge case (and every case is an edge case in javascript!)

The other half of the world is the 'javascript is great' people, and they
usually learned to program with javascript or have spent a large portion of
their coding career inside of it. They don't see what the big deal is, it's a
perfectly good language, right? Well, it's actually a pretty gross language
that has one amazing feature, it is available on every computer without having
to install anything extra.

Please, let us have the OPTION of a completely javascript free web. The
javascript fans can keep it, just give the rest us a choice.

~~~
cpressey
As someone who is more in the second camp than the first (JS is something of a
joke disguised as a language), actually, I see asm.js as a great way to
eventually kill off Javascript. Whereas I see trying to specify -- and get all
browsers to implement -- a whole new VM, as quixotic.

~~~
BrendanEich
You see clearly!

Although JS, warts and all, is no joke at this stage. It will die hard, but my
hope is that JS VMs become multi-lingual in as good a way, or better, than the
JVM and the CLR. Certainly better in terms of diversity of implementation,
reach of the Web, and consequent interop testing in the large.

/be

------
mistercow
I get the point of Math.imul, but it seems at odds with the idea of asm.js
being backward compatible. I guess it's easy enough to provide a polyfill (MDN
even provides one), but that seems rather inelegant.

~~~
azakai
It's a pretty minor issue - asm.js is viable without it. For multiplications
of small numbers, like 5 times x where x is 32-bit, you don't need Math.imul,
normal multiply is fine. The only case where Math.imul is useful is x times y
where neither x nor y is known at compile-time, so in theory they could be big
enough to cause double-rounding in JS.

But even in that case, emscripten can emit code without Math.imul (there is a
compiler flag). The code will work, but is a little slower than Math.imul,
that's all. In fact in practice you don't even need the polyfill on MDN (which
is precise), you can do imprecise multiplication _with_ double-rounding, that
works in 99% of cases in my experience, making Math.imul even less crucial.

But it's nice to have Math.imul, just to say that even in the worst case (odd
codebase with tons of integer multiplies that are very often of integers both
very very large), performance will be predictable and fast.

------
comex
Just to inject another viewpoint into this -

I don't care that asm.js is backward compatible with JavaScript. If my program
uses an appreciable fraction of the available CPU an it has to act real time
in some fashion (anywhere from 60fps for a game to simply acting responsive in
a GUI app), execution several times slower by a standard JS VM is no better
than no execution. If I want to be cross-platform in the hopefully _near_
future where asm.js is not widely supported, I would pick Chrome and Firefox
to support for now and use a NaCl plugin for the former. From this
perspective, a "true" bytecode VM would be no worse - and it would hardly be
boiling the ocean, since the size of code required to parse a simple bytecode
is negligible. (If you want to compile it efficiently, either you bring in all
of LLVM or modify your JS engine, which is hardly negligible, but asm.js is
the same in that regard. The only difference is _parsing_.)

But I think it's nice that the "bytecode" is human readable and pretty much
writable. It will be better to use a (possibly lightweight) too to compile to
asm.js, but it's nice that small kernels can just be written without a
compiler.

~~~
Arelius
> execution several times slower by a standard JS VM is no better than no
> execution

Having worked in games for a while I disagree. From a user perspective, I've
found that when games don't work at all players often feel the developer is
too blame, while if it runs but just unplayable slow they are much more open
to blame the system they play it on.

And let's be honest, if this is important for any sort of app, games are
indeed one of them.

Additionally, I'd much rather all my features work, but have to scale back on
visual effects, rather than have to write both a high-performance, and a low-
performance version, _and still_ be required to scale back effects.

From my perspective in game development the asm.js approach seems to be a
significant win.

~~~
yareally
> Having worked in games for a while I disagree. From a user perspective, I've
> found that when games don't work at all players often feel the developer is
> too blame, while if it runs but just unplayable slow they are much more open
> to blame the system they play it on.

Steam comments on their forums seem to blame developers regardless from just
my history of reading them. There might be more arguments about who is to
blame (when there's some it runs fine for), but plenty still blame the
developer when it runs slow and their system should be able to run it (such as
in claims x similar game runs so y should [even if it's not apples and
oranges, but users think so anyways]).

------
javajosh
Can coffeescript generate asm.js compatible javascript? Or rather, what
changes to coffeescript would be required to support it, I wonder?

~~~
dangoor
CoffeeScript and JavaScript are semantically almost the same. CoffeeScript is
not going to magically become faster because of asm.js. As I understand it,
asm.js is really intended as a compiler target for languages with different
semantics from JS that can achieve better performance by hinting more directly
about what the machine should do.

~~~
javajosh
Ah, I see. That's why GWT would really benefit from asm.js but coffeescript
would not: the Java-source already contains the extra type information (that
is currently being mostly ignored), whereas Coffeescript is just as dynamic as
native JavaScript.

Not sure why I got downvoted for asking though. Oh well! Cheers.

------
fijal
x | 0 as declaring int. +x as declaring float. Math.imul to multiply integers.
seriously???!!! It took me roughly half an hour to decide whether it's an
elaborate joke or an actual idea.

Also javascript (without introducing new concepts) is not low level enough to
write down everything you might need (have fun implementing 64bit integer
operations with overflow for example).

~~~
devongovett
Well luckily you don't have to write it. Use a compiler that can generate it
for you. It's not really designed for humans.

~~~
fijal
It's a very bad target for compilers too. Guys should just get their stuff
together and write a reasonable bytecode format for browsers that is fast to
read and verify. Would also save quite a bit of transfer.

PS. If intel told me to do such crap in their architecture manuals, I would be
the last user of PPC out there. I suggest reading them, or JVM bytecode spec
to see what is a __good __compiler target.

~~~
adamnemecek
When I was making a very argument in the last thread, it seemed like people
for whatever reason just did not want to let JS go.

~~~
chc
You're mistaken. It is not that people "just did not want to let JS go," but
that "let JS go" is so immensely hard at this point in time that it is sitting
near "boil the ocean" on the practicality scale. We don't know how to do it,
so it's not a meaningful suggestion. In order to convince people that it's
even worth trying, you need to enunciate a) a practical plan for "letting JS
go," and b) a case for why the advantages of that plan _in the context of the
modern browser landscape_ outweigh the advantages of the plan the Mozilla devs
are going with. You didn't do that.

To put it another way: If we were implementing Web browser scripting
completely from scratch with no history to lean on, this is not how I would do
it and probably not how Dave Herman would do it either. But we _aren't in that
situation_. As it stands, there are lots of existing JavaScript
implementations that browser makers are heavily invested in — and they are not
going to throw that all out and implement a new, incompatible and much more
restrictive standard just because I wave my magic wand. The idea behind asm.js
is that it is the smallest change you can make to still get the desired
effect, because change is hard.

Even if you don't think asm.js is perfect, it's a huge step in the direction
you want to go. Complaining that it doesn't teleport us all the way there
seems like missing the point.

~~~
fijal
Someone has to break it at some point. Making it live longer is not serving us
any good. We would still be using primarily ALGOL, COBOL and Fortran 68 with
this attitude

~~~
chc
I agree that someone has to break it someday, but that's similar to how
someday I'll buy a house — that doesn't mean I'm ready today! They aren't
_making_ JavaScript live longer — that will happen regardless. There simply
isn't the will among browser-makers to break it all at once, so all you're
doing by breaking it is making something that no one can use because it's
completely incompatible with everything on earth.

I feel like people are misunderstanding the challenge here. The challenge is
absolutely not creating a proposed standard or creating a virtual machine.
Those are relatively easy. The challenge is herding browser-makers to whatever
solution you propose. The nice thing about asm.js is that it automatically
works everywhere and it's easy for browser-makers to transition their existing
products to it. Because it takes the path of least resistance, has a better
chance of living up to the real challenge — gaining acceptance — than any
other plan I've heard.

There are already lots of virtual machines out there. Throwing another one at
the wall isn't going to suddenly bring us into a post-JavaScript world. It
reminds me of the xkcd about standards:

[Situation: There are 14 competing standards.]

Guy: "14?! Ridiculous! We need to develop one universal standard that covers
everyone's use cases!"

[SOON:]

[Situation: There are 15 competing standards.]

