
Brendan Eich: WebAssembly is a game-changer - alex_hirner
http://www.infoworld.com/article/3042705/web-development/javascript-founder-brendan-eich-webassembly-is-a-game-changer.html
======
jacquesm
Personally, I think this is terrible (and it really is a game-changer, only
not the kind that I'd be happy about). The further we get away from the web as
a content delivery vehicle and more and more a delivery for executables that
only run as long as you are on a page the more we will lose those things that
made the web absolutely unique. For once the content was the important part,
the reader was in control, for once universal accessibility was on the horizon
and the peer-to-peer nature of the internet had half a chance of making the
web a permanent read/write medium.

It looks very much as if we're going to lose all of that to vertical data
silos that will ship you half-an-app that you can't use without the associated
service. We'll never really know what we lost.

It's sad that we don't seem to be able to have the one without losing the
other, theoretically it should be possible to do that but for some reason the
trend is definitely in the direction of a permanent eradication of the
'simple' web where pages rather than programs were the norm.

Feel free to call me a digital Luddite, I just don't think this is what we had
in mind when we heralded the birth of the www.

~~~
Sir_Cmpwn
I agree. I'm really running dry on respect for Brendan Eich. None of the moves
he's making are for the benefit of user privacy - look at Brave, his new
browser project. It replaces ads on the web with his own ads, tracks you, and
puts money in his pocket instead of the publisher's pockets. I'm struggling to
remember why he was respectable in the first place - for making JavaScript, an
awful programming language we've spent 20 years trying to fix? I don't think
that his word on these issues is worth anything any longer.

~~~
drumdance
He created JavaScript in 10 days. It's still in use 20 years later. That's
very respectable.

~~~
na85
>He created JavaScript in 10 days.

It shows. JavaScript is fucking horrible, to put it mildly.

~~~
mcbutterbunz
Its got bad parts and its got good parts. Once you learn how to ignore the bad
parts (and the bad examples) the language really shines. Especially with the
newer versions.

~~~
stray
Stockholm Syndrome.

------
kibwen
On the Rust side, we're working on integrating Emscripten support into the
compiler so that we're ready for WebAssembly right out of the gate. Given that
the initial release of WebAssembly won't support managed languages, Rust is
one of the few languages that is capable of competing with C/C++ in this
specific space for the near future. And of course it helps that WebAssembly,
Emscripten, and Rust all have strong cross-pollination through Mozilla. :)

If anyone would like to get involved with helping us prepare, please see
[https://internals.rust-lang.org/t/need-help-with-
emscripten-...](https://internals.rust-lang.org/t/need-help-with-emscripten-
port/3154)

EDIT: See also asajeffrey's wasm repo for Rust-native WebAssembly support that
will hopefully land in Servo someday:
[https://github.com/asajeffrey/wasm](https://github.com/asajeffrey/wasm)

------
s3th
As we get closer to having a WebAssembly demo ready in multiple browsers, the
group has added a small little website on GitHub [0] that should provide a
better overview of the project than browsing the disparate repos (design,
spec, etc.).

Since the last time WebAssembly hit HN, we've made a lot of progress designing
the binary encoding [1] for WebAssembly.

(Disclaimer: I'm on the V8 team.)

[0]: [http://webassembly.github.io/](http://webassembly.github.io/) [1]:
[https://github.com/WebAssembly/design/blob/master/BinaryEnco...](https://github.com/WebAssembly/design/blob/master/BinaryEncoding.md)

~~~
KMag
About the binary encoding... It's a bit easy to armchair these things, and
it's too late for WebAsm now... but if you're on the V8 team, you have access
to Google's PrefixVarint implementation (originally by Doug Rhode, IIRC from
my time as a Google engineer). A 128-bit prefix varint is exactly as big as an
LEB128 int in all cases, but is dramatically faster to decode and encode. It's
closely related to the encoding used by UTF-8. Doug benchmarked PrefixVarints
and found both Protocol Buffer encoding and Protocol Buffer decoding would be
significantly faster if they had thought of using a UTF-8-like encoding.

LEB128 requires a mask operation and a branch operation on every single byte,
maybe skipping the final byte, so 127 mask operations and 127 branches. Using
32-bit or 64-bit native loads gets tricky, and I suspect all of the bit
twiddling necessary makes it slower than the naive byte-at-a-time mask-and-
branch.

    
    
        7 bits -> 0xxxxxxx
        14 bits -> 1xxxxxxx 0xxxxxxx
        ...
        35 bits -> 1xxxxxxx 1xxxxxxx 1xxxxxxx 1xxxxxxx 0xxxxxxx
        ...
        128 bits -> 1xxxxxxx 1xxxxxxx 1xxxxxxx ... xxxxxxxx
    

Prefix varints just shift that unary encoding to the front, so you have at
most 2 single-byte switch statements, for less branch misprediction, and for
larger sizes it's trivial make use of the processor's native 32-bit and 64-bit
load instructions (assuming a processor that supports unaligned loads).

    
    
        7 bits -> 0xxxxxxx
        14 bits -> 10xxxxxx xxxxxxxx
        ...
        35 bits -> 11110xxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx
        ...
        128 bits -> 11111111 11111111 xxxxxxxx xxxxxxxx ... xxxxxxxx
    

There's literally no advantage to LEB128, other than more people have heard
about it. A PrefixVarInt 128 is literally always the same number of bytes, it
just puts the length-encoding bits all together so you can more easily branch
on them, and doesn't make them get in the way of native loads for your data
bits.

Also, zigzag encoding and decoding is faster than sign extension, for
variable-length integers. Protocol Buffers got that part right.

Note that for security reasons, if there are no non-canonical representations,
there can't be security bugs due to developers forgetting to check non-
canonical representations. For this reason, you may want to use a bijective
base 256[0] encoding, so that there aren't multiple encodings for a single
integer. In the UTF-8 world, there have been several security issues due to
UTF-8 decoders not properly checking for non-canonical encodings and
programmers doing slightly silly checks against constant byte arrays. A
bijective base 256 saves you less than half a percent in space usage, but the
cost is only one subtraction at encoding time and one addition at decoding
time.

[0][https://en.wikipedia.org/wiki/Bijective_numeration](https://en.wikipedia.org/wiki/Bijective_numeration)

~~~
s3th
It's not too late! The wasm binary encoding is open to change up until the
browsers ship a stable MVP implementation (then the plan is to freeze the
encoding indefinitely at version 1).

The primary advantage of LEB128 is (as you mentioned) that it's a relatively
common encoding. PrefixVarint is not an open source encoding IIUC.

We'll do some experiments in terms of speed. If the gains are significant we
may be able to adopt something similar (this [0] looks like a related idea).

Thanks for the suggestion.

[0]:
[http://www.dlugosz.com/ZIP2/VLI.html](http://www.dlugosz.com/ZIP2/VLI.html)

~~~
KMag
PrefixVarint isn't open-source, but the encoding is trivial.

PrefixVarints are a folk theorem of Computer Science, (re-)invented in many
times and places.

I actually coded it up once in Python and once in C before joining Google, and
was chatting with an engineer, complaining about the Protocol Buffer varint
encoding. The person I was complaining to, said "Yea, Doug Rhode did exactly
that, called it PrefixVarint. He benchmarked it much faster."

------
machuidel
Since I started hearing about WebAssembly I cannot stop thinking about the
possibilities. For example: NPM compiling C-dependencies together with
ECMAScript/JavaScript into a single WebAssembly package that can then run
inside the browser.

For people thinking this will close the web even more because the source will
not be "human"readable. Remember that JavaScript gets minified and compiled
into (using Emscripten) as well. The benefits I see compared to what we have
now:

\- Better sharing of code between different applications (desktop, mobile
apps, server, web etc.)

\- People can finally choose their own favorite language for web-development.

\- Closer to the way it will be executed which will improve performance.

\- Code compiled from different languages can work / link together.

Then for the UI part there are those common languages / vocabularies we can
use to communicate with us humans: HTML, SVG, CSS etc.

I only hope this will improve the "running same code on client or server to
render user-interface" situation as well.

~~~
spb
More importantly, if we want to make "view source" more palatable in a
WebAssembly age, we need to have it support source maps from day 1.

~~~
machuidel
Yes, that would be good for development / debugging (like debug symbols) or as
an optional way to give people access to the source.

------
rl3
Considering how critical _SharedArrayBuffer_ is for achieving parallelism in
WebAssembly, I'm hoping we see major browsers clean up their _Worker_ API
implementations, or even just comply with spec in the first place.

Right now things are a mess in Web Worker land, and have been for quite some
time.

~~~
jvilk
Absolutely agreed. We should be able to debug Web Workers with development
tools (e.g. set breakpoints / examine state / etc), nest Web Workers, use
console.log from within a Worker, and construct Web Workers from Blob URLs.
It's infuriatingly difficult to work with Web Workers without these features,
which are missing from most browsers!

I think there's a chicken-and-egg problem with regards to Web Workers:
Developers do not use Web Workers because they are hard to use/debug/develop
with, and browser vendors do not improve their Web Worker implementations
because they have limited adoption. Someone needs to break the cycle.

~~~
rl3
There's also a host of nasty bugs and implementation deficiencies, depending
on the browser.

My favorite is in Chrome with simple DedicatedWorker instances communicating
with each other directly via the MessageChannel API. It works, except when the
UI thread is blocked, because messages are routed through the UI thread.
Firefox doesn't have this problem, but it has its own issues—namely the UI
thread for each tab runs in the same OS process, unlike Chrome where tabs are
isolated processes.

That said, Firefox and Chrome are far ahead of every other browser in terms of
how they implement workers. Other implementations are borderline destitute by
comparison (e.g. no DOMHighResTimeStamp available in worker context, no
Transferable support for important items).

> _Developers do not use Web Workers because they are hard to use
> /debug/develop with, and browser vendors do not improve their Web Worker
> implementations because they have limited adoption._

You hit the nail on the head. I think it's also a dislike for the API as a
whole, because really workers are just a convoluted way of enforcing thread
safety. Personally I'd prefer a far more simple and traditional shared memory
model, with developers being afforded enough rope to hang themselves with if
they so desired.

The existing methods of transferring data to and from workers are frankly
crap, the only light at the end of the tunnel being SharedArrayBuffer and the
Atomics API designed around it. The problem is that both are essentially
designed for compiled applications, ala asm.js and WebAssembly. In a compiled
[browser] environment, the heap is seamlessly allocated on a
SharedArrayBuffer, so writing parallel code is nearly identical to the
traditional desktop experience from a developer's point of view. In plain old
Javascript however, you have to serialize and deserialize native types to and
from the buffer, which is expensive. It really makes Javascript seem like a
second-class citizen with regards to parallelism.

~~~
13years
SharedArrayBuffer has been accepted as stage 2 in tc39 so there is now a good
chance you will have this from JavaScript as well.

[https://github.com/tc39/ecmascript_sharedmem](https://github.com/tc39/ecmascript_sharedmem)

~~~
rl3
Right. My point was that the way the API is built is definitely more friendly
towards compiled use cases. With JS, you have to manually serialize and
deserialize virtually everything that transits the buffer.

Still, it's nice to see the spec move forward.

------
stevenh
If anyone at infoworld.com reads these comments:

On the top of the page, there is a horizontal menu containing "App Dev • Cloud
• Data Center • Mobile ..."

When I position my cursor above this menu and then use the scroll wheel to
begin scrolling down the page, once this menu becomes aligned with my cursor,
the page immediately stops scrolling and the scroll wheel functionality is
hijacked and used to scroll this menu horizontally instead.

It took a few seconds to realize what was happening. At first I thought the
browser was lagging - why else would scrolling ever abruptly stop like that?

I closed the page without reading a single word.

------
eggy
I still think there is a lot of room for static pages with links in the style
that people seem to be prematurely waxing melancholy about when forecasting
where WebAssembly _may_ lead the internet. I was always able to find sites of
interest that didn't include Flash, Java applets, and company when I just
wanted to read something. I find some of the scroll-hijacking, and other
javascript goodies on modern pages to either be a distraction, or non-
functional on some different devices. On the other hand, I am particularly
happy about, and working with Pollen in Racket, a creation by Matthew
Butterick. Pollen is a language created with Racket for making digital books,
books as code, and bringing some long-needed, real-world publishing aesthetics
back to the web [1,2]. I may even by a font of his to get going and support
him at the same time!

    
    
       [1]  http://docs.racket-lang.org/pollen/
       [2]  http://practical.typography.com

------
petercooper
If you want to see Brendan's keynote from O'Reilly Fluent yesterday, a sample
went up
[https://www.youtube.com/watch?v=9UYoKyuFXrM](https://www.youtube.com/watch?v=9UYoKyuFXrM)
with the full one at [https://www.oreilly.com/ideas/brendan-eich-javascript-
fluent...](https://www.oreilly.com/ideas/brendan-eich-javascript-fluent-2016)

~~~
smartbit
And Alex Russel's keynote (Google) " _Progressive web apps and what 's next
for mobile_" can be found at [https://www.oreilly.com/ideas/progressive-web-
apps-and-whats...](https://www.oreilly.com/ideas/progressive-web-apps-and-
whats-next-for-mobile)

------
no1youknowz
I think the web may split into two.

1) 'Simple' web pages will stick with jquery, react, angular, etc type code.
Where you can still click view source and see whats going on. Where libs are
pulled from CDNs etc.

2) 'Complex' saas web apps, where you need native functionality. This will be
a huge bonus. I'm in this space. I would love to see my own application as a
native app. The UI wins alone make it worth it!

~~~
ash_gti
What does 'native functionality' mean for a web app?

Do you mean skipping the DOM and making a Canvas for displaying content? Or do
you mean something else?

------
gsmethells
To me, it's more about choice of programming language than performance. Though
the latter is very important, I think the former is what will open up doors to
making the browser a platform of choice (pun intended). Currently, it feels
like JavaScript is the Comcast of the web. Everyone uses it, but that's only
because there aren't any other options available to them.

~~~
gbersac
Definitely agree ! I really hope that web asm will kill javascript and (css by
the way). I just hate this language.

------
mwilkison
Video of the talk?

EDIT: Here is the full-length one - [https://www.oreilly.com/ideas/brendan-
eich-javascript-fluent...](https://www.oreilly.com/ideas/brendan-eich-
javascript-fluent-2016)

~~~
petercooper
Here's his Fluent keynote from yesterday:
[https://www.youtube.com/watch?v=9UYoKyuFXrM](https://www.youtube.com/watch?v=9UYoKyuFXrM)
.. full at [https://www.oreilly.com/ideas/brendan-eich-javascript-
fluent...](https://www.oreilly.com/ideas/brendan-eich-javascript-fluent-2016)
(click X on the popup window, you don't need to sign in)

------
hutzlibu
Sorry, but most of the discussion here is completly missing the point about
WebAssembler.

It is just a technology, to make things brought through the web, faster. And
it is open. And no less secure, than js. So I think it's great.

Good technology does exactly, what the creator wants. And if people don't like
some of the things, that gets created with it, then it is not a problem of the
technology itself.

So people can do good things, or bad things with it. But in the web, we have
the freedom to choose, where we go.

And if we don't like ads for example, we should be aware, that Web-Site
creators still want money for their work, so maybe we should focus and support
a different funding model. I like the pay-what-you-want or donation model the
most, Wikipedia shows, that this is possible on a large scale ...

------
vruiz
I want to agree with him, I'd like to see a future where WebAssembly closes
the gap between native apps and the web. For better or worse browsers are the
new OSes, and I dream of a future were all vendors come up with the equivalent
of a POSIX standard where any web application can access all (or a wide common
subset) of any device's capabilities, from the filesystem to native UI
elements.

~~~
serge2k
Hey, I've got an idea. How about we just implement this POSIX like standard at
the OS layer.

We can call it POSIX.

~~~
mindcrime
It's a shame you got downvoted for that, as you make a very good point. This
whole trend of making the web browser a poor man's OS is definitely a bit
hinky. I mean, how many layers of abstractions built on top of other
(redundant) layers of abstraction really make sense?

This stuff is one reason that, despite the advances associated with Moore's
Law, the advent of SSD's, and increasing RAM counts, computers don't feel any
faster than they did in 1995. It's ridiculous in a way.

Just to play Devil's Advocate: maybe web browsers should be good at, ya know,
_browsing_ and leave the other stuff for something else.

~~~
vectorpush
> _It 's a shame you got downvoted for that, as you make a very good point._

I think this is because the comment comes off as flippant and snarky.

> _This stuff is one reason that, despite the advances associated with Moore
> 's Law, the advent of SSD's, and increasing RAM counts, computers don't feel
> any faster than they did in 1995. It's ridiculous in a way._

That statement is ridiculous. I've never heard anyone claim that the computers
of today don't "feel" any faster than computers of 20 years ago, but if you
feel that way I just don't think you're living in the same universe as those
of us who walk around with quad core computers in our pockets.

> _maybe web browsers should be good at, ya know, browsing and leave the other
> stuff for something else._

Please define "other stuff" and where you draw the line between that and
simply "browsing"

~~~
mindcrime
I think it's a shame you got downvoted as well. Have an upvote on me.

As to the rest:

 _I 've never heard anyone claim that the computers of today don't "feel" any
faster than computers of 20 years ago,_

Interesting, I find it to be a fairly common refrain. In fact, what I'm saying
is basically just a paraphrase of Wirth's Law:

[https://en.wikipedia.org/wiki/Wirth's_law](https://en.wikipedia.org/wiki/Wirth's_law)

 _but if you feel that way I just don 't think you're living in the same
universe as those of us who walk around with quad core computers in our
pockets._

Well, I walk around with a quad core computer in my pocket as well, and I
still stand by that assertion.

 _Please define "other stuff" and where you draw the line between that and
simply "browsing"_

I'll allow that there's some subjectivity there, but when you're talking about
a "web application" like, say, Microsoft Outlook online or something, or a
programming editor or a CAD program or an image editing program, I can't help
but wonder if that stuff should really be done purely "in browser" as opposed
to being handed off to another program.

OTOH, I understand (some of) the arguments for doing it this way. Having a
uniform experience for all clients, the security holes associated with
plugins, avoiding the need to deploy software to individual machines, etc. I'd
just like to suggest that people spend some time considering if there are
other ways to achieve the same end(s) other than continuing to bloat the web
browser until it replicates all the functionality offered by the underlying
OS.

~~~
jblow
> Having a uniform experience for all clients

An experience that is uniformly slow and uniformly broken a different way on
every browser...

~~~
mindcrime
I largely agree, but the argument is "If we rely on plugins, some users will
have the plugin and some won't and since users don't install plugins, not
everybody will be able to use our $THING".

And it is a somewhat legitimate argument. Whether or not it justifies having
the browser subsume everything is, IMO, an open question.

~~~
jblow
I think if we decide heavily siloing / sandboxing is the right thing for
software generally, then what you want to do is build an operating system that
works that way (kind of like iOS, but with provisions to enable better data
sharing so that you can actually make things with that OS).

This would be TREMENDOUSLY better than trying to make the browser into an OS.

~~~
bobsgame
What do you think about a browser tab that loads a VM running Linux running
OpenJDK that runs a full Java application in its own sandboxed OS instead of
an applet, with some mechanism for file transfer to the host OS? You could
also support any other language, WINE, Mono, whatever. The point is having a
sandboxing mechanism that gives existing native code first class status in the
browser. Too hacky?

------
icedchai
WebAssembly... Wow, if we keep going, we'll re-invent what Sun achieved 20
years ago with Java. If only they hadn't f-ed it up...

~~~
protomyth
The JVM problem was that it had applets and did not have the DOM integration
of Javascript. I do often wonder if instead of Javascript in 1995 we had got
WebAssembly and WebSockets.

~~~
icedchai
You could actually call into JavaScript from Applets, using something called
LiveConnect. See
[https://docs.oracle.com/javase/tutorial/deployment/applet/in...](https://docs.oracle.com/javase/tutorial/deployment/applet/invokingJavaScriptFromApplet.html)

Was it simply/easy? No. But, you probably wouldn't want to.. The DOM is a
crappy way to build an application UI. Someday we might figure that out.

~~~
cheez
The concept of the DOM is so widely considered to be useful in declaring user
interfaces, that multiple languages have copied it.

[http://fxexperience.com/wp-
content/uploads/2011/08/Introduci...](http://fxexperience.com/wp-
content/uploads/2011/08/Introducing-FXML.pdf)

[https://en.wikipedia.org/wiki/Extensible_Application_Markup_...](https://en.wikipedia.org/wiki/Extensible_Application_Markup_Language)

The HTML DOM sucks, but what other DOM is widely available and already
installed on literally every machine on the planet?

------
nadam
A question to WebAssembly experts: How easy it is to use WebAssembly as a
sandboxed embedded scripting mechanism in my own native (C++) application? I
am writing a native real-time system (a distributed 3D engine for VR) in which
I send scripts on the wire between machines, and I need to call an update()
method of these sent scripts like 90 times a frame. I need complete
sandboxing, because my trust model is that what is trusted on machine A may be
absolutely not trusted on machine B: not only not letting the scripts do any
functions other than what I explicitly let them call, but I need to have hard
limit on their memory usage and execution time also, but preferably they
should execute in-process, so they can reach memory I let them and be called
from the thread I want. Currently I go wtih Lua, but to have really good
performance I will need to research this topic more deeply later.

------
talles
Are those boxes in the picture Firefox OS phones?

Is this an old picture?

~~~
nacs
Good catch. The URL of that image [1] seems to indicate it's from April 2014
(or earlier).

Seems Brendon Eich resigned that same month/year [2].

[1]:
[http://core0.staticworld.net/images/article/2014/04/brendan-...](http://core0.staticworld.net/images/article/2014/04/brendan-
eich-100259498-primary.idge.jpg)

[2]: [http://recode.net/2014/04/03/mozilla-co-founder-brendan-
eich...](http://recode.net/2014/04/03/mozilla-co-founder-brendan-eich-resigns-
as-ceo-and-also-from-foundation-board/)

------
n00b101
What is the upgrade path for Emscripten users? I understand that LLVM will
have WebAssembly backend, but how will OpenGL to WebGL translation work, for
example?

~~~
azakai
Emscripten can already compile to both asm.js and WebAssembly, with just
flipping a switch between them.

All the JS library support code is unchanged, so Emscripten's OpenGL to WebGL
layer is used just like before, and the same for all the other libraries.

The WebAssembly backend in LLVM will eventually be used by Emscripten as
another way to emit WebAssembly (right now it translates asm.js to
WebAssembly), but the new backend is not ready yet.

See also
[https://github.com/kripken/emscripten/wiki/WebAssembly](https://github.com/kripken/emscripten/wiki/WebAssembly)

------
bcoates
If you think WebAssembly (or asm.js) is a good idea, I would very much like
you to do the thought experiment of what design decisions something like
WebAssembly would have made 15 or 25 years ago, and what consequences those
would have today.

Helpful research keywords: Itanium RISC Alpha WAP Power EPIC Java ARM Pentium4
X.25

~~~
smitherfield
I can't think of any software development API that ended up being perfect 15
or 25 years later. Javascript certainly isn't. Java applets, ActiveX controls
and Flash _very_ much weren't, but, at the time, they did things you couldn't
with the standard web stack.

And we're better off for learning the lessons of the failures, creating
improved technologies to replace them (HTML5, JIT Javascript engines, etc),
and building on the successes to continuously do more things that previously
couldn't be done in the browser.

Will WebAssembly be perfect? Of course not. Will there be unanticipated
problems? Of course. I would not at all be surprised if it becomes the next
Flash. But it's better to move forward and keep innovating with new web
technologies instead of letting the platform stagnate.

We've tried feature-freezing the web for a few years; it was called "Internet
Explorer 6" and it sucked.

------
Executor
I'm conflicted. One one hand I support open data/raw documents. But this
prevents native-like, real-time applications. It also forces developers to
work on Javascript which is a terrible language.

On the other hand we have lock-in ecosystems, closed silos, that are
detrimental to the commons.

The only consolation I have is that if WebAssembly provides a bytecode instead
of machine code then we still have the ability to perform reverse engineering.

In the end, we have _ALL_ have to do the hard task to inform every single
person why Apple/FB/MS/Google are harmful to us and why we should boycott
their programs/services.

------
spitfire
I wonder if along with these byte code engines we'll get capability grained
control systems too. Somehow I doubt it though.

So in the future, when you visit a website they'll be able to Eg: open
windows, pop up unblockable modals, webGL, bytecode loaded spam/ads, etc. The
end users option will be to block everything, or live with it.

I do not like this bold new world we're entering.

~~~
esailija
This is a common confusion somehow. The programming language has _nothing_ to
do with the APIs provided by an environment. JavaScript can do all those
things now, as long as you run it in an environment that provides APIs to do
those things (node.js, electron etc). The browser is not that _environment_.
When you write a keylogger virus in C, you are relying on the APIs provided by
the environment to do it, they don't come from the C language.

~~~
spitfire
I wasn't confused at all. Technically there is a difference between a
language, an API and a platform.

That's all true in the case of NodeJS and the like, but not true for web
browsers. There the language the API and the platform come as one.

The result is, remote websites can execute code on a users computer. With no
control except for simple technical measures.

------
xaduha
WebAssembly shouldn't be for the end users to use, it should used for
implementations of other languages so they can access the same APIs Javascript
can.

Add Lua to the browser, add Perl 6 to the browser, etc. There are plenty of
decade old W3C specifications that never made it to the browser properly, like
XSLT 2.0, XQuery 1.0, XForms, never mind the latest versions of the specs.

~~~
esailija
I don't see how it can be feasible to use it for implementations of other
languages that don't directly map to webassembly like C. You would have to
ship a runtime for the other language along with your application code. The
runtime will be either huge with long startup time or small but too slow to be
feasible.

~~~
xaduha
Whichever it is it can't be worse than implementing other languages in
Javascript directly or using stuff like GWT and there are plenty of those
already (including development for running Perl 6, btw).

Runtime for a few selected implementations should be very well be packaged or
installed along with the browser itself. Failing that, it should be cached.

~~~
esailija
> Runtime for a few selected implementations should be very well be packaged
> or installed along with the browser itself. Failing that, it should be
> cached.

Web Assembly isn't related to this. You can standardize on a new language VM
that browsers should ship, like was attempted with Dart. What Web Assembly
enables is more efficient VM with better interop to Web APIs than currently is
possible, but the most prohibitive thing will not change (having to ship multi
megabyte VM along with your application code).

~~~
xaduha
Lots of things will change. If it makes sense to do that, then there will be
demand for it.

Here's something already
[https://news.ycombinator.com/item?id=11269736](https://news.ycombinator.com/item?id=11269736)

------
ak39
Is WebAssembly going to be host url resource based (like current .js files
are) or will it be used as part of some centralized global assembly cache
(GAC) solution where assemblies are only usable from a CDN type of authority?

------
vbezhenar
What exactly will be better? One can compile a lot of languages to JavaScript
today. JavaScript is fast enough and size doesn't really matter for most use
cases. Is WebAssembly going to be much faster than JavaScript?

~~~
Mikeb85
Webassembly is basically a portably assembly language (as in, lower level than
C), that then gets translated instruction for instruction into actual assembly
language.

It's several layers 'below' JavaScript. It's basically cross-platform, native
code.

~~~
acqq
And moreover, the resulting native code is executed inside the same VM that
executes normal Javascript, so it's not anything like starting some Flash
file, it's just that we can have native code speed, when needed, _inside_ of
the Javascript. That existed already with asm.js. This step now should allow
less overhead in parsing such code, which matters when there is a lot of code
like in the games or big programs directly translated from lower-language code
base. Less overhead means less battery drain and faster start of the program,
as an example.

------
chrstphrhrt
Has anyone tried NativeScript?
[https://www.nativescript.org](https://www.nativescript.org)

Heard about it on a podcast recently, haven't had a chance to try.

~~~
supernintendo
Just pointing this out to absolve future confusion: This comment is very off-
topic.

\- WebAssembly is a new low level language for client-side scripting in web
browsers. Future web browsers will support WebAssembly in the same way they
currently support JavaScript. WebAssembly has a number of advantages over
JavaScript, including performance and an AST-like syntax that makes it more
suitable as a compilation target.

\- NativeScript is a framework for developing "cross-platform" mobile apps. It
achieves this through a JavaScript/TypeScript API and common UI components
that are implemented natively on both iOS and Android.

I have not tried NativeScript and I am skeptical of projects that aim to
"bridge the gap" in mobile development. iOS and Android are ever-evolving and
so you must rely on the platforms that target them to stay up to date.
Further, these platforms have very different design goals and the compromises
that frameworks like NativeScript make often come at the expense of user
experience.

NativeScript could be great! But please be aware of the shortcomings of
eschewing native development.

------
Pxtl
If we keep this up, the web will be almost as good of an application framework
as a '90s era desktop application. Yay, progress!

------
jimmcslim
I wish the browser vendors focused on CSS Grid module support as much as they
did WebAssembly.

------
madsravn
This looks AWESOME

------
travisty
Thanks for that update that no one asked for.

~~~
dang
Please don't be rude.

We detached this subthread from
[https://news.ycombinator.com/item?id=11262923](https://news.ycombinator.com/item?id=11262923)
and marked it off-topic.

------
icosta
WebAssembly = SWF with diff name. Come on!

------
ape4
The format of WebAssembly could be Java ByteCode.

~~~
rubber_duck
It's lower level than that.

------
opacityIsCool
Yeah, great. Transform everything into opaque binary blobs, as far as the eye
can see. Wonderful.

Thanks for nothing.

~~~
T-A
From [http://webassembly.github.io/](http://webassembly.github.io/) : "Open
and debuggable: WebAssembly is designed to be pretty-printed in a textual
format for debugging, testing, experimenting, optimizing, learning, teaching,
and writing programs by hand. The textual format will be used when viewing the
source of wasm modules on the web."

~~~
opacityIsCool
Yeah, nice, this and and so many other formats, that people just throw up
their hands and give up on, when confronted with the raw binaries they work
with on a daily basis, are simply open and wonderful all the time.

Except not.

Portable Executables. ELF binaries. Zip Files. Open Image Formats.

All of these are theoretically open, and perfectly accessible to all, in their
raw form.

And yet broadly inaccessible to like 90% of the world's lay people, since the
concept of an interpreter eludes them, and in some cases is explicitly denied
to them. The same will happen with this.

This puts things on a shelf, well out of reach to many more people. And a very
small group of people love that.

So, while encrypted smart phones and email "go dark" on mass surveillance, the
rest of everything else "goes dark" for ordinary people.

------
twsted
I don't know. I am not sure yet. What the HN folks think about this?

