Hacker News new | past | comments | ask | show | jobs | submit login
A Little on V8 and WebAssembly [pdf] (archive.org)
92 points by rspivak on June 30, 2016 | hide | past | web | favorite | 23 comments



Does anybody know the status of Clang/llvm's webassembly backend? I haven't heard much of anything about its progress recently.


It's still not ready for general usage. On real-world codebases it has correctness issues that need to be sorted out, and there are missing features like longjmp and C++ exceptions.


Right, but it's worth mentioning that no part of WebAssembly is ready for general usage, as it is not yet released. :)

Just like other components of the technology preview, I think it would be useful to have interested developers try it out and give feedback in the form of bug reports -- understanding that things are incomplete and going to change.


Right now testing and giving feedback is still a bit too brittle, since too many moving parts are involved: my asm.js demos here (http://floooh.github.io/oryol/) and here (http://floooh.github.io/oryol-samples/), are all ready for wasm, and sometimes I'm putting wasm builds up there next to asm.js and PNaCl, but it's hard to automate the build like for asm.js and PNaCl. At the moment there's are bleeding edge build of Spidermonkey, binaryen and emscripten required to work together. Depending on the current phase of the moon this works or doesn't work (e.g. right now I'm having compile problems of Spidermonkey on MacOS 10.12 beta with the Xcode 8 command line tools).

I'm sort of holding back at the moment until emscripten has everything integrated to build wasm binaries, even if it is not the final LLVM backend. But I can't wait to flip the switch on the wasm builds :)


Do you also test with V8?

BTW We're happy that we got some of your demos into the build-suite, but unfortunately we can't run them anymore. The binaries there are still version 0xA and the .wast files have some errors that prevent translating them to 0xB with the sexpr-wasm tool.


I haven't tested with V8 yet (only the final result in Chrome Canary). I'm happy to (try at least) provide a PR with fresh builds :)


I don't think it answered why ... If you need better performance, you should go closer to the metal anyway?

Isn't the purpose of JS that you would easily be able to view the source, that it's actually source code running, and not binary!?


Much easier distribution and an open platform. Microsoft and Apple are both slowly tightening the screws on their desktop platforms for software that's not going through their app store and the mobile platforms have always been walled gardens. The web is (still) open, and distribution is much simpler both for the developer and the for the user (the developer just uploads to a web server, the user just clicks an URL).

As for view source: on one hand JS is already mostly minimized, on the other hand, WebAssembly has somewhat human-readable lisp-like ASCII representation which can be generated from the byte code. But I think view-source hasn't been useful since many years.


JS was meant to be a simple scripting language for web pages. But now that we have huge applications we're compiling from C or C++, plaintext isn't the best format.

Binary formats are nothing new to the web, it's used them for images and such for a long time now. In fact, the one raster plaintext image format (XBM) is no longer supported.


The attack vector of a image format is minuscule compared to actual running code. It's not like you are worrying about a jpg image performing an x-site injection.

SVG's are quite popular!


Font and image binary format parsing have been the source of plenty of zero-days over the years. :)


> Isn't the purpose of JS that you would easily be able to view the source

Who said that was the purpose of JS?


Probably same person who said Linux/GNU is more secure. But why?

Same reason why you don't let invisible aliens on-board your spaceship!

Programs become tremendously more secure when it's possible to read the source code and actually compile it yourself.


Portable eval() is a good thing not easily found elsewhere. Making portable eval faster and more useful (while still widely available) is different than switching to architecture-specific code generation.

That said, generating JavaScript is fast enough most of the time and more portable so it's probably the place to start.


any way to get the video of this talk?


We're working on it. Keep an eye on http://soft-dev.org/events/vmss16/ over the coming days.


> V8 was the first really fast JavaScript Virtual Machine

This is just barely technically correct, at best.

Chrome launched its first pre-release version on Sep 8 2008 [1], while WebKit's SFX JIT showed up in pre-release versions of WebKit just 10 days later [2] (or less, 10 days is when the blogpost came out).

Also, Firefox's TraceMonkey JIT landed on trunk Aug 8 2008 [3] which is before Chrome's pre-release (however, it was not yet turned on by default in nightlies at that time).

V8 is a wonderful VM and Google should be proud of it, but the PR line of "V8 was the first really fast JS VM" is debatable. Something like "one of the first really fast JS VMs" would be more honest.

[1] https://en.wikipedia.org/wiki/Google_Chrome

[2] https://webkit.org/blog/214/introducing-squirrelfish-extreme...

[3] http://arstechnica.com/information-technology/2008/08/firefo...


I must say it was a fantastic time to watch the web evolving over those months. As every browser over a short period of time but IE produced JavaScript engines that were not just slightly faster, but radically faster websites adapted and there was within months sites would load in seconds everyone but on IE where it would be minutes. It introduced a binary incompatibility of sorts on the web and users were now abandoning IE because the sites they wanted to use were too slow to be useful.

To a lesser degree we are starting to see a similar post of article starting with Opera, Safari, and IE working on reducing their energy usage while oddly it is Chrome that is silent. Just like before a product with a significantly superior feature has caused me to switched. But this time it was back to Safari from Chrome. But the Chrome team is not asleep at the wheel like the Microsoft was and I am expecting/hoping within a year or two there to be a post with dramatic energy savings made by the Chrome team.


I want to preface this with a warning that this is 100% conjecture. Literally none of this has any proof...

I personally don't think chrome has anything "up their sleeve". From the looks of what they have been focusing on recently, they really dug themselves a hole with their technical debt in the V8 engine.

It's too complicated, and it looks like their upgrade path has it getting worse before it gets better. They aren't planning on switching to "Ignition -> Turbofan" until mid-late 2017 (which is expected to reduce memory usage, but also will reduce speed slightly at first), so that means that until then, it will be "Fullcodegen -> crankshaft or turbofan and sometimes ignition only in some codepaths". It's only getting more complicated as they try to write out their crankshaft compiler. IIRC ignition can't interface with cranshaft, while it can with turbofan, but turbofan is significantly slower than cranshaft for many things right now, and fullcodegen is WAY to slow in parsing (and re-parsing, and re-re-parsing) code to compete with Chakra which just starts SOO much faster because of that.

On the non-javascript side, servo is proving to be a MASSIVE amount of work, and I honestly don't think that blink is going to be able to be retrofitted easily to get similar gains. And nobody has heard a peep about a "secret new web engine being developed at google", so I really don't think they have one (or if they do, it's still a few years out at best).

I love chrome, but it's not in a good position. Edge is consistently faster in javascript execution, Safari is significantly better to the battery (if missing a pretty large chunk of newer features), and when Firefox starts integrating Servo, they are going to be king of layout/painting/rendering speeds (which is by far one of the biggest bottlenecks in modern web apps).

I hope they prove me wrong, but my hopes aren't high at this time.


Yes, well if they are really running an eye tracker with your webcam to time GC pauses, then it's a case of robbing Peter to pay Paul.

But umm... they don't really do that, right?


I think that was tongue in cheek. There are related ideas though, like doing aggressive GC of tabs which are invisible. Browsers are doing better and better jobs of prioritizing work based on how it will affect user perceptions of performance.


> > V8 was the first really fast JavaScript Virtual Machine

> This is just barely technically correct, at best.

seriously, that's what you got out of these slides?


"Almost late" === "on time".




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: