
New JavaScript techniques for rapid page loads - tosh
http://blog.chromium.org/2015/03/new-javascript-techniques-for-rapid.html?m=1
======
Cymen
The mobile link is a little jarring on desktop (?m=1). Here it is without
mobile:

[http://blog.chromium.org/2015/03/new-javascript-
techniques-f...](http://blog.chromium.org/2015/03/new-javascript-techniques-
for-rapid.html)

~~~
nickysielicki
What it looks like on my vertical monitor:
[http://i.imgur.com/mzGr6Da.png](http://i.imgur.com/mzGr6Da.png)

ALL HAIL THE MIGHTY CHROMIUM.

~~~
tmd83
It was actually worse for me. The logo filled the complete vertical space so I
thought maybe the page was broken until I scrolled down :D

Kind of funny from chrome blog running in Chrome or is it a little sad.

Edit \---- I guess mobile version in desktop causing problem kind of make
sense I guess.

------
jray319
To be precise, code caching has been in major browsers for a while, but it was
in-memory cache. Chromium is the first to make it persistent with this update.
Yet, what's cached contains no optimization result. It is more like basic
parsing and translation into machine code.

~~~
timdorr
It looks like a logical next step is to store the optimized and compiled code.
And after that, you can perform more aggressive optimizations on frequently-
accessed scripts.

~~~
diydsp
yeah if the pages stored hashes of the script, it could be pre-executed before
you even visit the page. or common libs like jquery could be pre-parsed into
binary structs in RAM.

~~~
seba_dos1
It needs to happen. It's so simple and effective approach that it's really
puzzling to see it not being implemented anywhere.

------
hashseed
V8-based projects can benefit from this as well:
[http://www.hashseed.net/2015/03/improving-v8s-performance-
us...](http://www.hashseed.net/2015/03/improving-v8s-performance-using.html)

~~~
azinman2
Omg so does that mean plv8 (v8 inside Postgres) isn't caching the jit'd js
functions that might be executed thousands of times a day??????!

~~~
hashseed
Not sure. If it spins up a new V8 instance every time, then no. But I don't
think that happens. It probably keeps a V8 instance around, so in-memory code
caching should already work.

Edit: looking at the plv8 code, it doesn't seem to create a new Isolate for
every request. So I think for the same instance of PostgreSQL the in-memory
cache should also hit.

------
untog
Interesting that async scripts are parsed on a different thread - I didn't
know that. So maybe best practise now is to add JS to the <head> async, so it
has the most time to parse?

~~~
paulirish
Best practice is to always add [async] or [defer] attributes. Document
location will always be a hint to the browser, but you shouldn't toss it in
the head if you don't need it till later.

~~~
mike-cardwell
If your scripts are going to be downloaded asynchronously because you've used
"async" or "defer", you should place them before any blocking resources, like
CSS for example, otherwise you're sat waiting for that blocking resource to
download before you even start downloading the scripts.

This is why my script tags end up in the head now, because I want to place
them before the CSS tags, and I want to fetch the CSS before the body starts
being displayed.

I tend to use defer rather than async because I want the scripts to execute in
the order I add them to the page, and only once the body has been fully
constructed.

That said, if you have to support old browsers, make sure you research which
ones support async/defer, and which ones have buggy implementations.
caniuse.com is your friend.

[*] If I'm wrong with any of the above, please correct me. Just checked your
profile and you seem well qualified to contradict me.

~~~
paulirish
You don't have to place things in order to addressing blocking behavior
better. Browsers have already tackled that optimization for you.
[https://groups.google.com/a/chromium.org/forum/#!topic/loadi...](https://groups.google.com/a/chromium.org/forum/#!topic/loading-
dev/40mLlHGqRwc)

------
lugg
Code caching should have been at the top of that post. Sure its not as
exciting as streamed parsing but my guess is that it will be far more
beneficial to the user.

Funnily enough I was toying around with cross domain local storage and ways to
cache javascript for this very purpose. Sadly I found the overhead was a bit
too high for only first load benefits and the extra complexity didn't really
help things either. In any case, nice to see something of this nature going
into the browser directly.

~~~
tmd83
Exactly my thought on which would be more important.

Though one thing is even more surprising for me. I thought parsing was the
easy/cheap part of the compilation process. Why would then streaming parsing
be such a big deal or am I missing something? Specially with the additional
complexity of parsing the html and layout/rendering I would have thought JS
parsing would be a minuscule part. Can anyone enlighten me?

~~~
lugg
> pages loading as much as 10% faster.

In other words, its not, that was kind of my point, however it is cool.

I suspect, the devs had to modularize / logically separate parsing from
compiling / executing somewhat before they could complete the next phase (fro
chrome 42) which is the script caching part: cached scripts won't need re-
parsing before compilation/execution.

------
tosh
This is especially important for large app startup time.

------
MagnusVonBlack
I've discovered an incredibly efficient and reliable JavaScript technique for
rapid page loading: Disable JavaScript. I use the NoScript plugin in FireFox
and just whitelist a few reliable ajax sites like Google or whatever. My
experience has been that JavaScript and modern web design philosophies make
90% of websites slower, uglier, and more difficult to navigate.

~~~
bwy
I find it really interesting that NoScript users seem so vocal and adamant
about their plugin on this site, at least. Many sites that don't work with
NoScript have condescending comments from NoScript users, and they don't
hesitate to proudly let their makers know. They often make comments like this,
too. What is it about this plugin that seems to be correlated with these
comments? Is browsing the Internet with such a big chunk of functionality
missing really so much better? (Real questions, here.) I've heard HN doesn't
even work with NoScript, which makes this all the more interesting to me.

~~~
titanomachy
Not sure about NoScript but I just disabled JS using Chrome Developer Tools
and HN seems to work fine.

To answer your question, I think that NoScript users tend to appreciate (and
even expect) minimalism... consequently some get irritated when even a simple
blog can't load without JS. I get the appeal of the minimalist, standards-
adhering, machine-parseable website, but it's not a big source of emotion for
me :P

~~~
bwy
Thank you, very good answer! It makes a bit more sense to me. Also tested in
Chrome w/o Javascript, and it does indeed work. Upvoting's just a little more
inconvenient.

------
BinaryIdiot
I'm surprised they're not doing the streaming for synchronous code yet
considering it's just parsing and compiling and not yet executing. Perhaps in
a future update.

I'd still love to see some sort of byte code for the web though but I love the
advancements still.

~~~
martin-adams
like Java? :)

~~~
BinaryIdiot
Yup or .Net with its MSIL. Then they could change the JavaScript standard
every year if they wanted to and it wouldn't matter as it would compile to the
same byte code. Naturally byte code improvements would happen slowly but
that's not as big of a deal.

It just makes sense to me but I haven't seen anyone working on it just
occasionally talking about it. So maybe it doesn't make sense to the people
working on web browsers, I dunno.

~~~
dragonwriter
> Yup or .Net with its MSIL. Then they could change the JavaScript standard
> every year if they wanted to and it wouldn't matter as it would compile to
> the same byte code.

Not sure why you need byte code for that. You could just have a relatively
stable base JS standard as the common target, and then a more rapidly changing
set of advanced/experimental JS features implemented through compilers for the
resulting, enhanced, JS+extra features languages that compile back to base JS.
Heck, you could do the same thing for lots of other, non-JS languages that
compile to base JS. In fact, that's exactly what is done today.

Byte code isn't magic, its just another language you have to write an
interpreter for -- just not a human-readable one.

------
bcg1
I wonder how they persist the cached code, and if it is signed etc. Seems like
it could be an attack vector if it was altered on the filesystem or something
like that.

I'm sure they thought of that, so I'm just curious. I suppose I could look
through the Chromium source but I'm probably not actually smart enough to
figure it out.

~~~
ciupicri
If someone else has write access to the your filesystem, you're already
screwed.

~~~
vidarh
Well, sort of, but it's not that simple. Let's say they write the cached data
to $somedir/jitcache, and they, say expose some other API that indirectly
allows you to store data at $somedir/foobar. If they fumble checks in this
other API so you can manage to write arbitrary data to $somedir/jitcache
instead of $somedir/foobar, an attacker suddenly has a powerful tool to escape
the JS sandbox even if the flaw didn't allow them to write outside of
$somedir.

I'm not suggesting it's likely that this will be an issue, but it's a
legitimate question to ask, as there's a long history of people managing to
leverage a combination of flaws allowing them to write in a limited set of
locations with different flaws allowing them trick some tool into executing
what they manage to get written to disk.

~~~
seba_dos1
"so you can manage to write arbitrary data to $somedir/jitcache instead of
$somedir/foobar"

That's already escaping from sandbox which most likely would cause much more
troubles. In this way we can worry about absolutely anything ;)

------
amelius
This shows that being lazy can actually make you faster. I wish more languages
had good support for laziness.

------
KamiCrit
Chromium?

~~~
ForHackernews
Chromium is the open-source version of Google Chrome.

~~~
shwetank
More than that, its a project which a number of other browser also rely on and
contribute to, for example Opera.

------
blumkvist
Ah, I clicked expecting developer-orientated techniques.

Anyone has any resources he (or she) can recommend about writing faster, more
efficient javascript? I'm not a developer, I learned javascript to hack some
things together and obviously I taught myself awful practices, but I can't
identify/change them.

~~~
dccoolgai
Follow Paul Irish and Addy Osmani for starters. Superherojs.com has a list of
stuff to read in their "how browsers work" section... other than that, even
the experts admit a lot of it is black magic - there are some fundamental
things (dom access, writing is slow) but hard to nail down concrete things.

~~~
rylee
Superherojs seems to be down for me.

~~~
dccoolgai
Hmm, likewise. Too bad that was one of my favorite reference things (though it
was getting a bit dated).

~~~
graedus
Not quite as convenient, but at least you can still find the links here:

[https://github.com/superherojs/superherojs/blob/gh-
pages/ind...](https://github.com/superherojs/superherojs/blob/gh-
pages/index.html)

