
Forgo JavaScript packaging? Not so fast - prezjordan
http://engineering.khanacademy.org/posts/js-packaging-http2.htm
======
discreditable
>It may not have escaped your notice that 662,754 bytes is a lot of bytes for
JavaScript for a single HTML page, and 296 files is a lot of files. "What are
you doing on that page?" you may well be wondering.

> We are wondering that too. The page in question is the Khan Academy homepage
> for logged in users, and it's acquired a lot of, um, functionality over the
> years.

> The end result of our analysis is that our page isn't going to load any
> faster until we reduce the amount of JavaScript we use on the page. Tricks
> like trying to load a bunch of stuff in parallel, or aggressive caching,
> might seem like appealing shortcuts, but nothing replaces just auditing the
> code and making it need less "stuff."

At the end of the day, de-bloating is the best way to make your pages faster.

------
chowes
Webpack's chunking ([https://github.com/webpack/docs/wiki/optimization#multi-
page...](https://github.com/webpack/docs/wiki/optimization#multi-page-app))
will help you get the "hybrid" approach easily. It lets you separate out
common modules into JS files for you to load when needed.

~~~
zackify
That's what came to mind reading this. Why a whole post on this when you can
literally just go look at the webpack docs and do this in 5 seconds.

~~~
vectorpush
I _love_ webpack and highly recommend it, but it is _not_ trivial to integrate
it into an existing project and pretty much nothing about webpack can be
described as "done in 5 seconds".

~~~
dustingetz
I did it in a big project it took a couple days, really wasn't that big a
deal, and legacy JavaScript still goes through webpack and works unmodified,
then a couple weeks to iron out any kinks in the CI scripts (but was still
developing during this time)

~~~
vectorpush
A couple days sounds about right and jives with my own experiences, but I feel
like "do this in 5 seconds" is pretty misleading, especially for someone who
may have never used webpack before. It's absolutely worth the effort, but I
wouldn't want someone to be discouraged had they expected a "5 second"
solution but then realized it's not quite that simple.

~~~
muzmath
This is typical programmer attitude. Standard estimates go like this:

When trying to make themselves look good on online developer forums: 1/100th
real development time

When estimating time involved for co-worker: 1/10th real development time

When estimating time involved for themselves at work: 1/2 real development
time

------
spankalee
They need to use push to get the most out of http2.

If they prebuild a manifest they can add X-Associated-Content to the response
headers and App Engine will take care of the rest. They of course need to make
sure that their static file handling has very little overhead, since doing
push this way does get each resource via a request.

~~~
spankalee
Someone deleted a reply, that I was trying to respond to, about App Engine and
http/2 push. My reply:

App Engine has supported the X-Associated-Content header for a little while
now, and also supports the Link header, with rel=preload, like:

    
    
        Link: <https://example.com/other/styles.css>; rel=preload; as=style
    

See [http://w3c.github.io/preload/](http://w3c.github.io/preload/)

~~~
phleet
Haha, sorry, that was me. I was eating my words, because it looks like there
is support!

The experiments described in this post were conducted early this year, and it
looks like those developments are fairly new?

I just discovered this too: [https://github.com/GoogleChrome/http2push-
gae](https://github.com/GoogleChrome/http2push-gae)

Is this new-ish capability documented somewhere?

------
dmethvin
It's understandable that individual files would be slightly larger in
aggregate than a single combined file, due to the compression issue that was
mentioned. However, since there is no more head-of-line blocking the overhead
of an extra few bytes should be negligible compared to elimination of round-
trip delays. Also, if you are expecting to serve HTTP1 clients for a while
it's a good idea to use a hybrid approach where you combine files, just less
aggressively than you may have in the past.

------
nailer
> The traditional advice for web developers is to bundle the JavaScript files
> used by their webpages into one or (at most) a few "packages."

They're called bundles. Not 'packages', which already has a very distinct
meaning (eg, [https://www.npmjs.com/](https://www.npmjs.com/) or similar
systems) in the JavaScript community.

------
potench
Reading this article didn't inspire much hope for a bundle-free JS environment
- but if you want to play around with one, we've been working on such an
environment that provides 2 applications: 1\. A Koa + HTTP2 static asset
(primarily JavaScript) server 2\. A rendering application that loads a route-
based JavaScript dependency tree using JSPM + react-router in order to
isomorphically render a web-page.

[https://github.com/nfl/react-wildcat](https://github.com/nfl/react-wildcat)

------
RyanZAG
Isn't this whole article completely specific to the Google App Engine use
case? Google App Engine loads all of your static files onto shared servers in
use by many other users and builds up some kind of lookup table to access
requests. It's not entirely clear how they've architected this, but it's
obviously unlikely that they store everything in RAM.

So the most likely cause for the random slowdowns are when javascript files
are loaded from disk to ram or some similar process.

For most cases, you'd have your javascript files sitting in-ram on a dedicated
static file server cache such as apache or nginx and would not experience this
kind of random slowdown. So more than likely this article does not apply to
anybody not using google app engine.

------
berdario
It seems weird to me that the first byte for index.htm i the many-files test
arrives at 0.175s

while for the packaged-files test it arrives at 0.092s

Since it's the first request, there're no concurrent requests in the view.

Maybe they run their test while benchmarking it? Still, if the environment is
different, I'm not sure if their timing numbers are truly significant

------
ilovecomputers
Since we are on the topic of effectively distributing JS code to clients...why
is there no push for delta updates on the web? There was a proposal for delta
caching back in 2002:
[https://tools.ietf.org/html/rfc3229](https://tools.ietf.org/html/rfc3229)

~~~
spankalee
Service Workers can help here:

    
    
      1. Load the cached resource
      2. Fire off a background request for any delta updates
      3. Apply delta and update cache
      4. Next load use the new resource

------
ds_
ClojureScript can do this along with dead-code elimination / cross module code
motion: [http://swannodette.github.io/2015/02/23/hello-google-
closure...](http://swannodette.github.io/2015/02/23/hello-google-closure-
modules/)

~~~
lennelpennel
You mean the closure compiler can do this.

