

JavaScript on deviantART: DWait and Dependencies - kemayo
http://dt.deviantart.com/blog/36265987/

======
dmpatierno
They're certainly doing the right thing here. JavaScript optimization is a
constant battle between (1) reducing HTTP requests by grouping code together
in a single file, and (2) reducing file sizes by splitting code into separate
files.

You don't want to group too much because you risk downloading code you don't
need, but you don't want to split too much because you need to keep HTTP
requests low. Proper optimization requires intelligent grouping of related
code to create a balance between these two ideals, allowing the browser to
cache commonly used resources so subsequent page loads are fast.

Their blog page has 6 external JavaScript files and 4 external stylesheets.
Those numbers are pretty reasonable given the scope of their site.

~~~
bkrausz
There's no excuse for a blog to have 4 external stylesheets: these can easily
be concatenated. Worse yet, they're not even minified.

~~~
kemayo
There's no excuse for a dedicated blog to have 4 stylesheets, sure. But this
is a subpage of a big website. The stylesheets are based on logical
separation, with more sheets getting added in as you go to more specialized
pages. On the frontpage, which is really more of our focus, there's a single
CSS file.

CSS doesn't get anywhere near as much benefit from being minified as JS does.
There's much less scope for transformation of the code to make it smaller, so
if you're serving your CSS gzipped (which we do) then the difference is
minimal at best.

~~~
bkrausz
Good point on the redownloading part: I'd have to look into the actual
numbers, if the homepage-primed connection count is still only 1 it's
definitely worth it.

However, I disagree about the low benefit of minifying gzipped CSS. I thought
the same thing too until I ran some tests. For example:

    
    
       34K v6core.css.gz
       27K v6core.min.css.gz
    

I ran your core CSS file through YUI compressor and then gzip. 7k of savings,
over 20%, definitely seems worth it.

~~~
kemayo
Interesting; that's a lot more than I had expected. It's only a few lines of
code to change to make it happen, so we may well do it. (Need to do some quick
checking to make sure that we don't have any old CSS hacks in there that might
play poorly with being minified.)

------
kemayo
The first third is a pretty standard introduction: minimize number of files,
yadda yadda.

The second third talks about how JS/CSS dependency resolution is handled using
metadata included in the files themselves.

Then it explains a client-side library used for enabling dependency behavior
on the frontend, including dynamic loading of files as-needed.

------
bkrausz
All modern browsers have per-domain connection limits higher than 2 by now
(Firefox is 6 for example). The problem is that JS blocks all other content
loading/parsing because it can impact it upon execution.

~~~
kemayo
The article mentions that just after it talks about connection limits.

~~~
bkrausz
Yes, but mentioning a connection limit that's normally 2 (i.e. "This limit is
normally 2 connection to a domain.") is factually incorrect.

~~~
kemayo
It makes sense to consider the lowest number amongst browsers in fairly common
use when considering the limit. IE7 may not be a "modern browser", but it's
still one in widespread enough use that we have to support it. (It was only in
IE8 that MS bumped the limit to 6.)

------
petercooper
That's definitely the first time I've seen such a strong literary reference in
the first sentence of a programming article :-)

~~~
kemayo
I was expecting someone to comment on that a little sooner, I'll admit. :)

~~~
ihodes
Au contraire!

Check out my post on the front page of HN just a couple days ago (before this
was written) ;)

The exact same reference, but relating to Clojure.

<http://copperthoughts.com/p/clojure-io-p1/>

That's not to say I don't appreciate it, though :)

------
Aetius
Seeing $object->method() made me shed a little tear in remembrance of the days
I sweated bullets for PHP.

