
AJAX Libraries API: Speed up your Ajax apps with Google’s infrastructure - foemmel
http://ajaxian.com/archives/announcing-ajax-libraries-api-speed-up-your-ajax-apps-with-googles-infrastructure
======
nostrademons
This is really cool, but I'm not sure I'd use it because of the plugin issue.
Usually, I don't just use JQuery - I use JQuery, plus a few JQuery plugins,
plus JQuery UI, plus some custom JavaScript files that are built off this. To
deploy them, I concatenate them all together in dependency order, minify the
whole thing, and GZip it. That saves HTTP requests, and multiple files with
the same coding standards tend to compress _very_ well. (JQuery + Dimensions +
Dropshadow + Corner = 20504 bytes when compressed together, vs. 21919 bytes
when compressed separately.)

It looks like with this, I'd need to request JQuery off Google's
infrastructure and then all the other stuff off my own, so it's multiple
requests where one used to do, and you don't get the benefit of compressing
them together.

~~~
boucher
While I think you have some good points, the difference between 20504 and
21919 bytes is completely meaningless.

~~~
jauco
It's brings it down to one request _and_ it reduces the filesize. Requests are
relatively slow, you want as few as reasonable possible.

~~~
boucher
I tend to agree, and I didn't make any claims about the number of requests.
But data transfer really isn't all that slow, and 1415 bytes only takes about
10 milliseconds on a dsl connection (and 100ms on dial up), which is
insignificant compared to the other factors involved.

~~~
morbidkk
and you assume all who are accessing your web application from n number of
places over the world have such fast internet access. I doubt

------
pdubroy
Related to this, people might want to take a look at Doug Crockford's
suggestion:

[http://blog.360.yahoo.com/blog-
TBPekxc1dLNy5DOloPfzVvFIVOWMB...](http://blog.360.yahoo.com/blog-
TBPekxc1dLNy5DOloPfzVvFIVOWMB0li?p=789)

Basically, he suggests that every script tag have an option attribute called
'hash'. Whenever the browser downloads a script, it computes the hash and
caches the script. For any further requests that specify that hash, the
browser can used the cached copy instead of downloading a new one. The main
benefit here is that everyone can continue hosting their own scripts, yet
still take advantage of caching.

Brendan Eich (creator of JS) proposes a different solution:

[http://weblogs.mozillazine.org/roadmap/archives/2008/04/popu...](http://weblogs.mozillazine.org/roadmap/archives/2008/04/popularity.html)

In your script tags, you would specify both a local version (using the src
attribute) and a canonical version (using a 'shared' attribute).

Brendan's concern about the hash solution is the poisoned message attack
([http://th.informatik.uni-
mannheim.de/People/lucks/HashCollis...](http://th.informatik.uni-
mannheim.de/People/lucks/HashCollisions/)). However, I'm not sure that applies
here. I believe that you need to be able to generate both documents in order
to easily find a collision. Anyone else know if that's true?

------
imp
I still think I'd prefer to host them myself. When I used YUI style sheets
hosted on Yahoo's servers they always took longer to load than local files.
Also, Google ads are always the slowest item in a page to load. Until these
libraries are embedded in the browser I don't see much benefit.

------
aschobel
_edit_ Nevermind, nothing to see here _edit_

In the video they say that they used many of Steve Souders' techniques to
speed up the deliver of the files since most web servers aren't properly
optimized out of the box

Looks like Google missed the part in Souders' High Performance Web Site where
he talked about using ETag to speed up sites. :P

    
    
      curl -I http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js
      HTTP/1.1 200 OK
      Last-Modified: Mon, 26 May 2008 18:45:05 GMT
      Content-Type: application/x-javascript
      Expires: Wed, 27 May 2009 17:39:22 GMT
      Date: Tue, 27 May 2008 17:39:22 GMT
      Cache-Control: public, max-age=31536000
      Content-Length: 55740
      Server: GFE/1.3

~~~
simonw
I don't think you need an ETag if you're serving a Last-Modified.

~~~
aschobel
Oops, you are right. For some reason I thought using the If-None-Match header
was the "right" way of doing this, it just adds flexibility to how you cache
stuff.

<http://developer.yahoo.com/performance/rules.html#etags>

"If you're not taking advantage of the flexible validation model that ETags
provide, it's better to just remove the ETag altogether. The Last-Modified
header validates based on the component's timestamp. And removing the ETag
reduces the size of the HTTP headers in both the response and subsequent
requests."

------
aggieben
It's a neat idea, but I can't imagine using it. Do you really, really trust
Google with your business? What happens when google decides to "fix" something
in one of those frameworks that the upstream developers disagree with? Even
worse, what happens when your app is irreconcilably broken because of
something that happened at Google? Worse still, what happens when your app
somehow becomes reliant on Google's version of the framework(s), and you don't
realize it until it would be too expensive to unhook yourself from Google?

~~~
simonw
There's nothing new about depending on external parties - if your site hosts
ads (or uses an external stats service) you're already running code hosted
elsewhere, so you should probably be comfortable linking through to Google (I
trust them a lot more than most ad networks). The question is always "do I
trust this provider not to screw me over" - Google's developer network stuff
HAS to be trustworthy or they'll lose the hearts and minds they've been
cultivating overnight.

As for your app becoming reliant on Google's version of the framework, you can
always download the JS file they've been serving and host it yourself. You
can't get locked in that way.

~~~
aggieben
_As for your app becoming reliant on Google's version of the framework, you
can always download the JS file they've been serving and host it yourself. You
can't get locked in that way._

This misses the point. I'm not concerned about _where_ the file comes from. If
breakage occurs with regard to where the download comes from, a few customers
get annoyed while you fix it (or even better, you implemented caching and
customers never know the difference). That's a normal and expected maintenance
issue that you plan for.

What I'm really talking about is the _content_ of the file. If Google fixes
bugs or adds features and you don't realize it until your app has become
dependent on Google-specific changes, it's not an easy thing to fix. This is a
different problem than getting screwed over by ad providers.

~~~
simonw
Aah I understand where you're coming from. I would be amazed if Google made
changes to the libraries they are serving up (since it would undermine the
entire concept of the hosting service).

~~~
aggieben
It would definitely seem to go against the motto, right? You need look no
further than Microsoft (and other software vendors whose entire strategy is
based on lock-in), and assess the probabilities for yourself.

------
dmose
Excellent. With pipelining browsers this will speed things up even on the
first hit

------
flo
Thank you, this is useful to me.

