
Cdnjs - the missing cdn - ryankirkman
http://cdnjs.com/
======
jsdalton
What's the policy on previous versions of libraries?

I noticed, for example, that only the latest version (3.1) of the jQuery
plugin Nivo Slider is available. The last versions (<= 3.0.1) are returning
404s.

Given that the previous version was released in May 2012 that's an _insanely_
fast deprecation policy for a CDN.

I have to guess that either I've stumbled upon a bug or minor oversight in
this case, because I can't see how removing old versions that quickly would be
at all a workable solution for 99% of use cases...

~~~
jonny_eh
Maybe they'll only support version that were released from the date that
library was added to the CDN. In other words, hopefully once a library+version
are added to the CDN, it's supported for a long long time.

~~~
jsdalton
Yes, I think you are correct.

Digging deeper, I see that libraries are added via pull request and that the
Nivo Slider library was added 11 days ago. Other libraries that have been
there longer (such as jQuery) continue to have old versions.

~~~
ryankirkman
We'll host any version of any library. It's just that no one typically adds
versions of libraries older than we currently host.

------
ceejayoz
"This page (<http://cdnjs.com/>) is currently offline. However, because the
site uses CloudFlare's Always Online™ technology you can continue to surf a
snapshot of the site."

Well, that's hardly comforting.

~~~
Sidnicious
FWIW, the libraries they host still seem to be getting served normally, so
that’s a good thing.

~~~
ceejayoz
Sort of. I'm still not sure I want to use a CDN that can't keep its main site
up.

How do I know their .js files won't expire out of the CloudFlare cache before
the site comes up, for example?

~~~
ryankirkman
It doesn't work like that.

The CDN portion is _completely independent_ of the website. The website is
really just an index of the files on the CDN.

------
zhoutong
I always prefer compiling and gzipping the whole site's assets to just one .js
and one .css file and serve them through CloudFront. It's usually around
100-200 KB and CloudFront's latency is very low at most places.

This saves a lot of requests and waiting time between page loads (i.e. the
first page is always slower, but subsequent page loads take almost no time
because there're very few (or even just 1) requests needed to make.)

------
jvehent
Who pays for this ? How do I know you're not going to replace some of that
javascript with malicious content ?

~~~
ryankirkman
Cloudflare pays for this. They've been sponsoring cdnjs for around a year now.

As for the security of our system, all javascript files are verified against
official sources before going on the cdn. Additionally, we have many library
maintainers submitting updates to their own libraries.

Beyond that the only question remaining is our personal integrity. Like any
relationship with a third party, you're going to have to decide whether
trusting us is an acceptable level of risk. If past performance is any
indication of integrity, we have had no security incidents since we began in
January 2011.

~~~
adambenayoun
I assume you are the creator of cdnjs - what are your relation with cloudflare
beyond just the sponsorship? Are you working for them? Do you advise them?

I'm wondering what is the performance increase delivered by cloudflare. I've
heard many mixed opinions and I'm at the interesection where I have to decide
whether I'm using them or others.

btw - kudos to Cloudflare for sponsoring this - seems like a great way to put
yourself in front of developers.

~~~
ryankirkman
We have a very good relationship with Cloudflare. See these releases by
Cloudflare:

[http://blog.cloudflare.com/cdnjs-community-moderated-
javascr...](http://blog.cloudflare.com/cdnjs-community-moderated-javascript-
librarie) <https://www.cloudflare.com/apps/cdnjs>

As for Cloudflare's speed, you can see cdnjs's average ping time as measured
by pingdom here: <http://stats.pingdom.com/4jg86a2wqei0/362854>

~~~
halfasleep
> As for Cloudflare's speed, you can see cdnjs's average ping time as measured
> by pingdom here: <http://stats.pingdom.com/4jg86a2wqei0/362854>

If I'm guessing what the names of your Pingdom checks mean correctly, they
seem to show CloudFlare making your response time 10ms slower. I'm assuming
that cdnjs.cloudflare.com is the site with CF in front, and cdnjs.com is
without, not sure if that's correct.

~~~
ryankirkman
<http://stats.pingdom.com/4jg86a2wqei0/362854/history>

------
mdlthree
What do providers get for access to these type of files? Can they capitalize
on the information gathered from file requests? Adding to their knowledge of
traffic patterns etc...

~~~
bdcravens
As it's sponsored by CloudFlare, I suspect it makes for a good Case Study for
them.

------
mkoryak
and here is the other 'missing' cdn that does the same thing and has been
around longer:

<http://cachedcommons.org/>

:)

~~~
erwanl
cdnjs has been around for a while already.

~~~
mkoryak
cachedcommons claims to have been around since 2009. cdnjs claims to have been
around since 2011. Im not saying one is better than the other, but i am
implying that cdnjs is not 'the missing cdn'.

------
ckluis
That's pretty awesome! I've always wondered why popular css grids weren't also
CDN'd. Bootstrap, 960, etc, etc - it makes great sense to have many of these
things cached across 100s of sites 1 time - instead of 100 times. Modify the
cdn version in a seperate style sheet afterwards.

~~~
ambirex
The same people who are doing the cdnjs have been planning do a CSS one as
well, <https://github.com/cdnjs/cdncss>

~~~
ryankirkman
Yep. We plan on launching the css component our our offering in the near
future.

~~~
alexchamberlain
I've just submitted a pull request with both versions of normalize.css.
<https://github.com/cdnjs/cdncss/pull/15>

------
BarnabasLAL
I suppose if you have a big trust issue letting Cdnjs host your libraries or
if you have a customized build of one, you could just do what they did and
sign up for CloudFlare and control the files yourself. [edit: CloudFlare, not
Cloudfront]

------
Nux
Ok, I'm not a developer and I'm probably missing the big picture or smth; my
question: \- All this fuss is about hosting a few text files none bigger than
several KB??

Who in this world does NOT afford to host a few small files nowadays?

~~~
dasony
The idea is that users wouldn't have to download the same jquery or whatever
script over and over for every site they visit. It makes less sense when it
comes to not-so-popular script files, but I guess there's a convenience factor
too.

~~~
rachelbythebay
We need to extend the baseline notion of what the web is. If some nontrivial
number of sites are using (say) jQuery, then it would be a good idea to have a
way to declare "SCRIPT SRC jQuery version x.y.z" and let the browser figure
out where it lives. Then you fetch it once, parse it once, and run it many
times, no matter what site you may be visiting.

Or at the very least, we need some way to say "get this script from this URL,
but only if it hashes to <this value>, since otherwise it's been compromised".
Why worry about CDNs when you can design the script-switcheroo attack right
out the system in the first place?

I wrote about this in June: <http://rachelbythebay.com/w/2012/06/27/src/>

~~~
mikeash
It seems to me that the best way to handle this would be a content-addressable
system with a more traditional fallback. You'd declare that you want file with
SHA-256 (or whatever) hash of XYZ. If the browser has it, then you're done. If
the browser knows where to find XYZ, then it can go off and grab it however it
feels like. For compatibility, you'd also specify one or more traditional URLs
where you think that content XYZ can be found, and the browser could use the
hash to verify integrity.

The trouble with the current CDN setup is that you only get the maximum
benefit if everybody uses the same CDN, but people don't necessarily want to
trust Google or whoever to host code that their site relies on. With a
content-addressable system with fallbacks, you'd get all the benefits of the
CDNs with none of the drawbacks.

~~~
ay
I think we already have this system - it's the browser's cache. Consider the
following sequence of steps:

1) add the "SIG" values to the tags in the HTML page that have SRC attribute.
Be it scripts, or images, or iframes, or what not. So far, we've just bloated
the page a bit with no good effect for the user.

2) update the code in the browser to calculate for each resource in the
browser's cache the values of a few "frequently used" signatures, and allow
the signature-based access to the content [compressed trie?], in addition to
indexing the cache by the URL. Now, the "bloat markup" from step 1 starts to
kick in - and you can reuse web-wide all sorts of resources - scripts,
downloadable fonts, artwork, whatever. At this point the user spends the time
only on the first download of the file, even if that file comes from a very
slow VPS.

This approach could dramatically decrease the load on the CDNs for the
frequently-repeated content, and get the latency down to near zero, so it
would be much better than even the ISP-hosted CDNs.

Maybe anyone is reading this who is familiar with the FF/Chromium codebase to
comment on the feasibility of such an approach ?

~~~
mikeash
I'm a bit confused, how does this qualify as "we already have this system"
when your step #2 is "update the code in the browser"?

Aside from that, yes, this sounds pretty much like what I'm thinking of.

~~~
ay
I thought you had in mind a totally new content delivery mechanism to fetch
the data by hash from the network, relative to that adding the content
addressability to browser cache is near trivial. Apologies if I misunderstood.

~~~
mikeash
I was thinking that this could be added as well, but that it would be purely
optional, if anyone got around to adding it.

Basically, you need the hash and a fallback regular URL. The browser is free
to grab the content using the hash however it likes, whether it's grabbing it
from its cache, using the fallback URL, using another known URL for that
content, or using some new delivery mechanism.

------
mpd
What about old versions? Taking the jQuery url, and changing it to the version
previous to the newest release (1.7.2), 404s on me. This is a showstopper.

[http://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.0/jquery-1....](http://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.0/jquery-1.8.0.min.js)
\- ok

[http://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery-1....](http://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery-1.7.2.min.js)
\- 404

~~~
jsdalton
Ha, you and I noticed the same thing just moments apart. Hopefully someone
will reply on this thread to our question, because I can't imagine having to
upgrade all of my projects immediately whenever a new version of any library
was released.

EDIT: Looks like the URL pattern just changed slightly for jQuery. 2.72 is
still around:

[http://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.mi...](http://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js)

Looks like they host back to 1.6.1, probably when the library was added to the
project:

<https://github.com/cdnjs/cdnjs/tree/master/ajax/libs/jquery>

------
px1999
Looks really useful at serving the stuff that you can't find anywhere else
(and for whatever reason don't want to host yourself), but IMO it really needs
a TOS / license - I'm guessing that I can use it on personal sites, but can I
use it on commercial sites, can I use it on SaaS sites, or sell sites that use
it, etc etc?

------
acconrad
Front-end guy here, curious about CDN access on mobile - is it faster to serve
one CSS and one JS minified file through a private CDN (like CloudFront), or
use something like CDNJS to make concurrent CDN calls with many smaller files
spread across a connection?

------
moe
Hint: If you want to brag about your uptime ("100%") then you should monitor
that with a HTTP-request and not an ICMP Ping.

The latter is rather deceptive because it's not actually testing service
availability.

------
Maxious
"This website is offline No cached version is available"

definitely missing ;)

------
muxxa
MochiKit isn't listed: <http://mochi.github.com/mochikit/download.html>

------
fintler
I really thought this was going to be related to cjdns.

<https://github.com/cjdelisle/cjdns>

------
criswell
I like the simple new design. Everything that I need is there and nothing
more. Huge improvement over the last design.

------
jtokoph
<http://news.ycombinator.com/item?id=2828516>

------
debacle
Needs more wu.js

<http://fitzgen.github.com/wu.js/>

~~~
sabat
From what I've gathered, you can fork cdnjs, add it, and do a pull request.

------
rojotek
great CDN for all the stuff that you want that isn't on google or other CDN's.

------
josteink
For every one of these CDNs you use, chances that people with scriptblockers
will use your site goes down.

When I see that a site tries to resolve scripts from 50 domains, for what
should really just be static HTML, I generally leave.

So please. Don't use CDNs. If you want people to trust you, host your own damn
stuff on your own domain.

~~~
MatthewPhillips
How does using CDNs make you distrust a website? If someone is loading
jquery.min.js from Google shouldn't you trust it _more_?

~~~
josteink
When I have see that a webpage has 20 or more scripts attached to it from
domains differing from the one website I am visiting, I do by default assume
they are tracking scripts from advertisers or facebook or similar ilk.

If a website _needs_ scripts, I expect the website to serve it from a domain
which belongs to it. For most sites I visit there are at least 30, sometimes
50 scripts from various sites and domains trying to track me, slowing down my
browsing experience, and sucking up my systems ram.

Disabling those causes massive speed-ups. Plus it protects my privacy.

Install ScriptNo in Chrome (or similar for Firefox). You will be shocked by
the difference. And you will be shocked by the massive script-abuse currently
on the net.

And no, I will not wade through that long list for each and every page I
visit, to whitelist whatever CDN you have decided to put in the same
trustworthyness-group as doubleclick.net.

If your site breaks with those scripts blocked, I leave.

~~~
freehunter
_Install ScriptNo in Chrome (or similar for Firefox). You will be shocked by
the difference._

Yes, I would be shocked to be browsing the web circa 1995. Which is why I use
a modern browser that supports modern technologies. I also use Ghostery with a
whitelist to block tracking. That way, sites work like they should and I can
browse this decade's Internet without having to go on forums and let everyone
know I'm important and I'm special and every web app should work on Lynx
because I said so.

~~~
josteink
My point was not that all scripts are evil, but if you want me to trust your
site, then dont require me to trust 50 domains I do not know for a minescule
10ms potential performance improvement.

I find that a perfectly reasonable attitude.

