
JsDelivr – The advanced open source public CDN - rnyman
https://hacks.mozilla.org/2014/03/jsdelivr-the-advanced-open-source-public-cdn/
======
javajosh
When you include 3rd-party hosted content in your program, you give those 3rd-
parties:

    
    
       1. Access to your users' browser details (via request headers).
       2. Access to your users' habits (via the referrer header). 
       3. The ability to replace executable code. (This code can be used to gather more data from the state of the web page (e.g. keystroke logging), or break out of the browser sandbox and compromise the whole system. Note: this only applies to JavaScript.)
       4. The ability to break your software, either intentionally or unintentionally, by not serving the resource at all. 
    

When you host and serve your own resources, these are non-issues.

I think there is a sense among good-hearted programmers that "But gosh, they
wouldn't do [1-3]! These CDNs are just a really nice public service by
companies that care about keeping the web community healthy and fast. And
besides, if they ever abused their access to user data, there would be hell to
pay!"

Unfortunately this blasé attitude about user privacy on the part of
programmers is why we have such a rough time with privacy in general these
days. I believe that if you're a programmer with any interest at all in
protecting your users from bad actors, that you'll do your best to avoid using
3rd party hosts, _especially_ for executable code.

~~~
majika
Thank you thank you thank you.

What's worse is when you see sites running JavaScript from multiple analytics
companies, social media companies, and CDNs - even on pages that should be
private (login forms, personal details, etc.). It's even worse when those
sites break without those scripts. This is a concern that far too many
developers overlook. I made this (harsh) comment [0] on such a site posted to
HN a few months ago.

CDNs are way overrated, from what I've seen. I think, for most use cases,
you're better off avoiding the multiple HTTP requests and just compressing all
your JavaScript into a single file. Serve that from your own server and you'll
save the client from another domain name resolution too.

Do yourself a favor and install NoScript. For one thing, it will show you how
easily Google can (or, could have) track your internet usage - and, as you
pointed out, so much more. In my experience, about 50% of sites serve scripts
from some Google domain.

[0]:
[https://news.ycombinator.com/item?id=7187593](https://news.ycombinator.com/item?id=7187593)

~~~
javajosh
You're welcome! And I agree that the upsides of CDNs are overrated - but in
fairness they do have the benefit of being very easy to use, high performance,
and ease the bandwidth costs of applications. The geographical proximity thing
is a nice feature, and expensive to reproduce. Of course, even these upsides
come with their own downside: front-end developers have gotten into the habit
of including every lib under the sun. Heck, it's just a script tag! This makes
the page heavier, which makes the CDN even more indispensable, and it leads to
a downward spiral of browser-hosted bloat-ware that _requires_ a CDN to load
in a reasonable amount of time. This is, of course, a win for the CDN
sponsoring companies.

What's remarkable to me is how ubiquitous and unquestioned the use of CDNs has
become.

BTW NoScript/Ghostery don't address the CDN issue. They don't actually block
CDNs, they just block _explicit_ trackers (mainly in the forms of iframes and
scripts linked to known-bad hosts).

~~~
majika
NoScript is configured by default to only allow scripts and "active content"
to run from domains you allow. You can temporarily allow certain domains, or
permanently allow them. So yes, it does address the CDN issue. If someone
includes a script from apis.google.com, that script won't run until I permit
content from that domain to run (and I only ever do that temporarily, and only
if the site needs it). Likewise for any other domain: CDN, analytics,
whatever.

I don't know much about Ghostery - NoScript addresses most of my privacy
concerns around web browsing.

------
AdrianRossouw
So this is something I wrote last week relating how I've come to believe that
it's really not appropriate to rely on vendor code on public CDN's for
anything more complex than a JSFiddle.

[http://daemon.co.za/2014/03/from-trenches-js-cdn-point-of-
fa...](http://daemon.co.za/2014/03/from-trenches-js-cdn-point-of-failure)

Basically, if you need to include something like jquery, you have to have code
that makes use of it. Since by definition, you can't serve that code from a
public cdn, you are going to need to serve those static assets somehow.

I now believe that you should serve the vendor code from the same 'place' as
you serve your application's static assets, because by relying on external
resources, you are adding more moving parts that all have to be in perfect
working order for your app to actually be able to run.

This doesn't really have anything to do with how reliable the CDN itself is,
but rather with how reliable the client's connection is.

I did read the jsdelivr post, and it actually looks like a really well thought
out system. I just don't think I will use something like this for anything
where I have the choice not to.

IMO, the possible benefits of using a public CDN, don't outweigh the fragility
that gets added. It just feels like it is trying to optimize best case
performance while worst case is far more important.

I'm not against CDN's as a concept though, I just think you should serve all
the code that is needed for normal operation from the same one.

~~~
moystard
I think your statement is a bit extreme. While relying exclusively on a cdn to
serve your static assets is risky, doing it using require.js with a local
fallback works very well. You still benefit from cdn caching, while preventing
your assets to become unavailable (even in China).

~~~
lerouxb
Really? Have you ever sat through the experience of waiting for the primary
source to time out so that the fallback can kick in?

~~~
moystard
Ultimately, the timeout issue will be problematic for countries forbidding the
access to cdns servers (ie China for example).

For the rest of the users, there are three cases:

\- the user has accessed the application in the past (or any other webpage
that uses the cdn): a cached version of the library is served by her browser.
\- the user has never accessed the page, the cdn is unavailable:

* either the Internet connection has failed, and our application won't be available anyway * the cdn is indeed down, the user has to wait for the timeout before her browser fetches the local fallback.

The third case sounds extremely rare (look at the statistics of major cdns out
there, they often have better uptime and response time than your own servers).
And the advantages provided by the first case do more than compensate in my
opinion.

------
it_learnses
Can I work for mozilla? I feel like they're only one of the few organizations
that are doing cool things for the common good. Keep it up!

~~~
F30
They may be, but JsDelivr appears to have no deep affiliation with Mozilla:

> Who is behind jsDelivr?

> Basically just me, Dmitriy A. or @jimaek. But a lot of other people are
> involved, giving their advice and helping me out. I always refer to jsDelivr
> in the plural sense because of this.

It is sponsored, amongst others, by CloudFlare and MaxCDN, where @jimaek is
also employed. But the only connection to Mozilla seems to be that it's
featured on the Mozilla Hacks blog.

------
dalek2point3
I dont understand how this is paid for -- can someone explain?

~~~
lazyjones
> _I should also point out that MaxCDN, CloudFlare, Cedexis and the rest of
> the companies sponsor jsDelivr for free._

Personally, I don't trust such services any more than I trust cloud hosting,
which means not at all. I have no way to make sure that they server these
files unmodified, that they aren't neglecting their servers to the point where
they're hacked and serve malware to my users etc. ...

~~~
jimaek
I would be happy to address these issues. You have no reason to believe me but
all custom servers are secured and are regularly updated.

For CloudFlare and MaxCDN I have 2-Step authentication enabled + for MaxCDN 1
allowed whitelisted IP address.

Regarding unmodified files I guess we can build an app to monitor them? If you
want drop by
[https://github.com/jsdelivr/jsdelivr](https://github.com/jsdelivr/jsdelivr)
and we can discuss this even further.

I would be happy to do anything possible to ease your concerns regarding
security.

~~~
bronson
Do you have any sort of agreement that they will serve these files bit for bit
unmodified?

I had a problem with cloudflare inserting some tracking cookies into my static
HTML files. They claimed it was ddos protection. To me (and the EU cookie
directive) it looked no different from the garbage other analytics sites use.

If jsdelivr providers are allowed to modify the files they serve, I won't use
it. Got any sort of guarantees?

Thanks.

~~~
saurik
FWIW, the only "CDN" I've ever heard of to regularly pull stunts like that is
CloudFlare; and really, that's their angle: it adds latency (which has been
demonstrated in various commentary on the service) with the goal of modifying
content to reduce the number of requests or improve client-side rendering
times. It is more of a "content optimization" service than a "content
delivery" service. If you want a CDN the tradeoffs (number of edge nodes,
latency, cache sizes) are much better with other providers.

Sometimes, the stuff they inject also has horrible bugs ;P. One time, for an
entire day, they were managing to lock up Safari entirely. Cydia is mostly a
web browser, and one of the companies I work with apparently used CloudFlare,
so Cydia suddenly stopped working that day in a way that was pretty
catastrophic. I did a writeup on the process of discovering the bug (which I
had to report to CloudFlare to get fixed: I don't even think they really had
the expertise in-house to figure out what happened).

[http://www.saurik.com/id/14](http://www.saurik.com/id/14)

~~~
jessaustin
Good stuff! If you write this sort of article every so often, please set up an
RSS or Atom feed.

------
lerouxb
The problem I have with this is the same I have with all public js CDNs: Your
project doesn't just use public js files. You are almost guaranteed to also
have custom javascript (which could itself be opensource or not. irrelevant),
css, possibly fonts, images, etc.

You can't put your static files on that public cdn. So either you have to
serve it up yourself or you have to put it on your own CDN (either one you run
yourself or a managed one).

So you have an additional point of failure (because your static files are
coming from at least two places rather than one) which multiplies the chances
of something going wrong for little benefit in my opinion.

Deployments are likely to be much more complex too, etc.

And before someone mentions the limit of how many concurrent requests a
browser will make to a host: That's also irrelevant. I'm not arguing that you
should have loads of little files. You should still be combining things with
build tools. In fact if you host those "vendor" files on your own webserver or
cdn you actually have more flexibility around how to combine them. And you can
still have multiple subdomains and ip addresses or whatever pointing to the
same CDN to try and optimise around that if you want.

------
RaphiePS
This is really cool. Up to the section on Advanced Features, it seemed like a
clone of Cloudflare's cdnjs. But the multiple request feature is absolutely
terrific, and I can't wait to use it.

EDIT: I wonder if the multiple-request URL's mean that the individual assets
won't be cached, which is a big selling point of CDN's.

~~~
jimaek
Grouped URLs are cached as always but for less time. You can check the
headers.

~~~
eli
I think the parent's point is that people will less likely to have a warm
cache because it's less likely any other site has requested the exact same
group of files.

(Though there's debate as to how much that matters.)

~~~
jimaek
Personally I think they only file that is going to be already cached is jQuery
from Google. So for the rest it doesn't matter. Just my opinion.

------
moystard
There is cdnjs.com that proposes the exact same features and has been
available for quite some time.

I regularly submit new libraries and update the current ones through their
Github repository.

The multiple files per request is nice however, but contributing to cdnjs
might have been a better way of proposing it.

~~~
xpose2000
But according to the benchmarks, cdnjs.com is worse than google and jsdelivr.
[http://www.cdnperf.com/#jsdelivr,google,cdnjs/http/90](http://www.cdnperf.com/#jsdelivr,google,cdnjs/http/90)

~~~
jimaek
Exactly, we use multiple CDNs. We are the only project to do that. This is why
our performance and uptime will be always better.

~~~
Codhisattva
Better than Akamai? I think not.

~~~
jimaek
Akamai does not offer a public CDN. So they are not a competitor. Apples and
oranges

~~~
Codhisattva
If by "public" you mean free, correct.

Akamai is cadillac of the CDN world so more like Apples and Organic Gourmet
Apples.

------
stcredzero
I'd like to see something like a CDN, but for completely dynamic content.
Something like a system that could host instances of a game server, with a
framework that ensures clients make connections with very low latency.

~~~
devicenull
Huh? The game server instances would still need to sync between themselves, so
this really wouldn't address latency.

Rather then having the latency between the client and the server, you'd
instead have latency between the two servers.

~~~
stcredzero
_Huh? The game server instances would still need to sync between themselves,
so this really wouldn 't address latency._

No. If you design the game properly, they actually don't have to coordinate
very much at all.

~~~
devicenull
For what kind of game? For something like a FPS this requires significant
coordination.

~~~
chc
How much coordination is there between, say, different Team Fortress 2
servers?

------
gtirloni
_There are no popularity restrictions and all kinds of files are allowed,
including JavaScript libraries, jQuery plugins, CSS frameworks, fonts and
more_

So what is 'Js' about it? Has 'i' prefix become old-fashioned?

~~~
jimaek
jsDelivr is javascript centered. But it allows all kinds of projects too. For
example the combination feature will only work with js projects. And we have
all kinds of plugins for js developers.

But we still want to help open source projects to take advantage of our
network. This is why we allow even windows binaries and other stuff too.

------
SimHacker
I am worrid about wb sits with misspelld nams.

Mistakn usrs somtims correctd missd lettrs, so thy downloadd pags from
unrelatd servrs.

Howevr, the spc thy postd mentiond othr use cass, and presentd exampls of
anothr systm with enterprisy qualitis to delivr usrs to correctd wb sits with
misspelld nams:

A clevr browsr evn maks guesss, which enabls fils with unrequestd but correctd
nams to be downloadd in paralll ovr integratd distributd per to per
technologis, oftn receivd replis cachd, and latr presentd.

Ys, yt again, browsrs designd by committs enabld Opn Sourcs and Fre Softwars
to answr anothr complicatd Internt problm!

------
Dorian-Marie
Neat, but I couldn't get three librairies with minified JS:

    
    
        https://cdn.jsdelivr.net/g/jquery@1.10.2,bootstrap@3.0.3,angularjs@1.2.10
    

And I couldn't get Boostrap CSS even if they say they also host CSS.

Also, it's not clear if you can or not mix CSS and JS in the query, that would
be neat to do that and then to insert CSS with Javascript (or maybe that would
be a bad idea).

~~~
jimaek
jQuery was deployed from the official source. And it does load a map file
suggesting its minified. Not sure why it looks like this.

You can mix CSS and JS in one query but its up to you to make it work.

To load Bootstrap CSS you need to specify the exact file like so

[https://cdn.jsdelivr.net/g/bootstrap@3.1(css/bootstrap.min.c...](https://cdn.jsdelivr.net/g/bootstrap@3.1\(css/bootstrap.min.css\))

Because the main file for this project points to a js

~~~
Dorian-Marie
Thanks, I didn't see the files were in sub-directories for Bootstrap.

Here is my final Javascript query:
[https://cdn.jsdelivr.net/g/jquery@1.10.2(jquery.min.js),boot...](https://cdn.jsdelivr.net/g/jquery@1.10.2\(jquery.min.js\),bootstrap@3.0.3\(js/bootstrap.min.js\),angularjs@1.2.10\(angular.min.js\))

~~~
jimaek
You dont need to specify the exact files in this case. This code will load the
exact same thing
[https://cdn.jsdelivr.net/g/jquery@1.10.2,bootstrap@3.0.3,ang...](https://cdn.jsdelivr.net/g/jquery@1.10.2,bootstrap@3.0.3,angularjs@1.2.10)

You need to specify the files if you are loading files other than the one
specified in the "mainfile" parameter. Or loading multiple files together.

------
gales
I currently use cdnjs, but very interested in the single HTTP request &
simpler versioning that JsDelivr is offering, so will start using this for my
current project.

Most of the libraries I use are already on there, and I like the fact that
brief glance information is shown for each library. Also noticed there's a
duplicate misspelt typeahead library listed.

------
nawitus
I'd like something like this integrated with a module system. For example, I
could just say "var jquery = require("jquery")", and the script would be
automatically downloaded from the CDN.

~~~
bebraw
I suppose someone could write something like this on top of the API,
[https://github.com/jsdelivr/api](https://github.com/jsdelivr/api) .

------
alecsmart1
I must say that the process for adding files is too complicated and time
consuming.

~~~
jimaek
I know. I hope the auto-update app will fix this.

We are also thinking new ways to add files
[https://github.com/jsdelivr/jsdelivr/issues/347](https://github.com/jsdelivr/jsdelivr/issues/347)

~~~
alecsmart1
Why not implement origin pull?

~~~
jimaek
Origin pull from Github?

We need to maintain our own structure to allow features like file grouping and
version aliasing.

------
dangayle
HOw is this different from cdn.js? I already use that, and it too is on
Cloudfare.

------
SimHacker
2004 calld. Thy want their missing "e" back.

~~~
jimaek
You can use this domain if you prefer
[http://www.jsdeliver.com/](http://www.jsdeliver.com/) :)

~~~
SimHacker
Ys, I prefr.

------
aw3c2
How about not leaking to third-parties? I value my privacy and I wish Mozilla
would too.

~~~
_puk
Sorry, I don't get the comment.

Leaking what? Your publicly available JS libraries? or is this a more general
Mozilla bashing comment?

The service itself looks great, and the versioning is exactly what I'm after.

~~~
aw3c2
Sorry, I thought it was obvious.

Requesting assets from third-party websites leaks that I have been visiting
your site to the third-party. Now imagine having a big CDN that many websites
use and you can spy on a lot of unsuspecting users.

~~~
SEMW
Surely the correct solution to that is to control the information _your_
browser will send to the CDN, not to try and stop web designers from using
CDNs for everyone. The way requesting assets leaks the page you're visiting is
through the referer header, so if you want to prevent that, use an add-on that
lets you control that header, like [https://addons.mozilla.org/en-
US/firefox/addon/referrer-cont...](https://addons.mozilla.org/en-
US/firefox/addon/referrer-control/) .

~~~
AdrianRossouw
I don't think making the entire toolchain more complex with extra edge cases
that would need some kind of centrally managed white list, is more correct
than 'just dont do that'.

~~~
chc
Just don't do _what_? Receive the information you are sending them?

If you don't want someone to have this information, the sensible thing is not
to offer it. I cannot conceive of any way in which sending the information and
then yelling at the recipient for receiving it is "more correct" than just not
telling people things you don't want them to know.

~~~
hackinthebochs
Why don't site operators take responsibility for themselves to not leak their
users information unnecessarily? We've taken it upon ourselves to chastise any
web property that doesn't properly hash their passwords--we could just have
easily say "if you don't want your password to other sites leaked, use
different passwords". But we recognize the unfair burden we are placing on end
users in that case. The case with CDNs and information leak is similar.

