

Find the most reliable and fast public CDNs - jimaek
http://www.cdnperf.com/#jsdelivr,cdnjs,google,yandex,microsoft,jquery_mt_,bootstrapcdn/http/90

======
cbr
The main question for a site like this is: how are you collecting your data?
CDNs have dozens of nodes and the performance a user sees will depend on what
the network looks like between them and the nearest one. For example, if all
your measurements were from an EC2 instance in us-east you'd only be measuring
a tiny and easily-gamed slice of CDN performance.

On their about page they write that they're "grateful to Pingdom for providing
access to their data" but I can't find anything more about which and how many
places they're measuring from or how they combine measurements from different
places.

Disclaimer: I work for Google on mod_pagespeed and ngx_pagespeed, and I sit
near the people who handle developers.google.com/speed/libraries/ But I don't
know anything about how they serve the hosted library traffic.

~~~
jimaek
Currently CDNperf uses Pingdom to gather its data. All CDNs are in the same
Cluster group and these are the locations it uses:

Chicago, IL

Copenhagen, Denmark

Washington, DC

Milan, Italy

San Jose, CA

Lisbon, Portugal

Toronto, Canada

Las Vegas 2, NV

Amsterdam 5, Netherlands

Strasbourg 2, France

Charlotte 2, NC

I understand that cdnperf data does not reflect real life performance but with
our limited resources this was the best we could do.

If you have suggestions on how to make it better please let us know.

~~~
alexgartrell
I'm an engineer on Facebook's CDN and Edge network, and, in my experience, the
hardest places to serve people quickly are South America, Africa, and Asia.
You should try to get some timing data for those places.

~~~
jimaek
Pingdom does not offer servers in these locations. Other services do but we
can't afford them. Pingdom is actually sponsoring us with free Pro account :)

If Facebook or Google would be interested to sponsor us then sure, we can add
more locations and do more awesome stuff.

------
bredman
I'm actually not sure where the data for this site is coming from (there's
some mention of Pingdom on the about page) but CDN performance can vary
_significantly_ based on user location. For that reason the only CDN
comparisons I really trust are ones that are taken from end-user computers of
a representative sample of the viewer you're trying to optimize for.

In this case if this data is from Pingdom monitoring locations it's
particularly bad for estimating performance for a JS library that's likely
loaded by end-users who are not browsing the web from well connected data
centers.

Agreed that the look is nice and admittedly in the absence of other data I
would probably trust this data.

~~~
bebraw
Hi,

The data is based on Pingdom, yes. Recently we made sure each CDN is monitored
from the same group of servers.

It's probably not ideal. It definitely would be great if we could provide some
alternative metrics. Ideas are welcome. :)

~~~
mrweasel
One thing that would be interesting is "Latency from where". Yandex seems to
do pretty badly, but is that also true if you're in Russia?

~~~
bebraw
Nice point. We've been planning a map based visualization. That would
definitely help in this regard.

------
computer
Isn't actual real-world usage (market share) more important? If 99% of your
users already has the resource cached, no request will be fired whatsoever.

~~~
deelowe
Exactly, which is why using a big one, like Google's is my preferred option.

~~~
michaelmior
I have no data to support this, but I wonder if that really matters. You have
other resources you need to serve from your site no matter what happens. It
seems possible that resources from your own site would end up being the
bottleneck, so you might as well just serve up your own libraries anyway.

------
mey
My security conscious nature has always been distrustful of using a public
cdn. It represents a potential security and privacy concern to my clients.
It'd be an interesting attack vector to inject malicious javascript code into
a wide pool of sites at once. It is also an operational concern that these
systems are going to continue operating for the long term or not suffer
service interruption.

~~~
Jakob
While true, it’s would not spread that easily. Google returns in its response
header of e.g. jquery: "Expires:Fri, 21 Nov 2014 00:46:46 GMT"

As long as this file is in my cache, my browser won’t request it again for a
year, independent what happened to it in the meantime.

> or not suffer service interruption

Preventing that is actually quite easy

    
    
        <script src="//cdn/jquery.js"></script>
        <script>window.jQuery || document.write('<script src="local/jquery.js">\x3C/script>')</script>
    

(a slightly more complex solution would be needed if the CDN is timing out
instead of return an error)

~~~
wreegab
If it is in the cache, why is the CDN needed anyways?

~~~
wcdolphin
Using a centralized cdn would increase the chance of a cache hit.

~~~
wreegab
I'm completely skeptical about the benefit of CDNs to users.

I would like to see some hard data about the number of web sites a user visit
typically to understand how this is a meaningful argument. As of now, I lean
toward thinking these CDNs are just yet another way to track users.

Anyways, I block them all by default, and my browsing works just fine.

~~~
hercynium
How do you "block" a CDN? Is that a mis-wording? Do you have some means to
automatically discover and detect the origin servers and connect directly to
them, bypassing the CDN?

And, to answer your question, the benefits are substantial, well-documented,
and provable on multiple levels.

First off, a CDN (when working properly) greatly improves the average latency
for browsing a site, and in some cases even the bandwidth usage. Additionally,
use of a CDN can increase the number of users a site can simultaneously serve.
The best CDNs can not only withstand but actively deflect various types of DOS
attacks. Some can even serve resources like images and video dynamically
optimized for the browsing software or device.

There are many more benefits, and believe it or not, a _huge_ percentage of
the Internet's web and media traffic flows through CDN services - bypassing
all of them is near-impossible (unless you somehow don't use _any_ of the most
popular sites and services)

------
larrybolt
I'm curious if we ever will see the most popular JS/CSS frameworks/libraries
integrated into the browser itself, and a simple attribute in the tag would
allow loading the internal version, but still allow failover to the hosted
one.

This could even prevent man-in-the-middle attacks on scripts that otherwise
would never expire anyway like described here: [http://thejh.net/written-
stuff/want-to-use-my-wifi](http://thejh.net/written-stuff/want-to-use-my-wifi)

~~~
untog
Seems like an administrative nightmare. Incorrectly loading JS libraries is
just one of many problems when your network connections is compromised.

~~~
larrybolt
Why would this be an administrative nightmare? I can see how deciding which
JS/CSS libraries should be included can cause dispute, but apart from that,
why not?

I actually even always wondered why the default css applied to elements isn't
standardised, so pages not containing reset.css or normalize.css-sheets render
differently or how certain Javascript methods differ from browser to browser.

But I guess that is a rather different discussion.

~~~
untog
I think it's part of the same discussion. It makes total logical sense for
default CSS to be standardised. Why isn't it? Because coordination between
browser manufacturers is extremely patchy.

So, different browsers would have different libraries included depending on
who made them. Possibly different versions too. It would just be very messy,
with little reward.

------
kbar13
The problem with many of these publicly available "free" monitoring systems
that deal with ping / http response time is that a lot of the time it ends up
being a monitoring system for its own network.

With that being said, I like the look!

~~~
Jakob
An interesting addition would be letting the client ping those CDNs, too.

~~~
jimaek
You mean publicly show the hostname of each CDN to allow users to ping them
themselves?

------
jread
You're better off evaluating CDNs using real user/last mile testing than a few
pingdom servers in data centers. Cedexis has has some decent analysis of this
type on their website: [http://www.cedexis.com/country-
reports/](http://www.cedexis.com/country-reports/)

------
nl
The current gold standard in CDN measurement
is[http://www.cedexis.com/country-reports/](http://www.cedexis.com/country-
reports/)

It uses instrumented JavaScript on real user's browsers to check latency.

That seems a better method IMHO.

~~~
jimaek
It is, but they monitor Enterprise CDNs not public ones. Sure you can find
CloudFlare/cdnjs there but not Google, Yandex, jsDelivr, jQuery

Unfortunately this is hard to implement with our limited resources.

------
gcb1
yui official cdn?

