This, in addition to security issues caused when the CDN proxies requests to your main hostname and not only to your static files (requiring you to surrender your SSL keys to them).
While working on Bitrated, a Bitcoin service that deals with users' private keys and funds on the client-side, this stood out as a very serious issue. We opted to not serve any content from 3rd-party providers, at all (including Analytic services, which suffer from the very same issue), and configured a strict Content-Security-Policy that forbids anything other than hostnames controlled directly by us over SSL.
Edit: I'm aware of SRI, but it is brand new and not yet supported by the majority of browsers in use, so its not really a realistic solution just yet.
It's not in their interests to point out how the proliferation of CDNs results in centralisation of the internet and hence makes them tempting targets. Or how it gives a few companies the power to effectively shut out certain groups of users from large parts of the web (like CloudFlare did to Tor users a while back).
It's really disappointing to me to see the CDNs swallow the web whole. But at the same time I understand it, it's not really possible to protect against DDoS attacks for example without that kind of scale.
So we are on the lookout for good CDN. Came across https://www.keycdn.com/ which checks a lot of boxes but I can't find any reviews of it. Has anyone here used it?
I feel that the ultimate answer is a multi-CDN solution. If your assets are critical to be always available, I think you are going to want to diversify your CDNs. The challenge comes in how to load balance and/or failover across CDNs--whether that's easy or not depends on your use case. Doing so automatically at the time of trouble is a challenge too. Dyn just rolled out a product called Internet Intelligence that monitors all the CDN's performances (with end-user perf measurements) and can help you determine whether one of your CDNs is slow (and whether it's slow only in certain parts of the world). Theoretically, then you can combine geo-aware DNS with congestion-aware CDN selection and BAM, diversified, less-congestion-sensitive content delivery.
Another multi CDN solution is from cedexis - http://www.cedexis.com/openmix/multi-cdn.html
Small players can't use Akamai directly, but there are providers (Netflify and probably some others) that use Akamai that are more accessible to the downmarket crowd.
cache hit ratios at different POPs and different types of resources (don't lump my .js files with my mp4s, let me give you patterns / routes to group by)).
fine grained latency numbers (95, 99, 99.9 percentiles) per region, per resource type.
link(href='//maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css', rel='stylesheet' integrity='sha256-MfvZlkHCEqatNoGiOXveE8FIwMzZg4W85qfrfIFBfYc= sha512-dTfge/zgoMYpP7QbHy4gWMEGsbsdZeCXz7irItjcC3sPUFtf0kuFbDz/ixG7ArTxmDjLXDmezHubeNikyKGVyQ==' crossorigin='anonymous')
script(src='//maxcdn.bootstrapcdn.com/bootstrap/3.3.5/js/bootstrap.min.js' integrity='sha256-Sk3nkD6mLTMOF0EOpNtsIry+s1CsaqQC1rVLTAy+0yc= sha512-K1qjQ+NcF2TYO/eI3M6v8EiNYZfA95pQumfvcVrTHtwQVDG+aHRqLi/ETn2uB+1JqwYqVG3LIvdm9lj6imS/pQ==' crossorigin='anonymous')
You can check a particular browser for SRI support using https://ejj.io/sri/
If they didn't care about delivering the correct content they would be pushing back against the specification / using it.
This is why I have pushed bootstrap to start using integrity checking to prevent this form of code injection. But yeah there isn't a requirement for you to use this with a CDN either (I get whilst the browser support isn't there on a bank site it wouldn't perhaps be ideal - this has always been the case unless you are in charge of the CDN content) what it does do it inform me that they won't be delivering variable content of any kind whatsoever as SRI would break that code if they tried to load malicious content.
Justin who has also replied to this message works closely with developer outreach at MaxCDN and cares greatly about their product 'doing the right thing'™
Let me know if you have any further questions about SRI as I can probably answer then for you.
link(href='//maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css', rel='stylesheet' crossorigin='anonymous')
In the HTTP/2 world, it will likely be better to not do this.
The old tricks of relying on it being in browser cache already doesn't forego the fact that the connection to the server is expensive and HTTP/2 gives you a nice open connection to your server already... I would bet that a site is faster by taking as many third-party or other domain assets and putting them back on the same domain and same server.
There are other advantages to using a CDN beyond 'optimizing' browser cache usage: getting assets closer to your users reduces latency (and therefore increases throughput), which can be a noticeable benefit if you're hosting in San Francisco and serving Oceania/SE Asia.
HTTP/2 won't change that: it will just mean distributing assets over multiple domains (to circumvent browser pipelining limits) will make less sense, but putting one asset domain behind a CDN will still be beneficial.
The connection overhead of DNS + connection + TLS is still significant... and HTTP/2 means you'll already have a connection to your server, and the assets could already be being pushed to you, but at the least you'll be able to grab them from the same open connection.
Subresource Integrity  lets you fix this, and is in FF 43 and Chrome 45.
The pitches that I have heard in regards to a lack of initiated prewarming or synchronization/transfer methods or architecture optimizations is pathetic at best and fraudulent at worst (Highwinds CDN, shame on you). Akamai has been inconsistent or failed to meet any sort of standard on a given test run from 2005-2008 (the last time I was considering them). Akamai continues to be opaque and stonewalls any attempt to assess why individual failures occur. Whatever your particular needs are, there is absolutely NO REASON that you should have to do any kind of workaround, without a discount in price where you can measure a tradeoff. As per my original comment, a shitty CDN is not an objective assessment. Being unhappy with your CDN because of an API is just plain laziness. I don't want to go to a CDN host website, ever. The API should be robust and performant. Storage, caching, availability by geo, pricing. These are solved problems. If you are using a CDN and are unhappy, go find a new one and move. If you can't do that, you have larger problems.