Note that they don't cache anything over ~50mb, and it's a relatively small cache (~4GB nodes, cleared LRU first), in addition to only revalidating every 5 minutes.
Why don't forums like Hacker News, Reddit, etc. build this into their submission process, so the article/page is cached upon submission, and readers are directed straight to the cached copy?
And don't forget, if you're outside the US, appending their CDN domain to yours brings your static content (and the browsing trails of your users) silently into US jurisdiction. Or at least that's the DHS and FBI interpretation of the law with respect to .com, .org, .net
But content? Anyone can make a .net domain and mirror non-US content on it. Does that give DHS/FBI any legal power over said content? Not over the original, at least.
I thought everyone knew about this? I use it for mirrors when a site looks like it's going to go down. Just append .nyud.net to the domain and it gets cached.
Same here. CoralCDN has been around for ages, I've used it to auto-mirror images and other static content from sites to reduce bandwidth consumption. Works really well.
Isn't this very old? I am sure I stumbled upon this while doing some research on a CDN project I was interested in about 5-years ago.
I would assume that CDN technology has progressed significantly since then.
The impression I got at the time, was that it wasn't maintained - as well as I would have liked. I could be mistaken, but that was the impression I got a few years ago.
"20 Aug 2012: We're still here! While active development has been stopped for a while, we continue to operate CoralCDN as an open, free service. It's now been running continuously for more than 8 years (since March 2004), and continues to get a few million users per day at last check. Enjoy continuing to use the service!"
»CoralCDN has been continuously operated since March 2004, running on 300-400 servers on the PlanetLab testbed, spread worldwide. As of 2011, it receives 25-50 million requests per day from a few million unique clients.«
In the beginning you had to add .nyud.net:8080 but it changed to the more convenient .nyud.net along the way.
I wonder if this can be used in any interesting ways as a drop-in performance enhancement. Using it as an automatic domain sharding solution might be viable?
This requires Javascript though, and its operation is way more complicated. It depends on peer-to-peer delivery, which is pretty hard when most users are behind NAT'ed routers and proxies. Furthermore, you need quite a few concurrent users requesting the same resources before it will even start using this CDN.
Finally, it's "currently in private beta", which means it's relatively untested in the real world, and not open like Coral at all.
it is unmaintained. to quote a mail from january 2012 (the maybe twentieth latest mail in the -dev mailing list):
> CoralCDN is basically a "one-person job", and I never got around to
> building a great monitoring-and-notification system.
> In this particular
> case, I didn't have as great access to email as I usually do.
For some reason the DNS servers my Swedish ISP provides seems to ignore/block all lookups for anything containing nyud.net. I have never encountered anything at all being blocked before with my ISP, so it seems a bit strange. Too bad if other ISPs also are blocking CoralCDN. I tried Googles DNS server (8.8.8.8) and everything works fine with that.
Well, you can use it to trivially get around all other blocks. Points more to the futility of trying to block "bad" sites name by name than anything else.
Everybody pays. It's operated using PlanetLab, which is run by over 500 universities from all around the world. You can't contribute to the operation, other than maybe funding a PlanetLab node somewhere.
http://wiki.coralcdn.org/faq.html
So don't use it for big files or anything that changes more than every 5 minutes.