

Ask HN: Cache the frontpage links on HN? - JeroenRansijn

It really annoys me when the links on the frontpage of HN are clicked on so much they overload the server of that website. What about a website&#x2F;service which caches all the pages on the frontpage of HN, and throws them away after X amount of time after they disappear from the frontpage. If you think this is a good idea, me and my co-worker might create something like this.
======
edent
Or, before submitting their* sites, they could do the bare minimum of 1) Use
the free version of CloudFare or similar. 2) Get a web host with unlimited /
high bandwidth levels. 3) If using WordPress, use one of the many cacheing
plugins.

I've been hit several times by being on the front page. By my estimates, that
worth around 700 requests per hour ([http://shkspr.mobi/blog/2012/11/whats-
the-front-page-of-hack...](http://shkspr.mobi/blog/2012/11/whats-the-front-
page-of-hackernews-worth/)). I don't think that's excessive and using the
above, my bog standard WordPress install has never fallen over.

*I'm aware not all stories are submitted by their writer - but I consider the above to be best practice for any competent website owner.

------
petercooper
I think a lot of content providers would not be keen on this as a default mode
of operation. They'd lose stats, access to dynamic stuff on the page, and
more.

On the other hand, if HN could frequently check if a page is still responsive
and, if not, _then_ show a cached version until it's back up, that would be
awesome, but given the underlying software doesn't get many updates anyway, I
doubt we'd see anything like this soon.

------
dutchbrit
I think it's a bad idea, mainly because I'd like to see how much traffic my
site received. It occasionally happens that I want to read something here, but
the server is down. In that case, I always Google the URL. So far, in all
cases, Google has spidered and cached the URL I wanted to access.

People would be better off optimizing their site/server. It's a good lesson to
see what happens when your site gets a boost in traffic.

------
arb99
It wouldn't be hard to run every submitted link through
[http://www.coralcdn.org/](http://www.coralcdn.org/), then if the site goes
down add a comment to the thread linking to their .nyud.net address (eg
[http://google.com.nyud.net/](http://google.com.nyud.net/) )

~~~
gojomo
I love Coral CDN... and could see it being a model for (or supporting-part) of
a generalized cache-popular-submissions solution. But, it seems it doesn't
support HTTPS sites.

------
moreentropy
I think it's good when it hurts.

It's better to struggle with the hacker news treatment early on and learn to
cope with traffic spikes than have the same problems when your site gets
mainstream coverage.

------
haliphax
If they use varnish on their end, the problem solves itself. :)

------
bencevans
Check this out [http://hncache.bensbit.co.uk/](http://hncache.bensbit.co.uk/)

~~~
JeroenRansijn
Fantastic, although the extension doesn't work for me? Does it work for the
latest Chrome?

------
Shish2k
if(site doesn't respond) {link.domain_name = link.domain_name + ".nyud.net"}

