Hacker News new | comments | show | ask | jobs | submit login

66% of server time, not 66% of traffic. I am not a web developer, but my guess of what's happening: active pages get cached, so 500,000 page views to the homepage uses as much server time as a spider coming across a dead link from 2002. The Onion updates frequently, but not hundreds of time a day, and there's only a (relatively) small number of 'current' articles at any given time. It wouldn't take that many dead links to dwarf the server processing time, as long as they were all distinct links.

As a web analyst, I can tell you that in general old, deprecated content does not get very many visits except from spiders. I would not at all be surprised if the marginal ad revenue is break-even compared to the extra server load.

Note also that they're not just throwing the traffic away. It's a decent 404. It's not the best I've seen, there's room for improvements, but it's directing people straight to the archive so they can look for the story they were linked to.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: