Hacker News new | comments | show | ask | jobs | submit login

Why is it that "our algorithms think the site is dead" so soon, yet when I search for a keyword, I still get sites that our dead or no longer contain the given keyword? Bad algorithms or is this "thinking dead" a time delayed thing?

Normally there's a lag between when a site goes dead and when we crawl the site to see that it's dead.

It's also tricky because you don't want a single transient glitch to cause a site to be removed from Google, so normally our systems try to give a little wiggle room in case individual sites are just under unusual load or the site is down only temporarily.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact