Hacker Newsnew | comments | show | ask | jobs | submit login

Hi

I work at Google helping webmasters.

It seems something has been blocking Googlebot from crawling HN, and so our algorithms think the site is dead. A very common cause is a firewall.

I realize that pg has been cracking down on crawlers recently. Maybe there was an unexpected configuration change? If Googlebot is crawling too fast, you can slow it down in Webmaster Tools.

I'm happy to answer any questions. This is a common issue.

Pierre




Why is it that "our algorithms think the site is dead" so soon, yet when I search for a keyword, I still get sites that our dead or no longer contain the given keyword? Bad algorithms or is this "thinking dead" a time delayed thing?

-----


Normally there's a lag between when a site goes dead and when we crawl the site to see that it's dead.

It's also tricky because you don't want a single transient glitch to cause a site to be removed from Google, so normally our systems try to give a little wiggle room in case individual sites are just under unusual load or the site is down only temporarily.

-----


Not a webmaster, but would like to know: Does/can the google crawler system send a notification email to the registered webmaster of a website, if something like this happens? This could be deemed unsolicited, but would in a huge majority of the cases be more than welcome!

-----


All these errors are reported in Webmaster Tools in the Diagnostics section. Verify your site and you'll have access to this and a wealth of more data.

Also, we do send notifications in Webmaster Tools, and you can confgure those to be delivered by email too. I'm not sure if we send messages for these kinds of serious crawl errors, so I'll need to check. If not, that's an interesting idea I can ask the team to think about.

Thanks for the feedback :)

Pierre

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: