I work at Google helping webmasters.
It seems something has been blocking Googlebot from crawling HN, and so our algorithms think the site is dead. A very common cause is a firewall.
I realize that pg has been cracking down on crawlers recently. Maybe there was an unexpected configuration change? If Googlebot is crawling too fast, you can slow it down in Webmaster Tools.
I'm happy to answer any questions. This is a common issue.
It's also tricky because you don't want a single transient glitch to cause a site to be removed from Google, so normally our systems try to give a little wiggle room in case individual sites are just under unusual load or the site is down only temporarily.
Also, we do send notifications in Webmaster Tools, and you can confgure those to be delivered by email too. I'm not sure if we send messages for these kinds of serious crawl errors, so I'll need to check. If not, that's an interesting idea I can ask the team to think about.
Thanks for the feedback :)