Hacker News new | comments | show | ask | jobs | submit login

Could you use some sort of sitemap or other way to provide the data to Google that isn't so damaging to site performance? Or in Google Webmaster tools turn down the rate of crawling?

Just realized that this could be a problem for lots of sites, and I'm curious as to what the best solution is, since not everyone has Matt Cutts reading their site and helping out.




We do have a self-service tool in webmaster tools for people who prefer to be crawled slower.


Do you support/respect the "crawl-delay" directive?

http://en.wikipedia.org/wiki/Robots_exclusion_standard#Crawl...


Nope. As mentioned above, apparently Google thinks people will "shoot themselves in the foot" with the crawl-delay directive, while they won't with Google's special interface (which requires registering and logging in).


I can't imagine that they are just guessing abut this. I'm sure someone tried implementing it and was horrified at the actual results before they gave up on it.


What more can you expect from the World's Largest Adware Company?

Seriously, one thing about Google is that they seem to really like ensuring people are logged on, preferably at all times. Fortunately recent changes to Google Apps (promoting apps user accounts to full Google accounts) has made this more complex on my side and probably degraded the level of actionable info they can get out of it.


or faster : http://news.ycombinator.com/item?id=2382728




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: