Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Effectively banning robots is the last thing you want to do.


It's better than having the site be completely inaccessible. You'd only deploy this when you come under load. Additionally, many of the more important robots can be identified reliably (eg, googlebot has a DNS handshake that can positively identify legitimate googlebots) - and even if you get a false positive, if you filter out all their packets, they're likely to assume a temporary failure and come back later.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: