My favorite, by far is: http://www.google.com/#q=recursion
Edit: found the original interview:
Starting at 13:45, Sergey and Larry try to explain recursion and idempotentence on Fresh Air.
Great example of whitelists vs blacklists. :)
Disallow: tells the robot that it should not visit those particular pages on a site. Or in this case, terminate those individuals.
I've personally worked with some brilliant embedded software developers and compiler writers who likely wouldn't know what a robots.txt file is, because they've focused their skills and talents elsewhere. Despite not knowing what a robots.txt file is, they're still far more technically capable than even the best web developers I've ever encountered.