Have the IA ever discussed why they retroactively apply robots.txt? I can see the rational (though don't necessarily think it is the best idea given the IA's goals) for respecting it at crawl time, but applying it retroactively always felt unnecessary to me.