Except this has been going on for months, has happened a number of times, and occurs in violation of standards, like robots.txt, that were created to prevent it from happening. I'd expect it from an individual or startup, but not a large company that's been doing spidering for a long time.