* A canonical tag that shows up due to a bug. Dead.
* Industry terms that appear duplicative to the NLP parser. Dead.
* Too-diverse content. Lower rankings, but functionally dead.
There's tons more, but I'm not a practitioner of SEO.
People whining about how they're hurt by losing search position inevitably turn out to be spammers or other lowlife that have been trying to use "SEO" or some other shady practice to artificially inflate their search position.
So yeah, Google is bad for con-men. Darn.
I have a slang dictionary site that's currently being penalized by Google for showing citations of slang use gathered (by hand) from media.
With the penalty, Google sends 1/3 less search traffic. And I whine about it. But I'm not a con man.
Sounds like you know an awful lot about SEO and consequences for not being a "practitioner."
The entire article seemed to be one giant raving contradiction of its self. Companies don't want to hire good people (or for some reason have ignored what they have said), chosen to keep doing things they way that keeps getting them negative results, then moan about it. What happened to the voting website is the very thing thing that Bing would do to them (and probably has done to them).
Now admittedly, Google tends to promote its services above services of other companies, should they be in competing markets. Its unclear if this is by accident (Google engineers probably know the best Google SEO methods after all) or by design. At worst, if it is done by design, then Google is no worse then any other large company. Remember when all the big super markets started producing their own products at cut rate prices? Before when you walked into a Publix, you had 10ft of shelf space devoted to nationally branded ranch dressing. Now you have 5 ft devoted to Publix brand Ranch dressing and 5ft devoted to everything else. So if you are Kraft, your seeing your [eye-ball] search traffic being reduced by half and your competitor now has a much lower price to boot.
Why is it that Kraft doesn't care about that? They don't care, because they have spent the last half decade building their brand and their customer pool. They know that if their loyal customers go into a super market they will see the cheaper supermarket branded ranch dressing first; then second they will see the high priced Kraft and buy Kraft. I can easily extend this analogy to many of the situations presented in the article. When Google enters a market, they are the underdog. Just like when Publix decides to copy another nationally branded product and sell it in their stores. Even if Publix devoted 9 ft to Publix brand Ranch and 1 ft to National brands, on the shelves, Publix still wouldn't capture 90% of the market.
Cutts' clearly states, that they are giving preferred treatment to a high viability site such as Gawker, while normal websites (i.e. internet peasants) would have been de-indexed in the same situation. Cutts' fulltime job is communicating ranking related issues, so I am quite certain this would have sounded a lot different if this would have been implemented as a general rule for disaster ridden IP ranges etc.
As a genuine question, do you have any evidence for the de-indexing time of normal websites? (And evidence that they aren't restored as soon as the website is restored?)