That reminds me, have you removed the blocking function or does it only work in Chrome?
I tried to find a manual for a particular dishwasher today. Holy crap was the search results useless. With the exception of a site which didn't load and one which was a list of people who wanted to know if anybody else had the manual all other results of the three first pages where made for adsense sites or sites designed to harvest email address.
Which would have been one thing, but not one of them had the damn manual (one tried to suggest the manual for a Macbook air!?).
So it seems you still have a long way to go with Panda.
But the reason I ask is that I couldn't figure out how to ban those sites (I was using my iPad at the time).
For me, it works such that you go to a crappy site, navigate backwards to the search results and then a link "- Block all crappysite.com results" appears. It also mentions you need to be "signed in to search".
Why doesn't Google block sites manually (sites that are obviously bad)?
Off-topic, but I've always thought this is the main reason Google Blog Search isn't very good. In the quest to make an all-algorithm blog search, Google sacrificed having one at all. If they had simple restricted the search to blogspot.com, wordpress.com, and typepad.com, and applied the normal spam filters, it could have been pretty good!
This does not mean that every kind of discrimination is equally good. Dunking a site by an algorithm (sensible disclaimers apply) looks much better in court than a decision made by a human. The latter is vulnerable to questioning based on (supposed) hidden motives.
>Both Stack Overflow and The Hyphenated Site (EE) sites are affected, EE in fact goes way down and Super User also notices a drop after reaching a record in the same time while Stack Overflow goes sharply down
For an indication they are not so bad. They trail behind usually, but for big swings they seem to be good enough indicators. Both EE and SO went down after Panda and never recovered. So did tons of other sites, genuine or not. And a lot of spammy sites (there appeared a ton of new (Russian) sites in the index using subdomains (x.bla.com) instead of directories (bla.com/x) which seems to work now in fooling Google that it's 'quality' and sites with no own content (Markosweb as nice example) only grew larger.
So Google is a long way from achieving their goal while good sites get punished.