Hacker News new | comments | show | ask | jobs | submit login

One of the signals that we've said that we use in the Panda algorithm that launched in April is how many users blocked a particular site.

A new launch last week is that you can now import blocked sites from Chrome into Google.com. That way your blocked sites will work wherever you sign in. More info: http://insidesearch.blogspot.com/2011/10/export-your-sites-b...

That reminds me, have you removed the blocking function or does it only work in Chrome?

I tried to find a manual for a particular dishwasher today. Holy crap was the search results useless. With the exception of a site which didn't load and one which was a list of people who wanted to know if anybody else had the manual all other results of the three first pages where made for adsense sites or sites designed to harvest email address.

Which would have been one thing, but not one of them had the damn manual (one tried to suggest the manual for a Macbook air!?).

So it seems you still have a long way to go with Panda.

But the reason I ask is that I couldn't figure out how to ban those sites (I was using my iPad at the time).

For me, it works such that you go to a crappy site, navigate backwards to the search results and then a link "- Block all crappysite.com results" appears. It also mentions you need to be "signed in to search".

It would be nice if it also worked when you open a site in a tab too.

You can directly block sites here: http://www.google.com/reviews/t

Which dishwasher manual where you searching for? I'd like to do a comparative study of the various search engines and see what I can get.


My search terms were:

    WQP8 9001 user guide

    WQP8 9001 manual

Why doesn't Google block sites manually (sites that are obviously bad)?

Off-topic, but I've always thought this is the main reason Google Blog Search isn't very good. In the quest to make an all-algorithm blog search, Google sacrificed having one at all. If they had simple restricted the search to blogspot.com, wordpress.com, and typepad.com, and applied the normal spam filters, it could have been pretty good!

Maybe you have never heard of the anti-trust investigation into Google, where various sites have suggest Google manipulate their index already.

A search engine's function is site discrimination.

This does not mean that every kind of discrimination is equally good. Dunking a site by an algorithm (sensible disclaimers apply) looks much better in court than a decision made by a human. The latter is vulnerable to questioning based on (supposed) hidden motives.

Please can I draw your attention to this Stack Exchange question, about Google Chrome blocking (as a potential phishing site) a question on Security Stack Exchange?


If Google is blocking a site would that affect their ranking?

Is this the only reason why Stack Exchange has been hit or are there other big factors too?

It'd be interesting to know.

Experts Exchange, not Stack Exchange

sharp drop in alexa rankings in 2q 2011 - (http://meta.stackoverflow.com/q/110788/167162)

>Both Stack Overflow and The Hyphenated Site (EE) sites are affected, EE in fact goes way down and Super User also notices a drop after reaching a record in the same time while Stack Overflow goes sharply down

Obviously, Alexa sucks, and all web stats suck.

For an indication they are not so bad. They trail behind usually, but for big swings they seem to be good enough indicators. Both EE and SO went down after Panda and never recovered. So did tons of other sites, genuine or not. And a lot of spammy sites (there appeared a ton of new (Russian) sites in the index using subdomains (x.bla.com) instead of directories (bla.com/x) which seems to work now in fooling Google that it's 'quality' and sites with no own content (Markosweb as nice example) only grew larger.

So Google is a long way from achieving their goal while good sites get punished.

Yes, sorry about that. I meant Experts Exchange.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact