
Dear Google: please let me ban sites from results - nervechannel
Given the current high-ranking thread about spammy sites in Google results, it strikes me that a very simple solution would be to let logged-in users blacklist sites.<p>Bam, no more wareseeker or efreedom.<p>This would solve a lot of people's complaints in one fell swoop.<p>There are greasemonkey etc. scripts to do this, but they're tied to a single browser on a single machine. A global filter (like in gmail) would be so much more useful.<p>Would this be particularly hard to do?
======
AndrewO
I see a lot of people asking what happens when a group of people downvote a
site just to ruin its ranking. Sure that's a problem, but there's an easy
solution on Google's end: your blacklist only affects you. Yes, that means all
of us have to hide efreedom ourselves. Doesn't seem like a problem to me...

Plus, we are talking about a company whose core business demands that it can
identify groups of bad-faith voters. Given time, they may find a way to
incorporate this data safely into the ranking data (if anyone could, it would
be Google).

And I know there are extensions to do this (mine mysteriously stopped working
recently), but doing this on the client-side in a way that's bound to a single
browser install just seems wrong to me, especially for Google.

~~~
prawn
"Yes, that means all of us have to hide efreedom ourselves. Doesn't seem like
a problem to me..."

efreedom is monetised by Google ads. Might seem like a problem to Google.

Let's say it starts with personal blacklists. Then trusted lists that you can
subscribe to (AdBlock-style). Then word spreads and enough people are using it
such that AdSense revenue drops 20-30% or more?

(IME, CTR on ads is much higher on these content-light sites than it is on
more reputable sites.)

~~~
danudey
To be honest, I think this is the reason Google doesn't have this feature. The
sites everyone wants to blacklist are the spammers that game Google search and
show Google ads. If they don't get traffic, they don't show ads. If they don't
show ads, Google doesn't get that money either.

It's to Google's benefit that people end up on these pages, see a ton of ads,
and then click on one out of confusion or desperation.

~~~
ericd
That's strange, though, because it should be clear to them that the long term
effect of this is pretty dangerous for them - if their search results were
always the best, then they would be largely unassailable in the search market,
but as it is, they're decreasing satisfaction with their product. It's hard to
improve everyone's perception of your product once that's happened, and it
opens them up to usurpers.

~~~
prawn
It does, but Microsoft's thrown loads of money at the problem and not made
huge inroads so maybe they figure it's worth the risk?

------
SimonPStevens
No, it's not particular hard, but it will make the problem worse.

Why?

99% of users are non-tech oriented.

Those users will not really be aware of the specific problems with the search
results, they won't understand the concept of a good vs bad result and they
certainly won't bother to tweak/ban/filter their results.

The 1% that do care and are currently being vocal about it will start
filtering their results and they will perceive that the problem is solved.
They will stop making a fuss.

So now, the complaints have gone away, but 99% of users are still using the
broken system, so the good sites that create good original content are still
ranking below the scrapers and spam results for 99% of the users.

The problem must be solved for all (or at least the majority) of users.

(And you can't take the 1%s filtering and apply it to all users in some kind
of social search because the spammers will just join the 1% and game the
system)

~~~
nervechannel
Do you think 99% of users are too stupid to click 'report spam' when they get
a spam email?

~~~
SimonPStevens
Not "too stupid" no.

I think 99% of email users have not been adequately trained in why or how they
should report spam, and even if they were I think most of them would still not
care enough to actually do it with any regularity.

When pushed many may acknowledge that they know it exists, they will probably
even be able to find the button when asked if given a chance. But they won't
remember to do it when they see spam, they'll just ignore it and move on to
the messages from people they know.

~~~
seabee
Do you think this could be improved by alternate wording? Instead of reporting
'spam', ask the user 'was this useful?' or 'did you want this?'.

After all, the real goal is giving people a better, more relevant experience,
detecting and removing spam is just one facet of that. Whether it's email or
search.

~~~
SimonPStevens
In general yes, I think improved UI can guide user behaviour to more desirable
outcomes.

However I don't think it's as simple as changing the button text. Even a
process driven UI, like a wizard style interface where you have to click next
to progress through each step might work, but users very quickly become immune
to dialogs. They don't read them, they just evolve the actions that get them
to their goal the fastest, and the users goal does not include reporting their
spam.

~~~
patd
Maybe something more automatic : a user makes a search, clicks on a link, if
it is spam, he'll probably hit the back button within a couple of seconds. The
average time spent by a user on a result before going back to Google could be
used a as metric of quality.

~~~
mthoms
I'd be very surprised if Google does not already implement this in some form.

------
al_james
Yes that would be good. They could then look at the number of people blocking
certain domains and de-weight them in the global results.

Traditionally google seem against human powered editing (as this would be),
but I think as the black hat SEOs run rings around them, its needed _badly_.

~~~
eli
An extremely easy way for a bunch of people to get together and destroy
someone's ranking? That doesn't sound like such a good idea.

~~~
csomar
Given that Google gets hundred of millions of search and visitors, then you'll
need a hundred of thousands of down votes to get black-listed. No black hatter
can really do that (create a 100K account/IP to avoid Google radar and down
vote the website).

~~~
bromley
Hundreds of millions of searches and visitors in aggregate: yes.

Hundreds of millions of searches and visitors in any one keyword niche: not so
often.

Many websites live off a handful of visitors a day coming from a few core
keywords and associated long-tail traffic. For a keyword that only gets 100
searches a day, it wouldn't take many down-votes to affect the rankings of the
relevant sites.

~~~
moe
_For a keyword that only gets 100 searches a day, it wouldn't take many down-
votes to affect the rankings of the relevant sites._

Why do you assume a flawed implementation?

Naturally there would be thresholds. There is no reason to devalue a site
that's only displayed in 100 search-results/day at all.

The sites we want to hit are orders of magnitude worse at polluting the
results. We're talking about the Mahalo's and expert sexchanges.

~~~
CWIZO
Is experts-exchange really a spam site? I always thought they had real users
who ask&answer questions. But that they are just a crap site that hides this
behind a paywall (I'm aware that the actual content is at the bottom of the
page).

~~~
roc
That sounds like trying to excuse junkmail and spam because someone out there
finds it useful and orders the products being offered.

~~~
CWIZO
I'm not trying to do that. I'm just arguing that classifying it as spam is not
fair. I remember (before SO) finding some useful info there from time to time.
Whereas junk mail trying to sell me a bride from Russia for 50€ is defenetily
a hoax :)

------
radley
Google does provide this service: it's called Google Custom Search. You can
prioritize or blacklist sites and it's pretty easy to add it to your browser
searchbar. I don't always use it, but I'll switch to it when I encounter a
spammy topic, usually dev-related searches.

<http://radleymarx.com/blog/better-search-results/>

~~~
dejb
I can't believe this isn't the most popular comment. Kinda makes HN look a bit
like a knowledge vacuum with all these recent discussions of how to ban
results when a google service that's existed for nearly 5 years can do the
job. The format of the results isn't quite a nice as the normal google search
though.

~~~
nervechannel
Err, I mentioned this well before the comment above:

<http://news.ycombinator.com/item?id=2075437>

It's not so much that it's a knowledge vacuum, just that someone didn't read
the whole thread before replying.

------
Pewpewarrows
Gmail already does it, and the global system uses an algorithm to look at
reported spam results in order to automatically move future emails from that
party to the spam folder automatically, not just for the person that reported
it, but for everyone.

If they're not looking into integrating that nicely into the existing search
results page (not a separate form that the average user will never find or
use), especially after all the internet chatter about it recently, then they
definitely should make that a top priority in 2011. I definitely don't want
them to do a rush job on it though. I don't want competitors to start
reporting each other as spam in search results to try and game the system even
further. I'm assuming they have anti-gaming measures in place for Gmail, so
they won't be completely starting that from scratch...

~~~
mtkd
I don't see how you could anti-game this, the SEOs would just use mechanical
turk to hire 100s of people (with valid Google Accounts) to do the reporting.

At best G could use the information as a list of potential spammers and filter
domains manually, but I really can't see this being automated without giving
the SEOs another weapon.

~~~
moe
I don't think anyone wants to filter sites that could be gamed with "100s of
votes". We want to filter sites that will require tens of thousands of votes
to get rid of.

~~~
roc
And even then it should consider the votes themselves to be suspect and watch
for blocks of users who only vote in unison (qualitatively and temporally).

------
pixelbeat
Google were experimenting with voting on results:
[http://techcrunch.com/2007/11/28/straight-out-of-left-
field-...](http://techcrunch.com/2007/11/28/straight-out-of-left-field-google-
experimenting-with-digg-style-voting-on-search-results/)

Also there is this form for reporting spam sites:
<https://www.google.com/webmasters/tools/spamreport>

Integrating the above into standard search results would be difficult unless
it was restricted to users with a good "karma". That might be possible in our
increasingly socially networked world

~~~
CWuestefeld
The thing is, the SO scrapers like efreedom aren't spam, strictly speaking.
It's just that they clone existing content without adding value, and as such
are just noise in the results.

Perhaps we need to frame the discussion differently, considering what the
searcher wants, rather than "spam-free hits".

~~~
nervechannel
That was my point really. _I_ don't want to see eFreedom hits, _I_ consider
them spammy, so I'd like to be able to click-ban them from _my_ results.

If Google use that information to gradually adjust their ranking overall, then
fair enough -- won't affect me, I can't see them anyway.

 _EDIT:_ Even if they _don't_ let that affect everyone else's results (because
of gaming), then I still don't care, I still don't see the crap in my results
ever again.

------
Luc
Also, I would like '[any widget] review' to take me to an actual review, not
pages upon pages of spam. I usually end up looking at comments on a few
trusted sites (e.g. Amazon). This seems broken...

~~~
CWuestefeld
Yes, most of the results for this query wind up pointing to pages saying "be
the first to review [any widget]".

As a workaround, try searching for "[any widget] sucks" and "[any widget]
good".

EDIT: tying this to other discussions on the topic, it's a symptom of
Patio11's observation that natural language search doesn't work very well. If
you want to find something, you need to paint a picture of what it looks like,
rather than asking a question about it.

~~~
tokenadult
I've noticed for years that "[product] hosed" brings up good results on how to
work around bugs in various software products.

------
djhworld
I think the worst culprits are the ones that skim StackOverflow questions and
rehash them into their own supposed original "question and answer" site

~~~
ergo98
What do you think most StackOverflow answers are? It's a karma-paid labor pool
where you can post questions and a lot of under-employed people will rush out
and do the necessary Google searches, collating and slightly rewriting the
results to yield the most votes.

Everyone is ripping off someone's content.

And just to be accurate here, SO content is creative commons (created by the
community). Are those just cheap words?

~~~
nervechannel
_What do you think most StackOverflow answers are? It's a karma-paid labor
pool_ ... said the comment on Hacker News, earning the poster 3 points so far.

~~~
ergo98
2 points. I started at one. I desperately hope for more, though, as this is
going to give me my big break. I'm trying to earn six more trophies, a ribbon,
and link this profile on my resume.

------
coffeedrinker
As programmers, our typical complaints are for sites that bog us down in
common (expert's exchange, stackoverflow scrapers, etc.).

What I found interesting: I was doing a search on something I normally have no
interest in (a sewing machine manual for my wife) and I was _amazed_ by the
level of spam I was encountering.

We have no idea how bad the problem is for others whose topics we do not
usually see. The web is far more full of spam than we even realize.

------
pragmatic
Proof that true AI is a long way off?

If the best and brightest (arguably) on the planet can't figure out how to
filter out search with algorithms, what makes us think we can mimic true human
intelligence any time soon. (I think it will happen, just not as soon as some
claim)

~~~
tel
Or maybe it just means that the day we can algorithmically emulate human
intelligence is the day spam becomes useful (or the day we get impossibly good
at fooling ourselves).

------
dawgr
That will never happen, if they ever did that it would be an admission that
there is something inherently wrong with their algorithm. They won't do it.

~~~
nervechannel
Nah. They let people 'report spam' in gmail -- they're happy to admit that
algorithm doesn't always guess right. Also there's already a "give us
feedback" mechanism for reporting bad search results, it's just too slow and
manual.

~~~
wahnfrieden
People are used to spam buttons, and would realize that the lack of "report
spam" would mean that Google might be marking too many things as spam if they
think the user doesn't need a "Mark as spam" button.

People are also a lot less tolerable of spam in their inbox than they are of
irrelevant search results.

------
andrewljohnson
I'd definitely make use of this feature. Some ancillary features might
include:

a) Google could warn you if it thinks the sites you have blacklisted seemed to
have regained credibility.

b) Google could suggest additional sites you may wish to blacklist, based on
other user blacklists.

c) Google could allow outside parties to curate blacklists.

d) Google could list the most commonly black-listed sites publicly. For the
webmasters that find themselves listed who want to run an actual honest
business, this is a good sign they should change their tactics. For the folks
that aim to spam and profit... well screw those guys.

------
shimonamit
Maybe this could be implemented in the way of sticky search operators?

So for example, I could define -site:efreedom.com as an operator to be applied
silently for every search I make.

------
hessenwolf
How many gmail accounts do we need to band together to lower the rank of stack
overflow against our super-duper question-and-answer site
QandAdsWithMe.annoying.com?

~~~
Sandman
An excellent point. Blacklisting works both ways, and there's nothing stopping
spammers from creating hundreds, maybe thousands, or even more, of throwaway
Google accounts just to blacklist the original site. Sure, the logged in
Google users wouldn't see the spam site (if they blacklisted it), but it would
still appear, and outrank the original site, in standard search results for
those that aren't logged in.

~~~
nervechannel
So don't use the blacklisting stats for re-ranking everyone else's then. I
didn't consider that when I put the post up. An instant way to filter sites
just for yourself would solve 90% of people's complaints straight away.

------
twir
Looks like a lot of people are assuming a solution would some sort of voting
system like stackexchange, etc.

Why not allow individual users to hide sites from their own search results and
save the info in their google account? For example, provide a "hide this site
from my results" link next to each result. Each person decides which site they
don't want to see and SEO and global results remain unaffected.

~~~
teach
I feel like I'm taking crazy pills. Am I the only one that remembers this
EXACT feature on Google about a year ago? You had to be logged in to iGoogle,
and each search result had a small [X] to the right of it that would appear on
hover.

If you clicked it, that result wouldn't appear for you again. I used it all
the time.

Then, lately it's gone. Maybe I was part of a small, randomly-selected test
group?

~~~
mthoms
That was part of Google's SearchWiki experiment.
[http://googleblog.blogspot.com/2008/11/searchwiki-make-
searc...](http://googleblog.blogspot.com/2008/11/searchwiki-make-search-your-
own.html)

That experiment was replaced by Google "Stars" in March 2010 because,
according to Google:

> In our testing, we learned that people really liked the idea of marking a
> website for future reference, but they didn't like changing the order of
> Google's organic search results.

[http://googleblog.blogspot.com/2010/03/stars-make-search-
mor...](http://googleblog.blogspot.com/2010/03/stars-make-search-more-
personal.html)

I personally think there is much more going on here than Google admits.

------
aquilax
Wasn't this a problem Google Search Wiki tried to solve?

[http://googleblog.blogspot.com/2008/11/searchwiki-make-
searc...](http://googleblog.blogspot.com/2008/11/searchwiki-make-search-your-
own.html)

~~~
joeyo
Yeah. I miss searchwiki. :-(

------
davidk0101
I'm not sure how this would be implemented. Where would the blacklist be held
and how would it influence the search results? I know that they already do a
lot of search customization but most of it is just aggregate statistical
computations. It's not that they return results specifically tailored to you
but more like results tailored to a very fuzzy average version of you. A
blacklist seems way too specific to each user to be susceptible to meaningful
aggregate statistical operations like spam filtering which is one of the
reasons that spam filtering in google is so good. Each user contributes
something and everyone benefits. I don't see that happening with blacklists. I
think to make it worthwhile they would need to figure out how to feed the
information from blacklists into providing more meaningful results for
everyone.

~~~
nervechannel
That's the point, what I want is _exactly_ a user-specific blacklist.

I can even do that already with Google's Custom Search, all that's missing is
a little 'block this site' button. Instead I have to go and configure Custom
Search manually for each URL mask.

~~~
jimmyswimmy
You could write a little GreaseMonkey script or extension to do that,
shouldn't be too hard.

------
balakk
How about decentralizing the search page? Hear me out for a bit.

My theory is that these complaints are coming from specific interest groups,
not the general public. For example, spammy-content is created and targeted at
a developer/programmer audience, and that is the source of some of these
complaints.

So my suggestion is Google should platformize their search; and give out
dedicated search instances to specific communities. The community should have
enough levers to govern/influence what is spam or not. In addition, the
community can promote certain high-value resources, which are otherwise
unfairly listed in search results. Invite some high-profile communities for a
test-run, and let the communities make their own choices.

The public Google can still handle the general public. This can also bring in
some transparency in the way spam is determined.

------
charlesju
Here is a conspiracy theory for you guys.

1\. How does Google make money? Search Ads.

2\. How do people click on search ads? Bad real search results.

------
iwwr
In the interim, you can do your searches by adding -wareseeker -efreedom to
the search string.

~~~
nervechannel
I've discovered you can also set up a Custom Search Engine, with no included
sites (default to everything), and specifically exclude the sites you don't
want. Then do all your searches through this.

<http://www.google.com/cse/>

Usability-wise, though, it's not nearly as much use as a 'ban' button next to
each result would be. But it shows Google already have the infrastructure and
code that would allow this -- they just need to make it instant to use.

 _EDIT:_ The other downside of this is you lose a load of bells & whistles,
e.g. previews, "pages from the UK" (without typing), icons for
images/news/etc. Time will tell if I miss those.

~~~
srean
But its not free, right ?

~~~
JoachimSchipper
From the signup page:

"Standard edition - ads are required on search pages"

There's an ad-free premium version as well, but you definitely can get it for
free.

------
Tichy
Didn't Google have downvotes for results - shouldn't they be sufficient to
achieve the result you want? Presumably Google would learn that you
consistently downvote wareseeker and exclude it from results in the future.

I haven't used it because I don't want Google to remember my search history.
But if you are willing to stay logged into Google (which would be required for
your proposal), it would not be an issue.

------
joshrule
It seems that it might be more helpful to whitelist sites. The web grows too
quickly, and the mass of spam sites overwhelmingly so. If I had some way to
blacklist sites, I'd end up spending a lot of time doing so. In fact, it could
quickly take up most of my search time.

If, though, we could whitelist sites, it seems that results would get cleaner
faster. I don't care about how many bad sites are out there, as long as
helpful sites make it to the top. Plus, I typically use just a few sites to
access reliable information anyway (the number's about 7, right?), so if I can
whitelist results from those sites, I'll probably find my desired content more
quickly.

What about the case when there are 30 spam sites listed before 1 good site?
That hasn't happened too often for me. Instead, the results I'm looking for
are usually just 4 or 5 spots down the front page, and very occasionally on
the second page.

White listing seems like it would still be faster and easier for now.

------
ScottWhigham
For those wanting Google to put a penalty on the sites who are banned/removed
from the user's view, what's to stop someone from gaming that system via Mech.
Turk (or some other way)? Just pay people $0.12 to open gmail accounts and ban
a competitor or whatever.

That's the only negative I can think of - other than that, I say bring it!

------
krschultz
I'd ban eHow.

~~~
slig
And mahalo.

~~~
bradly
And ExpertsExchage and About.com

~~~
CWuestefeld
I don't get the hating on About.com. There are sub-sites there that I actually
read intentionally, e.g., <http://heavymetal.about.com/>

This sure looks to me like real, original content.

EDIT: how about the courtesy of an explanation for the downvote?

~~~
j_baker
About.com's content is decent sometimes. The problem is finding it amongst all
the ads.

------
Sukotto
I want a search results page similar to the "Priority Inbox" we got recently
in gmail. Set sane defaults and let me override them with
"Important/Notimportant" buttons (or thumbs up/down or whatever) next to
results.

Let it learn what I think is a good result for my needs.

If you make it a little bit social, make sure you weight other people's
opinions by how much they agree with my own in other areas (making it harder
for sockpuppets to muddy the waters)

------
thinkbohemian
Does anyone remember when google had this feature?

Well sortof, you could block individual responses from coming up under a
specific search term.

There was a little x by each result if you were signed into google and it said
"never show this result again"

Not enough people used the feature for it to stick around...

I would love this ability but google please, good UI and consumer education. I
love your features but don't love when they get taken away because users don't
know they exist.

------
jeffg1
It doesn't seem like it would be hard, but if the rankings aren't driven by
money, then there will be attempts to game the system. The problem I feel is
Money. As long as everyone has to compete for it (meaning money doesn't work
for the people, people work for money - in a system owned by the few), we'll
have shady marketers, shady products, spammers etc... so, I think that it will
remain a cat and mouse game.

------
Rhapso
It seems like a obvious answer, but why not just use "-site:annoyingpage.com"
in you search? In fact "-TotallyUnRelated" has helped me narrow down searches
effectively too. You are asking for a feature that only a small subset of the
users will benefit from and use, it makes more sense for google just to find a
way to rank sites better then it does to build a additional filter on top of
the current system.

~~~
jonhendry
Because I'd want to add a list of excluded sites for pretty much every single
query I do.

Would you want to type out a string of 20 or 30 excluded sites every time you
search Google?

Ranking clearly isn't going to be good enough, because algorithms can be
worked around and gamed.

------
serveboy
I use a Chrome extension called Google Search Filter which solves this exact
problem -
[https://chrome.google.com/extensions/detail/eidhkmnbiahhgbgp...](https://chrome.google.com/extensions/detail/eidhkmnbiahhgbgpjpiimdogfidfikgf)

It lets me sync my config accross multiple machines.

Has nice hacker-ish config. Basically a text file you can share with others.
This is my current config:

# Make these domains stand out in results

+en.wikipedia.org

+stackoverflow.com

+github.com

+api.rubyonrails.org

+apple.com

+ruby-doc.org

+codex.wordpress.org

+imdb.com

+alternativeto.net

# SPAM - never show these results

experts-exchange.com

ezinearticles

------
cygwin98
Sounds to me the web search is not yet a solved problem. As the hardware
(storage and memory) is getting cheaper and cheaper, and the emerging enabling
technologies such as cloud computing, building your own search engine may not
sound impossible any longer. Wonder how feasible it is to apply anti-spam
algorithms that work well on emails to web pages.

------
michaelhart
Google Domain Blocker: (userscript/greasemonkey), for those interested.

<http://userscripts.org/scripts/show/33156>

You can also sync them for Firefox across multiple machines using Dropbox, as
the preferences are stored in your profile (IIRC, in a javascript file).

------
diegob
Wouldn't implementing this feature be a tacit admission that there's a problem
with search results?

------
coffee
_"This would solve a lot of people's complaints in one fell swoop."_

And doing this would _spawn_ a lot of people's complaints in one fell swoop.

If you owned a site, and created enemies, they could band together and flag
your site as spam.

~~~
emef
I don't think you understood his suggestion. "Banning" a site would be local
to your signed-in google account, not a global ban from results (which would
indeed suffer the backfire you mentioned.)

~~~
coffee
doh!

You are totally correct. I completely missed that. Maybe I should drink a bit
of coffee and wake up ;)

Although... I have a suspicion that at some point it would effect non-logged
in users. Many logged in users banning a site is a signal that may effect the
global results. Maybe in the same way marking spam in Gmail...

------
pilom
Startup idea: Create a service around google custom search. Select the "Search
the entire web but emphasize the selected sites" Then create a gui to allow
people to prioritize or ban their search results.

------
scotty79
In the old days we had killfile. Why can't we PLONK content sources like
authors or sites by handles like nicks or domain names? There should be some
standard protocol for that. Httplonk.

------
RP_Joe
So what we are talking about is censorship. You are suggesting a non-
traditional type where a government does not do the censoring, but a few
people do. How many votes would it take to put a website on a blacklist? 50,
100?

Who decides if a site is spam?

So is free speech dead under your proposal? What is I build a site that
criticizes the Governor of your state. Or a federal agency. What would prevent
my site from being blacklisted in your proposal? Even if I had great content
(your argument is about poor quality content) my could be voted into a black
hole in a few hours. Lets think about this carefully. Is that the price we are
willing to pay to get rid of EE?

~~~
nervechannel
Did I say _anything_ in the OP about my blacklist affecting other users?
Please read before ranting.

------
ajayjapan
My question is why stackoverflow hasn't banned efreedom yet?

~~~
kqueue
SO provides their database content for free under creative commons license.
efreedom is not doing anything illegal.

------
alexobenauer
Although it's sad because it speaks volumes that we're fed up with all the
garbage in many of our search queries.

I do hope those working on _the algorithm_ are taking note.

------
pilooch
you can do it with seeks... <http://www.seeks-project.info/>
<http://www.seeks.fr/>

on your local machine and/or remote server... and it's free software.

blekko ? try this query, <http://blekko.com/ws/?q=debian> duh ?

------
richbradshaw
Just use Google SearchWiki.

Oh, yeah – they pulled it.

------
hoofish
the problem I have with this is that some black hat people can do this to any
site they feel they are competing with. what would prevent someone from
blacklisting a legitimate blog or website just because they did not like the
content?

~~~
nervechannel
So what? All that would mean is that the site wouldn't show up in _their_
results.

I never said anything about it affecting other people's results...

------
byron8
me gusta esta idea, muchos de los resultados iniciales son spam, y los
resultados que de verdad me sirven aparecen dos o tres paginas después,
apreciaría mucho que se pudiera banear los resultados alejados o que considere
spam, thxs

------
forkrulassail
YES. Like the useless chromeextensions.org

This would be an awesome feature.

------
stretchwithme
great idea. Let this be the first question asked at any Google event.

In fact, let there be a sea of hands all gesticulating wildly to present it.

------
eliben
Can't this be done with a browser plugin?

~~~
nervechannel
Like I said in the OP: "There are gresemonkey etc. scripts to do this, but
they're tied to a single browser on a single machine. A global filter (like in
gmail) would be so much more useful."

------
podperson
simply add -site:foo.com to your search request.

And no, this doesn't solve the problem.

------
AussieChris
blekko . com is doing this and much more

------
foljs
And no more bloody experts-exchange...

~~~
shrikant
I honestly don't mind that site - about half the time I search for an issue, I
find it's been resolved by someone over at Experts Exchange.

Do people realise that if Google is your referrer, you can scroll all the way
down and see the solutions to the question?

~~~
jawns
It seems like every few months, EE tries to cloak its site in a new way ...
then Google catches them and they revert to merely misleading people (by
putting the answers several page-scrolls down) into thinking that they need to
pay to see the answers.

Actually, I sort of sympathize with the predicament EE faces. They want to
show up high on Google search results, because that's how they get new
customers ... but they don't want to give away their content for free.

Here's an opportunity for a search engine start-up: Allow users to search (by
default) for only free content -- but also allow them to search, if they so
choose, for content that's behind a paywall. Pay-for-access sites would love
something like this.

~~~
nervechannel
_Actually, I sort of sympathize with the predicament EE faces. They want to
show up high on Google search results, because that's how they get new
customers ... but they don't want to give away their content for free._

I have no sympathy _at all_ for them. All the answers are community-generated
for free, aren't they? So they're trying to charge for other people's
generosity. And plenty of sites on the web give away better content than
theirs for free, with no attempt to trick people into paying.

Their business model is broken and I'm surprised they've lasted this long.

~~~
Sandman
I thought that the experts that answered the questions got paid? Guess I was
wrong...

------
alnayyir
[https://chrome.google.com/extensions/detail/ddgjlkmkllmpdheg...](https://chrome.google.com/extensions/detail/ddgjlkmkllmpdhegaliddgplookikmjf)

Is there something I'm missing here?

It's not in Google's financial interest to provide this feature, but it
already exists rather trivially.

~~~
nervechannel
Still tied to a specific browser though. Which isn't available on all
platforms.

~~~
alnayyir
Which platforms are that? I'm using Chrome on Windows, Mac and Linux. It can
be run on FreeBSD if you're willing to deal with the bridge troll running the
show there.

<http://i.imgur.com/1Amu3.png>

Do you really think this feature doesn't exist for Firefox?

Further, it'll even sync with your google account making it global if you give
it access.

~~~
nervechannel
Well iOS for one.

Also much as I like Chrome, I can't run it on my Linux box because the font
rendering is _terrible_ and it lets sites' font choices supersede the user's.
That means I can't overrule their painful font choices with ones that look
good, like I can in Firefox.

But that's another rant...

~~~
alnayyir
>iOS for one

 _whistles and pops finger_

> it lets sites' font choices supersede the user's

Googled, found it 5 seconds. Sure there are at least 3 other ways to do it,
one of them while rubbing your tummy and patting your head.

[http://www.google.com/support/forum/p/Chrome/thread?tid=2121...](http://www.google.com/support/forum/p/Chrome/thread?tid=21218cb9950ed044&hl=en)

------
svlla
I'd like to see an option for searching only ad-free sites, or perhaps just
sites that don't use AdSense, as well. Surely Google would have no problem
with that.

------
GrandMasterBirt
Use duckduckgo.com. Its pretty good with excluding spam. And with a new
service there is an indicator of how spammy a site is.

