
How Google uses blacklists, algorithm tweaks and contractors for search results - tysone
https://www.msn.com/en-us/money/companies/how-google-interferes-with-its-search-algorithms-and-changes-your-results/ar-BBWOCm2
======
williamDafoe
Bullet #3: Yes they keep blacklists, I worked on the web crawler for many
years. But the article does not understand or differentiate between blacklists
for content farms, spam domains, link farms, infinite spaces (that aren't
calendars) etc. Blacklists are low level url regexps. In ~2015 some spammer in
China overnight created 100 million websites to boost priority and each page
had 1000 links. Google saved them all! This will literally crush the web
crawler, slowing it by 20000x or more, like a snake swallowing an elephant. Or
it will crash the crawler and we'd have to write manual code to cleanse the
search logs and add blacklists to keep it from happening again, right away.

Blacklisted domains are blacklisted forever, because they crash the crawler.
This happens 1x - 3x times per year. Changes here are all tracked and only
owners of the first stage of search (who have no connection to the ranking
algorithm) have change rights.

Bullet #5: search quality is assessed by thousands of contractors worldwide
and you can become a part of the crawl quality team, although it doesn't pay
that well in the USA. The book they follow is 180 pages and available on the
Google website for many years. It has guidelines for how to determine sexual,
offensive, or illegal (or child porn) content in ALL countries. It has
guidelines on how to rank news source reputation and credibility.

~~~
baybal2
> Bullet #5: search quality is assessed by thousands of contractors worldwide
> and you can become a part of the crawl quality team, although it doesn't pay
> that well in the USA. The book they follow is 180 pages and available on the
> Google website for many years. It has guidelines for how to determine
> sexual, offensive, or illegal (or child porn) content in ALL countries. It
> has guidelines on how to rank news source reputation and credibility.

Funny factoid for you: Google hires those guys in Russia through recruiters
and innocuously sounding shell companies in Saint Petersburg.

The guy can work for the shell for years without realising that he works for
Google.

~~~
jhanschoo
> Funny factoid for you: Google hires those guys in Russia through recruiters
> and innocuously sounding shell companies in Saint Petersburg.

> The guy can work for the shell for years without realising that he works for
> Google.

Your comment makes this practice sound shady. I don't see how this practice is
even probably shady, if the general thrust of you say is true.

First, you have to set up a different company for each country where you're
doing business. Second, it's not a shell company if it hires employees or
contractors and has clients. Finally, if the subsidiary doesn't engage in your
primary business activity you don't call the subsidiary by a name similar to
the primary company's name.

It's a very common corporation structure.

~~~
baybal2
Well, at one point a recruiting company acting on their behalf tried
recruiting me for an ops position. After seeing a few metre long NDA in
perfect English, and Russian, and them being extremely tight lipped for whom
and what for the job is, it was very clear to me that it was Google given that
I knew that the recruiter is one of the few with whom Google works in Russia.

I asked "Google?" It raised their eyebrows, but they said they can't answer
that.

~~~
austhrow743
That's common recruiter behaviour. A lot of them are very afraid of being cut
out of the deal.

~~~
baybal2
The point is, Google doesn't hire directly there. Officially, they have their
Saint Petersburg office closed years ago, and it is "just 3rd party contractor
companies" doing things for them there

------
creaghpatr
Here are the findings of the investigation, according to the article:

>More than 100 interviews and the Journal’s own testing of Google’s search
results reveal:

• Google made algorithmic changes to its search results that favor big
businesses over smaller ones, and in at least one case made changes on behalf
of a major advertiser, eBay Inc., contrary to its public position that it
never takes that type of action. The company also boosts some major websites,
such as Amazon.com Inc. and Facebook Inc., according to people familiar with
the matter.

• Google engineers regularly make behind-the-scenes adjustments to other
information the company is increasingly layering on top of its basic search
results. These features include auto-complete suggestions, boxes called
“knowledge panels” and “featured snippets,” and news results, which aren’t
subject to the same company policies limiting what engineers can remove or
change.

• Despite publicly denying doing so, Google keeps blacklists to remove certain
sites or prevent others from surfacing in certain types of results. These
moves are separate from those that block sites as required by U.S. or foreign
law, such as those featuring child abuse or with copyright infringement, and
from changes designed to demote spam sites, which attempt to game the system
to appear higher in results.

• In auto-complete, the feature that predicts search terms as the user types a
query, Google’s engineers have created algorithms and blacklists to weed out
more-incendiary suggestions for controversial subjects, such as abortion or
immigration, in effect filtering out inflammatory results on high-profile
topics.

• Google employees and executives, including co-founders Larry Page and Sergey
Brin, have disagreed on how much to intervene on search results and to what
extent. Employees can push for revisions in specific search results, including
on topics such as vaccinations and autism.

• To evaluate its search results, Google employs thousands of low-paid
contractors whose purpose the company says is to assess the quality of the
algorithms’ rankings. Even so, contractors said Google gave feedback to these
workers to convey what it considered to be the correct ranking of results, and
they revised their assessments accordingly, according to contractors
interviewed by the Journal. The contractors’ collective evaluations are then
used to adjust algorithms

~~~
WalterGR
_Despite publicly denying doing so, Google keeps blacklists to remove certain
sites or prevent others from surfacing in certain types of results. These
moves are separate from those that block sites as required by U.S. or foreign
law, such as those featuring child abuse or with copyright infringement, and
from changes designed to demote spam sites, which attempt to game the system
to appear higher in results._

Google has a permanent demotion applied to a site I've run since 1996:
[http://onlineslangdictionary.com/](http://onlineslangdictionary.com/) . I
estimate that my traffic would be 2.5x - 3x what it is now, were the demotion
not in place.

These demotions are hidden, permanent, and cannot be appealed. Moreover, these
demotions can be performed _by hand_ internally within Google - for whatever
reason they choose, or for no reason at all. That is to say, some demotions
are manual and not automated.

I have never been _officially_ notified that the demotion exists, in any of
Google's available tools or any other way. However, a Google employee checked
the internal status of my website and there is, indeed, a permanent demotion
in place.

There is no reason for my site to be demoted. This demotion was put in place
while Matt Cutts was the head of the web spam team. I asked him about it here
on HN, and he lied about it. (I know he lied because of my communication with
the Google employee.) You can read my thread with Matt here:
[https://news.ycombinator.com/item?id=5408087](https://news.ycombinator.com/item?id=5408087)
.

I'd like to make the email chain between the Google employee and I public. But
I don't want to ruin someone's career / life, just because they did the right
thing and told me about the penalty.

So... I don't know what to do. Thoughts?

~~~
zepto
That thread with Matt Cutts looks terrible for Google.

In it he admits that the site is being penalized for having prominent ads
above the fold.

90% or more of Google’s revenue comes from presenting prominent ads above the
fold.

How is demoting the organic results of sites that use the same business model
that it does not the epitome of anti-competitive behavior?

~~~
im3w1l
Putting ads above the fold makes a site worse for users. This doesn't mean
it's immoral, and it doesn't mean it's the wrong thing to do. You are trading
of the amount of value you provide and ability to capture a portion of that
value. Capturing value can be essential to continued existence of the site,
and ability to create more content.

However, Google search is acting on behalf of the users, trying to find them
the result that brings them the most value. And everything else equal, that is
the one without ads above the fold.

~~~
zepto
The argument that Google is acting on behalf of users is contradicted by the
fact that they put their own ads above the fold.

------
Nasrudith
Google can't interfere with its own search algorithms by definition. It is
their design and theirs alone. The accusation isn't merely wrong but
impossible generically.

The whole goddamn point of a search engine is to privledge certain results
over another. The claims of armies of contractors reek of the zombie lies of
their persecution complex.

~~~
ineedasername
Sure, you can define algorithm to be "whatever Google does" in this way, but
that misses the point: The point isn't that they violate the algorithm, that
is just a semantic convenience in presenting the article. The point is that
they privilege and editorialize the results in ways they have claimed they do
not do.

------
chadmeister
We should rather be fighting for competitive alternatives and looking more at
if/where Google uses it's market dominance to stymie competition. That would
be a more fruitful discussion IMHO.

~~~
jammygit
What alternatives do you all use? I like DuckDuckGo but it still depends on
existing engines behind the scenes. There is also Quant and SearX but I
haven’t used either much.

Surely there is some foss project that has been promising, somewhere?

~~~
charlesism
I don’t understand why Google’s competitors don’t form an independent search
engine. If I were Microsoft, I’d talk to Apple and others to see if they would
help fund a spun off Bing.

The internet badly needs a big alternative search engine that isn’t beholden
to advertisers or dependent on a single corporate owner.

The benefit of such a search engine (whose main incentive is to just be a good
search engine) is obvious for the public, but would also give companies who
rely on their own OS leverage against Google.

~~~
Nasrudith
Conflation of want and need aside how exactly - lets assume they have said
niche. How are they going to manage to scale funding to actually provide for
it while big?

Paid by user search? Discourage curiosity or just using the alternative.

Deep pocketed sponsor? They have the control now.

User donations are the biggest "maybe" I can see which would be not worse but
depends upon charity and campaigning to some degree.

~~~
charlesism
If you mean “conflation of what customers want and what companies need” that
is another way to express “customer focus”

If companies like Apple and Microsoft care about providing a great user
experience, Google search is risky. I think users would prefer not to see ads
when they search, or worry about Google harvesting their data. If this is so,
it might be worth it to fund some sort of independent “search foundation.”

I reckon a simple text-only search engine – like Google before it jumped the
shark – would actually be quite cheap to develop and operate.

~~~
v7p1Qbt1im
I can tell you from experience that everyday users don‘t care that much about
seeing ads. Some even like them. They care even less about the possibility of
manipulated results. It’s also a really big topic to explain.

Besides the question of the funding of this hypothetical search engine. It
being „fair and objective“ or even completely transparent about its ranking,
would mean it‘d be SEO‘d into oblivion by everything from click/content farms
to trolls to more nefarious actors. As long as many on the internet want to
make money or manipulate people somehow it‘s not really doable in my opinion.

~~~
charlesism
Am I doing your comment an injustice if I paraphrase it as follows?:

\- Ads do not adversely affect customer satisfaction.

\- The combined forces of Microsoft, Apple and others could not create a
serviceable search engine (despite Microsoft alone having already made one).

If not, let’s just agree to disagree :)

~~~
v7p1Qbt1im
Ads are looked at as either a nuisance and necessary evil or just a part of
how the „free web“ pays for itself.

I‘m sure they can. And, as you say, MS does already. supported with ads as
well unless I‘m mistaken. Are you suggesting they offer a search engine and
subsidize it. That they offer it as part of their ecosystem benefits, sort of?

~~~
charlesism
Yes, that’s what I was thinking.

I can’t disagree that users tolerate ads, since advertising is the model of
plenty successful websites. It’s just that, like bundled OS crapware, the user
experience is better without it.

------
buboard
Is there research on how to create a distributed search engine ?

~~~
smashingfiasco
I’d love to see search decentralized and have been toying with this idea for a
while.

The last concept I hacked together was a custom search plugin for Grav and a
command line util to use for querying.

It goes like this.

Use the command line util to search a term. The command line util run that
term against the search engine _inside_ of the websites CMS itself. You
essentially have a list of sites related to a topic that you chose to execute
the query against.

I got this working against some sites and the proof is there. But it’s
obviously highly inefficient and I haven’t figured that out yet. :-/

~~~
mgreenleaf
One alternative is to run a webcrawler that stores the index in a series of
SQLite database files, either by topic or by site, or any other criteria. Then
users could download sets of those SQLite databases and run queries on them.
Not really completely distributed, but hides some information in the noise of
"search sets" and mirrors, and individual queries are run on local. You could
mirror the main repository and just run searches on your own server/local. You
could also swap the database files with P2P, etc.

------
creaghpatr
For those who can't see the article:

>The practice of creating blacklists for certain types of sites or searches
has fueled cries of political bias from some Google engineers and right-wing
publications that said they have viewed portions of the blacklists. Some of
the websites Google appears to have targeted in Google News were conservative
sites and blogs, according to documents reviewed by the Journal. In one
partial blacklist reviewed by the Journal, some conservative and right-wing
websites, including The Gateway Pundit and The United West, were included on a
list of hundreds of websites that wouldn’t appear in news or featured
products, although they could appear in organic search results.

Gateway is trash and I'm not sure what United West is but they can't say they
aren't blacklisting political sites at this point. Pretty big challenge for
them ahead of the bipartisan AG investigations and 2020 elections.

~~~
happytoexplain
Your assertion is a bit literal. Yes, they can't say they aren't blacklisting
political sites. But they can say they aren't blacklisting sites _for being
political_.

------
t0ughcritic
Google has destroyed small business with its monopoly. First search page is
all ads on mobile.

------
retrovm
If you actually read the article most of the statements of fact are about the
omnibox autocomplete system, and then they use innuendo to imply some things
about search engine ranking. But these are two completely separate systems,
and it makes sense that a system that is literally telling you what to type is
more sensitive than the search result ranking. It is not a flaw of Google that
it won't suggest "is hillary clinton still controlled by the jews" when you
type "is hillary clinton". If it was just a big trie of what everyone typed,
it would be completely dominated by 4chan troll bots.

~~~
dunkelheit
First, I don't agree with your assessment of the article at all. There are
multiple concrete assertions regarding manipulations of the search rankings
themselves. I also don't agree that it is somehow "more okay" to manipulate
autocomplete results than the search results proper.

Second, I think the big chunk of the problem here is lack of transparency.
Google has traditionally been very secretive about its algorithms to avoid
tipping off spammers. So if you ask them directly they will hem and haw when
in fact they ban spammers and also, as the article reports, they moderate
inflammatory content and manually boost rankings of specific websites. The
question is - what is the exact scope of these activities? Where is the red
line that they will not cross? I think the public deserves to know.

~~~
erikpukinskis
What’s a site they’ve manually boosted?

~~~
dunkelheit
> Google made algorithmic changes to its search results that favor big
> businesses over smaller ones, and in at least one case made changes on
> behalf of a major advertiser, eBay Inc., contrary to its public position
> that it never takes that type of action. The company also boosts some major
> websites, such as Amazon.com Inc. and Facebook Inc., according to people
> familiar with the matter.

Of course the exact nature of changes and boosts remains unknown but that's
just underlines the need for transparency.

~~~
erikpukinskis
"algorithmic changes" implies the boost is not manual.

~~~
soraminazuki
An "algorithm" still implies human intent. Heck, even a blacklisting system is
still a form of an "algorithm." Even if each changes to the algorithms Google
have made in the past may be justified, the public can't make an informed
decision about it if it's not transparent about what it actually does.

------
stevenicr
imbo - this is fraud on a global scale [1], it negatively affects millions of
users and thousands of webmasters, and content producers that do not publish
in walled gardens.

I've watched the changes since there was a 'googleguy' and the 'big update was
charted with moon phases'.

Since about the time Page started pushing things around it's gone downhill
slowly with more and more censoring and less and less transparency.

Google is still benefiting from public trust that was earned years ago, when
it was truly doing lots of good around the world.

The lack of transparency about the censoring is a terrible thing for the
knowledge of the planet - an entire generation of people are learning truth
about life (and the afterlife) trusting google, and being filled with info
from youtube. Even the censoring spam filters in gmail are affecting people's
lives in the real world today.

I mentioned some of the issues with users not getting transparent info about
their searches being censored in a comment here recently:
[https://news.ycombinator.com/item?id=21487318](https://news.ycombinator.com/item?id=21487318)
(and how I think more web sites need to put notices about the increased
censorship of big G)

It's fraud for the all webmasters as well.

"Make a good site, we'll find it and display it. Don't do any SEO, that's evil
- just make good content" \- well many of us have spent hundreds of hours
making good content and watching others who have less content rank higher.

Is there a blacklist from the time period Matt Cutts was on the spam team and
around the time he left?

I heard a rumor that if your site was on one of these 'we caught you'
blacklists with a certain googlers name on it - you can't get out of the de-
rank jail unless that specific person lets you out.. a shadowban, and public
notice to do more work to fix it, and whole while knowing nothing will fix the
ranking for that site.

maybe not regardless - but telling webmasters to make disavow lists and spent
that stupid amount of time putting them together - and still not putting their
sites back into the top 10 (knowing you've crafted the blacklists / shadowbans
and tweaked the algo to push them back as well) - that's fraudulent isn't it.

You've made people spend tons of time trying to fix things for google -
knowing they were wasting their time and losing money.

I'm guessing the goal was to destroy lives and knock the spirit out of those
people who would 'game the google system' \- those evil SEO people should be
destroyed.

I'm sure there's a valid excuse - the algo changed - we added more manual
reviews - we have this stay in your lane thing, this your life your money
thing - you have to have all this other info to be legit -

Is that really what the end user needs when they are looking for entertainment
sites? No - it's a sneaky way to put down a bunch of sites to raise up the
others.

Then a video comes out - some googlers say, well it is okay to hire an SEO
company now - don't hire a bad one or it'll penalize you - you should only
hire one that says it will take 6 months for you to rank.

So google keeps saying one thing and doing another - then saying another thing
and not doing that.

Webmasters should be able to get details about anyone who has 'manually
scored' their site.

In some cases it's not just are they looking for what's in the manual - their
location in the world, their religion, and other factors, could influence how
they feel about a site and it being downranked by someone in the Philippines
could have drastic consequences for a webmaster in the states, and for all the
users who might enjoy that site from Europe.

So the one big G statement said that NOT being transparent is the best thing
so bad actors don't take advantage of knowledge of the system - well I believe
you are hurting more good actors and more users by hiding everything.

There's even some recent evidence that telling users why things have been
moderated actually leads to less problems:
[https://news.ycombinator.com/item?id=21513871](https://news.ycombinator.com/item?id=21513871)

[1] InMyBiasedOpinion - and not a lawyer, doctor ymmv yada yada

~~~
stevenicr
Will the downvoters please comment about what part of these statements you
find wrong? I am trying to offer an honest assessment from how things look
from the other side of the glass, er bubble.

I put in my comment that my opinion is biased, and I think it is obvious what
side of the issue that bias is from. I will add that for a couple of years I
had a site in the number one position or top 3 of some cool search results and
it was a site that gave the searchers what they were looking for.

For a long time google was good. I even took my love of google and content
creation to other businesses in town, got them to make better web sites and
even partnered with them to spend more than $100k on adwords over a couple
years.

When things are good, they are great, but when cracks start to show - the
algorithm changes to enhance national publishers, there is little help when
you discover fraud clicks, customer support is a 'volunteer top poster not a
google employee' kind of thing.

as was said at a hearing recently; "“Small businesses cannot survive on the
internet if they cannot be found.”" \-
[https://www.marketwatch.com/story/tech-giants-google-
amazon-...](https://www.marketwatch.com/story/tech-giants-google-amazon-and-
facebook-accused-of-posing-threat-to-small-business-2019-11-15)

knocking people out of business with hand wavey 'make good stuff, don't do
seo' \- knowing that they will be screwed forever and knowing they most likely
will never know about the secret manual that some know about and how it
actually plays out, it's worse than mean.

imho

~~~
philipkglass
I have done a fair bit of work labeling and classifying quality of user
submitted URLs for a public facing platform (not Google). That includes many
hours spent manually inspecting content and deciding whether a site looks
"spammy" overall.

[http://weblog.globaladvancedmedia.com/2010/research-
before-b...](http://weblog.globaladvancedmedia.com/2010/research-before-
buying-web-software-script-security-is-important/)

If I were judging by the first few paragraphs of this entry from your site, I
would lean toward blacklisting it. Reasons: typos, grammatical errors, and a
general lack of polish and punch in the writing. It looks at least
superficially similar to thousands of keyword-stuffed, semantically-
impoverished blogs that I have encountered before. The page source contains
another red flag:

<!-- This site is optimized with the Yoast SEO plugin v9.2.1 -
[https://yoast.com/wordpress/plugins/seo/](https://yoast.com/wordpress/plugins/seo/)
\-->

The co-occurrence of the terms "SEO" and "optimized" is almost a good enough
signal to blacklist it on that basis alone.

I am not saying that _you_ are personally trying to exploit search or
recommendation systems to trick people into visiting your page. There are also
two big counter-signals that show me this entry _isn 't_ part of a content
farm:

\- You don't link to commercial sites.

\- You don't show ads on the page.

The problem is that there are armies of people churning out "SEO blogging make
money fast" content, incorporating ads or commercial links, and spreading it
across a multitude of domains. For every blog entry like this one --
unpolished but harmless -- there are many that look textually similar and are
purely mercenary.

 _So the one big G statement said that NOT being transparent is the best thing
so bad actors don 't take advantage of knowledge of the system - well I
believe you are hurting more good actors and more users by hiding everything._

This is where I disagree most strongly. Google already struggles to keep junk
out of search results. Process transparency would enable content farmers to
evolve more quickly. Thousands of brilliant engineers are not a match for
millions of people who pollute the web as a full time job. Some good actors
will be hurt, granted. I think that you are badly underestimating the number
of bad actors when you say that opacity hurts "more good actors" and users
than bad actors.

~~~
stevenicr
This "typos, grammatical errors, and a general lack of polish and punch in the
writing." \- reminded me of something I had considered some years ago...

A time when I read google was ranking edu type sites higher, and blogs
lower... I noticed more news sites in top results and mayo clinic types..

It dawned on me that it would be a convenient truth to point to a bunch of
'high brow signals' to justify sanitizing results a bit - and that this would
be a slippery slop slide into censoring lots of adult stories and other
entertainment, while also playing into the bigger companies that can afford to
spend the adwords money.

Could be good reasons for this (less public pressure to remove the porn and
such) - could be nefarious, censor the web for users and cater to those who
can afford to pay the big bucks, less companies to contend with content
questions - while not be transparent to the users and content creators.

This allows big money to influence the results via ads easier, and limits
choice - those publishers who spend a lot of time creating content are cast
aside, even though one side of the big G keeps saying 'create good content you
will soar to the top'.

I think we, er they, big G especially, crossed a threshold of being able to
determine intent more often than not and so the need to filter search results
for 'how to have sex' and 'watch free sex' for example are different and can
be, and should be handled differently.

I know they are handled differently to a degree, I feel it's important to
point out that these two different intentional searches should show results
even much more different than they are.

The first one would likely benefit from ranking higher sites that meet a lot
of the points on the pdf manual checkers document and other factors for trust
rank and what have you. However I think the other kind of sexual entertainment
searches would actually benefit from not using many of those factors in the
ranking process.

I believe you will find many professional sex people do not advertise their
address on every page of their site, and many do not use real names in order
to make it harder for bad things to happen to these people. For example.

I also think the need for perfect grammar and such is much less when people
are looking for erotic entertainment. Millions of penthouse stories magazines
have sold over the (pre-internet, get that stuff free via searches from
content indexers, years) - and I am pretty sure that if every story had
perfect grammar like it was written for a college thesis, that they would not
have sold as well month after month for years.

If you combine this with the type of grammar and spelling you see a majority
of people using via textual communique - look at Insta, fbk, snap... people
expect, engage with, react with, and continue to pursue content that is not
grammar and spelling perfect.

I'd go so far as to say that a majority of people in the US at least (?) are
actually mostly trying to find more crude discussions and writing styles, and
it's a much smaller amount of people searching daily for phd level high brow
perfection.

Of course this is different when looking for electrical engineering searches,
and even searches for putting together prefab furniture - they are definitely
searches where you want things to be accurate and no fluff, no extra
personality needed.

Given that I believe this to be obvious to most, and that we do not have the
computer systems of 1991 running the search giants, I believe that they know
they could provide tons more content that browser reporting behavior would
show that people enjoy and are looking for - yet they choose to use some of
these trust rank things to censor bigger portions of the net for various
reasons.

Hey, I'm a big believer in private companies doing what they want - I just
think transparency is seriously lacking with big G - why not be honest about
how many semi-good sites are not being shown because google is employing new
content filters?

We used to see this chilling effects notices regularly, and some results show
that X number of pages or sites are not being shown due to dmca requests...
but being honest about how many sex chat sites google use to show in the
results, and how they have pushed many good ones down and many more straight
out of the index.. we don't see any posts about that.

Sadly, for many people the internet is what google shows it is. I understand
there are many in the world who think whatever if on fcbook is the entire
internet. Well if things are being removed from these platforms and it's not
being understood - then it's a huge disservice to humanity, imho. It's closer
to people learning with today's tools may never find Mark Twain and others for
they are not perfect in the eyes of the elite.

------
allovernow
Between SEO gamification, ad spam in top search results, neutered advanced
search capabilities, auto correction of search terms based on NN models which
are regressing to the layman's mean, and increasing evidence of manual
manipulation of search results/autocomplete, Google search is rapidly
degenerating into a pile of garbage. Unfortunately their market cap combined
with the status as the defacto portal to the internet makes them hard to
unseat and DDG isn't quite as good yet.

Give me back the Google of 5-10 years ago, and the rest of the internet from
that time, not this ad and blogspam dominated AOL 2.0 joke of a net that we're
quickly centralizing into. It's sad to see where ad based economics are
driving the net.

------
privacywall
Good old msn

[https://www.msn.com/en-us/money/companies/how-google-
interfe...](https://www.msn.com/en-us/money/companies/how-google-interferes-
with-its-search-algorithms-and-changes-your-results/ar-BBWOCm2)

~~~
dang
Thanks! Url changed from [https://www.wsj.com/articles/how-google-interferes-
with-its-...](https://www.wsj.com/articles/how-google-interferes-with-its-
search-algorithms-and-changes-your-results-11573823753?mod=rsswn).

------
helpPeople
When it gets bad, I'll stop going.

But right now I am still getting decent results.

I don't use Facebook anymore, yet we had some hysteria about that being
manipulated.

~~~
jbay808
I don't know about you, but when I search for "python string replace" I'd
expect the first result to be the Python 3 documentation for 'string'.

Instead, I get (in order):

1\. GeeksforGeeks

2\. Tutorialspoint

3\. W3Schools

4\. Programiz

5\. Stack overflow

6\. And then, finally, the official documentation... For python 2.7.

How does this happen? Are these sites just paying Google a bunch for the
rankings?

~~~
jonas21
Honestly, this is because the organization of the Python 3 docs is terrible.

The documentation for str.replace is located halfway down an enormous page
that describes every single built-in type in the language [1].

And then, once you manage to find the entry for str.replace, what does it tell
you?

 _Return a copy of the string with all occurrences of substring old replaced
by new. If the optional argument count is given, only the first count
occurrences are replaced._

That's it. No examples, no link to re.sub or other functions you might want to
use for replacement. Stack Overflow or even W3Schools (gasp!) is much better
results for this.

[1]
[https://docs.python.org/3/library/stdtypes.html](https://docs.python.org/3/library/stdtypes.html)

~~~
saalweachter
Incidentally, if this is the level of information you are looking for, you can
get the same thing by typing help(str.replace) in the Python interpreter.

------
7loopscom
what is the purpose of sharing articles that are behind a paywall?

~~~
cameronbrown
Do you prefer ads or paywalls?

Nobody's ever happy..

~~~
buboard
Ads. The social contract of the web was: link aggregators bring traffic to
websites which need views to ad-support themselves. WSJ doesnt need it. (This
article is not available on archive.is or anywhere. You actually need to pay
to see it).

~~~
nradov
There is no such social contract.

~~~
buboard
thats why RSS and those like/share buttons existed

------
deweller
Try this:

[https://duckduckgo.com/?ratb=c&q=autism+vaccine+link&t=h_&ia...](https://duckduckgo.com/?ratb=c&q=autism+vaccine+link&t=h_&ia=web)

[https://www.google.com/search?q=autism+vaccine+link](https://www.google.com/search?q=autism+vaccine+link)

The 3rd result on duckduckgo.com is learntherisk.org. That result is not
presented in the first 10 pages of results on Google.

It seems quite likely to me that one of these sites is manipulating search
results. Because the organic search results for these two different search
engines should not be that far apart.

Edit: Why the downvotes? Here is an even more egregious example:

Try searching for "learn the risk autism".

[https://duckduckgo.com/?ratb=c&q=learn+the+risk+autism&t=h_&...](https://duckduckgo.com/?ratb=c&q=learn+the+risk+autism&t=h_&ia=web)

[https://www.google.com/search?q=learn+the+risk+autism](https://www.google.com/search?q=learn+the+risk+autism)

Note that I am not promoting an agenda here. This seems like an example of
manual manipulation. The article sites "vaccinations and autism" as an
example.

~~~
realmod
Wait why would duckduckgo be more accurate because a random site called
learntherisk.org is present. This is just confirmation bias.

------
sk84life
Google sucks in many ways. Search results are very curated and filtered to
match they agenda.

Advertise ? How it's any profitable for advertiser to have multiple same
adverts on page ?

Thruth about Bill gates and Epstein is filtered away..just like hn filter and
delete comments :)

~~~
harvestofzorros
Daa

------
gtirloni
I can't read the article but unless it's NOT about Google promoting their
services on top of others, let me be the devil's advocate and ask:

Are we elevating Google's search engine to public utility status?

~~~
holstvoogd
Same here, but working with that assumption:

Yes. And we might have too I'd say. Given the defacto monopoly they have, I
think it is reasonable to agree that they fulfill a public utility function
and thus need to be held to a higher standard. This should however be done
through regulation, cause that is why we have governments.

~~~
enitihas
But other search engines do exist, and it is trivial to switch from Google to
Bing or DuckDuckGo. Unlike a social network, you can switch your search engine
in isolation without waiting for others to switch.

------
codingslave
The American people arent this stupid. Its obvious to everyone that the search
results are being messed with. Its incredibly arrogant for Google to think
they can get away with mass scale information manipulation. Capitalism doesnt
work this way, Google search is an inferior product, everyday the opportunity
for a competitor to move in gets just a little bit bigger. But the time is not
quite here yet

~~~
po1nter
> Google search is an inferior product

Inferior compared to what, exactly? I have yet to find a good replacement.

~~~
codingslave
Worded that poorly, inferior to what could exist. Clearly too expensive at
this point to show up and compete, but I think the time is coming.

~~~
reroute1
Isn't literally everything inferior to what COULD exist?

~~~
codingslave
No, I think Google search is inferior to what google could produce. By that I
mean they tuned their own knobs in such a way that search results are worse
than they could be. They are hurting the search results in the name of revenue
and bias (political and otherwise)

