

Sheep Or How Google is destroying the internet - bdfh42
http://42topics.com/blog/2008/05/parable-of-the-single-sheep-or-how-google-is-destroying-the-internet-and-nobody-seems-to-know/

======
mechanical_fish
Two points:

First, no article complaining about Google cutting off the traffic to Site X
or Site Y is complete unless it _fills in X and Y_ with specific examples. I
want to know exactly what we're talking about. Otherwise I tend to surmise,
perhaps unfairly, that articles like this are astroturf: They're written by
spammers or spammer apologists who are angry that their highly profitable link
farms have been effectively cut off. After all, if I find myself reading a
random article on this topic, it's probably because a) it's ubiquitous or b)
my subconscious has been lured to it by a well-crafted linkbaity headline. And
who is good at making posts ubiquitous and tempting? Spammers!

So knock it off with abstractions like "sheep" and show me some real examples,
so I can empathize properly.

Second, the reason why Google doesn't take a "rule of law" approach is easy to
see, isn't it? Nobody on earth is better at gaming complex sets of rules than
programmers. Create a clear specification that defines "spam" and the spammers
will promptly craft a ton of elegant and technically legal ads and then flood
Google with them.

Why doesn't this happen with the "rule of law" in a legal setting? Well, it
does: People play fast and loose with laws all the time. But the real secret
to the rule of law is that the ultimate arbiter of that law is... groups of
actual humans called "juries" and "courts", who are empowered to use common
sense to throw the book at those who get too creative with the edge cases.
Real-world law has intentionally fuzzy edges. (Of course, when the fuzz
spreads into the center, you've got a problem.)

Proposing to turn Google's ranking system into something resembling a legal
system -- which is perhaps equivalent to taking our existing legal system and
applying more of it to Google, i.e. legislating certain elements of Google's
design -- is an interesting idea, and it may eventually be tried, but it's not
obvious that it will improve anything, or even change anything. And it's going
to be hard to discuss how well it might work unless we use... real examples!

~~~
justindz
I had a similar thought to your first point when reading this. The author is
claiming that webmasters have to build two sites rather than one, thinning
resources and reducing quality. This is because if Google determines one is
spam, they have to fall back to the other site.

But if the sites are similar and are created for safe redundancy, I would
assume the second site is going to imminently get nailed as well. I mean, the
guys says build two of the same sites at different places so you have a
backup. You're building two mediocre targets either hoping that the other one
isn't noticed by google (which probably means users aren't finding it either)
or that they're significantly different somehow.

I agree, it sounds in its lack of detail like someone doing something sketchy
and complaining it doesn't work. I'm not aware of any cases in which non-
specious websites have gotten nailed and had no recourse. The author didn't
provide any. Has anyone seen this in the wild?

~~~
shabda
For starters, [http://www.forbes.com/2007/06/28/negative-search-google-
tech...](http://www.forbes.com/2007/06/28/negative-search-google-tech-ebiz-
cx_ag_0628seo.html)

I am sort of lazy right now finding replies to [citation needed], but read
threadwatch.oorg, I am sure can find a few.

~~~
mechanical_fish
Well, that's very interesting. Although this article is merely a pro-
journalism example of the very same "consider the case of sites X and Y, where
X and Y are cannot be named" phenomenon, so it still isn't clear to me how
large the risk is.

And I'm with justindz, who doesn't understand the proposed defense. If a
competitor decides to cross the ethical line and take your site down by
"generously" having spambots put up links to your site, thereby causing Google
to conclude that you're a spammer and shut down your pagerank... can't they do
this to all of your sites at once, not just one? My understanding -- correct
me if I'm wrong -- is that the cost or supply of spambots is not the limiting
factor here.

------
greyman
I think this article is a bad joke. It's laughable how some webmasters think
that Google just have to include their spammy web into their ranking, and if
not, Google is bad, and internet is bad, and they now just must build 100
other spammy webs and try to push them into search engines.

------
bigtoga
It's so cutesy with the parable about the people of Foobr!

