
Too Much SEO? Google’s Working On An “Over-Optimization” Penalty For That - ilamont
http://searchengineland.com/too-much-seo-google%E2%80%99s-working-on-an-%E2%80%9Cover-optimization%E2%80%9D-penalty-for-that-115627
======
mtkd
This can't come soon enough.

Too many sites are ranking by stuffing keywords and buying links.

Next, Google need to find a way to rank a page based on it's role in a broader
user experience - rather than being measured in isolation.

Sites (especially in retail) are forced to optimise a page in a specific way
to get indexed successfully. The result is everyone putting too much content
on a single page - detracting from usability and creating homogeneity for
users - like cars all optimised in wind tunnels.

~~~
palish
"buying links"? I'm completely new to this; would you mind explaining what the
benefit is of buying links?

~~~
xSwag
>buying links

People who have a high PR website may sell a link (usually in the footer, on
all pages) to websites in a similar niche to help them rank better. However,
for some people its not blackhat enough. Spammers nowadays tend to use
automated software like the following:

-xrumer - Developed by a few russian people and primarily used by russian pharmacies for fraud, this is the most powerful "spammer" too out there. You can literally spend $1k and get 1million backlinks within a matter of days and have your site in the #1 position for whatever keyword you're aiming for.

-scrapebox - Similar to xrumer, but while xrumer targets forums (vbulletin etc), scrapebox specializes in harvesting large lists of wordpress blogs (throught google + proxies/botnets) and then posts comments on them (usually all those generic comments you see on wordpress blogs such as "nice post" "good advice, bookmarked" etc).

I could literally go on for hours and hours on how people are exploiting
search engines, I think that its prime time for Google to develop an AI for
this rather then just relying on algorithms. There needs to be a better
PageRank alternative that needs to be implemented.

>benefits

In the most basic terms, for spammers, more links = better search rankings =
more traffic = more $$$

~~~
aswanson
The most obvious means to counter this strategy is to watch the rank delta of
any given site and dampen it. If a site goes from 12 incoming links to
12,000,000 in a few weeks, penalize proportionally pending human review.

~~~
bira
What if the site just went viral?

Should the human reviewer check every link by hand?

Think codecademy.com now first for "learn to code". I mean, is it best place
to learn how to code? At the moment, I'd say it's not. Is it the third result
"codeyear.com" (run by codecademy as well)? I'm sure there are better sites
that could take that spot and provide more value to the user than that landing
page pointing to codecademy (Nothing personal against the codecademy guys of
course, big kudos for their venture so far and best wishes for the future).

The problem with today's algorithm is that it can't really tell which site is
really good and which one is just popular, mainstream, consolidated, with a
high PR. Think lifehacker.com second for "learn to code" (on my side of the
world, on google.com and chrome incognito mode on).

Right now if every single newspaper of the world cover your site about how to
build a homemade nuclear device and actually write something about the matter
in the article (how it would be natural to be), use as anchor text "how to
build an atomic bomb" and link to your site, your site will skyrocket to the
top for that kw (and many others similar). Even if it's just a school project
or a joke that made it to the news.

Where's the quality check?

Anyway I'm sure that smart people are currently working on these details and
that a comment reply can't really do justice to the complexity of the issue.

~~~
aswanson
Im sure humans can quickly ascertain whether a site suddenly went viral or is
being gamed, as the number of such sites is fairly small. I never said a
comment reply is the panacea or do can possibly do justice to the totality of
search quality, just that an explosive delta in page rank should not be a
major issue to a company with the resources of a google.

------
jrockway
Disclaimer: I work for Google, but not on search. These opinions are my own
and based only on the linked article.

I took a beating on the SEO blogs for calling SEO a bug a few months ago, but
I'm glad the rest of the world is finally realizing that I'm right :)

Ultimately, Google is trying to rank you highly for providing the best
_content_ ; you shouldn't be spending your time trying to figure out how to
game Google by making superficial changes to the presentation of your content.
Want to rank better? Write better!

The whole problem is rooted in the fact that Google is a leaky abstraction. It
tries to be an omniscient Sherpa, guiding the wary Web traveler with its
infinite understanding of the Web and the individual user's needs. The
reality, though, is that Google is actually just a computer program. So there
is a gap between the user's mental model of Google and what Google actually
does, and it's this gap that SEO exploits for its own profit.

An infinite amount of exploitation would mean that Google would just return
results randomly, and so it makes a lot of sense to detect signs of SEO and
penalize the behavior before it further broadens the perfection/reality gap.
Gaming the system is currently profitable, since the worst thing that can
happen to you is nothing, but the best thing is that you get more traffic. A
penalty aligns the risk/reward spectrum to favor "write better content" rather
than with "spam a bunch of wikis".

~~~
john_flintstone
>Google is trying to rank you highly for providing the best content...Want to
rank better? Write better!

The real world doesn't always work like that. I do a lot of work for an
eCommerce site that sells wholesale. Their customers have little to no
interest in reading text, all they want are pictures. Which means, the
catalogue pages, which are optimised for actual human visitors and not Google
robots, contain little to no text, only pictures. As a result, the catalogue
pages (most important part of the site) do not rank anywhere with Google, and
never will. Google is unable to handle websites that are - quite correctly -
all about the pictures. Competitor sites that outrank this particular site
design their catalogue pages for Google, not for humans, and rank well because
of it.

------
sixQuarks
I gave up trying to optimize for SEO a long time ago. The future is good
content, period. Technology will eventually figure out a way to get the good
content in front of users, whether it's through a Google-like search engine or
something totally different.

I'd rather focus my energy on driving traffic in other ways, such as
traditional media, list building, PR - and focusing on creating great content.

I just do the bare minimum stuff when it comes to SEO (title, h1 tags, some
anchor text here and there). Other than that, I don't pay attention to SEO in
the actual content of the pages.

~~~
dannyr
I'm with you.

Create content & do minimum SEO.

Eventually, other sites will link to your content & Google will figure it out.

~~~
AznHisoka
I think this is wishful thinking. Google have been trying to make search
smarter for many many years. The truth is that links are still a good signal,
and you should try to consciously build links. There are many many many
articles out in the internet that are well-written and informative but noone
can find them because they're not in the top 10 in Google.

~~~
sixQuarks
Creating good content and promoting it through PR should result in natural
links - the exact type that Google looks for.

If you try to stay on top of the latest SEO tactics, you end up hurting
yourself in the long-term. For example, awhile back it was recommended to
create huge directories of conten t (something like: Travel deals in Ohio,
Travel deals in NY, etc, etc). With Panda update, you are now penalized if you
have too much similar content. Why deal with all that headache, just create
good content.

~~~
AznHisoka
Yep, that's what I said. Creating good content ALONE is not enough and never
will be. The internet, like life is not a meritocracy. Just because you write
great content doesn't mean Google will do you a favor and feature it. You need
to promote it, which is what linkbuilding post-Panda is all about. Not blog
commenting, forum spamming, or submitting to directories.

But telling someone to "just write good content" is like telling a programmer
"just create a good product". The world doesn't work like that. You need to
hustle, shout and say Look at how awesome this site/app/article is! Link to
me! I don't like it that much, but that's how the game is, and if you don't
play it accordingly, you'll end up with awesome content and no visitors.

~~~
sixQuarks
True, I take it for granted that everyone knows marketing is the most
important thing you need to know, no matter what product or service you
provide.

------
user24
The problem is that Google actually aren't good enough at search yet.

They still rely on this dumb word-based approach to document retrieval.
Example, if my page is about how to manage your time, and it's a really
fantastic resource about that, but I don't actually mention the phrase "how to
manage your time" anywhere, I won't rank for people searching for that phrase.
I should, but I won't.

So I _have_ to write my content for two audiences - humans and Google.

I don't want to do this. I'd be much happier just building great content for
my human readers, and if I happen not to mention the exact keyword phrase a
searcher might use, it doesn't matter, Google still knows it should put my
site at the top for that phrase, because it understands that my site is about
that phrase, even though I don't mention it exactly.

But Google just isn't good enough yet. Someone can set up a page which has a
bunch of headers and URLs and variations of the keywords and beat me to #1,
even if their content is utter rubbish for human consumption.

That's _why_ SEO still exists. It's symptomatic of a bug in Google.

This penalty will help, hopefully, but we're still going to be in a state
where we have to compromise our content to serve two masters.

~~~
jcampbell1
Google is actually pretty good at getting the semantic meaning, but the SERPs
show snippets of the content, and users only click on results that match the
query.

Google could stop bolding the keyword matches in the SERPs, but that would be
a disaster because they are useful for picking the best result.

My point is that this is not purely a Google bug, but also a bug in human
nature. People like exact keyword matches to their queries.

~~~
user24
> the SERPs show snippets of the content, and users only click on results that
> match the query

I'd love to see a source to back that up. The only data I've seen on this is
various eyetracking and/or click tracking studies which suggest higher up on
SERP=more clicks.

~~~
davemel37
just run a ppc test with dynamic keyword insertion and look at your click
through rates.

This is absolutely correct!!!

------
nostromo
I wonder if Google is getting trapped near a local peak.

I watched the video they released on improving search quality yesterday
([http://insidesearch.blogspot.com/2012/03/video-search-
qualit...](http://insidesearch.blogspot.com/2012/03/video-search-quality-
meeting-uncut.html)) and was initially impressed that they gave so much
thought to slightly improving a small subset of 0.1% of queries. That meeting
must have had 40 people attending!

But afterwards it left me with the feeling that Google is becoming so big and
entrenched in old ways of doing things, that they may not be focusing enough
on the next big improvement in search. Penalties and heuristics can only go so
far -- eventually they'll need something approaching AI -- and if any company
can do it it's Google.

~~~
ma2rten
I don't know if you have any concrete idea in mind, with what AI Google should
to approach AI, but Google does fund a lot of research in Information
Retrieval and other subfields of AI (machine translation, machine learning,
...).

If you were thinking more along the lines of: "you ask a question and it
understands your question and answers it like a human would", then I can tell
you that this would require a scientific breakthrough first. That is not
really something that you can plan for. Also Google mainly excels in
engineering, less so in research. I actually think that Yahoo and Microsoft
have stronger research divisions.

------
DanielBMarkham
I am all for continuing to clean up search.

I'd also like to note that for people creating content on the web, especially
programming-type people who deal in logic and certainty, the SEO system
resembles something like black magic. It's pretty clear that unless you
understand SEO and apply it, you're never going to be seen no matter how good
your content is. Now it appears that if you understand it too well that's also
a bad thing.

The goal here is to let the users themselves inform the search engine as to
what content is good -- hence the plus-ones, social search and all of that
stuff. But all of this is still indirect evidence. Unless you could plug a
computer inside the head of a person and watch their every thought, the only
real data you have for input is server logs, click-throughs, and all kinds of
other things that computers do, not people.

I just don't see this being solved any time soon. But I do see it getting so
complex and unwieldy that it continues to frustrate searchers and content
producers alike. Meanwhile the bad actors will continue to have a heyday.

Wish I could be more optimistic about it.

~~~
dholowiski
Don't you think the obvious solution is to build an algorythm that thinks like
the human brain? I suspect Google isn't that far away from that riht now.

~~~
stfu
Fortunately this is somewhat more than just a wicked problem, something that
is going to keep life interesting for a few more years to come.

------
lkozma
While they can certainly penalize sites that optimize today with respect to
the criteria of yesterday, that effectively just means changing the criteria
of ranking. Nothing stops site owners to optimize with respect to the new
criteria tomorrow. The full implications of the proposition are reminiscent of
Russel's paradox, or worthy to be included in a next edition of Hofstadter's
Gödel, Escher, Bach book.

EDIT: small clarification

~~~
kami8845
Well no, it's clearly stated that the goal is to even the playing field, not
change the rules. If SEOs build too many backlinks (and for big keywords you
need them in the range of hundreds of thousands) they will get punished. What
are they gonna do, build less backlinks? This is of course over-simplified but
it does come across as a sound solution.

~~~
theseanstewart
The solution would be to keep building more backlinks but vary the anchor text
significantly.

------
mark_l_watson
A little off topic but: the idea of SEO has always bugged me: like a store
owner investing in having really clean windows, but crappy products inside the
store.

Admittedly, that is easy for me to say because I don't need high traffic at my
site. I need my web site to have a few high value visitors: people who want to
work with me or communicate because we are into the same technology. The way I
"optimize" my site is writing about what most interests me, and that attracts
people with my interests. Seems pretty straight forward to me.

~~~
davemel37
actually, the most successful retailers invest more money in their window
displays than any other marketing strategy.

Regardless, you are taking an overly simplistic view of SEO...

SEO works best with quality content backing it up.

------
danmaz74
I hope they'll be able to do it right. But, as soon as they change their
algorithms, SEOs will reverse engineer them again and change the
optimizations. There can be no definitive solutions, at least not with current
technologies, because SEOs are very smart and very fast at adapting.

As an aside, the first over-optimization that I would target if I were google
are keyword-based domain names. Keywords in the domain name are given WAY too
much weight; how often do you find that the top three results for the search
"XXXX YYY ZZZZZ" to be very shallow but keywords-rich websites with the domain
names www.XXXXYYYZZZZZ.com, www.XXXXYYYZZZZZ.org, www.XXXXYYYZZZZZ.net?

------
stfu
I have relatively little hope that it takes longer than a few days/weeks until
a wave of seo-deoptimization services hit the SEO market. The incentives for
beating "the system" are just too high.

~~~
jeremydavid
I would guess Google already keeps a record of a website's history and could
easily detect if it suddenly became "seo-deoptimized"

EDIT: But I completely agree. There are far too many SEO's out there for this
to beat everything

------
tocomment
Google has gotten so bad. I'm now finding worthwhile results until the second
or third page.

Here is my search process

If it is from ehow ignore

If it has my search term in the URL ignore

If it is from about.com ignore

~~~
DanBC
ehow should have been hit with the Panda update? Do you have a "block all
example.com results" link on your Google results page? (You might have to
visit the link and then hit back in your browser to get it to appear.)

Do you have any examples of search terms that don't return useful results
until page two or three?

~~~
dholowiski
Let me tell you, searching for typical tech support issues, ehow is always on
the first page and the content is clearly spam- usually only tangentially
related to the query. try Googling exchange server specify IP adress- and try
to find something useful.

------
sheff
Unless the penalties from Google are long term I can't see this as being very
effective, as you would assume competent SEOs will just tweak their processes
in response.

From what I've read over the years, basic SEO mainly boils down to :

\- Have good, relevant content

\- Choose your page titles and URLs carefully

\- Get lots of links to your pages , from as many different domains as
possible, and try to get them from high Pagerank domains

\- Where possible try to get anchor text relevant to the terms you want to
rank for - but don't go overboard with a large percentage being the same
phrase, as it appears artificial

\- An older site can benefit you, as will exact match domains for the main
TLDs

Except for site age, they are all easily changeable by the SEO.

Maybe Pagerank is too easily gameable, and what is old will become new again,
and some of the approaches tried in the 90s and replaced by Pagerank will
return with a new twist.

~~~
polyfractal
Pagerank is a very small part of Google's ranking algorithm nowadays.

------
blacksmythe
Sounds like the equivalent of the AMT (Alternative Minimum Tax, designed to
increase the US income tax of those deemed to have too many deductions).

------
theseanstewart
I think Google needs to be really careful with this. It would make sense that
the first place they should look for over optimization is anchor text
manipulation. The problem is that this could open up the door for "Negative
SEO" services where you can blast a competitor with high traffic keyword
anchor texts. The safest place to look will be on page factors and if they
decide to go after anchor text manipulation there should be some type of
authority metric that protects established sites.

~~~
Isofarro
Google is probably already do that, which is why the current traffic-
generation methods advice is along the lines of varying anchor text.

This is done mostly by identifying related / tangential / complementary /
supplementary keyword terms to the resource linked to. Non whitehat SEOers are
wising up to the idea of a site targetting niches, and that considerably
widens the list of appropriate keywords.

By tackling the long tail keywords en masse - these are less competitive,
easier to rank for, and correlated better to buying intent; gradually they
make significant inroads to ranking strongly for the big-head keyword in a
manner that looks more natural.

------
Ron_wayne
Google and Matt Cutts, in general, have a larger problem. Cutts never
acknowledges that Google works primarily on the language, which is one of the
most complex things the human race has ever come up to. And though language is
always at stake in Google search, I don't see any serious semiotic study in
whatever they make.

And, apart from that, Google totally sucks on languages that are different
than English. They think of a new thing for the English based search and then
they'll just propagate it everywhere with no substantial modifications
whatsoever.

I've got many sites under me, all writing original content. They're not gonna
win the Pulitzer, of course, but at least they're edited, accurately followed
by teams of human authors that won't even copy PR as a measure to prevent non
original contents to appear on the sites. Although all of these things, we got
seriously pandalized.

The results where we once used to stay atop are filled in many occasions by
scraper-sites who steal our contents and rank 2 or 3, sometimes up to 5,
positions above us on the SERP. I can't even count the times I filled the
anti-scraping form anymore. That's not enough, because many times popular
sites rank higher than us just because they're pretty popular, although their
contents are horribly written, short, totally uninformative.

That happens all the time. You know where we still go pretty strong? On Google
News, where the human intervention sometimes really applies.

Google should see what's going on everywhere and their insistence on having
matt cutts as its only public voice on this huge issue is becoming really
frustrating.

He ends up looking like a fake good guy, perpetrating the hypocrisy of a
corporation too big and too convinced they have the ability to solve all the
hyper-complex search problems, laden with human generated unpredictability and
the natural human tension towards deception, just with pretty algorithms.

------
yaix
For years those penalties have often been mentioned on SEO forums. This is
nothing new but just another filter they will add. Or maybe its just a little
FUD to keep people nervous?

------
itsprofitbaron
This is completely the _wrong_ thing to do and in my opinion this means that,
Google have essentially just admitted they're _bad_ at search.

Firstly, for ecommerce sites etc due to the lack of content - especially
unique content (because there are only so many ways you can say something is X
length) these sites are _forced_ to SEO themselves. This is even more apparent
as Google are placing their “Google Product Search” within the results as
well. Sure, you could add yourself to Google Product Search but that’s not the
point - Google _should_ be adding ecommerce sites etc in their automatically
to make it _a level playing field_. Sure, you can argue that it is hard to do
and that might be the case but, there is proof out there in the marketplace to
highlight that this is possible - look at what TheFind etc are doing in this
space. Additionally, I think there is a lot more that can be done in the
Shopping Search space as well as other areas of search which I will cover
below.

Having covered those, I will also highlight another problem which is
_hindering_ Google = their search engine is based around the Pagerank
algorithm which despite evolving is actually _hurting_ Google in trying to
solve the problem of SEO.

I believe this is the case because of PageRank and the general Google Search
algorithms which are in place – their search at its core is based on
'citations' like those in academic papers etc (yes it has evolved over time)
but it is still even loosely based and developed upon on this ranking system.
Hence there is the problem of paid links - although you can report them [1]
this reporting method actually isn't effective and doesn't work. Additionally,
there are tons of ways to get links really easily which appear natural, won’t
appear overly seo’d and are _extremely_ easy to get and game as well.

Google seems to be taking their search into the, Semantic & Social Search
approaches which they will probably solve some of these aspects but, everyone
is working out (and many have already worked out) how to game social search
although, it is a step in at least a decent direction.

Currently, Google just provides links and documents etc to the ones which it
believes is the most useful based on their Pagerank/keyword approach and even
if it can improve search to a much greater degree the issue is also related to
Adsense/Adwords.

This is because; these two print money for Google. For instance, Adwords
really is used for Google Search (yes it does feed into Adsense as well but
Advertisers really want their ads on Google Search Results). The last
statistic I heard about Google Search, is around 2007 when Google was making
$0.12 on average per search – you could probably calculate easily how much
Google is making now by looking at their income and dividing it by publicly
available search numbers but, I’m going to say it’s more than that because in
2004, they were making $0.09 per search. This is actually an issue for the
user because, I believe Google really just want their ad results to be perfect
– they don’t really want you to have to click on the result because then
Google isn’t making any money – as long as their results are good enough so
then they’re Ok with that If you don’t believe me, take a look at [2] that’s a
whole lot of ads above the fold. Oh, and if you try and do that be prepared to
be punished [3].

Now Adsense, sure it doesn’t make that much of Google’s income but it still is
highly profitable – if it wasn’t they wouldn’t be doing it. If Google, really
wants to fix search then they need to review every single Adsense site and I
mean every single Adsense site since Google only reviews the site which you
apply with. This leads of a huge problem as MFA’s aka. Made For Adsense slip
through the net afterwards and if you look at their sites, they’re generally
have poor content etc and are optimised for the search engine and the user to
leave either by clicking the back button. Sure, Google will lose some Adsense
income on their balance sheet but if they want to fix search this is also a
good place to start.

Sure they're trying to solve this issue and get to grips with it but I don't
think they will anywhere in the near future. You can argue this fix they have
announced will solve some SEO problems but, as I said in my opening sentence I
believe this is the _wrong_ approach and some of the above reasons highlight
several issues already but there are tons more. In fact, I actually hope you
disagree with me because, not only do I value your opinion but am interested
in the other aspects you may add to it.

[1] <https://www.google.com/webmasters/tools/paidlinks?pli=1>

[2] [http://searchengineland.com/figz/wp-
content/seloads/2012/01/...](http://searchengineland.com/figz/wp-
content/seloads/2012/01/6728099139_d5444820cf_z.jpg)

[3] [http://searchengineland.com/too-many-ads-above-the-fold-
now-...](http://searchengineland.com/too-many-ads-above-the-fold-now-
penalized-by-googles-page-layout-algo-108613)

~~~
atesti
I don't need to see any ecommerce site in the search results anyway. So good
luck making them visible via SEO. It's annoying when searching for information
about a product. If I want to buy it, I go directly to amazon, not to the
google serach

~~~
itsprofitbaron
You've actually just highlighted another problem with Google there as well ;)

Google doesn't really know what you're searching for, it doesn't understand
the _context_ which is why you see shopping results when you're just looking
for information. Yahoo actually tried to solve this with a slider with an
option for more shopping/info results but it didn't catch on as users don't
want to click anything, they just want to type words in to the box and get the
perfect answer - be it an ad or an actual result. If you make a user do
anything else then you've already lost the game.

This is another reason why I don't think Google will fix this SEO problem it
has any time soon, and I definitely don't think this current 'fix' will solve
it either

------
janesvilleseo
I think this is going to be very interesting. I can understand the benefit of
it, but as an SEO, I also feel that because we are working in the dark it is
as much a science as an art. To me, this just means I have to work on fine
tuning what it takes. Also, I think it will be harder for those who play on
the wrong side of the tracks to rank well, which in turn may make it easier
for those who do to rank well.

------
projektx
I am probably misconstruing things, as I often do and being simple-minded on
top of that, but this seems like a circle-jerk and makes me wonder what
happened to Tim Berners-Lee's musings about the Semantic Web, and why search
hasn't progressed in that direction in some exponentially powerful and
revolutionary way.

~~~
im3w1l
I believe the reason that we see so little of the semantic web is that
advertisement is such a dominant business model on the internet. Since it is
hard to embed ads in semantic web content, and the will to pay for it isn't
there, the sad result is that it is not worth producing.

~~~
Isofarro
Various bits of the lowercase semantic web (and its related uppercase Semantic
Web cousin) factors show through on Google results - mostly Google Rich
Snippets, so reviews metadata. These mostly require human intervention to
enable on a site-by-site basis (sometimes by the site owner, sometimes by
humans in Google itself).

The deeper problem of the semantic web is that it's meta data, which tends not
to be visible to visitors. And because of that, it's easy to forget / overlook
/ incomplete, and nefariously allows SEO over-optimisation invisibly that non-
whitehat SEOers do in full view of the human audience.

I wish it wasn't true, but Cory Doctorow's portmanteau of metacrap is
unfortunately still accurate ( <http://www.well.com/~doctorow/metacrap.htm> ).
We have made several small metadata improvements over the years, but not much
progress in substantially overturning Doctorow's original concerns.

Human beings are not great sources of accurate and up-to-date metadata. So we
rely on scripts and services to fill in the blanks that we don't. (publish
times being recorded in WordPress, for example.)

Converting human content into metadata ends up being an automated human-hands-
off affair which means that those automative tools need to be able to parse
and extract information / meaning from human prose. Very much what Google has
been doing since inception.

It's flawed because it relies on automated interpretation of prose. But there
isn't a viable non-flawed method of getting the same information without
imposing academic-like constraints to the Web.

~~~
projektx
Thanks for the Doctorow metacrap link, excellent reference.

------
Drbble
Does anyone else find that announcement weird? "over -SEO", is just a kind of
low quality content, so been a Google target for years. This announcement
seems like it's just trying to scare off SEO fanatics, or an admission that
Google has been using ridiculously naive linear scoring models for keywords.

------
jakeonthemove
Well, that's pretty good news... if it works. There's a lot of garbage on the
first page just because the page/site has a lot of backlinks and has the whole
site SEO'd by the book (titles, keyword density, headers, etc.).

------
ZanderEarth32
As an SEO I am interested to see how this update goes. Google updates usually
have a 'shotgun' result. They might kill the intended targets but there is a
lot of damage from the spray to sites that might be unwarranted.

~~~
andybak
Define 'unwarranted'. For every site with a good ranking there's probably a
dozen who rank worse but provide equally good content. Maybe just shaking
things up a bit regularly isn't a bad idea in itself.

~~~
ZanderEarth32
By 'unwarranted' I am referring to websites who might get dinged in this
update but are not overdoing the SEO, or possibly not doing any SEO at all. It
happened with the Panda update so I can't help but think that it will happen
again.

~~~
Kiro
If you're doing SEO at all you should be punished imo. Let Google do its job
without interfering and trying to manipulate please. And no, I don't believe
there is such a thing as "whitehat SEO". Whitehat SEO is just common sense
(write good content and don't screw up your markup).

~~~
AznHisoka
That's like saying trying to sell your product is a sin.

Mind telling me how a consumer web startup is supposed to get others who are
interested in X to visit his site about X? This is the basis of SEO: matching
searchers to pages. It doesn't happen magically just by wishful thinking, and
writing good content unfortunately.

~~~
Kiro
No, it's like saying trying to lure people into buying something worse instead
of the better is a sin. Like the product manufacturer secretly sneaking around
in stores hiding competing products and putting their own in the front.

~~~
AznHisoka
In this world, there will always be people who will attempt to sell you a
shoddy product, and snake oil salesman.

But because that's the case, and you got a quality product, you're saying to
sit tight, don't get out of the building, don't try to sell... and hope people
will eventually choose you?

So how exactly is a consumer startup going to get people who want to know X to
visit your site about X?

------
AznHisoka
The huge elephant in the room is that Google can kill so many startups and
small businesses practically overnight with an algorithm change.

People can say "create quality content, don't worry, you can always file
reconsideration request if it's an accident" don't realize how Kafka-like
Google really is. If an algorithm penalizes your site by accident, and your
traffic drops by 99%.. good luck trying to talk to an actual human being
working in Google to get your problem fixed!

------
espinet
I wonder how Wikipedia will be ranked with this change since there are litte
to no outbound links.

~~~
Drbble
?? All the references are outbound links. A wikipedia page with no outbound
links should likely be deleted for non-encyclopedia content.

------
sad_panda
Is there a SEO industry around Bing? What does Bing SEO look like?

~~~
jordhy
Many factors make Bing SEO different. Since Bing doesn't crawl as many pages
as Google, they are more harsh on low quality sites and links coming from
"content farms". Specifically, Bing is:

\- Less likely to find results in forums

\- Harsher on results from “content mills” (even after Panda)

\- Less likely to give you ranking for a keyword if is not found verbatim in
the page

\- Also provides way fewer results for phone number queries

------
benologist
I hope this destroys all the scabby blogs that link everything to more of
their own garbage - Mashable, Engadget, The Verge etc. Be nice to see them
focus on not being shit instead of being rewarded for it.

------
Kiro
This is great news. I've always been surprised how the SEO industry can have
such legitimacy when it's basically as legit as false advertisement.

~~~
jasonlotito
Do not confuse black hat SEO with white hat SEO. One is not the other.

Creating good, worthwhile content is a key part of SEO. If you write, or
create, good, worthwhile content, you are taking part in SEO. If you take your
time to write a headline that best reflects your article, and mark it up
correctly, that's a part SEO. If your site is using easy to read names, that's
SEO. If people link to your content and write about it, that's a part SEO. If
you make the description below your page title on Google's search results
actually worthwhile then some random piece, that's a part SEO.

------
huntaub
This is the Laffer Curve for search.

------
fellowniusmonk
Thought 1: This may be overly cynical, but it is hard to believe that this
isn't just another way to dump more people into their adwords system.

People who "over optimize" on a large scale are probably the same people
making money off of well ranked SERP's, penalizing those people pushes them to
buy adwords.

They are in essence pre-qualified customers for adwords since they have
already demonstrated a willingness to invest real time/money in ranking. Sure
this will just trigger yet another race to optimize optimization, but until
things are refigured out more money will be dumped into adwords.

Thought 2: This SERP "Market Volatility" is a great way for Google and SEO's
to make a little more money.

This is also a good way of reminding some people how much their revenue is
dependent on Google, when you see an overnight dive in revenue it captures the
attention/mind share of higher ups who have maybe been taking their well oiled
SERP machine for granted.

It's like a one night only "Google Dance" reunion tour, and gets people
obsessing over them again.

~~~
andybak
That sounds like a marvelous future. Everyone who can justify throwing money
at the problem can use paid search and everyone left in organic search can
wish them all good riddance and get on with producing content.

~~~
AznHisoka
except this makes sites like StackOverFlow, Huffington Post, Mashable, Yelp,
and most sites that depend on advertising economically unsustainable. Before
you say "No..", remember even Yelp, a great word-of-mouth resource gets over
70% of their visits through search engines. As does Stack Overflow.

What will be left will be personal bloggers, charities, and hobbyist sites. Be
careful what you wish for.

~~~
Drbble
HuffPo is a spam site that Panda should have killed. Stackoverflow is a unique
content sites with practically no human-unfriendly SEO (Except maybe the huge
list of related questions that tends to fool Google, where the search term is
split over two questions and neither is actually relevant. )

------
shingen
The Google guys are so smart, with enough time they'll get all their
algorithms so optimized they'll eventually exclude every site from ranking on
the first page of results. It'll just be a blank white page, the ultimate in
minimalism and speed. It'll be an engineering feat not rivaled in the 21st
century.

~~~
sek
You are probably the same guy who complains that Google get's to spammy.

~~~
shingen
Nope, not that guy at all.

To clarify for your benefit, I wasn't complaining about Google's optimization
or lack-thereof (spammy vs not spammy). I think it's hilarious, watching the
endless dance that Google is going through because their fundamental approach
to search is broken and they're trying to drag that broken approach into the
future. It's like watching Microsoft with each iteration of Windows & Office,
trying to figure out how they can cheat death and drag 1980s software into the
future.

On a business level, I don't care about Google's survival or their
optimizations. My product doesn't benefit from SEO, nor from their search
engine. I'm indifferent to them. If they make a good product, great; if they
don't, someone else will eat their lunch eventually.

~~~
sek
There is some truth to it, the approach was never really good. There is just
an extreme need for it and they are still the best one around.

I wonder what happens 20 years from now on, Google can't win over SEO in the
long term. There are just too many people on the other side. As long as they
don't develop some AI the results will get worse. I use more often "site:" to
get what i want.

------
ktizo
The trouble with SEO as a business is that you are always second-guessing the
whims of people who's plans and actions are unknown to you other than the fact
that they are almost definitely working against you, and furthermore they will
usually have much better resources available.

------
klbarry
Hard to believe that they don't already do this...?

