
Pages With Too Many Ads Above The Fold Now Penalized By Google - coupdegrace
http://searchengineland.com/too-many-ads-above-the-fold-now-penalized-by-googles-page-layout-algo-108613
======
ryanwaggoner
Yeah, too many ads above the fold sucks:
[https://img.skitch.com/20120121-rwyw8bm7mhu86ptrjuq1mpfqg.jp...](https://img.skitch.com/20120121-rwyw8bm7mhu86ptrjuq1mpfqg.jpg)

~~~
resnamen
I count three ads above the fold. I believe the results below those top ads
are still search results, just presented in different modalities.

~~~
ryanwaggoner
Regardless of whether the product search stuff counts as an ad or not, the six
items on the right definitely do.

~~~
Skroob
The three in the center are in a box that says "Ads". I think they count. And
all six down the right side count too. Targeted ads are still ads, even if
they're relevant.

------
joejohnson
Yep, only Google's search results page gets to be top-heavy with ads.

~~~
staunch
Completely true.

[http://www.google.com/search?btnG=1&pws=0&q=breast+r...](http://www.google.com/search?btnG=1&pws=0&q=breast+reduction)

There are 11 paid links and 10 organic. At least 75% of the above-the-fold
content is ads.

~~~
epochwolf
I don't see any ads actually.

<http://dl.dropbox.com/u/361483/archive/google_search.png>

~~~
blutonium
This is how it looks without AdBlock.

<http://imgur.com/3pOyK>

~~~
wildmXranat
To be fair, I only see the three ads above and none on the right. Still, is
this what Google refers to as 'above fold' location ?

~~~
chc
Yep, that is what they're referring to. The term comes from newspapers, which
are shipped folded across the middle, so that only the half of the front page
that's "above the fold" is immediately visible. The web design community
adopted the term with essentially the same meaning: the part of the page
that's visible when you first get there. So yes, those ads are decidedly above
the fold.

------
robk
This seems rather contradictory with the general AdSense historic guidance to
run up to 3 ad units and place them generally to the top and left side of the
page (per various pieces of heatmap research often cited). Seems like this
could be pretty easily gamed too depending on how much of a CSS-wizard you
are, as I suppose the algorithms are somewhat limited in analyzing CSS vs
plain HTML layout or Javascript.

~~~
AdamTReineke
There was an article a while back about how Googlebot is likely a version of
Chrome that fully renders a page to process it. [1] You're right, without a
full browser engine, you could pull a whole host of tricks to make the page
appear clean but then put in ads later. Since Googlebot is probably a full
featured browser, it becomes a lot harder to trick.

[1] <http://ipullrank.com/googlebot-is-chrome/>

~~~
HNatWORK
There are definitely at least two different, shall we say classes of crawlers:
The Googlebot and the Google Web Preview crawler. I don't know the extent of
Googlebot's javascript parsing but the web preview crawler appears to parse
javascript like plain webkit.

Here's a link where they show the UA for the Google Web Preview crawler:
[http://www.webmasterworld.com/search_engine_spiders/4353651....](http://www.webmasterworld.com/search_engine_spiders/4353651.htm)

I believe this crawler renders your pages for the preview snippets you get in
search results when you hover over the arrow that appears on the right side of
a result.

The previews I've seen would only look that way if javascript was being
rendered and allowed to run for ~10-20s by my estimation--based on the
progress of an animation that was previewed.

------
TylerE
Rather ironic to see this on a site so crowed with obnoxious social media-
cruft that it actually took me about 5 seconds to find the article text...

~~~
RuggeroAltair
Well, I guess they got penalized and got upset. But it's honestly good if such
pages are penalized. Nowadays time is very important, and wasting time on a
normal basis just to look through the advertisements is a very annoying thing.

------
zone411
This change also needs to penalize pages with full screen ads that are not
always being shown (NY Times among others do it). Can anyone at Google confirm
or deny whether these types of pages are being penalized?

~~~
j_col
Since when did Google become the Internet police? Seriously, I don't think
it's their role to penalize anything you find annoying online.

~~~
kc5tja
It's Google's service, it's Google's web-spiders, and it's Google's attempt to
deliver useful content to their own customers. How is it _NOT_ in Google's
interest to police the links they present to visitors of Google's own
properties?

If you don't like it, use Yahoo!.

~~~
j_col
But where does it stop? Is it ok for example for them to downgrade a site
based on political content? Or religious content? It's not much of a jump from
"we don't want to to visit this site because it has too many ads" to "we don't
want you to visit this site because we don't believe in it's views".

> If you don't like it, use Yahoo!.

Thanks for the tip, but I prefer DuckDuckGo.

------
coob
Advertising company penalises others who sell advertising, news at 11.

------
studentrob
Cool, I didn't know about the browser sizing tool,

<http://browsersize.googlelabs.com/>

------
chc
I wonder if this isn't at least partly a sneaky way to penalize pirate sites
without needing to wade into the intractable question of copyright. There
seems to be a strong correlation between a site's sketchiness and the
likelihood that it will plaster obnoxious ads all over the top of every page,
and the timing is weirdly coincidental if not.

(I really hope this doesn't derail one of the few non-SOPA threads. But that's
the most relevant motivation I can think of for this change.)

~~~
lukeholder
any serious 'pirate site' has absolutely no ads. The only 'pirate sites' that
have ads are the ones open to the public - and they are not so serious, but
rather unreliable with no community. I am guessing you are not a member of a
private tracker.

~~~
chc
I said _pirate_ , not _private_. Private trackers may be where the most
hardcore pirates hang out, but theyre still only a fraction of all pirate
sites. What you're saying here is precisely cognate to the No True Scotsman
fallacy.

------
re_format
So many issues with the internet seem to devolve into the same thing: a fight
over who gets to show ads to the naive user.

Eventually Google itself will be showing "too many ads above the fold". Does
anyone doubt it?

Gaming the search engine to be numero uno on the SERP is one thing. But
proclaming a penalty for websites that have "too many ads"? That seems like
it's for users to decide, not Google. Not to mention hypocritical. Can we
penalise Google for "too many ads"?

~~~
magicalist
Huh? That's why you go to a search engine -- so that it decides what good
content is and the user doesn't have to (or at least they only have to do so
in a dramatically-reduced-dimensional space).

When Google sends me to the worst of these kinds of sites, I become extremely
annoyed...at Google. So yes, we can and should "penalize" Google, but on
metrics like quality of results (which includes the ads being shown). In an
ideal world I'm replacing having to separate the wheat from the chaff of the
entire internet with having to separate the wheat from the chaff of the search
engine market, and I'm going to favor a search engine that does a better job.

In other words, as a user, "since Google is a website that uses ads, and
they're going to favor websites that use fewer ads, aren't they hypocritical?"
is not a question I care about even a little.

What I do care about, among other things, is having a search engine that
doesn't show me useless crap.

~~~
re_format
The fact that they have to make changes to their system in order to not have
useless crap appear at the top of the results tells us something: either
people are searching for crap or the portion of the web Googlebot is crawling
is full of crap.

Neither is something the search engine can fix for you.

With respect to the later idea, the search engine may in fact be contributing
to it by encouraging more crap to be created, because it easily percolates to
the top of their "intelligent" results and users blindly click on result #1.
And no doubt many users see these results as equivalent to "the web". Whatever
Google returns, to them, that's "the web".

You can think about the web through the lense of "search engine results" and
evaluate the web based on whatever is returned from your search engine
queries.

Or you can think of the web as a huge mess of websites some of which are
useful, most of which are crap and many of which an aggressive search engine
might index.

Are you evaluating search results, or websites?

I'm evaluating websites, individually. Because that is what the web is. To me,
Google is not the web. Google might give me some clues about some sites. They
do an enormous amount of grunt work crawling them.

But it's up to me to do the final evaluation. To decide whether a site is
useful or whether it is crap.

And there are other ways to discover websites besides using Google. How do you
think Google learns about existing and new websites? Voluntary disclosure by
the webmasters?

It sounds like you want someone to evaluate websites for you. I doubt you are
alone in that regard.

This is not a new problem.

However, unlike you, I do not see Google as providing any viable solution.

~~~
icebraining
_The fact that they have to make changes to their system in order to not have
useless crap appear at the top of the results tells us something: either
people are searching for crap or the portion of the web Googlebot is crawling
is full of crap._

No, it means the ranking algorithm is evaluating the results wrongly. Which is
what they're trying to fix.

 _With respect to the later idea, the search engine may in fact be
contributing to it by encouraging more crap to be created, because it easily
percolates to the top of their "intelligent" results and users blindly click
on result #1. And no doubt many users see these results as equivalent to "the
web". Whatever Google returns, to them, that's "the web"._

But that's the point, isn't it? It _shouldn't_ easily percolate to the top.
That's what their algorithms are for. If it does, they need to be fixed.

 _Are you evaluating search results, or websites?

I'm evaluating websites, individually. Because that is what the web is. To me,
Google is not the web. Google might give me some clues about some sites. They
do an enormous amount of grunt work crawling them.

But it's up to me to do the final evaluation. To decide whether a site is
useful or whether it is crap._

I don't get what you mean by "Google being the web". Of course the final
evaluation is up to the user. But if Google can rank the results more like you
would, you're wasting less time clicking through the crap to get what you
want.

 _And there are other ways to discover websites besides using Google. How do
you think Google learns about existing and new websites? Voluntary disclosure
by the webmasters?_

Actually, they do that too. But mostly by painstakingly loading every link
recursively, something which is obviously impossible for a person to do unless
they want to be limited to 0.0...01% of the web.

------
PaulHoule
i'm pretty amazed that they didn't do this a long time ago... sites with no
content above the fold have been a problem for a long time...

~~~
RuggeroAltair
I think they have been trying for a long time, but while it's possible to
conceive the idea in a second, it might be harder to find the right algorithms
to do it in an effective way. After all they make changes every day and study
the response of users randomly distributing new versions of the algorithm to a
small group of users that are effectively beta testers without even knowing
it. It is a known problem, that's true, it can be tricky. For example it is
massively clear for webpages that have keywords such as "iphone" "jailbreak"
and so on that pretty much have just ads or want to sell you something without
really any content in the page. But if you try to find, say something about a
scientific paper in blogs, you don't want to penalize the page of a good
blogger that has a few ads towards maybe a crazy crackpot that has their own
idea about the universe and knows nothing about real science just because
their pages have no ads.

~~~
PaulHoule
well, i think also Google has a motivation to make the search results as bad
as they can get away with.

a crappy page with nothing but ads is going to get clicks, and since Google is
the #1 ad network, it means more money for Google.

if Google organic search results were perfect, people would never click on the
ads. the worse the results are, the more likely the ads are better and you get
trained to click... and ker-ching!

Google needs to be good enough to (i) discourage mass defection to Bing and
Duck-Duck-Go, (ii) not produce public outrage as happened when A list bloggers
were getting outranked by duplicate content, and (iii) not get in trouble for
anti-trust. (Hint: if you want to run a spam farm, buy a second or third tier
search engine.

------
X-Istence
> "(Side note, that yellow color around the ads in the screenshot? It’s much
> darker in the screenshot than what I see with my eyes. In reality, the color
> is so washed-out that it might as well be invisible. That’s something some
> have felt has been deliberately engineered by Google to make ads less
> noticeable as ads)."

Sounds like the author needs to go through the process of correctly setting up
his monitors using the Apple provided tool in System Preferences, I have
absolutely no problem seeing the yellow box in Google's search results.

~~~
Samuel_Michon
My monitor is calibrated properly, so I checked.

The color yellow in that screenshot is very faint. It's HTML color #fef5e4,
which is 10% yellow and little else.

Then I compared it with the ad background on a live SEPR, which was rendered
as #fff, aka 100% white.

Sounds like the author was right.

~~~
X-Istence
This is what I am seeing when Googling "trash cans":

<http://i.imgur.com/gJjCK.png>

To me that yellow stands out extremely well, I have no trouble seeing it.

The background in Google's CSS for that div with id "tads" is: #FDF6E5

This is for the search "credit cards":

<http://i.imgur.com/bPnkJ.png>

The background colour is still the same. I just asked around the office if
anyone was misled regarding these ads and no-one else seems to be unable to
see the yellow or have trouble seeing it in general.

~~~
Samuel_Michon
Ah yes, that SEPR does have an ad block with background color #FDF6E5.
However, that color is only 10% yellow. Objectively, that should present a
very faint yellow.

If you're seeing a gold-ish yellow background, then your monitor is set to low
brightness or Gamma 2.2. That way, you won't see colors as they are.

~~~
X-Istence
My monitor is set for accurate colour representation. Gamma 2.2 is the norm
these days and is what websites, photography software, video software all want
these days.

Mac OS X used to be the lone hold-out at 1.8 gamma but those days are long
gone. (Snow Leopard changed the default to 2.2 to be in-line with the rest of
the industry). Windows has always been at 2.2 gamma.

------
badclient
Let's closely monitor YouTube's rankings in that case.

------
coreyspitzer
I skimmed the article, but I didn't see at what arbitrary line Google
considers "the fold"

~~~
chauzer
I thought "the fold" was always the line where you have to scroll to see below
it

~~~
arantius
Yes but for what combination of hardware and software?

------
chauzer
Why do people still browse the web without an ad blocker

~~~
j_col
Seriously, if you were a web publisher who's livelihood depended on revenue
from advertising, you'd be kinda glad that most of them didn't.

------
re_format
Best way to find the content: don't use javascript.

~~~
resnamen
Then I can't see 80% of the "Show HN" stuff that enters the newsfeed!

~~~
re_format
Exactly. You get content only. Do you want to read, or do you want to click on
stuff? Most ads (stuff you click on) are javascript driven. Most content (e.g.
text, like what you're reading now) is not. Articles on news sites like HN or
other similar sites render just fine without javascript. Converting them to
nicley formatted text is easy once the effects of javascript are eliminated.
Let the downvotes begin!

