

It’s Official: Google Now Counts Site Speed As A Ranking Factor - ilamont
http://searchengineland.com/google-now-counts-site-speed-as-ranking-factor-39708

======
tokenadult
This is a very user-friendly policy.

P.S. There is something to think about in that the first time I tried to post
this comment, there was a problem loading the page. HN is actually one of the
slowest sites I put up with--great quality is worth the wait, but not the
quality of most websites.

~~~
ShardPhoenix
It usually seems pretty fast to me, but maybe that's because I mostly browse
outside of US peak hours.

------
swombat
Interestingly, they appear to count loading times of third party stuff (which
doesn't affect the display of the main body of content, e.g. the blog post
body) as part of the site load time.

This means that if you use Disqus and a few badges from Reddit and the like,
even if you set them up so that they don't slow down your main content, Google
will hold it against you. That's a little... not great.

~~~
snprbob86
How do you know that? Are you going by the webmasters tool performance page?
I'd imagine that view is a very simplified perspective on the actual metrics
that search uses for ranking.

~~~
CWIZO
weebmaster tools reports 10-20x grater response time than it actually is for
one of the pages I maintain (haven't checked the others). So I would also
conclude that it measures the time until every last CSS,JS and iframe is
loaded. And that is just not right. Ok, I understand that they take CSS into
account. But my pages work without JS, and I only use iframe-s to display ads.
The point is that my page is readable long before JS files and ads are loaded.

I'd also like to know if they take caching of CSS,JS and image files into
account.

~~~
landyman
Even though I couldn't find out if they do in the Webmaster Tools Report; the
Page Speed Analyzer (Firefox Add-on from Google) does take it into account (as
does YSlow, another Speed Testing Add-on from Yahoo). I would think that
Google would take it into account because checking if something is cachable is
easy; and I don't think Google is stupid enough to _not_ check that.

------
vaksel
I have two concerns here.(well 3, but swombat covered that one)

1\. If you are going to penalize sites for being "slow" then how about you
tell us what slow is? Is a site loading in 10 seconds "slow"? is a site with a
5mb index page that loads in 15 seconds "slow"? How about some metrics so that
we can optimize properly?

And what will that do to a lot of content rich sites? If you have a lot of
images/flash/javascript it sounds like you are going to get screwed for trying
to make a better looking user experience.

2\. Of course Google search results experiment would affect user satisfaction.
You are looking for results, and you want to do a lot of searches. BUT when
you are clicking to see the result you like, I think most people would be
willing to wait 2 seconds extra to load the more relevant information.

Sounds like this is yet another attempt at boosting big sites, where large
sites like eHow and Mahalo get preference in results just because they can
afford faster servers.

~~~
nostrademons
Some of the very worst offenders are big sites, eg. cnn.com takes about 2.5
seconds to load close to 100 external resources. Mahalo takes about 2 seconds
for about 40 external resources.

As for metrics - as a user, I can say that 10 seconds is too slow. So is a 5M
index page that loads in 15 seconds (what the hell do you need 5 megs of data
on your index page for? That's like a 5 minute YouTube video). I want to see
results within 1-2 seconds of clicking on a page; otherwise, you've broken my
train of thought and I need to mentally context-switch each time I visit a
page.

------
brandnewlow
Not pleased. It can cost a lot of money to have a fast site.

If you're a neighborhood blog trying to make a go of it, you can't afford a
developer to optimize your site and cache the crap out of it. Meanwhile, the
local newspaper site, running a tag archive page for your neighborhood powered
by Outside.in or some other McLocal scraper app, can do that. You lose every
time on the speed front, despite having original content.

~~~
eli
Not disagreeing, but I think it's often a matter of setting up or tweaking
your caching strategy, not throwing hardware at the solution.

~~~
brandnewlow
Not too many writers can "set up and tweak their caching strategy."

~~~
easp
Sounds like a business opportunity to me. If only there were a site where...

~~~
brandnewlow
Exactly. Now it's more expensive to show up in search.

------
ComputerGuru
Google is saying my site takes 7 seconds to load... Well, it doesn't.

<http://neosmart.net/dl.php?id=1> is one of the slowest pages... according to
them.

Browse it and see for yourself. It's super fast.

TribalFusion and PubMatic take some time, as do the user tracking JS, but (a)
not 7 seconds and (b) do not affect the actual content.

~~~
dkubb
FWIW, I just visited your link and with a cold cache it took ~5 seconds the
status bar to report every asset was downloaded. Definitely not a scientific
measurement by any means, but I thought you might like to know.

I also ran the site through <http://www.webpageanalyzer.com/> (one of many
such services), and it said on a T1 it would take approximately 6.5 seconds to
load. It also provides a number of improvements to cut down the page size, and
improve rendering speed.

------
lwhi
You run a popular site, with little cash.

How can you afford to keep your access time down?

Host advertising, perhaps?

~~~
pavs
iframe ads.

Most popular sites use iframe to (either) asynchronously load javascript ads
or load it on a separate page so that it doesn't effect your initial site
speed. Most popular ad platforms also offer iframe specific codes you just
have to ask for them (I know adify does). If they don't offer iframe codes,
ask if it against their policy to load codes on iframe, they might make
exceptions for high traffic sites (Arstechnica loads all ads on iframe).

For general optimization, yahoo has an excellent resource page:
<http://developer.yahoo.com/performance/rules.html> I was able to bring my
site from ~8-9s loading time to ~2-3s running on a not too powerful server.

Three optimizations that worked great for me.

\- CDN for static files (maxCDN has a great cheap introductory offer of 1tb
for $9.99 and offers PULL)

\- Minify and Gzip CSS and js files and then fetch them from CDN.

\- PHP cache (APC, eaccelerator or xcache.)

I am trying to reach <2sec speed point now.

~~~
nostrademons
The absolute best thing you can do to reduce page loading time is to cut the
number of requests the browser has to make. Each request requires a round-trip
to the server, with all the latency of a cross-country or cross-continent
trip. It requires overhead for TCP/IP and HTTP headers. And most browsers
limit concurrent open connections to 2-6, so the bulk of these requests are
serialized and block further loading of the page.

Sprite all your images, so that it takes one request to get the whole chrome
rather than one per image. If you have multiple JavaScript files, concatenate
them together. Same with your CSS. Obviously, gzip them and push them to a CDN
if possible, and cache them aggressively. And consider using data: urls to
inline images directly into the page: the time saved on requests more than
makes up for the added bytes from b64ing the data.

~~~
pavs
Thanks for the tips. Most of these I am already aware of. You are very much
right about reducing the number of requests to bare minimum. I was able to cut
down from 28 to 12, but the last few are tricky because some connections are
external files (ads, analytics, disqus). Also I don't think its as much as the
number of the connection thats the problem but the number of DNS lookup that
will cause more delay. So if you have 12 file thats spread around 3 different
locations, its better than having 5 files spread around 5 different locations.

Concatenate is also a good idea, but in some cases it breaks some JavaScripts
for me.

Personally I am happy with ~2s avg load time, just trying to see how far I can
push that number.

------
kadavy
“Quality should still be the first and foremost concern [for site owners],”
Cutts says. “This change affects outliers; we estimate that fewer than 1% of
queries will be impacted. If you’re the best resource, you’ll probably still
come up.”

------
kpanghmc
What if the page you're looking for just happens to contain a lot of content /
images / etc?

Adding speed to Google's ranking algorithms is only useful for searches where
there are several equally good search results (in which case the fastest would
be the one you would want). But in the event that you're actually searching
for a lot of information, having fast (but less informative) sites propagate
to the top would be detrimental.

~~~
donaldc
Speed is only one factor in their ranking. If a site is notably better than
other sites for a query, they said in the article that it will still rank
first for that query.

------
jbyers
If it is true that non-HTML resources are a factor based on Google Toolbar
reporting, that's scary. On high confidence sites -- more than 1000
datapoints, in Google Webmaster terms -- average speed seems stable and
correlated with other performance measures. This is not true in my experience
with medium confidence sites, 100 to 1000 Toolbar datapoints.

I'm hopeful that Googlebot is the primary signal.

------
Raphael
IIRC, this has been a factor on AdWords for some time.

------
nfriedly
I like this move. I hate it when I click a link and it ends up taking 20+
seconds to load.

------
spokey
Just BTW, there's some related HN discussion from an earlier post at
<http://news.ycombinator.com/item?id=1253528>.

------
luckyland
Yet another standard of measurement that is not directly related to the number
one reason anyone uses Google: finding the most appropriate content.

------
metamemetics
How would googlebot measure page-load speed accurately? Wouldn't it be
ignoring a slow site's cruft filled javascript and stylesheets?

~~~
eli
The Google toolbar

~~~
metamemetics
who uses google toolbar besides old IE users that need popup blockers? Seems
like super high variance data depending on the end users connection, perhaps
they get info from chrome usage statistics?

~~~
eli
Dunno, but according to this article that's what they're doing:
[http://searchengineland.com/google-now-counts-site-speed-
as-...](http://searchengineland.com/google-now-counts-site-speed-as-ranking-
factor-39708)

If every site is being judged by the same pool of toolbar users then it's
fair, right? It's only the relative speed that matter.

