Hacker News new | past | comments | ask | show | jobs | submit login
Using site speed in web search ranking (googlewebmastercentral.blogspot.com)
56 points by mattyb on Apr 9, 2010 | hide | past | web | favorite | 15 comments



The funny thing is that as Analytics isn't gzipped nor local, it's usually a major contributor the the overall load time.


There's an asynchronous GA script available:

http://code.google.com/apis/analytics/docs/tracking/asyncTra...


I hope the Google bot doesn't come along and measure my site's speed by hitting one page when cache is empty.


If the Webmaster Tools site performance chart is any indication, that's exactly what they'll do. Let's just hope they'll ensure a large enough sample before using this as a factor in search ranking.

In my experience the Webmaster Tools performance indicator is terribly inaccurate for small sample sizes (relatively unpopular sites).

For one site I have intimate details of, Googlebot visits roughly one page every three minutes. I have every indication (knowledge of what the site is doing, local and distributed load testing, analysis of server logs, etc.) that the site performs consistently and well, but the Site Performance chart on Google's Webmaster Tools is all over the place: anywhere from 0.4 seconds to load (99th percentile) (which is where it happens to be now and is probably slightly overestimating the site performance) to more than 9 seconds to load, oscillating pretty wildly. I'm fairly confident this isn't accurate (unless GA, the only external component on the site, is itself to blame).

On other, more heavily trafficked sites the Web Performance chart seems much closer to reality--in line with our load test results and anecdotal experience, so whatever the issue is it seems to pan out when they take a large enough sample size.

Edit: I see in other comments here that Google is using end-user data from the Toolbar to track this. That's helpful information and may explain the variability I and apparently others have seen. Again, this is a good reason to hope they'll only take this into account for large sample sizes. Otherwise, hope that your customers aren't using dialup connections.


From what I understand they are using measurements from visitors that are nice (or ignorant) enough to report the time it took them to get the page to display in their browser.

That will include a whole lot of empty caches. But it will be representative as well then.


This post encouraged me to take a look at Google Webmaster Tools -> Labs -> Site Performance for my site. It says 4.3 seconds average load time, "slower than 63% of all sites", and the graph since November shows tremendous variability, varying between 2-6 seconds. I don't understand where 4.3 seconds comes from (even with an empty cache, it's maybe 1.6 seconds to onLoad), unless they're using some old ISDN line or something. And I don't know where the variability over time is coming from, with a lightly loaded server that hasn't changed much in that time.

Is anyone else having similar weirdness with their Site Performance graphs?


Do you have a lot of visitors? I have my webmaster tools interface in Swedish, but it is telling me that my 1,5 sec (faster than 84% of all sites) average is of low accuracy ("låg tillförlitlighet") and has less than 100 data points.

The worst offenders to the speed of my site is that I have both google adsense and google analytics on it. I find it ironic.

From what I understand the time is sampled from visitors, and includes the time from the page request until the page is fully rendered.

I'd love to know what people do that get their times well below sub-second.


Ah, I see now... I didn't realize that this was end-user data via Google Toolbar opt-in. I've got "medium accuracy" 100-1000 data points, and I'm assuming that means total over the full 5 months. In that case, it doesn't surprise me that some combinations of visitors and connections might have long load times. But yes, the fact that non-visible elements seem to make a difference is not great.


There is a fair amount of variability in the data, just because it depends on who happens to be browsing your site (data is collected from people who have the Google tool bar installed and the page rank feature enabled). So you kind of have to look at the basic trend over multiple weeks.

They don't say though that this is the same data being used in their ranking factor.


If it is the same data they use for the ranking factor, doesn't this give me a way to tank my competitor's rankings? All you'd need to do is install the toolbar on browsers using some low bandwidth or throttled internet connection and browse the sites who rank above you.


All of my sites on Finnish servers have a load time under 1.5 seconds. Everything in Canada or the States fluctuate between 1.5 and 7 seconds.


"Fewer than 1% of search queries are affected by the site speed signal in our implementation"

It's pretty clear they're just using this to weed out broken sites that take forever to load. If your visitors aren't already bouncing because of long load times, you probably have nothing to worry about.


I don't know why you're being downvoted. While I think the article suggests that they are giving a slight boost to sites that give a better user experience, I think your point is a much more likely reality.


what happens if my site is targeted in one country only, and the speed from rest of country is slow?


I was tempted to upvote this one just because it wasn't about the changes to the iPhone SDK agreement.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: