Hacker News new | past | comments | ask | show | jobs | submit login

One would assume that they are clever enough to have built in safeguards to prevent anything going too long, or using too much processing power.



Not just that, I would also assume that PageRank will penalise your site if a JS takes so long to execute.


Looking at Google Webmaster tools, I see a significant decline in my reported site performance starting in September, even though my site's speed has improved significantly since then by my own measures. Assuming this is due to our 'next' feature that AJAXs in, I'm going to disallow the 'next' urls in robots.txt and cross my fingers.


If you see significant positive results, write them up and let us know.


The Google Webmaster Tools site performance is being measured through multiple data points which can include: - People on dial up (yes these still exist) - People in other countries, if you have not taken care of cashing or CDN, your website might load slower for far-far away visitors This represents an average across all the data points being measured.

The way the data is being captured, is through: - Google Toolbar - Google Chrome - Google DNS services

Your observations of the speed and the improvements are not always similar as the ones from the aggregated data Google has access to.

I'm not sure what you are trying to accomplish with the dis-allowment of the next urls in robots.txt. Can you explain more what your hypothesis is in this test, and how you would measure success?


I guess disallowing won't work if what you say is correct i.e. Google doesn't measure site performance with Googlebot. However, this wouldn't explain the slowdown since September unless changes were also made to include JS execution time in site performance within the services you mentioned.

Perhaps a solution then would be to trigger the AJAX on mouse over, but that seems kludgy. In my case, I need make the AJAXed content part of the initial page load anyway, for the sake of user experience. But I can see cases where Google should not be counting AJAX as part of the page load time. God forbid somebody uses long polling for example. Maybe Google is doing this in a smart way, looking at the changes after the AJAX and determining if they should count as part of the page load.


It really depends on what you are trying to accomplish if you need to worry what Google is reporting with respect to the page load time. In general, it's always good to pay attention to page load times, regardless what Google finds of it!

You can experiment with Asynchronous calls, or slow load jQuery scripts, which kick off after the headers and html framework already have been loaded.

Overall, I would not worry about the reports in Google WMC that much, just try to get faster in general.

If you are serious in delivering an ultra speedy service online, there are services which can test your site or application on multiple locations, different OS and connection speed or using a different browser. But these services are pricy, trust me on that one!


Better to say "that Google will penalize your site". PageRank is a calculation on the link graph of the web, and nothing else.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: