

Google PageSpeed Service Reviewed - xpose2000
http://x-pose.org/2012/12/google-pagespeed-service-review/

======
cbr
With defer_javascript on, browser-reported page load times tell less of the
picture. The problem is that what browsers report back to your analytics
service is the amount of time that passes before the onload event while
defer_javascript [1] postpones javascript execution until the onload event.
This means that with defer_javascript off you were counting javascript
execution time but when you turned it on you stopped counting it.

We're trying to optimize something like "time until the page is visually
complete and usable", and there's not currently a good metric for that. Speed
index [2] does visual completeness well, but I don't know of any algorithmic
way to measure time-until-usable.

(I work on the PageSpeed team at Google, mostly on ngx_pagespeed.)

[1]
[https://developers.google.com/speed/docs/mod_pagespeed/filte...](https://developers.google.com/speed/docs/mod_pagespeed/filter-
js-defer)

[2] [https://sites.google.com/a/webpagetest.org/docs/using-
webpag...](https://sites.google.com/a/webpagetest.org/docs/using-
webpagetest/metrics/speed-index)

~~~
cbr
Checking with some people here, it's actually more complicated than this: what
I described is true for IE9 and below, but for Firefox, Chrome, and IE10
measured page load time will include execution of these deferred scripts.

~~~
xpose2000
I appreciate the responses and updated the article with links to your
comments. I've been very impressed by what I've seen so far!

I was going to include Google Analytics site speed data, but it seemed less
accurate than NewRelic's Real User Monitoring.

~~~
cbr
What seemed less accurate? They should be doing almost exactly the same thing.

~~~
briandoll
Google Analytics site speed data is only based on 1%[1] of your traffic while
New Relic's Real User Monitoring aims to track all of your traffic. I'm sure
that can lead to differences in accuracy.

(Disclosure: I used to work for New Relic)

[1]
[https://developers.google.com/analytics/devguides/collection...](https://developers.google.com/analytics/devguides/collection/gajs/gaTrackingTiming#sampleRate)

~~~
mh-
Provided you have enough traffic, sampling shouldn't affect the results in any
meaningful way.

------
cbr
> Not to be confused with mod_pagespeed, which is an Apache module

PageSpeed Service proxies your site, but aside from that the optimizations it
makes are very similar to mod_pagespeed (and ngx_pagespeed). Which makes
sense: they're closely related Google projects.

(I work on all three, but mostly ngx_pagespeed these days.)

------
dubcanada
I'm not sure the "cloudflare has been getting worse and worse" that links to a
single page written by you referencing a single twitter post. Really helps the
article any.

However it was mostly a terrible read.

~~~
druiid
Yeah, what's with this 'Cloudflare is terrible and will make your pages slower
!!11!!' stuff around HN lately?

I pass at least a few million req/day through their service and only every
once in a good while are there hiccups.

~~~
xpose2000
I linked to what GoogleBot thinks of Cloudflare. [http://x-pose.org/wp-
content/uploads/2012/12/topiama-googleb...](http://x-pose.org/wp-
content/uploads/2012/12/topiama-googlebot.png)

Cloudflare response times to the far left and PageSpeed response times to the
far right. I'm not sure what else to say.

~~~
druiid
No offense, as I know little about your website, but you're one site out of
thousands/tens of thousands on Cloudflare at this point. Your results without
substantial research and supporting evidence of this being a trending problem
with other sites is not enough to say (per your article) 'As I mentioned
earlier, avoid Cloudflare at all costs'.

First, have you contacted Cloudflare support regarding your purported page
slowdowns? If not, they are actually very good about both fixing things and
getting back to you, even on the piss-ant level plans.

Second, what optimization features exactly do you have enabled? Do you set
proper cache headers? Those are especially important to Cloudflare and without
them being properly set (especially on JS and CSS objects) I imagine your page
speeds will be pure crap.

That isn't to say that Cloudflare is without issue. We've had them, they
exist. But if you're going to present an issue and blame it on the service, do
your homework first and present full evidence, otherwise don't spread it as
it's just an unproven lie.

~~~
xpose2000
Yes, I have contacted Coudflare about slowdown issues. I tried to use
Cloudflare across a few sites with many weeks of testing. I agree that they
are extremely helpful and responsive. One specific support ticket was about
the time it took for ajax requests to complete. Cloudflare was adding up to
one FULL second of latency. I also mentioned response times reported by
GoogleBot. They had no real answers. Though they did acknowledge some Ajax
slowdown that was supposedly fixed.

Some days are better than others, but overall Cloudflare did not speed
anything up. Perhaps free users don't get as good of a performance boost?

I had similar features enabled with Cloudflare as I do with Google PageSpeed.
The sentence right before the one you quoted says: "The response times to the
far left are a result of Cloudflare’s Full Caching features with Rocket Loader
enabled.".

Essentially that graph shows what happens when all of Cloudflare's "CDN + Full
Optimizations" features are enabled compared to most of Google's PageSpeed
Service. It's the closest comparison that I can make.

~~~
druiid
Hard for me to say what happens for 'free' accounts. We only use pro or higher
(A good number higher than pro). I would say not counting the additional
performance features you get at the pro level, if they're purposefully slowing
things down for free accounts that would be pretty crummy.

That still doesn't answer what you're doing with your cache headers. I took a
basic look at your fantasysp site and I wonder if there isn't some changes you
could make on the cache-control settings. You seem to be setting no-cache
quite a bit.

~~~
xpose2000
Caching is specified as:

FilesMatch "\\.(ico|pdf|flv|jpg|jpeg|png|gif|swf|mp3|mp4)$" Header set Cache-
Control "public" Header set Expires "Thu, 15 Apr 2016 20:00:00 GMT" Header
unset Last-Modified

FilesMatch "\\.(html|htm|xml|txt|xsl)$" Header set Cache-Control "max-
age=7200, must-revalidate"

FilesMatch "\\.(js|css)$" Header set Cache-Control "public" Header set Expires
"Thu, 15 Apr 2016 20:00:00 GMT" Header unset Last-Modified

~~~
huxley
You may want to avoid setting Expires to greater than 1 year, RFC 2616 (HTTP),
section 14.21:

"To mark a response as 'never expires,' an origin server sends an Expires date
approximately one year from the time the response is sent. HTTP/1.1 servers
SHOULD NOT send Expires dates more than one year in the future."

If an agent is precisely following the RFC, anything set to more than 1 year
in the future is an invalid date and:

"HTTP/1.1 clients and caches MUST treat other invalid date formats, especially
including the value '0', as in the past (i.e., 'already expired')."

~~~
xpose2000
Very good tip! I'll have to change those to one year. :)

------
joshfraser
It would be interesting to know what the before/after load time histograms
look like. The averages that New Relic give can hide a lot of the details of
what's actually happening.

Rant on why averages are bad for looking at performance data:
[http://highscalability.com/blog/2012/5/23/averages-web-
perfo...](http://highscalability.com/blog/2012/5/23/averages-web-performance-
data-and-how-your-analytics-product.html)

------
chris_mahan
Getting rid of javascript altogether and sticking with validating html5 worked
for me.

------
isalmon
It's a great article, thanks a lot for sharing this. The most amazing part of
all of it is that Google offers it for free while many other companies charge
a lot of money for the same (or even worse) functionality.

~~~
detst

      PageSpeed Service is currently offered free of charge to a 
      limited set of webmasters during this trial period. Pricing
      will be competitive and details will be made available later.
    

You can get an idea from what they are charging for PageSpeed on App Engine.
It may be competitive but it's not cheap. Perhaps they've learned that price
increases aren't taken well and have priced it high with the possibility to
reduce it when it's widely available.

------
zapt02
this was a poorly written article. random numbers all over the place, no
understanding from author about how the services actually work. awful.

~~~
xpose2000
Sorry if it's a bit scatter-brained and unorganized. I threw it together late
last night and am posting to help show others what to expect.

"no understanding from author about how the services actually work". I gave my
interpretation of the data I was presented with. I am no expert in deferred
javascript.

If you'd like you can ignore the entire article's text and just look at the
pretty graphs.

