Hacker News new | past | comments | ask | show | jobs | submit login

With defer_javascript on, browser-reported page load times tell less of the picture. The problem is that what browsers report back to your analytics service is the amount of time that passes before the onload event while defer_javascript [1] postpones javascript execution until the onload event. This means that with defer_javascript off you were counting javascript execution time but when you turned it on you stopped counting it.

We're trying to optimize something like "time until the page is visually complete and usable", and there's not currently a good metric for that. Speed index [2] does visual completeness well, but I don't know of any algorithmic way to measure time-until-usable.

(I work on the PageSpeed team at Google, mostly on ngx_pagespeed.)

[1] https://developers.google.com/speed/docs/mod_pagespeed/filte...

[2] https://sites.google.com/a/webpagetest.org/docs/using-webpag...




Checking with some people here, it's actually more complicated than this: what I described is true for IE9 and below, but for Firefox, Chrome, and IE10 measured page load time will include execution of these deferred scripts.


I appreciate the responses and updated the article with links to your comments. I've been very impressed by what I've seen so far!

I was going to include Google Analytics site speed data, but it seemed less accurate than NewRelic's Real User Monitoring.


What seemed less accurate? They should be doing almost exactly the same thing.


Google Analytics site speed data is only based on 1%[1] of your traffic while New Relic's Real User Monitoring aims to track all of your traffic. I'm sure that can lead to differences in accuracy.

(Disclosure: I used to work for New Relic)

[1] https://developers.google.com/analytics/devguides/collection...


Provided you have enough traffic, sampling shouldn't affect the results in any meaningful way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: