Hacker News new | past | comments | ask | show | jobs | submit login
Google Page Speed Online (googlelabs.com)
165 points by abraham on March 31, 2011 | hide | past | favorite | 38 comments



Nice tool, even if it makes some funny suggestions:

Minifying the following JavaScript resources could reduce their size by 1.1KiB (0% reduction). Minifying http://ajax.googleapis.com/.../jquery-ui.min.js could save 641B (0% reduction). Minifying http://ajax.googleapis.com/.../jquery.min.js could save 516B (0% reduction).


Rats, they got me:

The following cacheable resources have a short freshness lifetime. Specify an expiration at least one week in the future for the following resources: http://www.google-analytics.com/ga.js (1 day)


Heh...

The following publicly cacheable, compressible resources should have a "Vary: Accept-Encoding" header: http://partner.googleadservices.com/.../google_service.js


This is nice, but I find it much less usable than GTMetrix: http://www.gtmetrix.com which runs both Google Page Speed and Yahoo YSlow.

As an added bonus, GTMetrix also shows you the resource loading timeline.


I only got 97/100 for https://grepular.com/ because:

1.) I have 306 Bytes of inline JavaScript

2.) Minifying https://grepular.com/ more than it already is could save 794 Bytes, ie less than 1% of the page size.

3.) They want me to defer javascript until after page load. It's the last thing in the body anyway...

None of those are valid complaints. Where's my 100% damnit


Google.com has 100. Facebook has a 99.

The best I've gotten to (on a real site) is a 98, but I'm quite pleased that I've got a few sites faster than pagespeed.googlelabs.com (95)


Try entering the URL of the service itself[0] for a small easter-egg.

[0] http://pagespeed.googlelabs.com/


  The page Hacker News got an overall Page Speed Score of 86 (out of 100).


I actually don't mind HN taking a while to load. It't not that we have to fight for every new user by aggressively optimizing page load time. Btw. the recent changes to the HN backend already improved the average speed a lot.


Page Speed and YSlow say Hacker News should:

* Enable gzip compression. * Specify img dimensions. * Leverage browser caching: Add Expires headers; Configure entity tags (ETags). * Minify JavaScript


This suggestion from Google especially looks fun:

The following publicly cacheable, compressible resources should have a "Vary: Accept-Encoding" header:

    * http://www.google.com/buzz/api/button.js
However, it's a useful tool. It gives me useful suggestions for improving.

Anybody knows a simple tool to minify javascript?


I often use the Closure Compiler at http://closure-compiler.appspot.com/ (online) and http://code.google.com/closure/compiler/ (offline)


Many thanks! I've just found a simple tutorial of how to use it in Python script: http://code.google.com/closure/compiler/docs/api-tutorial1.h...

So, I've written my own script, which minified seven js-files with total size 84.8 KB to one file with size 26.7 KB. Plus gzip and it will be very small.

Thanks again.


There is also UglifyJS https://github.com/mishoo/UglifyJS which is build on Node.


Here's an ugly GUI for Uglify: http://alexsexton.com/uglifui/


I think I've been using the Google PageSpeed Plugin (Firefox only though) to minify JS, CSS and shrink PNGs.



I am surprised it doesn't catch or say anything about Flash


"The page Google got an overall Page Speed Score of 100 (out of 100)." ... so they eat their own dog food, or was the homepage the pinnacle of excellence for the building this tool?


I like how Google.com scores a 97, while Bing.com scores a 96 (ha!)


Surely it would be in their best interest not to continually flag their own services as needing improvement.

Running http://www.beseku.com primarily indicates that they don't setup their Analytics or jQuery CDN properly.


Is there a way to analyze which server (CDN) delivers the same content faster?


I wonder what would happen to my 15% conversion rates if my shopping cart wasn't so crappy... I'm superstitious about switching though because of a fear that it might mess up our organic search traffic.


Hmmm... Contrary to a lot of recommendations I see here on HN (patio11 mainly), they seem to recommend enabling keep-alive.


That's because patio11 is telling you how to keep your site alive under load, while that page is telling you how to decrease your page load times. They're contrary goals in this case. Keep-Alive makes things load faster, but it puts a cap on how many clients can connect to your server before it curls up and dies. It would be ideal if Apache would let you set a high-water limit for Keep-Alive connections after which it turns the feature off, but I don't know any way to do that. You can set how long they're kept alive, and you can set how many requests are allowed per Keep-Alive session, but not how many sessions are kept alive.


This is only true for servers that use separate thread/process for each request... It doesn't apply to event-driven servers (nginx, etc).

I'd even say that keep-alive is always your friend and the longer you can keep connection open the better... Of course there are always OS-level limits (open file descriptors, etc), so you should use LRU-queue on idle keep-alived connections to make sure that you won't run out of resources...


The best way to handle keepalives is to proxy everything through nginx, which can handle a huge number of connections with very little memory. Turn keepalives on in nginx, turn them off on your app server.


Umm the Chrome extension states it can access: "All data on your computer and the websites you visit".

Say what?


It's very common. It's a limitation/feature of the Chrome API. If you want to make an extension that can be run on any page, this warning will appear.


Nice !!



Nice to see google cheating:

http://i.imgur.com/JxFKP.png


How is Google cheating?


Because on http://pagespeed.googlelabs.com/ it says perfect score.


I just noticed that your screenshot isn't even of Page Speed. It is of the Audits tab in Chrome Developers Tools.


It is not mine, but gautaml's one, and it use the independent tool of chrome that perform the same test. But when using the webservice it shows different results.


My bad. I wasn't paying attention to the usernames.

Are you sure they perform the same tests though? Why would Google build a Page Speed Chrome extension if the exact same algorithms were already built in? I suspect they are working towards the same goal but are approaching it differently.


It likely is a different analyzing system but the web appears very much modeled after the audit tab.

I just found it funny that the web gave google.ca 100/100




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: