
How to Get 100/100 Google Page Speed Score with Middleman and Nginx - elliotec
https://elliotec.com/how-to-get-100-google-page-speed-score/
======
partiallypro
Google Page Speed scores sometimes give bad advice that can actually slow down
page load speed. I just focus on load time and the experience of the page
loading. I sort of wish Google would kill the tool off or give it more
context. Instead they just have a disclaimer at the bottom that says it's a
recommendation and doesn't take into account experience...but try explaining
that to a client. Any static site is going to load fast though, it's when you
have to make database and API calls when things get hairy.

~~~
elliotec
Any examples of the bad advice that would slow down the page?

~~~
partiallypro
A very basic example: if you had a large site, you can raise my Google Page
Speed score by inlining all CSS and JS, but that would be stupid because it
doesn't take into account the next page load and caching. I've seen some
render blocking optimizations slow the overall page too. Getting rid of query
strings is nice, but sometimes you need it for cache busting and other things,
but it penalizes your score. Even Google's own tag manager/analytics scripts
lowers your page speed score when they are on the site.

~~~
badestrand
>> Even Google's own tag manager/analytics scripts lowers your page speed
score when they are on the site.

I never understood why Google doesn't fix this. I want to use Google Analytics
but their own tool penalizes me for it because it gets unproperly cached on
Google's own servers? That's just stupid.

~~~
manigandham
It's just a tool with a random score - why does it matter? Most of the
recommendations are now outdated. Focus on overall experience and there are
plenty of RUM monitoring scripts you can use (that take advantage of the very
fine grained performance timings available in browsers today).

------
adorable
Some of those optimizations become irrelevant (or downright bad) with http2.
Inlining CSS ans JS is no longer best practice when you can do a server push
of those and avoid sending the extra bytes once the files have been cached by
the browser.

Some optimizations can backfire. Adding async to the external JS is a great
tip.. unless you have dependencies between your different scripts (i.e. you
need one to be loaded before the other one gets executed)

~~~
Xoros
That's something that really astounds me with this async thing.

Why on earth when they introduced it they didn't add a way to tell the browser
the order of execution ? Or may be it exists and I haven't heard about it ?

~~~
ko27
There is a way: ES6 modules. If you can account for the order of execution
(with modules or custom loaders) async is, in almost all cases, the best way
to load your scripts. Defer (or body-end) scripts also specify the execution
order, but only in a strictly linear fashion which usually has a performance
hit.

------
amdixon
We love Middleman for static sites; curious why you chose to go with a server
instead of S3+Cloudfront or Netlify. One of the biggest reasons we use static
site generators is the ability to CDN the whole thing, making it dirt cheap
and blazing fast.

~~~
kuschku
> making it dirt cheap and blazing fast.

If you’re using S3+CloudFront, it’s not dirt cheap.

Traffic at AWS is so expensive, at any scale it’s cheaper to roll your own.
Significantly.

(For traffic in the EU, that is, in other regions traffic tends to be a lot
more expensive)

~~~
kaishiro
For what you get out of the box with this setup (or Google Cloud Storage +
Cloud Load Balancing/CDN) it actually can be "dirt cheap", depending on your
business goals/traffic. Just like anything, it depends on what you value/what
your time is worth. The pricing difference would need to be several (several)
orders of magnitude larger than it is for me to worry about the few dollars a
month I'm spending on each static site sitting in S3/Cloud Storage.

Also, if there is an issue with either of these it's far easier for me to
justify downtime to a client by saying "Amazon us-east is down, here's
Amazon's status page" then trying to explain who OVH is.

~~~
codedokode
You will need to spend time to set up your site for Amazon or for cheap DO
instance, I don't see what you can save here. You can have your own virtual
server with root privileges and iptables and custom software and with Amazon
you will have to pay for every new feature.

~~~
kaishiro
As I said, it really just comes down to business goals. Setting up a shared DO
box with iptables isnt worth my time when it's still just a single DO box at
the end of the day.

S3 + Cloudfront + ad hoc Lambdas is a far far easier cost benefit ratio when
it comes out what my time is worth. To each their own :)

~~~
oelmekki
I'm under the impression that by "it's not worth of my time", you actually
mean "I don't like sysadmin" :)

~~~
kaishiro
Ha. Well, it's more that there are things I can justify charging for and
things I cannot. Standing up a sub par solution for a larger investment of
time doesn't make a ton of fiscal sense to myself or my clients.

~~~
oelmekki
Yeah indeed, for contract customers, better go with something you're confident
everybody will already know, since other people will have to manage it when
you won't be there anymore.

In a previous startup I was CTO and founder for, I used a gentoo and managed
everything possible by myself. This was not especially costly in time because
I was already used to it and efficient, and we managed that way to stay for 3
years with 70€/mo in infrastructure cost. This has its advantages too :)

------
mosselman
It is a shame that you dont get into details technically. The article now is
basically the same as looking at the recommendations of google page speed
itself.

~~~
elliotec
What level of detail are you looking for? I'm happy to expand on it.

~~~
mosselman
Thanks for getting back to my comment. Reading back my comment I might have
been a bit harsh, I am sorry about that. The level of detail in the
compression and caching bits were very detailed, thanks for that.

What I was most interested in technically is the part about 'Prioritize
visible content'. You talk about "loading third party scripts later, and
generally keeping the above-the-fold content small and fast". From this I make
out that you do not load images below the fold, etc. I was wondering what you
do technically to make this happen. Also, how would you load fonts after the
first paint, etc. Also, from what I remember, when you inline your CSS you
make caching the CSS separately from the HTML impossible. What would you do
when you want to have a quick first paint, but also want to leverage having
CSS in its own file?

------
gingerlime
One odd thing I bumped into is Google speed suggestion "complaining" about
leveraging browser caching, but it reports Google Analytics as an offender :-/

    
    
        https://www.google-analytics.com/plugins/ua/linkid.js (60 minutes)
        https://www.google-analytics.com/analytics.js (2 hours)

~~~
zepolen

        location /analytics.js {
            proxy_pass https://www.google-analytics.com;
            expires 1d;
        }

~~~
codedokode
You might serve an outdated file and lost a day worth of analytic records as a
result.

~~~
zepolen
Never seen that happen before - but you have a point assuming google
implements a drastic change.

------
r1ch
Has anyone else had trouble using pagespeed insights recently? I think they
changed something about their API, every site I try to test gives me "The
referrer [https://developers.google.com/](https://developers.google.com/) does
not match the referrer restrictions configured on your API key. Please use the
API Console to update your key restrictions."

Note I'm not actually trying to use the API, I just want to run the tool from
[https://developers.google.com/speed/pagespeed/insights/](https://developers.google.com/speed/pagespeed/insights/)

------
hunvreus
For Jekyll users, you can do just as well with CloudFlare and GitHub pages
(all for 0$): [https://wiredcraft.com/blog/make-jekyll-
fast/](https://wiredcraft.com/blog/make-jekyll-fast/)

------
jwilliams
Note that you have to use a workaround to get 100/100 - it excludes Google
Analytics when the analysis is running.

I always get pinged for GA and for Google Fonts. Not sure this makes any
sense. Kind of detracts from the tool in general, actually.

~~~
Y7ZCQtNo39
But loading external fonts does take additional time. So to say, oh yeah, I
have GA and fonts and I'm still 100 sounds disingenuous because it could still
be faster.

Context is key. Nobody should be shooting for a perfect 100 in any situation
where a barebones static HTML document won't suffice. So if you have to make
DB calls to create the HTML document, or are reasonably delivering a nice user
experience (e.g., a custom font), then you should not be scoring 100.

With that being said, you are still doing great for what you are delivering if
it's non-static content with fonts and can get to at least 80 / 100.

~~~
jwilliams
Maybe so, but PageSpeed still gives you bad advice on it.

------
nitramm
I have recently built a tool to help me monitor Google Page Speed Score daily
and notify me when it changes -
[https://twitter.com/SocialMindAI/status/920979239226302464](https://twitter.com/SocialMindAI/status/920979239226302464)

Interesting thing is that the score fluctuates without me doing any changes.

------
zeep
Plain [https://www.google.com](https://www.google.com) gets 79/100 but when
you perform a search, it gets 98/100
([https://encrypted.google.com/search?hl=en&q=test](https://encrypted.google.com/search?hl=en&q=test))
...

------
sumoboy
Who really cares scoring 100/100\. Is there a top 1000 site that is even in
the low 90's? I love optimizing pages for speed but to claim 100/100 isn't
going to achieve some magical jump in organic rankings either because of
decreased page load speed.

~~~
dazc
People care when they have paid you to optimize their site but they only get a
score 86/100.

Trying to explain the page speed tool is just a 'rough guide' and that a
website always has to be some kind of compromise between speed, functionality
and design falls on deaf ears because 'google' is, well, 'google'.

~~~
sumoboy
I agree but like everything with optimization and refactoring, the cost to
move up towards a 100 offers less ROI in the end. I think if you're at 90
that's solid, no desire to be in the 100 club nor would I tell a client you
need to be there.

~~~
TheSleeprAwakns
90 for mobile, desktop, both?

------
tbarbugli
Did you try using Imgix for images? We did that this week and so far we like
it a lot

~~~
amdixon
We just started using Imgix on our most recent Middleman site as well and it's
been great.

------
calibas
Does anybody know how much the actual load time factors into the "PageSpeed"?
In my experience, you can have a site that loads quickly but gets a mediocre
score because it violates Google's guidelines.

~~~
ThrustVectoring
It really depends on the specific site. The gold standard is to open up the
network tab of your browser's debugger to dig into the runPageSpeed API call
that gets used behind the scenes. Specifically, dig down to
.formattedResults.ruleResults and look at the ruleImpact of each category.
It's not an exact correspondence to points taken off of a score out of 100,
but it's roughly proportional. That is, if you have a rule with a ruleImpact
of 20 and another of a ruleImpact of 10, fixing the ruleImpact of 20 will have
twice as good of a positive effect on your overall score.

------
stabbles
I tried gzipping my static site up front on maximal compression levels. Google
the `gzip_static` directive.

It doesn't make a noticable difference, but it's fun.

~~~
erik_p
what about Zopfli/brotli?

Gzip compression benefits taper off in benefit after the "7" setting, IIRC.
Brotli at higher settings compresses better, but you'd want to do it static as
opposed to on the fly (at higher/highest settings)

~~~
stabbles
I've tried that as well, but you need a brotli module for nginx. No problem if
you're using Docker, but I just used the mainline distribution of nginx on
Ubuntu 16.

I'm curious if brotli's decompression is still fast on high compression
levels. This is the case for gzip IIRC.

~~~
tomatsu
> _I 'm curious if brotli's decompression is still fast on high compression
> levels. This is the case for gzip IIRC._

Yes, it's optimized for fast decompression. Compression, on the other hand, is
comparatively expensive. This trade-off makes Brotli very interesting for
static assets.

[https://quixdb.github.io/squash-benchmark/](https://quixdb.github.io/squash-
benchmark/)

------
steveharman
404 for me on "Here’s the link to my full Middleman config.rb"

------
_salmon
The link to your Middleman config is broken FYI

~~~
thelittleone
I'm also getting a 404 for the linked github repository.

OP thanks for the great article.

------
codedokode
Google Page Speed is an awful tool. For example, it often advices to compress
images and even suggests to download "optimally" compressed versions. Those
images have awful quality, and have visible compression artifacts. The person
who chose this level quality is probably blind.

Also it often gives crazy advices, for example something like "embed CSS code
necessary to display the top of the page, right into the page". Not only it is
difficult to implement, (what exactly is "the top of the page"? how to
understand what part of the code is necessary? what if the user scrolls the
page down while loading?), it will make the pages larger and doesn't allow
browser to cache embedded CSS. So you have to invest a lot of time and your
page will probably load slower as a result.

But at the same time this script doesn't notice obvious things. For example,
it never advices to remove or replace web fonts although web fonts are heavy,
can cause problems with rendering text (they can lack glyphs, for example
cyrillic or CJK characters) and __web fonts block text rendering __. And they
are really unnecessary in most cases because builtin system fonts usually look
better at small sizes (at least on Windows).

Also it never advices to remove ads althogh ads often cause high CPU and
memory consumption. Why is that I wonder.

The tool always requires to move JS scripts to the bottom of the page. But is
it always the best idea? I think in most cases it is not. For example, if you
have a button with an onclick handler, it will give a JS error if the button
is clicked before the scripts in the bottom of the page have loaded (the
problem of dead JS controls). And if you put a small JS script at the top of
the page, there will be no error. And what if you have a SPA? In this case the
earlier you start to load the scripts the sooner the user will be able to see
the data.

Also it always recommends to use a CDN which is not always good idea. For
example, if you have a site in Russia or China and move your static files to
CDN then it might be farther from the user than your server and even worse, it
can get blocked so your site won't even be loading. Adding a CDN often doesn't
make performance any better but reduces the reliability and increases your
hosting bill.

So Google's script might help you in identifying problems with your site but
you should not ever blindly trust any of the recommendations. Please leave
client side optimization to people who have expertise in this area.

By the way I had to lessen the image quality to get a better score (and foolow
other useless or even harmful recommendations) because our management believes
that the score given by this tool (it is Google's official tool after all)
affects SEO rankings and obviously they are more important than image quality.

~~~
codedokode
Also it recommends to "leverage browser caching". I.e. set unconditional
caching of every static resource for a week. Please remember that it means you
will have to implement a solution to reset the cache when you deploy new
versions of static files. It might be quilte costly to implement correctly on
a legacy site. By the way you can trick Google by enabling caching for a week
and then appending current time to every link
([http://example.com/file.css?t=12:00:00](http://example.com/file.css?t=12:00:00)).
In this case you will get a good score AND your users will always get the
latest versions of the files.

The article says "Then, inline your CSS and JS instead of making external
resource calls to them." and this is actually what makes your page heavier.

> Finally, I was having a lot of trouble getting the Google Analytics script
> on my site to not be render-blocking

It is easy. Don't understand what was the trouble.

