
Web Performance 101 - iamakulov
https://3perf.com/talks/web-perf-101/
======
userbinator
IMHO instead of going for compressing/minifying/whatever else, it is far
better to just _remove the useless cruft in the first place_ \--- and then you
can still apply such techniques to whatever is left to squeeze out a bit more
improvement. The best way to make your site ultra-responsive is to cut out all
the bloat.

Related item:
[https://news.ycombinator.com/item?id=10820445](https://news.ycombinator.com/item?id=10820445)

~~~
Matrixik
This part is like always missing from such presentations. First point always
should be remove as much as possible.

~~~
diafygi
You're totally right, but unfortunately, I think that's because the person
writing these types of presentations don't have control over the content or
design of the website, so removing stuff is often impossible or way more work
that these tips.

The core holdback for slow websites is usually political, not technical or
lack of skill. These presentations usually just try to make the best of those
team dynamics issues.

------
iamakulov
So, in summer, I gave an introductory talk into web performance. This is its
textual version :)

The talk includes 94 slides with text about:

    
    
      — why web performance matters
      — how to optimize:
        — JS (async/defer, code splitting)
        — CSS (the critical CSS approach & tools for it)
        — HTTP/connection stuff (Gzip/Brotli, preloading, CDN)
        — Images (compressing images, webp, progressive vs baseline images)
        — Fonts (the `font-display` trick)
      — and what tools help to understand your app’s performance
    

Would love to hear your feedback :)

~~~
denormalfloat
I'm saddened to not see Closure listed on there for JS minification. It might
be worth mentioning that there are more advanced minification tools.

Also, there doesn't seem to be anything on JSON minification, which is a
sizable portion of responses. There are techniques to transpose JSON objects
to be easier to gzip compress.

~~~
KingMob
Google Closure is still best in class at DCE and cross-module motion, but it's
never caught on with the larger web community, partly because it applied
certain constraints to your Js code that weren't always met. This has changed
a bit with modern Closure better able to consume npm modules, but AFAIK, the
only heavy non-Google user is still ClojureScript.

------
vladdanilov
> Compress your JPG images with the compression level of 70‑80.

In practice some images can get noticeable artifacts even at around 90. Most
of JPEG compressors always apply chroma subsampling which is often destructive
on its own [1]. On the contrary, many hidpi images can be compressed at around
50.

> Use Progressive JPEG… Thanks to this, a visitor can roughly see what’s in
> the image way earlier.

That's not the point of using progressive JPEGs nowadays. The 10-200%
decompression slowdown is for the 5-15% size reduction.

> Use Interlaced PNG.

Don't. Interlaced PNGs can easily be 1/3 bigger. There are better ways to show
loading images, and it's already used on the website.

> webpack has image-webpack-loader which runs on every build and does pretty
> much every optimization from above. Its default settings are OK

> For you need to optimize an image once and forever, there’re apps like
> ImageOptim and sites like TinyPNG.

These tools are no good for _automatic_ lossy image compression [1]. The
default is mostly JPEG 4:2:0 at quality 75, PNG quantized with pngquant at
settings as low as conscience allows, missing out many PNG reductions and
optimal deflate, no separation between lossy and lossless WebP if at all, etc.

As a result, the images on the website have about 13-24% more to optimize
losslessly.

[1] [https://getoptimage.com/benchmark](https://getoptimage.com/benchmark)

~~~
CharlesW
> _These tools are no good for automatic lossy image compression_

The self-promotion didn't bother me until this claim, because you posted some
great advice along with it.

ImageOptim is great. If you choose "lossy minification" it does automatic
lossy image compression, preserving perceptual image quality while making huge
reductions to file sizes. Users can even adjust how aggressive it is.

I'll take your word for it that I could get 13-24% smaller file sizes with
your Optimage product on top of the 80% (or whatever) that I can get with
ImageOptim. But I'd prefer that you didn't claim that other choices are "no
good".

~~~
vladdanilov
I specifically meant _automatic_ lossy compression with predictable visual
quality. If ImageOptim could actually achieve it (automatically), that would
save me and others an awful lot of time. But as it turns out it is not that
easy.

Some very smart people at Google go to the trouble of creating projects like
Guetzli. I personally have spent months on this, and it gets me every time
someone claims "just use that one tool" without any evidence. I presented mine
and it's reproducible.

ImageOptim is a great tool otherwise.

------
runako
Based on my first scan, I've bookmarked this for the next time I have a perf
issue on a site.

One thing I didn't notice is that one of the biggest speedups is removing
_junk_ from the pages. That could be too many JS trackers, user-hostile
videos, or whatever. IMHO it's an underrated skill for Web developers to be
able to push back with cogent arguments when asked to ruin the performance of
the sites they work on.

~~~
taf2
Why? Most trackers load async and don’t block the page from loading or
rendering. I get it’s popular to hate but technically they load without
blocking page from rendering. A good read is
[https://sites.google.com/a/webpagetest.org/docs/using-
webpag...](https://sites.google.com/a/webpagetest.org/docs/using-
webpagetest/metrics/speed-index)

~~~
pmichalina
Even just one poorly designed library can cause serious memory issues and
trigger a ton of events. Usually it’s the marketing and ad people that throw
around this “but they are async loaded”. That’s true, but doesn’t change the
the fact that 40 trackers and their dependencies that come with it are slowing
down and infringing on the users privacy. Let’s be real.

~~~
taf2
Can you share an example site with 40 trackers? I believe you but it just
doesn't seem common place.

~~~
runako
There has been a lot written about this, but here's a test I just ran on a
random page on Huffington Post:

[https://www.webpagetest.org/result/181031_TT_443f9d1e666d08f...](https://www.webpagetest.org/result/181031_TT_443f9d1e666d08f708e1f78c5811ee5f/1/details/#waterfall_view_step1)

The article is probably <200 words, but the page is 3.2 MB and makes > 200
requests. There are 55 Javascript requests in there.

Right, HuffPo is egregious. How about a site HN readers might frequent,
something lightweight like Reddit?

[https://www.webpagetest.org/result/181031_H6_eaa2e64c9969515...](https://www.webpagetest.org/result/181031_H6_eaa2e64c99695156cd97c10867680cfb/)

Random Reddit page, content: medium-sized image. 164 requests, ~12 seconds to
render the page on the test rig. 60 Javascript requests. 1.5 MB of JS
downloaded to display the post, a 46kb image. There's a 30:1 ratio of JS to
post content (what the user wanted to see) here. And there's other bloat
beyond the JS.

I just picked these two sites at random. You can do this all day with random
websites. I would guess that most sites behind a .com (or .co.uk, etc.) will
look similar.

I don't mean to pick on these sites, just wanted to point out that these
practices are widespread and even reputable developers engage in them. Which
will make it more difficult to undo the rot.

~~~
taf2
I see there is a lot of JavaScript but are you certain that is JavaScript with
the sole purpose of tracking or application code? I’m not gonna say these
sites could not be implemented better but the claim was these are all tracking
pixels. Is that true?

~~~
runako
You're parsing this too narrowly. I'm not making a claim about the purpose of
each request. I'm looking at the totality and saying that maybe making 200
requests to display 200 words of content is overkill.

The larger claim was that the pages are loaded with junk that affects the
performance of the pages. The signal:noise ratio on modern sites is broken,
and optimizing the junk can only accomplish so much. Developers need to advise
stakeholders of the downside costs, performance among them, of loading sites
with bloat.

Here's an article I read a while back on the impact of specifically
Javascript:

[https://medium.com/@addyosmani/the-cost-of-javascript-
in-201...](https://medium.com/@addyosmani/the-cost-of-javascript-
in-2018-7d8950fbb5d4)

~~~
taf2
I agree but if you look at the parent response claimed 40+ tracking scripts, I
just don’t see 40 tracking scripts I see a website using probably too many
scripts to implement functionality but I can’t claim to know they can do it
with less scripts - I can only assume and that is what the website posted in
the original document talks about how to optimize... hence async and trackers
not blocking comment was correct

------
firasd
Good document! I wanted to mention, something I've found to be significant in
practical testing but often not considered by web developers is reducing the
total number of HTTP queries. It seems like fetching 5 CSS files of 5kb each
will significantly slow things down compared to combining them into one 25kb
CSS file.

~~~
tyingq
Http/2 makes this less important. Perhaps still worth it for CSS, but higher
effort things like image spriting might not be as attractive now.

~~~
chrisweekly
Yes -- in fact some webperf techniques are HTTP/1.1 - specific, and are
actually _anti-patterns_ in HTTP/2\. Spriting is one such example.

~~~
chrisweekly
Domain sharding is another.

------
Klathmon
I know i'm kinda self-promoting here, but since you mention webpack loaders
like responsive-loader, and you recommend image-webpack-loader multiple times,
I figured I could mention my plugin imagemin-webpack-plugin [0]

The problem with image-webpack-loader is that it only works on images which
are `require`d or `import`ed. responsive-loader adds those images to webpack
in a way that the loader cannot compress them.

Plus there are a bunch of other fany features that many helpful users have
added like caching (no need to re-compress every image every time you run
it!), the ability to minify images not in the webpack pipeline, and more.

[0] [https://github.com/Klathmon/imagemin-webpack-
plugin](https://github.com/Klathmon/imagemin-webpack-plugin)

------
Mr_Manager
I've found the server PageSpeed module (in my case for Nginx) does a lot of
good things "on the fly".

~~~
Theodores
This is the correct way to do a lot of it. For instance, images. Yes you could
optimise the things in Photoshop but actually you want your artworkers to be
doing the images right, so they look good and tell the story. Optimising the
images is something that should be abstracted out, so on your dev version of
the site everything is in 100% glorious, maybe even phone-res megapixels.

Then, abstracted out with PageSpeed you can deliver the webp when you need to
and also the source set images so every device has the right size images that
update themselves automagically if people zoom in.

Same with minification, why have complex build tools when you can just have
PageSpeed do it properly?

You can also have beautiful HTML for view source by putting on the right
PageSpeed filters.

The list goes on, apart from the results it also does the abstraction bit, so
artworkers can do their Photoshop stuff unencumbered, same for frontend and
backend devs.

The thing about it though is that you need to understand a mix of different
things that are nowadays split up into different job roles of ever increasing
specialisation. A Photoshop person isn't going to go all command line on the
server for the perfect PageSpeed Nginx setup, neither is a CSS person, a UX
expert, a JS expert or a backend expert. Not even the guy who keeps the site
online is going to typically step up to using Pagespeed for the benefit of the
team. Pagespeed just doesn't fit into one of these niche-jobs so it is more
likely to be found on smaller one-person efforts where there aren't the
organisational hurdles in the way.

~~~
C1sc0cat
No you need your artworkers to do it right it the flipping first place and not
place excessive load on your live environment - or make the live environment
more complex than it needs to be.

------
commandlinefan
Careful with GZip and SSL, though - your payloads become vulnerable to the
BREACH attack.

~~~
virmundi
I've seen that. My understanding is that you can compress CSS, images, pretty
much anything that does not require a secret. So if your sending a secret to
get an image, I think you're doing something wrong. If you keep the secret in
a cookie, or transfer it in a header rather than a body, you should be clear
to use Gzip and HTTP compression.

------
timmytwotime
I would have liked to have seen more on HTTP/2

