Related item: https://news.ycombinator.com/item?id=10820445
And if the site doesn't do it, you can often decrease page load time by turning on your browser's built-in tracking protection:
It's a bit sad that blocking trackers can cut page load time in half but that is unfortunately the web we have.
Designers have become "coders" but aren't versed in the area of the science of computers and networking. Thus we have given to us Wordpress, Wix, Squarespace where you, too, can become an internet web site developer!
The core holdback for slow websites is usually political, not technical or lack of skill. These presentations usually just try to make the best of those team dynamics issues.
Yes, like "compressing/minifying/whatever else", which is the subject of this presentation.
What's next? Really, before minifying, you have to get a server to serve anything in the first place. And register a domain name. And, hmmm, well first you need to buy a computer and plug it in...
The talk includes 94 slides with text about:
— why web performance matters
— how to optimize:
— JS (async/defer, code splitting)
— CSS (the critical CSS approach & tools for it)
— HTTP/connection stuff (Gzip/Brotli, preloading, CDN)
— Images (compressing images, webp, progressive vs baseline images)
— Fonts (the `font-display` trick)
— and what tools help to understand your app’s performance
Meta node: I find this presentation style to be great. You can scroll up and down at speed, can text search the whole page, we have a nice mixture of imagery and text, its clean and accessible, we even have a table of contents. How did you author this?
HTTP2 is a doddle to implement. Do it.
PWA makes it possible to work fully offline.
With CSS best to chuck it all out including those reset files someone wrote a decade ago. Instead re-write the whole lot using CSS Grid and using custom variables.
Inline the SVG into the CSS as custom variables.
Use HTML5 properly, with no lip service. Get rid of JS for forms and rely on HTML5 to do it.
Pagespeed to sort out the images and make them into source sets.
The goal of a lot of the above is to strip out convoluted build tools and have actual neat HTML that can be maintained. No more 'add only' CSS to hand on to the next guy, instead have something with comments in the code and sensible names that target HTML5 things like 'main' and 'aside' or 'nav' rather than made up class names.
A final thought is that the starting point can be to build a green website, i.e. one that cause too much cruft to be downloaded. This is the same thing as 'minimizing/cutting out bloat' but I find that setting out to build a website that sets the example of being green is a better mindset than 'must do those hacky things to make website faster'.
I also found them useful recently for small (28x28px) thumbnail images on my personal website. On average, saved as a PNG the thumbnails were 20kb, as a JPEG 9kb, and as an optimized GIF about 1-2kb. With about 100 thumbnails on one of the pages, the savings are pretty significant. (At least, this seemed to be the best approach; if anyone with more knowledge of image compression has a better suggestion, please let me know).
Also, there doesn't seem to be anything on JSON minification, which is a sizable portion of responses. There are techniques to transpose JSON objects to be easier to gzip compress.
In practice some images can get noticeable artifacts even at around 90. Most of JPEG compressors always apply chroma subsampling which is often destructive on its own . On the contrary, many hidpi images can be compressed at around 50.
> Use Progressive JPEG… Thanks to this, a visitor can roughly see what’s in the image way earlier.
That's not the point of using progressive JPEGs nowadays. The 10-200% decompression slowdown is for the 5-15% size reduction.
> Use Interlaced PNG.
Don't. Interlaced PNGs can easily be 1/3 bigger. There are better ways to show loading images, and it's already used on the website.
> webpack has image-webpack-loader which runs on every build and does pretty much every optimization from above. Its default settings are OK
> For you need to optimize an image once and forever, there’re apps like ImageOptim and sites like TinyPNG.
These tools are no good for automatic lossy image compression . The default is mostly JPEG 4:2:0 at quality 75, PNG quantized with pngquant at settings as low as conscience allows, missing out many PNG reductions and optimal deflate, no separation between lossy and lossless WebP if at all, etc.
As a result, the images on the website have about 13-24% more to optimize losslessly.
The self-promotion didn't bother me until this claim, because you posted some great advice along with it.
ImageOptim is great. If you choose "lossy minification" it does automatic lossy image compression, preserving perceptual image quality while making huge reductions to file sizes. Users can even adjust how aggressive it is.
I'll take your word for it that I could get 13-24% smaller file sizes with your Optimage product on top of the 80% (or whatever) that I can get with ImageOptim. But I'd prefer that you didn't claim that other choices are "no good".
Some very smart people at Google go to the trouble of creating projects like Guetzli. I personally have spent months on this, and it gets me every time someone claims "just use that one tool" without any evidence. I presented mine and it's reproducible.
ImageOptim is a great tool otherwise.
A score of 24/55 for TinyPNG and then 55/55 for their own service makes it look as if this article is an advertisement. Especially since TinyPNG gets better/very similar file size while staying visually lossless up to a point (images where it's nothing but a bunch of rainbow gradients are its weakness).
Remember that TinyPNG is optimized for web use where artifacts are tolerated. It was configured with that in mind. They test for images that are visually identical and won't get it from any images optimizer that is made for web usage.
Users only spend a few seconds looking at images that on web pages and the artifacts from optimizers are very minor. See: https://3perf.com/talks/web-perf-101/#images-compress-jpg-si...
One thing I didn't notice is that one of the biggest speedups is removing junk from the pages. That could be too many JS trackers, user-hostile videos, or whatever. IMHO it's an underrated skill for Web developers to be able to push back with cogent arguments when asked to ruin the performance of the sites they work on.
Right, HuffPo is egregious. How about a site HN readers might frequent, something lightweight like Reddit?
I just picked these two sites at random. You can do this all day with random websites. I would guess that most sites behind a .com (or .co.uk, etc.) will look similar.
I don't mean to pick on these sites, just wanted to point out that these practices are widespread and even reputable developers engage in them. Which will make it more difficult to undo the rot.
The larger claim was that the pages are loaded with junk that affects the performance of the pages. The signal:noise ratio on modern sites is broken, and optimizing the junk can only accomplish so much. Developers need to advise stakeholders of the downside costs, performance among them, of loading sites with bloat.
Generally avoid splitting CSS. Even if you don't use all your CSS on every page, a cached 100kb CSS file will outperform a bunch of unique-per-page 25kb CSS files everytime (especially since it's 0 requests for the second page). Except for dial-up probably.
It may be beneficial to split CSS files somewhere above the 300kb mark, but I wouldn't know. My one-page-app is only about 500kb over the wire, including CSS, JS, Fonts and HTML. ~30KB of that is CSS.
I've been optimizing that for years though.
The problem with image-webpack-loader is that it only works on images which are `require`d or `import`ed. responsive-loader adds those images to webpack in a way that the loader cannot compress them.
Plus there are a bunch of other fany features that many helpful users have added like caching (no need to re-compress every image every time you run it!), the ability to minify images not in the webpack pipeline, and more.
Then, abstracted out with PageSpeed you can deliver the webp when you need to and also the source set images so every device has the right size images that update themselves automagically if people zoom in.
Same with minification, why have complex build tools when you can just have PageSpeed do it properly?
You can also have beautiful HTML for view source by putting on the right PageSpeed filters.
The list goes on, apart from the results it also does the abstraction bit, so artworkers can do their Photoshop stuff unencumbered, same for frontend and backend devs.
The thing about it though is that you need to understand a mix of different things that are nowadays split up into different job roles of ever increasing specialisation. A Photoshop person isn't going to go all command line on the server for the perfect PageSpeed Nginx setup, neither is a CSS person, a UX expert, a JS expert or a backend expert. Not even the guy who keeps the site online is going to typically step up to using Pagespeed for the benefit of the team. Pagespeed just doesn't fit into one of these niche-jobs so it is more likely to be found on smaller one-person efforts where there aren't the organisational hurdles in the way.
Not saying that Page speed is wrong, but it is a niche tool that depends on the server side implementation. Some developers prefer to abstract "the server" from their architecture...