> Around the same time we switched from an (outdated) manually created critical CSS file to an automated system that was generating critical CSS for every template — homepage, article, product page, event, job board, and so on — and inline critical CSS during the build time. Yet we didn’t really realize how much heavier the automatically generated critical CSS was.
Is reducing the total amount of CSS per page so you don't have to calculate the critical CSS at all an option?
To throw my own page into the ring, here's a non-trivial product website of mine where the homepage is 0.3MB total over the wire and renders in 0.4 seconds for me (includes custom fonts, payments, analytics, large screenshot and real-time chat widget):
The major tricks I'm using is keeping the website CSS small (CSS for homepage + rest of site gzips to less than 8KB), inlining all CSS, rendering default fonts before the custom font has loaded, SVG for all images (this saves a ton), and not using JavaScript for content (which blocks rendering).
The screenshot in the header is in SVG format and inlined directly into the page along with the CSS, so the moment the HTML arrives the browser can display all above the fold content. Logos are another good one for the SVG + inline treatment.
The problem with inlining all CSS is serving all the global styles all over again with every page load. You're gaining a great first impression at the cost of a poorer experience for every subsequent page load.
Have you considered inlining CSS in the head (as you've done), but then serving it again with a linked css file just before </body>.
Then, with subsequent page loads (of the current or other pages), you don't have to inline any CSS anymore.
Of course this requires that you serve two versions of all your pages, one for if that page is a first hit, and another to be served to users who already have your CSS cached.
> The problem with inlining all CSS is serving all the global styles all over again with every page load.
> You're gaining a great first impression at the cost of a poorer experience for every subsequent page load.
Yep, it's unfortunate we still have to make tradeoffs like this. At least in this case, it's less than 8KB added per page vs something more complicated that might break.
HTTP push was getting closer to offering some alternative like what you're suggesting (the server pushes the CSS file to the client in parallel to the initial HTML page, and the client can say if it already has the CSS file) but it's being deprecated now.
> But with 8kb zipped (I assume that following pages aren't that much worse) why should one optimize with a solution that adds that much complexity?
If that's the argument, why not just let them take the (tiny) initial 8kb hit as an external css file, and make the rest of the experience even "zippier"?
Unfortunately this means it’s difficult to handle HTTP caching without having something in the URL, and at the same time you want to make sure search engines index the version without the indicator in the URL.
> The problem with inlining all CSS is serving all the global styles all over again with every page load. You're gaining a great first impression at the cost of a poorer experience for every subsequent page load.
The vast majority of users will visit one and only one page.
On the extreme end, consider someone on a spotty and slow mobile connection going through Google results to find what they want. They're only going to give you a few seconds before they hit "back" on your site and try another. The round trip for the browser to fetch the CSS file after the HTML file arrives can be enough to cause that delay.
Because for every second of delay you add, you reduce your potential audience by a large degree.
You optimise for first impressions, even though you know that a major proportion will click away anyway.
You'd need to add 'rel="prefetch"' to the link or the browser will block rendering until the CSS file has been downloaded, parsed, and added to the CSSOM.
Thanks! I wouldn't mind seeing a list of fast loading pages like this that have images + functionality that aren't mostly text. There's only so much you can do if your design requires large photos or a video in the header though.
The big wins are: don't require JS for any content in the page header, use minimal CSS + inline it, strongly prefer SVG images over bitmaps + inline them, use "font-display: swap" + go easy with how many fonts you need.
The main trick is to view your web app in a browser, print to PDF and then use Inkscape to auto convert from PDF -> SVG (with some caveats). Worst case you could recreate the screenshot from scratch in a vector editor.
It's still way too slow. It's a big page of text. It should load in an instant.
In my browser, I see 1.75 MB sent over the wire and a 2.5 second load time. My big pages of text [1] need 105 kB and load in 0.4 seconds. Their compressed critical CSS is the same size as my entire uncompressed CSS file. They send more CSS bytes than I send bytes in total.
If you want to make a content website fast, it's quite simple: send just the content.
It also loads very fast (close to instantly) for me on a pretty old mobile (moto g6). Most websites are slow to load for me on this device. This one isn't.
Nice to see another company covering all these steps and validating the work we've done at my company. Unfortunately we weren't as successful, or at least, our results were not as fruitful. A good score for our site (tracker.gg) is 70 on mobile. Turns out it's pretty hard to optimize the bootstraping of an application that can render 20 different websites! Mobile devices spend 1200ms on the main thread. It will be interesting to see how these changes impact our page rank when Google starts incorporating Core Web Vitals into its algorithm this year.
I'm pretty sure most of them would double as writers, or illustrators, etc. It's not a normal magazine per-se, since the articles are all technical ones. It also mentions many of the 12 are part-time and/or wear other hats.
It's not a static site. It has a site-wide search. It lets users post comments on articles. It also has its own store selling books. And it serves ads too.
Chunking JS just leads to massive latency issues as the client is forced to download dozens (or hundreds) of "efficiently" chunked JS files.
The e-commerce platform Magento 2 is packed with this kind of bullshit and is part of the reason my colleagues and I abandoned it for our clients' large e-commerce websites:
It seems though that in your example, all of the files being loaded are necessary on first load, and the total quantity is huge. Would the problems still be present if less JavaScript was depended upon on first load?
Only because developers absolutely insist on building things that way, for some reason I will never comprehend.
What's the first thing most people do when starting a new project? They ask "what framework should I build this on?" and start their tiny portfolio site built upon a massively overpowered suite of enterprise-level software with a million features they'll never, ever need.
Then they might drop in ten or fifteen separate external libraries because using vanilla HTML, CSS, and JavaScript is just "so 1995".
Then they start thinking about AJAX and microservices because rendering an entire page on the server side is utterly unthinkable in 2021.
Then they cram the site full of third-party services (because your tiny home-brew website will definitely benefit from Newrelic monitoring).
Then they might start on unit testing.
Finally they package everything up using at least seven different dependency managers, because a Github project without 100 useless ancillary files (grunt.js, app.yaml, travis.yml, composer.json, .gitignore, etc. etc.) is obviously not acceptable.
I remember reading a blog post about a fake conversation between two developers about web development and one way telling the other about how "easy" it is and goes through a 5 step process of how to build their 1 page website. That was written at least 5 years ago and things have only gotten worse since then.
Quite the opposite. It needs to be hammered into people. Your app isn't Notion or Google Docs. Your app is a form with a login menu and a couple of small logic elements and a date picker.
So what?
Just because your application is simple, does not mean you need to use vanilla tools. Do you get mad at a carpenter for using a nail gun because he 'can more easily swing a hammer'? It is the application of tools not the tools themselves that is the problem.
I really can't understand why so many people rag on frontend development as if we (as frontend developers) are just magpies who gather shiny things and don't care about performance or simplicity.
The problem is, most of those complaining have never tried to understand the complexity (and why it exists) because they've never built a highly complex frontend application! They're degrees removed and just throw stones from their glass houses.
Guess what, I'm not picking Spring to make a simple REST api when it takes 5 lines in Expressjs. Why? Spring is complete overkill. So along those lines, should you use React/Redux/Redux-{Thunk,Promise,Observable}/Navigation/etc if you need a simple 2 page site? NO! Jesus christ, you pick the right tool for the right job.
There's a reason though, when you have a highly complex app with insane business rules (that could very well be hidden from the airmchair HN crowd) where you need that complexity. And guess what, if you're highly versed in these "complex" frontend tools (spoiler: they're not complex), then it makes building sites, generally, very easy.
So go ahead, believe what you want and continue to parrot the "the frontend ecosystem is hopeless, needlessly complex and godless" while the rest of us continue developing with these "complex" tools because they make our lives as developers easier.
I'll sleep happy while you wait that one more second for the page to load.
I have a slow (apparently) work laptop and the web is horribly slow. Some pages feel like surfin' the pre 2000 web with all that lag and slow loading everytime anything moves.
It certainly feels that way sometimes. But web is rather unique. There aren't any other platforms that demand you deliver an application for "a device" (specifications unknown!) in under 1 second.
That's why there's all kinds of helpful features in HTML/CSS like media queries. You can serve up raw content and let the device make decisions on what resources to use to display the content. You can tell the device the most optimal resources for its particular constraints.
As a for instance you your script tag can just include a bunch of @import statements. Thankfully @import statements support media queries [0]. So you get the utility of external style sheets but can load one optimized for the client device. Unlike media queries on link tags the browser doesn't load style sheets (well they're not supposed to) that don't match the media queries.
It's also trivial to include a tiny bit of base styling with every page in a script tag. It doesn't blow any size budgets and makes sure a page is readable even with no external assets.
There's such a thing as an unreasonable, unrealistic, or downright stupid demand as well. What you often see is a house of cards hacked together to try and support such demands on the web.
Sending reasonably well visually formated text is definitely a reasonable demand though.
There is a ridiculous number of moving parts, stacked on top of each other, interacting in unforeseeable ways to allow you to shoot yourself in the foot, and absolutely nothing is straightforward.
OK, that basically describes any kind of software development, but web seems so much worse than anything else. And I say that as someone doing mainly Java backend development who's learned to live with the AbstractProxyFactoryManagerFactory jokes.
Snarky TLDR: half the JS load time was ad scripts. IMO Most of the performance increase was around specifying image heights, using facades for third party embeds, and optimizing around ad and analytics scripts.
Nice write up but not a big surprise to anyone that blocks analytics tracking, ads, and third party embeds.
Is reducing the total amount of CSS per page so you don't have to calculate the critical CSS at all an option?
To throw my own page into the ring, here's a non-trivial product website of mine where the homepage is 0.3MB total over the wire and renders in 0.4 seconds for me (includes custom fonts, payments, analytics, large screenshot and real-time chat widget):
https://www.checkbot.io/
The major tricks I'm using is keeping the website CSS small (CSS for homepage + rest of site gzips to less than 8KB), inlining all CSS, rendering default fonts before the custom font has loaded, SVG for all images (this saves a ton), and not using JavaScript for content (which blocks rendering).
The screenshot in the header is in SVG format and inlined directly into the page along with the CSS, so the moment the HTML arrives the browser can display all above the fold content. Logos are another good one for the SVG + inline treatment.