Hacker News new | past | comments | ask | show | jobs | submit login
How we improved our website's performance (smashingmagazine.com)
123 points by kkm on Jan 24, 2021 | hide | past | favorite | 70 comments



> Around the same time we switched from an (outdated) manually created critical CSS file to an automated system that was generating critical CSS for every template — homepage, article, product page, event, job board, and so on — and inline critical CSS during the build time. Yet we didn’t really realize how much heavier the automatically generated critical CSS was.

Is reducing the total amount of CSS per page so you don't have to calculate the critical CSS at all an option?

To throw my own page into the ring, here's a non-trivial product website of mine where the homepage is 0.3MB total over the wire and renders in 0.4 seconds for me (includes custom fonts, payments, analytics, large screenshot and real-time chat widget):

https://www.checkbot.io/

The major tricks I'm using is keeping the website CSS small (CSS for homepage + rest of site gzips to less than 8KB), inlining all CSS, rendering default fonts before the custom font has loaded, SVG for all images (this saves a ton), and not using JavaScript for content (which blocks rendering).

The screenshot in the header is in SVG format and inlined directly into the page along with the CSS, so the moment the HTML arrives the browser can display all above the fold content. Logos are another good one for the SVG + inline treatment.


The problem with inlining all CSS is serving all the global styles all over again with every page load. You're gaining a great first impression at the cost of a poorer experience for every subsequent page load.

Have you considered inlining CSS in the head (as you've done), but then serving it again with a linked css file just before </body>.

Then, with subsequent page loads (of the current or other pages), you don't have to inline any CSS anymore.

Of course this requires that you serve two versions of all your pages, one for if that page is a first hit, and another to be served to users who already have your CSS cached.


> The problem with inlining all CSS is serving all the global styles all over again with every page load.

> You're gaining a great first impression at the cost of a poorer experience for every subsequent page load.

Yep, it's unfortunate we still have to make tradeoffs like this. At least in this case, it's less than 8KB added per page vs something more complicated that might break.

HTTP push was getting closer to offering some alternative like what you're suggesting (the server pushes the CSS file to the client in parallel to the initial HTML page, and the client can say if it already has the CSS file) but it's being deprecated now.


But with 8kb zipped (I assume that following pages aren't that much worse) why should one optimize with a solution that adds that much complexity?

I believe that shaving a few additional kb from this already low number isn't worth the proposed complexity.

But this is a tradeoff that everybody has to decide for themselves.


> But with 8kb zipped (I assume that following pages aren't that much worse) why should one optimize with a solution that adds that much complexity?

If that's the argument, why not just let them take the (tiny) initial 8kb hit as an external css file, and make the rest of the experience even "zippier"?


> why not just let them take the (tiny) initial 8kb hit as an external css file,

One reason is Google will likely ding your Core Web Vitals score for it, which is becoming a ranking signal in May this year (https://developers.google.com/search/blog/2020/11/timing-for...) so you have less of a choice here for what you prioritise.


I would believe that 8k via an additional request would harm performance more than to inline it.

But haven't tested that yet.


It would most certainly marginally harm performance a single time, but every additional page load would be much improved.


Unfortunately this means it’s difficult to handle HTTP caching without having something in the URL, and at the same time you want to make sure search engines index the version without the indicator in the URL.


Inlining is a measured risk. If your CSS is sufficiently small, and the number of pages per visit is low, it's still faster than cached CSS.


> The problem with inlining all CSS is serving all the global styles all over again with every page load. You're gaining a great first impression at the cost of a poorer experience for every subsequent page load.

The vast majority of users will visit one and only one page.


why would you optimize your website for someone who only visits the one page?


On the extreme end, consider someone on a spotty and slow mobile connection going through Google results to find what they want. They're only going to give you a few seconds before they hit "back" on your site and try another. The round trip for the browser to fetch the CSS file after the HTML file arrives can be enough to cause that delay.


Because for every second of delay you add, you reduce your potential audience by a large degree. You optimise for first impressions, even though you know that a major proportion will click away anyway.


Because most days you don't add more than one piece of content.


You'd need to add 'rel="prefetch"' to the link or the browser will block rendering until the CSS file has been downloaded, parsed, and added to the CSSOM.


true, but if the 1st impression is great and the 2nd isn't worse – what else do we want?


That's one of the fastest loading modern pages I've ever been to. Kudos!


Thanks! I wouldn't mind seeing a list of fast loading pages like this that have images + functionality that aren't mostly text. There's only so much you can do if your design requires large photos or a video in the header though.


Whoa. I've never seen a page load this fast. How did you do it? Is there a blog post somewhere?


I need to write it up but the SVG screenshot is documented on https://www.checkbot.io/article/web-page-screenshots-with-sv... and I'm using all the tricks from https://www.checkbot.io/guide/speed/ that the Chrome extension checks for.

The big wins are: don't require JS for any content in the page header, use minimal CSS + inline it, strongly prefer SVG images over bitmaps + inline them, use "font-display: swap" + go easy with how many fonts you need.


Just wanted to say that I’ve been using Checkbot for years and it’s great


Thanks, that made my day. :)


phantastic. Design is not how something looks, it is how it works. (said Steve Jobs)

Do you folks know the entertaining talk http://idlewords.com/talks/website_obesity.htm?


How did you create the screenshot SVG and other SVGs?


SVG screenshots are documented here: https://www.checkbot.io/article/web-page-screenshots-with-sv...

The main trick is to view your web app in a browser, print to PDF and then use Inkscape to auto convert from PDF -> SVG (with some caveats). Worst case you could recreate the screenshot from scratch in a vector editor.

The SVG icons are licensed from https://icons8.com/ (recommended!).


Thank you. You have been tremendously helpful


It's still way too slow. It's a big page of text. It should load in an instant.

In my browser, I see 1.75 MB sent over the wire and a 2.5 second load time. My big pages of text [1] need 105 kB and load in 0.4 seconds. Their compressed critical CSS is the same size as my entire uncompressed CSS file. They send more CSS bytes than I send bytes in total.

If you want to make a content website fast, it's quite simple: send just the content.

[1] https://allaboutberlin.com/guides/german-health-insurance


> It should load in an instant.

FWIW, it does load in an instant for me. (Lighthouse Performance = 99, Speed Index = 0.4s.)


Me too, but I'm loading it on a $3000 computer with 150Mbps/<10ms internet. Generally it's a good idea to only focus on the mobile score. :)


It also loads very fast (close to instantly) for me on a pretty old mobile (moto g6). Most websites are slow to load for me on this device. This one isn't.


Your website was way faster on mobile data, kudos.


I was thinking of the person reading about something on the U-Bahn on their way home.


Nice to see another company covering all these steps and validating the work we've done at my company. Unfortunately we weren't as successful, or at least, our results were not as fruitful. A good score for our site (tracker.gg) is 70 on mobile. Turns out it's pretty hard to optimize the bootstraping of an application that can render 20 different websites! Mobile devices spend 1200ms on the main thread. It will be interesting to see how these changes impact our page rank when Google starts incorporating Core Web Vitals into its algorithm this year.


12 people for a static site seems like a huge team to me.


I'm pretty sure most of them would double as writers, or illustrators, etc. It's not a normal magazine per-se, since the articles are all technical ones. It also mentions many of the 12 are part-time and/or wear other hats.


It's not a static site. It has a site-wide search. It lets users post comments on articles. It also has its own store selling books. And it serves ads too.


I’m not entirely sure you quite get the phrase, unless you feel that the definition of a “static site” is too permissive.



How could this be improved?



This site is atrocious. About 4.5% of world's population has some level of color deficiency.

How do people come up with this shit?


key takeaways:

- fonts can be unnecessarily huge

- monolithic js = slow. Chunk at build time.

- content-visibility: auto for lazy rendering


Chunking JS just leads to massive latency issues as the client is forced to download dozens (or hundreds) of "efficiently" chunked JS files.

The e-commerce platform Magento 2 is packed with this kind of bullshit and is part of the reason my colleagues and I abandoned it for our clients' large e-commerce websites:

https://magento.stackexchange.com/questions/104583/magento-2...

https://old.reddit.com/r/Magento/comments/bli7vz/seriously_w...

https://magento.stackexchange.com/questions/277544/page-load...

https://magento.stackexchange.com/questions/270553/is-it-pos...


It seems though that in your example, all of the files being loaded are necessary on first load, and the total quantity is huge. Would the problems still be present if less JavaScript was depended upon on first load?


Magento 2 sucks, but clients are stupid and think that 10+ page loads are ok.


Page Speed Insights shows this site scoring a poor 39/100.


"Modern" web development is apparently a game of Jenga.


Only because developers absolutely insist on building things that way, for some reason I will never comprehend.

What's the first thing most people do when starting a new project? They ask "what framework should I build this on?" and start their tiny portfolio site built upon a massively overpowered suite of enterprise-level software with a million features they'll never, ever need.

Then they might drop in ten or fifteen separate external libraries because using vanilla HTML, CSS, and JavaScript is just "so 1995".

Then they start thinking about AJAX and microservices because rendering an entire page on the server side is utterly unthinkable in 2021.

Then they cram the site full of third-party services (because your tiny home-brew website will definitely benefit from Newrelic monitoring).

Then they might start on unit testing.

Finally they package everything up using at least seven different dependency managers, because a Github project without 100 useless ancillary files (grunt.js, app.yaml, travis.yml, composer.json, .gitignore, etc. etc.) is obviously not acceptable.

And this is why I hate modern web development.


I remember reading a blog post about a fake conversation between two developers about web development and one way telling the other about how "easy" it is and goes through a 5 step process of how to build their 1 page website. That was written at least 5 years ago and things have only gotten worse since then.


This is such a tired take. Logic has been moved from the backend to the frontend (again) which has justified the complexity.


> This is such a tired take

Quite the opposite. It needs to be hammered into people. Your app isn't Notion or Google Docs. Your app is a form with a login menu and a couple of small logic elements and a date picker.


So what? Just because your application is simple, does not mean you need to use vanilla tools. Do you get mad at a carpenter for using a nail gun because he 'can more easily swing a hammer'? It is the application of tools not the tools themselves that is the problem.

I really can't understand why so many people rag on frontend development as if we (as frontend developers) are just magpies who gather shiny things and don't care about performance or simplicity.


> I really can't understand why so many people rag on frontend development

May be you should try to listen to them more carefully and make an attempt to understand their woes.


The problem is, most of those complaining have never tried to understand the complexity (and why it exists) because they've never built a highly complex frontend application! They're degrees removed and just throw stones from their glass houses.

Guess what, I'm not picking Spring to make a simple REST api when it takes 5 lines in Expressjs. Why? Spring is complete overkill. So along those lines, should you use React/Redux/Redux-{Thunk,Promise,Observable}/Navigation/etc if you need a simple 2 page site? NO! Jesus christ, you pick the right tool for the right job.

There's a reason though, when you have a highly complex app with insane business rules (that could very well be hidden from the airmchair HN crowd) where you need that complexity. And guess what, if you're highly versed in these "complex" frontend tools (spoiler: they're not complex), then it makes building sites, generally, very easy.

So go ahead, believe what you want and continue to parrot the "the frontend ecosystem is hopeless, needlessly complex and godless" while the rest of us continue developing with these "complex" tools because they make our lives as developers easier.

I'll sleep happy while you wait that one more second for the page to load.


I agree with what you're saying.

I think what most people are frustrated about with the frontend ecosystem (And many other areas of software) is actually misuse of tools.


I have a slow (apparently) work laptop and the web is horribly slow. Some pages feel like surfin' the pre 2000 web with all that lag and slow loading everytime anything moves.

For myself, I would appreciate more such takes.


It certainly feels that way sometimes. But web is rather unique. There aren't any other platforms that demand you deliver an application for "a device" (specifications unknown!) in under 1 second.


That's why there's all kinds of helpful features in HTML/CSS like media queries. You can serve up raw content and let the device make decisions on what resources to use to display the content. You can tell the device the most optimal resources for its particular constraints.

As a for instance you your script tag can just include a bunch of @import statements. Thankfully @import statements support media queries [0]. So you get the utility of external style sheets but can load one optimized for the client device. Unlike media queries on link tags the browser doesn't load style sheets (well they're not supposed to) that don't match the media queries.

It's also trivial to include a tiny bit of base styling with every page in a script tag. It doesn't blow any size budgets and makes sure a page is readable even with no external assets.

[0] https://developer.mozilla.org/en-US/docs/Web/CSS/@import


Web is indeed unique. No one else would consider a magazine article "an application".


There's such a thing as an unreasonable, unrealistic, or downright stupid demand as well. What you often see is a house of cards hacked together to try and support such demands on the web.

Sending reasonably well visually formated text is definitely a reasonable demand though.


I don't understand the point you're making. Mind elaborating?


There is a ridiculous number of moving parts, stacked on top of each other, interacting in unforeseeable ways to allow you to shoot yourself in the foot, and absolutely nothing is straightforward.

OK, that basically describes any kind of software development, but web seems so much worse than anything else. And I say that as someone doing mainly Java backend development who's learned to live with the AbstractProxyFactoryManagerFactory jokes.


Thanks! I appreciate the explanation.


And designing to please Google.


The layout jumps


Snarky TLDR: half the JS load time was ad scripts. IMO Most of the performance increase was around specifying image heights, using facades for third party embeds, and optimizing around ad and analytics scripts.

Nice write up but not a big surprise to anyone that blocks analytics tracking, ads, and third party embeds.


Such an odd design for the site.


[flagged]


Well, if they "discover" it, then they did "learn" something.


[flagged]


[flagged]


Reddit v3.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: