If you're interested in this sort of optimization then Zach Leatherman's "Speedlify" is a good tool for doing continuous monitoring - https://www.zachleat.com/web/speedlify/
Also you always have to remember that users won't necessarily hit the homepage first. If they click a link from a search or a social media post they might end up on any page in the site. Everything that you do to optimize the homepage should be done for every page. You end up with cacheable assets that are shared between pages and everything else is only downloaded when the user hits a page that actually needs it. That's the way to a fast site.
With HTTP2 request parallelisation having lots of small files is not a bad thing. You can also add prefetch headers for big things like fonts. There's lots of ways to make sites load better.
Also you're not just gaining for the second page, you're gaining for all pages visited on the site going forward.
The Opportunities and Diagnostics sections don't contribute to your Performance score. Your overall Performance score is a weighted average of the Performance metrics . The Opportunities and Diagnostics sections are just potential ideas on changes that may help you. It's up to you to decide what's best for your site.
Disclosure: I work on Lighthouse and web.dev
I use a Lighthouse score for troubleshooting, but at the end of the article there is a chart of the change in Core Web Vitals, which tracks performance of _all_ pages indexed by Google over time.
> "…nearly everything it suggests you do will make that second page load slower…"
This is just not true for _any_ of the suggestions in the article. I guess you can make this argument for breaking down one CSS bundle into four, but their content was picked based on analytics of where people go more often. We make 1 extra request but load much less cruft, and it applies to any page of the website.
A common practice is to use Lighthouse CI to test a set of representative pages across your site (such as your homepage, product search pages, and product detail pages if you're an e-commerce site). Every time that someone submits a pull request, Lighthouse CI runs Lighthouse against all of the representative pages to help you gain confidence that you have not introduced weird regressions on the rest of your site. https://web.dev/lighthouse-ci
> nearly everything it suggests you do will make that second page load slower
Which audits exactly are you referring to? To be frank, I highly doubt that claim.
1. Have your initial landing page load and render as fast as possible, and cut stuff out to make this happen.
2. That initial page will usually take the user at least a couple seconds to read. While they are doing that, you can load stuff in the background that is needed for other parts of your site, like larger JS bundles. There are a number of ways to do this.
My main pet peeve with lighthouse is the "Eliminate render-blocking resource". It suggest inlining critical CSS and defer the rest. However, in my experience with limited testing group, there are no perceivable different at all.
I also have way too many web site that origin summary and field data shows all green, but the lab data is almost always in either yellow or red.
There are many, many research and case studies  correlating improved website performance to improved business metrics (such as conversions).
Google Search has also signaled that the Core Web Vitals will become a ranking signal .
Helps push general performance and best practise forward. And its communicated in a way non techie's understand.
Almost too easily - have had some 'consultants' run 'reports' that are just lighthouse re-badged, coupled with no technical understanding of what the action points mean, or why they may not be suitable for the platform / workflow in use.
But for the most part it's a positive tool, thanks!
It's not as important as your page rendering correctly, or finding the cure of cancer. It is however a nice goal to strive for regarding performance, encapsulated in an easy to use scoring tool.
Its results might be arbitrary, but for some industries, when Google says "jump", you say "how high?".
Suggestions/feedback on the Lighthouse guides is also welcome (the documentation that you see after clicking those "Learn more" links).
LCP. Oh boy, it's hard to track down what will make a difference. Is it the font, the CSS sizing of the element, the images in it... help on diagnosing what's causing the LCP time would be great.
But by far the least helpful metric is when I'm told that analytics.js could be optimised further. A (I assume) Google-sponsored tool is telling me I should improve the result from another Google tool. I appreciate why it happens, but it's frustrating :)
* Only to learn earlier this week from Paul Irish that some sites can't reach 100 due to the way the scoring curves are set.
No problem, I should have led off with that because it's usually the most actionable feedback. Thanks for the feedback.
When we used to measure TTFB, DomContentLoaded and page fully loaded, I understood what those meant and could relate them to what's going on on-screen.
The core web vitals are a bit abstract to reason about - and how they're linked. FCP vs LCP, how they interplay...
I'd like a video where a page loads and PING! a vital happens, the page pauses and someone talks about that vital, what's led up to it, what's affected it and what would change it. Then the page unpauses until the next vital is reached. Something to make it relatable.
Images aren’t appearing as well.
Our mobile score is lower (86 according to PageSpeed), but I mention that in a blog post.
Edit: seems to be back up but slow
A quick examination of the sites on https://www.11ty.dev/speedlify/ suggest there's a distinct look/set of trade-offs made to get this score.
Google Analytics offers many site speed metrics that are tied to real-world visits and can be correlated with other metrics, behaviors, and conversions on the site itself. These numbers also give you visibility on the entire site and not just a single page that you've run through the Lighthouse tool.
I don't want to discourage people from making the web faster but Lighthouse scores are about as helpful as domain authority and Alexa rank when you can take a detailed look at your users through Google Analytics and get more granular performance analysis from WebPageTest.org (which also provides Lighthouse scores, can be run privately, simulates various devices and locations, and much more).
I have seen improvements from speeding up pages particular in mobile postmark is unusual in having so much many desktop viewers.