This is powered by Google Lighthouse, with the benefit of it being done via a web UI instead of a Dev Tools Audit. Which is both good and bad.
Good because Lighthouse has some reasonable best practices to follow, and a few good performance timings, so lowering the barriers of entry is nice.
Bad because many of Lighthouses best practices aren't always applicable (our major media customers constantly say "stop telling me a need a #$%ing Service Worker!"). And while Speed Index and Start Render are great, Time-to-interactive, First CPU idle, and estimated keyboard latency are still fairly fluid/poorly defined, and of different value.
This all also overlooks the value that something like the Browser's User Timings provides (Stop trying to figure out what's a "contentful" or "meaningful" paint, and let me just use performance.mark to tell you "my hero image finished and the CTA click handler registered at X"), which Lighthouse doesn't surface up.
What is interesting is the monitoring side. WebPageTest, Lighthouse, Page Speed Insights, YSlow, etc are just point-in-time assessments that is largely commoditized. Tracking this stuff over time and extracting meaningful data is valuable, so that's pretty cool.
Disclaimer: I work in the web performance space. People replace homegrown Lighthouse, puppeteer, or WPT instances with our commercial software, so I'm biased. However I like a lot of the raising awareness and trail blazing about what Performance/UX means that Google is doing.
But I'm getting the impression that you want Lighthouse to surface up this information in a different way. Please feel free to elaborate.
Disclaimer: I write the docs for Lighthouse. I'm speaking from my general knowledge of the project but haven't vetted these comments with my team. So consider all comments my own.
Good because Lighthouse has some reasonable best practices to follow, and a few good performance timings, so lowering the barriers of entry is nice.
Bad because many of Lighthouses best practices aren't always applicable (our major media customers constantly say "stop telling me a need a #$%ing Service Worker!"). And while Speed Index and Start Render are great, Time-to-interactive, First CPU idle, and estimated keyboard latency are still fairly fluid/poorly defined, and of different value.
This all also overlooks the value that something like the Browser's User Timings provides (Stop trying to figure out what's a "contentful" or "meaningful" paint, and let me just use performance.mark to tell you "my hero image finished and the CTA click handler registered at X"), which Lighthouse doesn't surface up.
What is interesting is the monitoring side. WebPageTest, Lighthouse, Page Speed Insights, YSlow, etc are just point-in-time assessments that is largely commoditized. Tracking this stuff over time and extracting meaningful data is valuable, so that's pretty cool.
Disclaimer: I work in the web performance space. People replace homegrown Lighthouse, puppeteer, or WPT instances with our commercial software, so I'm biased. However I like a lot of the raising awareness and trail blazing about what Performance/UX means that Google is doing.