When Google announced Chrome User Experience Report (CrUX), I was immediately excited and started to dig. It's a 30TB BigQuery dataset with a ton of insights into how real users experience the web.
But working with raw data is hard and expensive (5$ to scan 1TB). So I built – Treo Site Speed (https://treo.sh/sitespeed). It caches CrUX data and makes it accessible for everyone, and for free. No signup, just enter a URL and explore Core Web Vitals, server responses, and other speed metrics for top 8m websites on the web. Build your custom metrics using percentiles and intervals (including p99).
But working with raw data is hard and expensive (5$ to scan 1TB). So I built – Treo Site Speed (https://treo.sh/sitespeed). It caches CrUX data and makes it accessible for everyone, and for free. No signup, just enter a URL and explore Core Web Vitals, server responses, and other speed metrics for top 8m websites on the web. Build your custom metrics using percentiles and intervals (including p99).
I'd love to hear what the HN community thinks about it. Check out an example for HN vs Reddit to get a quick taste: https://treo.sh/sitespeed/news.ycombinator.com/vs/www.reddit...