And yet, compared to the time you're waiting for that mast head jpeg to load, plus an even bigger "react app bundle", also completely irrelevant.
HTTP/3 makes a meaningful difference for machines that need to work with HTTP endpoints, which is what Google needed it for: it will save them (and any other web based system similar to theirs) tons of time and bandwidth, which at their scale directly translates to dollars saved. But it makes no overall difference to individual humans who are loading a web page or web app.
There's a good argument to be made about wasting round trips and HTTP/3 adoption fixing that, but it's not grounded in the human experience, because the human experience isn't going to notice it and go "...did something change? everything feels so much faster now".
Deploying QUIC led to substantial p95 and p99 latency improvements when I did it (admittedly a long time ago) in some widely used mobile apps. At first we had to correct our analysis for connection success rate because so many previously failing connections now succeeded slowly.
It's a material benefit over networks with packet loss and/or high latency. An individual human trying to accomplish something in an elevator, parking garage, or crowded venue will care about a connection being faster with a greater likelihood of success.
Almost every optimization is irrelevant if we apply the same reasoning to everything. Add all savings together and it does make a difference to real people using the web in the real world.
Google operates at such a scale that tiny increases of performances allows them to support a team of engineers and saves money on the bottom line.
For example, Google hires 10 engineers, they deploy HTTP/3, it saves 0.5% cpu usage, Google saves a million dollars and covers the salary of the said 10 engineers.
For the vast majority of society, the savings don't matter. Perhaps even deploying it is a net-negative with a ROI of decades. Or, the incentives can be misaligned leading to exploitation of personal information. For example, see chrome manifest v3.
It absolutely matters. Machines are orders of magnitude faster than they were 20 years ago; most software isn't doing much more than software did 20 years ago. And no, collaborative editing isn't be-all, end-all, nor does it explain where all that performance is lost.
Many optimizations have bad ROI because users' lives are an externality for the industry. It's Good and Professional to save some people-weeks in development, at the cost of burning people-centuries of your users' life in aggregate. And like with pollution, you usually can't pin the problem on anyone, as it's just a sum of great many parties each doing tiny damage.
>most software isn't doing much more than software did 20 years ago
This isn't exactly true, but some of the potential reasons are pretty bad. Software turning into an ad platform or otherwise spying on users has made numerous corporations wealthier than the gods at the expense of the user is probably one of the most common ones.
What a bizarre thing to say: not every optimization is imperceptable by humans (jpg, gzip, brotli, JS and CSS payload bundling and minification, etc. etc.) and not all sums of optimizations add up to "something significant in terms of human perception".
HTTP/3 is a good optimization, and you can't sell it based on "it improves things for humans" because it doesn't. It improves things for machines, and given that essentially all internet traffic these days is handled by large scale machine systems, that's a perfectly sufficient reason for adoption.
For a long time all my internet connections were bad (slow, unstable or both). Compressing HTML/CSS/JS, avoiding JS unless absolutely needed, being careful with image sizes and formats, etc, helped a lot... so I guess this makes me biased.
Today I have fibre at home, but mobile networks are still patchy. I'm talking sub 1Mbps and high ping/jitter sometimes. So you can see why I think an "irrelevant" optimisation that removes 300ms from a page reload, no compression vs brotli/zstd, jpg vs avif, etc, are important for me, a human.
It's important to keep in mind that many users out there don't have a fast and low latency connections, at least not all the time. What takes 300ms to complete on our fast machine and fast WiFi at the office might take 1s on someone else's device and connection. It's harder to understand this if we only use fast connections/hardware though.
That was my point: 300ms sounds like a lot until, like me too, you're on a slow connection and those 300ms on the entire 10 second page load are utterly irrelevant. You were already expecting a several second load time, that 300ms is not something that even registers: the HTTP negotiation on a modern page is _not_ what you're noticing on a slow connection. You're noticing literally everything else taking forever instead.
3% speedup is still pretty good. (especially because with some of the awfulness, it's possible to get bottle-necked by multiple of these in which case it could be 6 or 9%)
omfg: YES! YES IT IS! But you won't notice it and so the argument that it improves the experience is nonsense because you as human WON'T NOTICE THOSE 3 OR EVEN 6%
It's good because it speeds up the overall response by a measurable degree, not because it makes the experience better. That only happens in conjunction with tons of other improvements, the big ones of which are completely unrelated to the protocol itself and are instead related to how the page is programmed.
How is everyone this bad are understanding that if someone claims A cannot be justified because of B, that does not mean that A cannot be justified. It's near-trivially justified in this case. This crowd really should know better.
> But it makes no overall difference to individual humans who are loading a web page or web app.
Navigating from my phone at 4g and my fiber connection has drastic differences.
Especially noticeable when on vacations or places with poor connections, TLS handshakes can take many, many, many, seconds..After the handshake and an established connection it's very different.
HTTP/3 makes a meaningful difference for machines that need to work with HTTP endpoints, which is what Google needed it for: it will save them (and any other web based system similar to theirs) tons of time and bandwidth, which at their scale directly translates to dollars saved. But it makes no overall difference to individual humans who are loading a web page or web app.
There's a good argument to be made about wasting round trips and HTTP/3 adoption fixing that, but it's not grounded in the human experience, because the human experience isn't going to notice it and go "...did something change? everything feels so much faster now".