The team at gov.uk is doing an excellent job regarding web performance, credit where credit is due. In many ways role model behavior.
Still, the results are such a stretch as to not have that much meaning. They have to descent all the way to 2G to see any meaningful difference, and I'm assuming they are cold visits (typical in lab-based testing).
For those exceptional users, this creates a difference from very poor (12 seconds) to still poor (9 seconds). Probably less because they'd normally have a warmed up cache.
Is it empathetic to improve performance for those users? Very much yes, do it as far as you budget allows for. But as it comes to jQuery specifically, the conclusion is that its negative impact is negligible.
12 seconds down to 9 seconds is not negligible. What is negligible is 12 down to 11.999 seconds because you prematurely optimized.
And even for "normal" users, shaving milliseconds matter. The general idea is that a UI should respond within 100ms to feel instantaneous. More than 1s and you interrupt the flow of thought. According to Google research, increasing page load time from 1s to 3s increases bounce rate by 32%.
Performance matters, a lot. I am not living in the UK, but if I was, I would be happy to pay taxes that make government websites 25% faster.
And even without the performance benefits, I consider removing dependencies a good thing. jQuery was great in the IE6 days, but now that even 10 year old browsers have decent JS, it is not as important as it once was. Modern versions of jQuery don't even support IE6 anymore.
That is not what the person you were replying to claimed. Their claim was that the overall effect of removing jQuery was negligible, and made a persuasive argument to support it, while also acknowledging that making modest improvements to
the experience of a minority of users was an admirable achievement.
What is negligible is 12 down to 11.999 seconds because you prematurely optimized
Who could disagree with that verdict over an imagined scenario? I would even go so far as to say it could be considered negligible whether or not the optimization was premature or not.
And even for "normal" users, shaving milliseconds matter. The general idea is that a UI should respond within 100ms to feel instantaneous.
9 seconds is pretty fucking far from 100ms. Far enough that I feel like shaving milliseconds off does not in fact matter to the majority of “normal” users.
According to Google research, increasing page load time from 1s to 3s increases bounce rate by 32%.
Fascinating. What does the research say about increasing the load time from 9 to 12 seconds? Those are after all the figures that are relevant to this discussion.
Performance matters, a lot. I am not living in the UK, but if I was, I would be happy to pay taxes that make government websites 25% faster.
I’m sure the UK government is thrilled that you approve of how they spend their tax revenue. But just to reiterate: We’re not talking about a 25% improvement of websites across the board. We’re talking about a 25% improvement of a specific page for a small subset of users.
And even without the performance benefits, I consider removing dependencies a good thing. jQuery was great in the IE6 days, but now that even 10 year old browsers have decent JS, it is not as important as it once was. Modern versions of jQuery don't even support IE6 anymore.
I doubt anyone disagrees with this as a general statement. Certainly it would probably be a mistake to take on jQuery as a dependency today. However, it does not cost nothing to remove jQuery from a code base that has depended on it for a long time. And the cost has to weighed against the risk and the cost, not least of which is the opportunity cost: What could the programmers tasked with removing jQuery have done with their time instead?
Developers really don't care. They will claim otherwise with great conviction, but its all posturing and bullshit. If developers did care they would measure everything and challenge popular assumptions. Instead most developers want to randomly guess at what works and instead talk about tools. That's why most pages will never come close to being vaguely fast. 9 seconds is still really slow.
My hamster mobile can achieve 0 to 60mph in under 8 seconds while a Corvette C8 takes about 2.6 seconds. Nobody gives a shit what socket wrench they used to put the tires on.
I think you misunderstood the comment you replied to. Their point wasn't that 12s to 9s is a negligible improvement generally. The point was that the improvement only applied to a tiny fraction of their users and it was still a bad experience, precisely because it was so far from 1s.
Has anyone been able to reproduce this "Google Research" that is so often quoted? They have numbers related to conversion too. I know I'm sounding doubtful but bringing response time down a second or so isn't a magic potion for more money pouring in the door. Not that anyone should ignore performance. It's irresponsible to ignore perf. I just know from experience, when buying something on Black Friday at Amazon, if the page doesn't respond in 2 seconds... you probably are going to wait a bit longer.
Google's results consistently don't hold up for high-intent visitors, yeah. A .gov site isn't (overly) concerned about bounce rate - people have no other location to bounce to, they will consistently wait out latency to do what they need to do.
It does still help of course, just at tiny fractions of the impact that Google saw.
The IBM researched this in the late ‘70es and reached broadly the same conclusion. They were talking about “productivity” instead of “bounce rate” back then, but the basic biology of attention span and working memory is the same.
He and Richard P. Kelisky, Director of Computing Systems for IBM's Research Division, wrote about their observations in 1979, "...each second of system response degradation leads to a similar degradation added to the user's time for the following [command]. This phenomenon seems to be related to an individual's attention span. The traditional model of a person thinking after each system response appears to be inaccurate. Instead, people seem to have a sequence of actions in mind, contained in a short-term mental memory buffer. Increases in SRT [system response time] seem to disrupt the thought processes, and this may result in having to rethink the sequence of actions to be continued."
The Google Research that is often quoted might have been well intended, but is grossly overstated and misused.
The thing with bounce rate is that you don't know WHY the user bounced nor did you know their intent. Typically analytics aren't even loaded yet. Sure enough I believe there to be a correlation forming as you dramatically increase page response time, but you still cannot attribute a high bounce rate to performance alone, which the often quoted line does imply.
Case in point, most of Google's properties don't even come close to their own performance guidelines.
At the time that those metrics were being thrown around, Google was knee-deep in pushing AMP. They visited us at Overstock several times to cajole us into using it with cries of, "will you please consider your users in less fortunate countries on less than stellar networks". That's fair I guess. We reduced our bundle size to 185k. They still tried to convince us to use AMP. We travelled out to Sunnyvale for a hackathon at the Googleplex, they were pushing AMP all over the place. AirBnB were the only folks willing to stand up and flat out say, "because AMP is terrible". (side story - I realize)
I need no convincing that performance matters, I'm a performance engineer by profession. You're kind of contradicting yourself. The numbers you use (100ms, 1s, 3s, etc) are well researched performance perceptions linked to our biology, in a way they are timeless guidelines.
That scale of attention span places both 9s and 12s in the exact same box: you've completely lost your user's train of thought.
Is 9s better than 12s? Obviously, yes. But that wasn't the point. My point was that there is such a thing as diminishing returns. If I were to follow your train of thought (performance absolutism), we should next optimize for WAP, if you're old enough to know what it is.
> They have to descent all the way to 2G to see any meaningful difference
Somewhat shockingly according to https://www.cambridgewireless.co.uk/news/cw-journal/why-2g-w... 12% of mobile network connections in the UK are still on 2G? That's surprisingly high usage, and would definitely be worth optimizing. Although usage that high would then explain why 2G shutdown is still 10 years(!) away for the UK.
Phones still on 2G also seem likely the be phones that don't have the best of web browsers, so cold visits may still be more representative than it seems like it should be.
I wouldn’t read too much into 2G figures meaning 2G-only phones (or 3G). In my experience once you get out of populated areas connections often drop to 2G, even on my 5G capable iPhone. It’s mostly a coverage issue not a handset issue.
That's somewhat surprising. The US, which has plenty of rural areas with cell coverage, phased out 2G basically entirely over the past few years (only T-Mobile has a 2G network at all in the US now, and it's due to be sunset at the end of the year) Heck the US's 3G networks are all shutdown already, too, except for Verizon which is due to be turned off at the end of the year. Canada, also no stranger to rural areas, also has a complete shutdown of the 2G network. What's the struggle with UK getting 4G/LTE coverage out to those areas? Just no investment?
> The US, which has plenty of rural areas with cell coverage, phased out 2G basically entirely over the past few years (only T-Mobile has a 2G network at all in the US now,
There's still big gaps. My parents live in the middle of nowhere. Their service isn't classified as 2G but the 5G service is so spotty and the signal is so weak it might as well be. The US has a documented history of producing data to avoid serving these areas, same thing with internet.
It varies wildly depending on which network you are on. For example, on EE I get 4G coverage almost everywhere - even across remote parts of the Scottish Highlands and Islands. OTOH, my missus is with Three, and she rarely gets 4G at all outside of towns and cities.
Exactly. It was cute for a few years, but these jQuery bashing thoughtpieces are so worn out now. As React and the long tail of frameworks slides into it's twilight years, it's clear in retrospect that jQuery was just a much more responsible and sound architecture (mostly because it cleaved to standards) than any of the things that spawned the 1000 "why I'm leaving jQuery for _this_" blogposts every day.
I think you are conflating 2 different things. jQuery -> vanilla JS and jQuery -> framework (React/Vue/Angular/etc). Though I'm not sure where you getting the idea that JS/TS frontend frameworks are sliding into their "twilight years".
You can improve website performance by replacing jQuery with vanilla JS. That's a statement of fact, not a "jQuery bashing thoughtpiece".
Several years ago, jQuery was practically a requirement, but now vanilla JS offers many of the same features. There's a huge amount of older websites that relied upon jQuery but don't really need it anymore.
Even as someone who's largely ripped out jQuery for vanilla JS I still totally understand why people use the API. Vanilla JS is terse comparative to jQuery short-hand syntax, especially for simple event bindings.
Sure, I think jQuery these days is largely legacy developer ergonomics... I admit there are parts of those ergonomics that I miss. Especially when comparing it to modern tool-heavy transpiled/compiled/whatever JS development which is way more opinionated than using a simple utility library.
But only if you (and any libs you use) only plan on adding 1 click handler to an element. That's why I always opt for the addEventListener/on style vs setting the onclick/ondblclick/etc handlers.
I am also excited about gov.uk tech blog posts. My point: They are always "looking in the mirror": "How can we become faster and more accessible?" Making digital gov't websites as accessible as possible is an incredibly important social priority for modern governments in the 21st century.
> Making digital gov't websites as accessible as possible is an incredibly important
Indeed - websites are the most convenient way to access government services.
So it's important that government websites don't rely on technology that isn't universally available. So I wish gov.uk sites would stop insisting on a smartphone (or SMS) for 2FA. Phones have these problems:
* They cost a lot.
* It's impossible for the average user to check the 2FA app is secure.
> So it's important that government websites don't rely on technology that isn't universally available.
IMO this should also mean they have offices and phone lines with minimal hold time. Not everyone can use the web fluently, and a website is a cheap replacement for human guidance.
Governments have an obligation to serve all of their citizens, even those on 2G or similarly slow connections. They can't just say "eh, well, we don't want them as customers anyhow".
Still, the results are such a stretch as to not have that much meaning. They have to descent all the way to 2G to see any meaningful difference, and I'm assuming they are cold visits (typical in lab-based testing).
For those exceptional users, this creates a difference from very poor (12 seconds) to still poor (9 seconds). Probably less because they'd normally have a warmed up cache.
Is it empathetic to improve performance for those users? Very much yes, do it as far as you budget allows for. But as it comes to jQuery specifically, the conclusion is that its negative impact is negligible.