Hacker News new | past | comments | ask | show | jobs | submit login
The impact of removing jQuery on our web performance (insidegovuk.blog.gov.uk)
315 points by kevinak on Aug 16, 2022 | hide | past | favorite | 264 comments



The team at gov.uk is doing an excellent job regarding web performance, credit where credit is due. In many ways role model behavior.

Still, the results are such a stretch as to not have that much meaning. They have to descent all the way to 2G to see any meaningful difference, and I'm assuming they are cold visits (typical in lab-based testing).

For those exceptional users, this creates a difference from very poor (12 seconds) to still poor (9 seconds). Probably less because they'd normally have a warmed up cache.

Is it empathetic to improve performance for those users? Very much yes, do it as far as you budget allows for. But as it comes to jQuery specifically, the conclusion is that its negative impact is negligible.


12 seconds down to 9 seconds is not negligible. What is negligible is 12 down to 11.999 seconds because you prematurely optimized.

And even for "normal" users, shaving milliseconds matter. The general idea is that a UI should respond within 100ms to feel instantaneous. More than 1s and you interrupt the flow of thought. According to Google research, increasing page load time from 1s to 3s increases bounce rate by 32%.

Performance matters, a lot. I am not living in the UK, but if I was, I would be happy to pay taxes that make government websites 25% faster.

And even without the performance benefits, I consider removing dependencies a good thing. jQuery was great in the IE6 days, but now that even 10 year old browsers have decent JS, it is not as important as it once was. Modern versions of jQuery don't even support IE6 anymore.


12 seconds down to 9 seconds is not negligible

That is not what the person you were replying to claimed. Their claim was that the overall effect of removing jQuery was negligible, and made a persuasive argument to support it, while also acknowledging that making modest improvements to the experience of a minority of users was an admirable achievement.

What is negligible is 12 down to 11.999 seconds because you prematurely optimized

Who could disagree with that verdict over an imagined scenario? I would even go so far as to say it could be considered negligible whether or not the optimization was premature or not.

And even for "normal" users, shaving milliseconds matter. The general idea is that a UI should respond within 100ms to feel instantaneous.

9 seconds is pretty fucking far from 100ms. Far enough that I feel like shaving milliseconds off does not in fact matter to the majority of “normal” users.

According to Google research, increasing page load time from 1s to 3s increases bounce rate by 32%.

Fascinating. What does the research say about increasing the load time from 9 to 12 seconds? Those are after all the figures that are relevant to this discussion.

Performance matters, a lot. I am not living in the UK, but if I was, I would be happy to pay taxes that make government websites 25% faster.

I’m sure the UK government is thrilled that you approve of how they spend their tax revenue. But just to reiterate: We’re not talking about a 25% improvement of websites across the board. We’re talking about a 25% improvement of a specific page for a small subset of users.

And even without the performance benefits, I consider removing dependencies a good thing. jQuery was great in the IE6 days, but now that even 10 year old browsers have decent JS, it is not as important as it once was. Modern versions of jQuery don't even support IE6 anymore.

I doubt anyone disagrees with this as a general statement. Certainly it would probably be a mistake to take on jQuery as a dependency today. However, it does not cost nothing to remove jQuery from a code base that has depended on it for a long time. And the cost has to weighed against the risk and the cost, not least of which is the opportunity cost: What could the programmers tasked with removing jQuery have done with their time instead?


> Performance matters, a lot.

Developers really don't care. They will claim otherwise with great conviction, but its all posturing and bullshit. If developers did care they would measure everything and challenge popular assumptions. Instead most developers want to randomly guess at what works and instead talk about tools. That's why most pages will never come close to being vaguely fast. 9 seconds is still really slow.

My hamster mobile can achieve 0 to 60mph in under 8 seconds while a Corvette C8 takes about 2.6 seconds. Nobody gives a shit what socket wrench they used to put the tires on.


The folks who pay developers get what they pay for. Management sets priorities, not developers.


I don’t buy that at all. There is a world of difference between poor performance due to low prioritization and lying about it.


“I don’t buy that at all.”

That’s my point.


I think you misunderstood the comment you replied to. Their point wasn't that 12s to 9s is a negligible improvement generally. The point was that the improvement only applied to a tiny fraction of their users and it was still a bad experience, precisely because it was so far from 1s.


Has anyone been able to reproduce this "Google Research" that is so often quoted? They have numbers related to conversion too. I know I'm sounding doubtful but bringing response time down a second or so isn't a magic potion for more money pouring in the door. Not that anyone should ignore performance. It's irresponsible to ignore perf. I just know from experience, when buying something on Black Friday at Amazon, if the page doesn't respond in 2 seconds... you probably are going to wait a bit longer.


Google's results consistently don't hold up for high-intent visitors, yeah. A .gov site isn't (overly) concerned about bounce rate - people have no other location to bounce to, they will consistently wait out latency to do what they need to do.

It does still help of course, just at tiny fractions of the impact that Google saw.


The IBM researched this in the late ‘70es and reached broadly the same conclusion. They were talking about “productivity” instead of “bounce rate” back then, but the basic biology of attention span and working memory is the same.

He and Richard P. Kelisky, Director of Computing Systems for IBM's Research Division, wrote about their observations in 1979, "...each second of system response degradation leads to a similar degradation added to the user's time for the following [command]. This phenomenon seems to be related to an individual's attention span. The traditional model of a person thinking after each system response appears to be inaccurate. Instead, people seem to have a sequence of actions in mind, contained in a short-term mental memory buffer. Increases in SRT [system response time] seem to disrupt the thought processes, and this may result in having to rethink the sequence of actions to be continued."

“The Economic Value of Rapid Response Time”

https://jlelliotton.blogspot.com/p/the-economic-value-of-rap...


The Google Research that is often quoted might have been well intended, but is grossly overstated and misused.

The thing with bounce rate is that you don't know WHY the user bounced nor did you know their intent. Typically analytics aren't even loaded yet. Sure enough I believe there to be a correlation forming as you dramatically increase page response time, but you still cannot attribute a high bounce rate to performance alone, which the often quoted line does imply.

Case in point, most of Google's properties don't even come close to their own performance guidelines.


At the time that those metrics were being thrown around, Google was knee-deep in pushing AMP. They visited us at Overstock several times to cajole us into using it with cries of, "will you please consider your users in less fortunate countries on less than stellar networks". That's fair I guess. We reduced our bundle size to 185k. They still tried to convince us to use AMP. We travelled out to Sunnyvale for a hackathon at the Googleplex, they were pushing AMP all over the place. AirBnB were the only folks willing to stand up and flat out say, "because AMP is terrible". (side story - I realize)


I need no convincing that performance matters, I'm a performance engineer by profession. You're kind of contradicting yourself. The numbers you use (100ms, 1s, 3s, etc) are well researched performance perceptions linked to our biology, in a way they are timeless guidelines.

That scale of attention span places both 9s and 12s in the exact same box: you've completely lost your user's train of thought.

Is 9s better than 12s? Obviously, yes. But that wasn't the point. My point was that there is such a thing as diminishing returns. If I were to follow your train of thought (performance absolutism), we should next optimize for WAP, if you're old enough to know what it is.


Are you able to share what you referenced as google research? I would love to dig into that.


> They have to descent all the way to 2G to see any meaningful difference

Somewhat shockingly according to https://www.cambridgewireless.co.uk/news/cw-journal/why-2g-w... 12% of mobile network connections in the UK are still on 2G? That's surprisingly high usage, and would definitely be worth optimizing. Although usage that high would then explain why 2G shutdown is still 10 years(!) away for the UK.

Phones still on 2G also seem likely the be phones that don't have the best of web browsers, so cold visits may still be more representative than it seems like it should be.


I wouldn’t read too much into 2G figures meaning 2G-only phones (or 3G). In my experience once you get out of populated areas connections often drop to 2G, even on my 5G capable iPhone. It’s mostly a coverage issue not a handset issue.


That's somewhat surprising. The US, which has plenty of rural areas with cell coverage, phased out 2G basically entirely over the past few years (only T-Mobile has a 2G network at all in the US now, and it's due to be sunset at the end of the year) Heck the US's 3G networks are all shutdown already, too, except for Verizon which is due to be turned off at the end of the year. Canada, also no stranger to rural areas, also has a complete shutdown of the 2G network. What's the struggle with UK getting 4G/LTE coverage out to those areas? Just no investment?


> The US, which has plenty of rural areas with cell coverage, phased out 2G basically entirely over the past few years (only T-Mobile has a 2G network at all in the US now,

There's still big gaps. My parents live in the middle of nowhere. Their service isn't classified as 2G but the 5G service is so spotty and the signal is so weak it might as well be. The US has a documented history of producing data to avoid serving these areas, same thing with internet.


It varies wildly depending on which network you are on. For example, on EE I get 4G coverage almost everywhere - even across remote parts of the Scottish Highlands and Islands. OTOH, my missus is with Three, and she rarely gets 4G at all outside of towns and cities.


Yes good point. Also the phone signal can be deceptive if the backhaul is terrible (which I found with EE quite often).


Maybe it's home alarms


Exactly. It was cute for a few years, but these jQuery bashing thoughtpieces are so worn out now. As React and the long tail of frameworks slides into it's twilight years, it's clear in retrospect that jQuery was just a much more responsible and sound architecture (mostly because it cleaved to standards) than any of the things that spawned the 1000 "why I'm leaving jQuery for _this_" blogposts every day.


I think you are conflating 2 different things. jQuery -> vanilla JS and jQuery -> framework (React/Vue/Angular/etc). Though I'm not sure where you getting the idea that JS/TS frontend frameworks are sliding into their "twilight years".


You can improve website performance by replacing jQuery with vanilla JS. That's a statement of fact, not a "jQuery bashing thoughtpiece".

Several years ago, jQuery was practically a requirement, but now vanilla JS offers many of the same features. There's a huge amount of older websites that relied upon jQuery but don't really need it anymore.


Even as someone who's largely ripped out jQuery for vanilla JS I still totally understand why people use the API. Vanilla JS is terse comparative to jQuery short-hand syntax, especially for simple event bindings.

Sure, I think jQuery these days is largely legacy developer ergonomics... I admit there are parts of those ergonomics that I miss. Especially when comparing it to modern tool-heavy transpiled/compiled/whatever JS development which is way more opinionated than using a simple utility library.


> Vanilla JS is terse comparative to jQuery short-hand syntax, especially for simple event bindings.

You mean "verbose" here, probably?


Ah yea you're right - I misspoke. Vanilla JS feels verbose, and jQuery feels more terse in regards to general syntax.

---

document.querySelector('#sel').addEventListener('click', (e) => {})

Feels way more verbose than the following:

$('#sel').on('click', (e) => {})

My bad - apparently the coffee hadn't sunk in yet ha.


It's even more verbose in a lot of common cases, actually. (Although the ? operator has made things better).

---

1. document.querySelectorAll(".sel").forEach((e) => e.addEventListener("click", (e) => {}))

$('.sel').on('click', (e) => {})

2. document.querySelector(".sel")?.addEventListener("click", (e) => {})

$('.sel').on('click', (e) => {})

3. const e = document.createElement("div") e.className = "cls" e.setAttribute("title", "Title") e.textContent = "<Content>"

const e = $('<div>').addClass("cls").attr("title", "Title").text("<Content>")


OT note: sel.onclick=e=>{} works just fine. https://jsfiddle.net/39h5ynga/


But only if you (and any libs you use) only plan on adding 1 click handler to an element. That's why I always opt for the addEventListener/on style vs setting the onclick/ondblclick/etc handlers.


true but I would hope the libs use the event. (one onclick is still allowed in combination)

This approach seems the funniest: https://jsfiddle.net/eu60Lwv9/


I am also excited about gov.uk tech blog posts. My point: They are always "looking in the mirror": "How can we become faster and more accessible?" Making digital gov't websites as accessible as possible is an incredibly important social priority for modern governments in the 21st century.


> Making digital gov't websites as accessible as possible is an incredibly important

Indeed - websites are the most convenient way to access government services.

So it's important that government websites don't rely on technology that isn't universally available. So I wish gov.uk sites would stop insisting on a smartphone (or SMS) for 2FA. Phones have these problems:

* They cost a lot.

* It's impossible for the average user to check the 2FA app is secure.

* They run on batteries that go flat.

* they are not owned by their owners.

Please, always offer email as a 2FA choice.


> So it's important that government websites don't rely on technology that isn't universally available.

IMO this should also mean they have offices and phone lines with minimal hold time. Not everyone can use the web fluently, and a website is a cheap replacement for human guidance.


Governments have an obligation to serve all of their citizens, even those on 2G or similarly slow connections. They can't just say "eh, well, we don't want them as customers anyhow".


Where did I say otherwise?


Has anyone noticed how all UK government websites like https://www.gov.uk/, https://www.nhs.uk/, https://tfl.gov.uk/, https://coronavirus.data.gov.uk/ have the same look-and-feel and the same UX? Are there more such examples of countries that have uniform UX for their government websites?


Yes, this is an example of government design systems, which are becoming popular among developed countries:

- The UK: https://design-system.service.gov.uk/

- The US: https://designsystem.digital.gov/

- Canada: https://www.canada.ca/en/government/about/design-system.html

- Argentina: https://argob.github.io/poncho/

- Italy: https://designers.italia.it/

- Singapore: https://www.designsystem.tech.gov.sg/

- Estonia: https://brand.estonia.ee/


Australia: https://designsystemau.org/

it was the official design system until it got defunded, now I believe it's run by open source


Beautifully simple and incredibly fast to load. Well done Australia (hope the idiots who defunded it got kicked out of power after the last election and this will be fixed).


The New South Wales state design system is still fairly consistently used: https://www.digital.nsw.gov.au/delivery/digital-service-tool...

It's open source on GitHub: https://digitalnsw.github.io/nsw-design-system/


Wow that might be the fastest website ive seen



Will have to inform the french national services about it ("Ameli", "Impôts", "URSSAF"), apparently they are not aware of it!


It's one of the objective in the long run but it does require a lot of work to get everyone on the same page.

I am currently working on implementing the "Système de Design de l'État" for a french governement service but unfortunately it is still pretty chaotic right now since there are many modifications of components with breaking-changes / no backwards compatibility with almost every releases.


Don't worry, I know it's not that easy, and good for you for sticking to the design system rules! My sarcasm was mostly directed to the chaos introduced in state services by giant consulting firms doing there own thing quick and dirty...


Yeah, in theory the French government offers a ton of super-useful APIs to its services (to automatically fetch paperwork for a procedure, for instance) but the big services use almost none of them.



OMG. This is HN! Can we please get a showcase of the best e-gov't website globally? Right away, when I saw that Italian gov't has a web design studio, I was curious! Do they have some beautiful, but highly functional websites that any Italians can share? (Apologies for positive stereotyping!)



I'm afraid there won't be anything like that in Germany any time soon.


Wait, the Germans, of all people, don’t have a standard government design system? Next you’re going to say the Swiss also don’t have one, and I’ll have to throw out my entire set of national stereotypes!


Of course we do.

https://styleguide.bundesregierung.de/sg-de/

Edit: under “Hilfsmittel” there‘s a “Web Component Library”, hidden behind a login.


We Germans probably have a standard design system for the documents we send by Fax...


Or print out hundreds of pages of documents nobody reads and awkwardly wait while someone painstakingly scans each one to “have it digital”.


1. Our population is quite old thus many people in key positions are quite old and missed the transition to the digital world.

2. So. Much. Bureaucracy.

3. Another factor that doesnt help is that the government pays quite poorly compared to the private companies. Any semi talented dev will find a better position than working for the government


The bureaucracy can be a good thing, because it also makes it harder for the government to get information on its citizens, and I think this is deliberate after WW2. Making it hard to exchange information is annoying (I've heard fax machines were still in use during the corona pandemic) but serves as an additional layer of protection from hostile governments. Time will tell if digitizing everything in for instance Denmark (my country) was a good idea.

Number 3 is the same in Denmark. You can almost double your salary if you work for the private sector.


As someone from Germany I'm sometimes surprised what stereotypes exist outside Germany about it.

I can assure you standardizing things is not something Germany is good at.


Public sector and the old people are not good at modern solutions, but German officials and companies in general are still very driven in organizing and standardizing the soul out of everything.


The beautiful thing about standards is that there are so many of them.


> Next you’re going to say the Swiss also don’t have one

Well... at a federal level there is, but it's a confederacy so most of the time you're interacting with a local canton which has it's own set of systems.


It's not entirely surprising. Germany is a federal republic, somewhat similar to the US, with some degree of autonomy for its member states. Switzerland is a confederacy.


This needs correcting. Germany and Switzerland are both federal states, they occupy the same spot on the notional spectrum. <https://upload.wikimedia.org/wikipedia/commons/d/da/The_path...> The comment would have a point if either state was unitary, but this is not the case.

Switzerland was a confederation prior to 1848, the now inaccurate name stuck for I assume sentimental reasons. The issue of republics is orthogonal to the topic under discussion.


My point is that they're both not the most centralised form of government. Even if Switzerland is officially a federation instead of the confederacy its name suggests, the point still stands.


It actually exists in the form of this CMS: https://produkt.gsb.bund.de/DE/Home/home_node.html

However, I believe it's mostly the federal ministries that use it.


It exists but I think it's undeniable that Germany is behind in many digital aspects, even in advertising. I think part of that is cultural (credit card acceptance and advertising being good examples) and another part existing bureaucracy.




Is this a parody? Knockoff? Why is a Russian government site on an English domain name under some random tld?


Ironic that it's http


In Soviet Russia, specification website learns about you.



New Zealand is launching one soon that is heavily based on the UK system. https://design-system-alpha.digital.govt.nz/



Of all the examples in this thread, this is the only one that isn't a plain HTML page, but rather a pile of javascript that takes several seconds to load and shows nothing but an "enable javascript" banner without JS. Despicable, but also absolutely typical.

Edit: alright, CZ, UA and RU further down the thread are the same, I just hadn't gotten to those yet. Still, absolutely unacceptable for a government design standard.


This isn't even a government design standard (or government website for that matter). If looking into the impress you will find that it is run by a Verein that's promoting tourism. To quote:

> Promotion of Austria as a holiday destination.

There is a corporate design guide available [0] but from what I can tell it's more of a guide for "everything" not just websites.

[0]: https://www.bundeskanzleramt.gv.at/service/corporate-design....



It's nice, but also gets boring and restrive quiet quickly in Governmental organisations I've worked for.


> Boring

I feel like that's a great quality. As for restrictive, that's a valid concern.


Wait till you work with the team, and they decide to make all highlighted colors like a an extreme warning sign because "accesiblity" and 10 other dramatic choices and they prefer to be dominant debaters "I decide and you listen to me because you are not part of the team" instead of any reason.


If only every government website I visited was boring and predictable.


Perhaps. But user expectations are otherwise. Especially when lack of guardrails too often leads to wacky and wonky experiences.


<strike>wacky and wonky experiences</strike> inaccessible FTFY


Needs more <marquee>


They all use this design system: https://design-system.service.gov.uk. It's much more than just a CSS framework, it's full specifications on how to implement easy to follow and sensible UX.

I wish there was more buy in by local councils of this system, there are some incredibly poor local government sites.


Every local councill has a different payment system for colecting councill tax, and all of them fail in novel and unpredictable ways.

Last time they collected my name and address over the phone and got both of them wrong.

Then they were sending letter to a person that doesn't exist, to an address that doesnt exist, never used my email or phone number, didn't respond to my email, but still told me it's my fault!


I had the fun experience of my local (UK, major city) council sending debt collectors after me for non-payment of council tax while simultaneously (literally same week) asking what bank account to send the several thousand pounds they owed me for overpayment - because they failed to acknowledge that despite having moved address, I was still the same person, and their systems treated each address as an account not each person. They still told me it was my fault.


This is the difference between “council tax” a tax on property and “poll tax” a tax on individual people, and that went quite badly when it was tried... So it makes sense for the primary accounts to be per property. However you are quite right they should have a better understanding of a second layer of accounts being individual people.

The IT infrastructure around tax/HMRC is ridiculously complex. The “making tax digital” initiative is trying to fix that. The aim for business taxes, for example, is that each individual business will have a single tax account eventually, rather than separate once for each tax type as they currently do.


In the world of councils, an address is an account.

They just need to make sure that every address is paid for every day of the year - that's the way they maximize their revenue. They don't really who pays, as long as someone does. Hence the account system keyed on address.


You pay council tax on your address, not on an individual basis. So if you moved house and didn't tell them, they quite reasonably would have counted the payments towards your old address, not your new one.


I did tell them, the person I told said I could keep my automatic payments going, which I did for a year before they said I hadn't paid a year on the new address and had overpaid a year on the old address.

They then denied knowing that I had moved address, but didn't have an answer to "then why have you sent me a letter (to my new address) saying the payments for the old address should be refunded if you think I still live there?" (they literally denied knowing I had moved after sending the letters).

I appreciate that involved human incompetence not just a poor IT system, but a better system would have made the updating of me as a person paying council tax from one address to another seamless, then it would have flagged the problem sooner than a year in, and then it could have crossed my "debt" with my "overpayment" rather than passing it (in what I assume was another human mistake, but again allowed by the system) to a debt collection agency.


> I wish there was more buy in by local councils of this system, there are some incredibly poor local government sites.

You may be interested in Local Gov Drupal. Unfortunately they've defined themselves around a specific tool, rather than a design library, but maybe that was what it took to get them to come together. Seven councils are using the solution, and around 30 councils are involved in the initiative, but there is plenty of scope for the 300+ others to leverage the work and join the initiative.

https://localgovdrupal.org/


> You may be interested in Local Gov Drupal

Not much!

/me former Drupal developer. Drupal makes super-heavyweight pages, and is a real pain to learn. There is no backwards compatibility between major releases; to upgrade, you have to rewrite your site.

I burned 8 years working on Drupal sites. I made a living, but I ended up hating Drupal with a passion.


Yikes. Drupal is complex enough to be a government in its own right. Stay far far far far FAR away from it and any company or agency that demands its use. It's an overengineered nightmare of opaque idiosyncrasies.


I imagine a buy-in by local councils is just that - a cost. And most councils are either poorly funded, poorly managed financially or both!


I can only speak for my council, which complains it has no money but has hundreds of millions in slush fund "for a rainy day" and keeps building new council offices or rental spaces that aren't even being used or getting the rent. There's a huge one they built that needs like 30k+ a month in rent, no takers for 18+ months now (as you'd expect in such a crappy area and a certain pandemic).

They also have terrible web based services, akin to Frontpage '97 on a good day.


I'm not so sure, I think there is a real NIH syndrome in local government IT services, they quite often want to build something themselves - but then as you said don't have the budget to do it properly. Using the design system, and sharing platforms using it, would be a significant cost saving.

Plus the (at times) antagonistic relationship between Whitehall and local government I think means they don't want to necessarily use something they associate with the incumbent administration.


Not really.

Local gov doesn't have anything like the budget of central so they don't typically have the choice to build their own. Instead they're left with cobbling together behaviours from tens, maybe even hundreds, of disparate third-party systems, each independently solving the full stack of a given department's function. E.g. there's a dedicated system for planning approvals, another for administering blue badges, etc., etc.

These third-party systems are typically very cheaply made, with a very low maintenance budget, and don't provide anything like enough customisation to adhere to someone else's design system. If you're lucky you can add your own header/footer and set some main css colours to match your council's brand.


> They all use this design system

Does that system require absurd cookie banners? Gov.uk should be leading the charge against ridiculous cookie banners.


Coming from Denmark I'm impressed that they all use a uk domain, and they even have a system in place to distinguish official government websites from others.

In Denmark we are like: Need a domain for our common ID and login system? How about nemid.nu, where .nu is the TLD of Niue, but "nu" means "now" in Danish, so lol, let's use that. On top of like 7 other domains that users are randomly redirected between during use.


The UK. domain is one of the few that used to reproduce and enforce (almost) the gTLD hierarchy under itself, thus CO.UK., ORG.UK., GOV.UK., etc. (They gave up several years ago I think.)

(Unlike with almost every other ccTLD, UK is also not the ISO code for the country—GB is—and is instead “reserved”, which is why I must struggle to remember each time that the BCP 47 for Ukrainian-in-Ukraine is uk-UA.)


.au used to be third-level only too (or fourth-level in places, e.g. *.vic.edu.au for the state of Victoria), with careful rules about each, including things like requiring that the name match your business/whatever, trademark, or something your business sells or does (which eliminates the vast majority of squatting in theory without the additional “no squatting” rule, and in practice there wasn’t much, and also ensures surprisingly high quality of domain name meaningfulness and relevance), but this year they started doing second-level registrations with almost no rules, just about a free-for-all, after years of “public consultation” concluding with carefully-worded reasoning that I will uncharitably simplify to “why not?”, “we’ll make more money that way” and “everyone else is doing it”.

The sad fact of the matter is that structured and controlled ccTLDs where you knew what you were dealing with from the name alone are increasingly being supplanted by second-level domains.


A bit of internet trivia is the oz.au domain. In the late 1980s, the Australian universities were connected in a network with names ending .oz.

When the TLD .au arrived, .oz was renamed .oz.au. So my email address at Elec Engineering at Melbourne University was <name>@ee.mu.oz.au.

And then a few years later, the third-level structure got formalised. Melbourne Uni, for instance, switched from mu.oz.au to unimelb.edu.au.

I for one missed the typographic economy of the ee.mu.oz.au domain.

None of this really matters, of course, except that a sensibly designed structure with a clear underlying rationale, and historical context for a small number of exceptions, makes everything easier to understand :)


Yeah, those old *.mu.oz.au subdomains seem to have steadily disappeared. When my eldest brother was there somewhere around 2005–2010 Computer Science were still actively using cs.mu.oz.au (I think all their email and web stuff was still there), but even that seems to have lost its A records now, though it’s still got a third, interesting NS record (compared to ee.mu.oz.au which has only two boring NS records, edns-*.unimelb.net.au), mulga.cs.mu.oz.au, which is still talking A.

Another bit of historical trivia that now becomes mundane: csiro.au.


Of course, NemID isn't actually a Government website, despite being used by the Government. Similar situation as Sweden's BankID.


And then when users fail to phishing: "users are so stupid and easy to trick… it's impossible to make things safe when people are so gullible"


They're built using the gov.uk design system[0], which is not just a CSS framework! It also has a lot of guidelines and requirements to make sure that the pages you're building are accessible to everyone

https://design-system.service.gov.uk/


It's actually a pretty cool thing to browse through - they have some good advice on how to make things accessible and not confuse users. Each pattern tends to have supporting research as to why things should be done like that.


The fun part is the design system backlog[0]. There is an absolute wealth of information in there, along with plenty of opinions for designers in various departments, feedback from rounds of user research, alternative approaches to common problems. I spend a lot of time in there.

This thread on notification banners is an excellent example[1].

I think if this was started today they'd fit into GitHub's discussions better than issues, as that's what they're used for. Eventually, with enough support, these things might make it into the design system.

[0] https://github.com/alphagov/govuk-design-system-backlog/issu...

[1] https://github.com/alphagov/govuk-design-system-backlog/issu...


I think the UK Gov digital service is pretty well regarded for their clear and accessible UI/UX, not to mention their technical writing. They've got a decent amount of open APIs/data too.

Bonus: The ONS has some pretty smashing data visuals https://www.ons.gov.uk/peoplepopulationandcommunity/healthan...


This is the design system that the government(1) of The Netherlands uses: https://www.nldesignsystem.nl

(1) https://www.government.nl


I'm deeply confused that pretty much all Dutch government websites look alike, yet they look in no way like the "nldesignsystem" site


I'm not involved, but I think that is because the nldesignsystem project is fairly recent.


I wish more governments would do this to be honest - instead of having hundreds and hundreds of different agencies/localities all hiring different web designers all expressing their artistic preferences of how a website should work.

I applaud the UK for this - the US and its various federal agencies/states/cities/towns should consider the same, not only as a cost saving measure, but making the experience predictable across many different sites.


Slightly off-topic. Ok, off-topic.

I find it exceptionally galling that the so-called Inflation Reduction Act sends $80 billion to the IRS, of which $15 million is earmarked “to fund a task force that would study the cost and feasibility of creating a free direct e-file program.” Spoiler alert fellas - it’s feasible, and it should have been done 20 years ago.

https://crsreports.congress.gov/product/pdf/IN/IN11977


Id love to know what the potential reasons are for it being infeasible. Intuit doesn't want us to?


I feel like the quickest, easiest, and most politically paletable approach would be for the government to just buy Intuit.

The market cap is $130B, three quarters of which could probably be recovered by reselling the half of the business that deals with foreign/international accounting and large business accounting.

Just keep the stuff for individuals and companies with under 250 employees in the USA. Then make all that software available for free.

Over time, integrate that software deeper into IRS processes - for example, the software could autofill fields with stuff the IRS already knows. The web based stuff could go onto a .gov website and become the official way to file accounts and taxes.



Edit: TLDR Nice clean graphics, shit UX for the actual thing you are trying to do, here’s an example:

Nice that the Universal credit page is 17 seconds faster, but has anyone else noticed how they use dark patterns to screw you out of money you are due by making it really hard to submit accurate year end accounts if you have submitted an estimate at Tax credits renewal time because you didn’t have your year end accounts yet. The only way is to phone and go on hold for 2-4 hours or to write a letter to an address that is not listed anywhere on the website. I am having to pay back £75 / month of tax credits that I was legally due because I didn’t manage to submit this information to the correct place by the deadline. What is even more annoying is they have all this information as tax returns are submitted to the same government department and have all the information they need. The two departments even share the same bank account number for tax payements and tax credit repayments and use the same letterhead. They used to sync this information automatically but they changed the system at some point around 10-12 years ago. I can’t imagine why they have done this apart from it is designed just to screw money out of small business owners and sole traders that don’t make much money e.g. cleaners, hairdressers, window cleaners, dog walkers etc. It’s not setting a very good example if you want people to be honest and declare all their cash income!


Related and maybe a little OT, but why does Transport for London have to include a big Google Ads banner at the top of the page?


TFL is a "Statutory Corporation" which is funded both by grants from local government and commercial income, the majority from rail+bus fares but also from activities such as advertising. There is a significant quantity of advertising placements provided by TFL, in the stations, on buses, bus stops, and as it seems on their website.

TFL was almost bankrupted by Covid, I for one will give them a pass on the ugly advertising on their site.

However, looking at [0] and [1], advertising only covers about 1% of their budget. That banner ad will be tiny fragment of that.

0: https://tfl.gov.uk/corporate/about-tfl/how-we-work/how-we-ar...

1: https://www.statista.com/statistics/898278/transport-for-lon...


Absolutely crazy that public transport even has the possibility of being "bankrupted".


I think it's mostly a political move.

Things like "we have so little money we had to replace half our homepage with ads" is a good sob story, and will help them get billions in government funding, even though the ads probably only bring in a few thousand pounds.


A lot of things are starting to look the same, owing to the availability of ready-to-use templates by Mantine, antdesign, Material UI, etc.

For small projects, you almost don't need frontend chops anymore to deliver nifty looking goods.

I was kind of stunned to learn that even Apple uses MUI.


It does have a big downside - some departments like Highways England and Land Registry had their own, more sophisticated sites.

The new standardised GOV.UK sites are missing features and content.

There's a sense of dumbing down to fit the design standards.


Yes, the GDS has a design system: https://design-system.service.gov.uk/


Honestly, I used to think they were the same website but they all have different logins. Super cool and uniform UI/UX.


greece https://gov.gr explicity mimicks the UK style

On that note, i expect in the coming years some UK startup will be selling "government as a service"


That's nice but when you load the actual universal credit login page it needs nearly 400kb of reactjs - not sure about after that, don't have an account..

Some (most?) of the other gov uk sites have megabytes of js on them.. eg. https://coronavirus.data.gov.uk/details/testing?areaType=nat...


Yea, I was shocked they talk about stripping jQuery as if they have these super fast, snappy pages and then ... it's React. WHAT? React is a lot if things to a lot of people, but fast loading (on an underpowered device with bad connection, which are the conditions they're talking about) isn't one of them. When I imagine a grandma trying to login into a gov site and loading a meg of react related garbage which then runs like ass anyways, I want to punch a Zucc in his cold blooded alien dick.


Honestly if they'd just used jQuery instead of React all the way through they'd be loading a lot less JS and performance would be vastly better on those slow old devices. After all, jQuery was made for the slow old devices, in the time they were not slow or old.

The real problem is that jQuery is not new and shiny, and nobody wants to be doing jQuery in 2022.


In addition to which, jQuery was actual useful - React is just pure overhead with no real functionality.


I don’t understand how you can claim that react is pure overhead. Sure it does have overhead, but it’s also a really useful abstraction. How composable abstraction for UI components that enables local reasoning not functionality.

jQuery is just pure overhead with no real functionality compared to vanilla js.


React is vastly overused. You don't need every single element in your page to be a separate component when 95% of your site is pure text and you have an occasional form or a button here and there.

If you're making an actual application (think Discord/Spotify) then yes, it becomes a necessary abstraction. Your personal website or news website (or many others) using React is a waste of my bandwidth and CPU cycles.


Agreed it doesn’t make sense for a lot of things that could just be static sites. But you also wouldn’t need jQuery on such sites so I’m not sure why that matters.


Don’t forget about server side rendering. If you only need to serve static, you don’t really need any Jack, anyway.


React's core functionality is to manage the relationship between state and the dom.

jQuery lets you imperatively manipulate the dom.

They're different use cases; calling one useful and the other not isn't a very accurate representation.


No, React’s strength is to componentize your webapp, allowing several teams to work on objects that can be reused by each other.


Sure, it does that too, but that is not its runtime purpose.

Many, many approaches solve organizing your code; React is in the middle of a large pack here.


> Now I know what you may be thinking, that doesn’t sound like a lot of data, especially compared to images which can be multiple megabytes in size. But when it’s on every page, from a web performance perspective, it equates to a lot of data.

Er... the utility of jQuery in 2022 aside, I would say: if it equates to a lot of data when it's on every page (of a single website), you're doing something wrong?! I mean, caching content between different domains is not a thing anymore, but on the same domain jQuery should only be loaded once?


Yeah but for what?

I have been writing vanilla js for years now and aside from the sometimes slightly more convenient syntax I haven't missed a thing where using jQuery would have helped me over not using it.

I don't say anything against loading useful javascript libraries, but too many websites load 1 MB of libraries just to get a result that they could have achieved with a few lines of handcrafted vanilla js.

This is not just about performance, but every piece of code you add increases the attack surface and the number of moving parts that can fail. Not that every webpage has to be minimalist and barebones, but there are tradeoffs to be made and too many people just import the whole world, their neighbour and their dog without a second thought.


jQuery is more convenient to use than vanilla JS. Vanilla JS is too verbose. If only jQuery was modular and didn't require to load 300 or 400 Kb of code.

Also it is difficult to find non-jQuery library of a good quality. For example, I was looking for a small library to send and receive JSON via HTTP, and ended up writing my own wrapper around XMLHttpRequest (fetch() is a poor choice, it is not supported well in older browsers and it has no advantages, and fetch() requests cannot be aborted) because there was no good library for this. Other libraries either do not support all necessary browsers or are overengineered and too large.


>If only jQuery was modular and didn't require to load 300 or 400 Kb of code.

JQuery's own site says the library is 30 to 40 Kb, not 300 or 400. The minified JS file I downloaded of the latest version is 87 Kb, so I assume the rest would be taken care of with zip headers.

Are people routinely serving the uncompressed version of the library?


It seems that you are right, the 300Kb number is about an uncompressed version. Still it is very large and includes lot of code I don't need.


Agree. Plus one can use a cdn to increase the chances of the library being already downloaded.


Won’t work

Browser caches are partitioned by origin so if a.com and b.com both use jquery from c.com then two copies of jquery will be cached one with a key of a.com/c.com and another with b.com/c.com


Have browsers always done this?

I thought and heard that using a JS library from CDN is beneficial when users already have that same cached JS library from visiting other sites. But you're saying that's not a thing? So what's the benefit of using a CDN for jquery then?


No, they have not always done this. But the supposed caching benefits of CDNs have always been overstated.


Browsers have started doing this because it’s a security issue. If a some libs load fast, the app can fingerprint or guess which website you were on before.


Safari has done it for a long time, Chrome and the others more recently.

Even back in 2012 people were raising the question of whether shared caching of libraries from the public JS CDNs was effective — due to too many different versions in use, and content being evicted from the browser cache sooner than most expected


Ok thanks for explaining. I wasn't aware of the changes. I don't mind at all, it just means I'll avoid using CDN hosted JS libraries from now on.


fetch() requests cannot be aborted

Does {signal} not work yet? https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API#a...


fetch() can definitely be aborted, and what "older browsers" don't support fetch? You must be talking maybe about Internet Explorer 11, the only older browser that is still somewhat used (in Japan and public/old infra) that doesn't support it?

https://caniuse.com/fetch


I work on a site that gets a small percentage (less than 0.1%) views from iOS 10 devices, which doesn’t support fetch. This still amounts to a few dozen views.


Since the gov.uk page is talking about 2G users, I'd imagine that "old browsers" here means "whatever Nokia shipped over a decade ago"


2G users can include “users on the latest iPhone or Samsung who are in Scotland”. Try https://www.o2.co.uk/coveragechecker and look at Kinlochewe for an example.


From that page, it appears that older versions of Android Chrome don't support fetch at all.

I've seen stats that indicate that 5% of the U.S. population (to say nothing of other parts of the world) are still on Android Oreo (version 8), which wouldn't surprise me since older phones are basically locked in time.


Unlike on iOS, I don't think the browser version is tied to the operating system anymore? Besides the AOSP browser that no one uses


No, you're correct, but you're missing the point; what version of Chrome is on Android 8? Because it's definitely not Chrome 104, which apparently is the earliest version that supports Fetch, according to CanIUse.


I'm with you on fetch() requests not being able to be aborted, but I didn't think there were any other reasons to use XMLHttpRequest over fetch in 2022. I've been fetch() only for a few years as this point - what older browsers are you targeting that don't support fetch()?


fetch doesn't support upload progress

https://javascript.info/fetch-progress

> Please note: there’s currently no way for fetch to track upload progress. For that purpose, please use XMLHttpRequest, we’ll cover it later.

coming soon (2020) https://ilikekillnerds.com/2020/09/file-upload-progress-with...


Internet Explorer and older browsers support is important too. You must support browsers released within last 10 years and fetch is too new and experimental for this. I don't want to use something just because it is new and has no clear advantages over older technology.


Microsoft dropped support for IE in June 2022. https://caniuse.com/fetch shows that fetch() is now at 96.99% support for global users.

If you still need to support old IE you can use a library like https://github.com/github/fetch


And even then, there are probably fetch() implementations around xmlhttprequest


> If only jQuery was modular and didn't require to load 300 or 400 Kb of code.

It is? You can use grunt to make your own version, excluding the parts you don't need.


This is outdated, all modern browsers support fetch cancellation, look up `init.signal`. it wasn't available from the start, but at this point it's supported everywhere.


"Fetch cancellation" is where the browser tells itself "Stop trying to make 'fetch' happen," right?


> Browsers have started to add experimental support for the AbortController and AbortSignal interfaces (aka The Abort API), which allow operations like Fetch and XHR to be aborted if they have not already completed. See the interface pages for more details.

https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API#a...


That's why I wrote "the utility of jQuery in 2022 aside" at the beginning. Of course, if you don't need it anymore, you don't have to think twice about removing it, but their statement is overinflating the cost of having jQuery (or any other dependency) on your site.


jQuery was actual worth it about 15 years or so ago when browsers were far less compatible and Javascript compliant than they are now. The juice isn't worth the squeeze now, though.


The article goes on to explain that they are optimising for extremely low spec devices used by financially and socially disadvantaged users. The majority of their users load pages in less than 1s (likely due to caching and connection). Some users take 25s or more to load a page. It is for these edge cases they were testing.


If they're concerned about financially and socially disadvantaged users perhaps the place to start helping those people would be on the universal credit login page which currently delivers 100+ kb of javascript for a couple of form fields.


They're testing against a hypothetical user on a 2G cellular connection. I can't see many situations where that would happen. Being poor doesn't mean you can't afford 3G because every data plan in the UK includes 4G as a mimimum. Often 5G is included too. You have to be extremely rural to drop down to 2G and rural people are often better off. Poorer people tend to be in urban areas that have better digital infrastructure.


You're conflating having a 5G connection with a 5G connection running fast. Now, I live in the US, so maybe the UK has much better infrastructure, but cell phone internet service can be slow anywhere. I've had that 5G link with full bars in the middle of a city and could barely pull anything. The difference with me having money to spend is I can easily do any number of things about it, such has have a different internet service at home, or pay to stay get something in an internet cafe while 'working from home'.


Sounds like you live in a big city? In the rest of the UK, there are many many places, even in large towns of 100k+ people, where the mobile signal is absolutely terrible. Whether it's 4G or not doesn't matter if the signal is too weak to sustain a reasonable speed.


I think it is a great mindset to try to be as inclusive as possible. It might actually improves everyone’s experience.


They're testing against actual RUM data collected from the site and linked in the article.


[flagged]


I can't downvote you, and being as dismissive of you as you are of these efforts would simply create more rudeness.

Instead, consider why a governmental agency might want to be as inclusive of as many users of its services as possible, and how your miniscule slice of the world, advantaged as it is, might blind you to that motivation.


If you want to help the disadvantaged, don’t kid yourself that removing jquery from your website is doing it.


The article is talking more about the execution time to read, parse, and execute the library -- that isn't cached at all.


Seems like the writer hasn't heard about the trick of putting js at the bottom of the page, or the async html5 tag. Looks like the had jquery in the head section on every page. That's not really jquery or javascript's fault.


The writers are Matt Hobbs and Andy Sellick, respectively Head of Frontend Development and Senior Frontend Developers at GDS. They may have considered alternative options.

Here's a previous article explaining why they did what they did:

https://insidegovuk.blog.gov.uk/2022/08/11/how-and-why-we-re...


That article doesn't address the point raised: their placement of jquery in the head of page was likely the cause of the performance issues more than the presence of jquery.


I was thinking the same. However from the article it seems the issue is actually less with the downloading, but more with the parsing and execution, which is blocking, and has a material impact for users on slow devices.


Loaded from the Internet once.

So it's cached, but the browser still needs to access and load the file! :)


Maybe they have a bundle per page and it's in each one: https://www.npmjs.com/package/jquery


What I miss in this otherwise excellent article is more insight about methodology. There is just murky

> We run tests every day on GOV.UK pages using specific simulated devices and connection speeds. Because we can repeat these tests every day, it allows us to monitor what the changes we are making to the site are doing for real users visiting GOV.UK.

with no mention whether those were all cold start loads or simulated browsing scenarios.

Not wanting to diminish the effort or challenge the conclusions — I know that jQuery does quite a lot checks and feature detecting stuff when it initializes, even when it is not used (so removing it must have significant impact) — but if those measurements were done with cold start empty cache single loads each time, it might count some "extra" time (download, parse, tokenize, build AST) that usually precedes only initial load and "first run" of static library and is not present in subsequent interactions.


From the post

> For example, for a simulated user visiting the Universal Credit start page on a low specification device and 2G mobile connection, we can clearly see from the graph where the jQuery change was made.

Even if the cost of jQuery can be amortised over multiple page views in a session, someone's still got to have a successful first page view before they can view any more

For a service like Universal Credit that's aimed at the less well off then cheap i.e. slow devices, and mobile connectivity means that the service has to be as easy to access as possible


Yes, but still no notion if it was really just that "Universal Credit *start page*" [1] load measured or if it was some more realistic scenario, like, you know, navigating to some page that this guidepost links to. Loading just the start guidepost does not make much sense from the user's perspective.

Again, I really do not want to attack their conclusions, I'm just curious how much (if at all (!)) would browser cache be beneficial here.

(I'm not even sure browser generally caches once parsed chunks, but I assume they probably do.)

[1] https://www.gov.uk/universal-credit (presumably)


My guess would be that the vast majority of people who visit government websites are doing it very infrequently and are highly likely to arrive with a cold cache.


Does any browser cache the parsing, tokenizing and built AST of Javascript libraries between website visits?

Also, I don't visit UK government websites that often -- so I'd suspect I am "cold cache" whenever I do.


> Does any browser cache the parsing, tokenizing and built AST of Javascript libraries between website visits?

From a quick googling, yes, at least some of them. (TIL):

https://stackoverflow.com/questions/1096907/do-browsers-pars...

>> The third time (i.e. a hot run), Chrome takes both the file and the file’s metadata from the cache, and hands both to V8. V8 deserializes the metadata and can skip compilation.

https://v8.dev/blog/code-caching-for-dev

> Also, I don't visit UK government websites that often -- so I'd suspect I am "cold cache" whenever I do.

For first visit with given browser and profile (or incognito mode), sure.

But if you visit more than single page during same session, you probably have more than a "cold cache". That measured page mentioned in the article seems to be guidepost with only most important excerpt of the content, so it seems plausible you would "click around a bit" from there.


Scripts cached by a Chrome Service Worker will be compiled to byte code and cached too

https://v8.dev/blog/code-caching-for-devs#use-service-worker...


When I did web development, I specifically loaded JS lazily so first you got a pure HTML page with just a few lines of code. Those lines of code are normal (not jQuery) JS that loads the rest of the JS lazily in a progressive enhancement. This means the page can start being rendered before the jQuery code loads.

But in fact, jQuery is largely not needed since Internet Explorer lost market share and there's less of a need to have lots of workarounds for it.


If done properly this can be nice but a common grief is when interactive elements (input fields/buttons/links, including in flowing text) move around after first becoming visible. The worst is input fields that get cleared or grabbing focus (this is way too common; not stellar to have to password reset due to typing half a password in an auto-complete search field) as part of this process after the user has already started inputting data.


this is so infuriating when it happens and I click the wrong thing because between the time I saw the button and pushed the button, the button moved away and another button moved under my finger/cursor.

I would massively prefer to have a blank page, or a page with no buttons, and wait a few seconds for the buttons to appear in their final locations, than this.


it also negatively affects anchor links, e.g. example.com/#contact will not always jump to the contact section with lazy loading


You have to really do some extra work to make this happen "nicely". Swapping out parts of the page without jerking the user around is a difficult problem, especially if you aren't working with fixed sizes of elements. And creating a "skeleton" system of placeholders during loading is a whole other set of state you have to build out and maintain.

I find most sites using this kind of progressive loading are completely useless during the loading phase... What's the point of showing the user a page full of skeleton/loading placeholders? You might as well have just server-rendered the whole thing and made the user wait an extra second or two.


I remember how it was good practice to assign width and height to <img> tags.

The idea being the browser would know how to render the things around the picture without needing to download the picture first, preventing the page from jumping around as the quicker and lighter HTML was downloaded and rendered first.


That still is a best practice.

That "jumping around" is now referred to as "CLS" -- Cumulative Layout Shift -- one of the "Core Web Vitals" metrics used to quantify performance-related UX.


I sometmes like CLS, it makes it look like things are happening instead of feeling like a wait time. I recently used a simple templating mechanism

elem = jquery.clone() ... elem.slideDown()

And this renders bits of the page moving them as it does it; slideDown() is in jquery core. It used to be popular before that biz of rendering grey squares first got invented


Layout shift feels fairly terrible when you're navigating to an element that is at the bottom of the page (via #id) or simply refreshing the page. There is literally nothing more annoying than clicking the wrong link because the layout shifted under the cursor at just the wrong moment. CLS is also known as "jank" for a reason.


I use aspect ratio for this now that it's supported in CSS. It might be my favorite CSS feature since flex.


GP was talking about loading JS, adding JS to a webpage doesn't affect its rendering (unless the loaded JS is DOM-manipulating) does it ?


Yes it does by default.

You can document.write() in js by default so it has to wait for the js to load.

"async" tag stops the default behaviour but the person writing the blog does not sem to know what.


> jQuery is largely not needed since Internet Explorer lost market share and there's less of a need to have lots of workarounds for it.

As someone who's late to the webdev party I feel that jQuery is pretty awesome (all I ever heard is people poopoo it). The vanilla API for the DOM manipulation is just plainly awful and borderline nonsensical so I'd say jQuery is still very much needed.


If you are mainly using jquery for its DOM manipulation¹ rather than for browser compatibility² or things that didn't exist consistently in older browsers³ then there are much smaller libraries that do that job which may be worth looking into. https://github.com/fabiospampinato/cash or https://github.com/franciscop/umbrella to give a couple of examples. Some explicitly support IE11 so you are not dropping as much support for legacy browsers as you might otherwise.

Though if jQuery works for you and isn't a performance issue, then by all means keep with it. It may not be ideal, but good enough and does the job. Let the naysayers spend their time debating whether you should or not, and just get on with making things!

---

[1] selection engine, chained selections, chained modifications, …

[2] not the issue it once was, if you can abandon IE and old Android browsers from your supported UAs or can deal with any issues that crop up individually

[3] again, if you can afford to drop support for legacy UAs


I'm a big jQuery fan. Some new frameworks that do DOM binding are just too "magic". Sometimes there is close to zero js code for a page, which, while cool, means it's hard to find out what to do to add new featues when the default behaviour is not what you want.

With Jquery it seems the balance of writing code to make page active, and not writing too much code, is correct.

I guess I prefer reading code than docs.


Yeah, for what little front-end dev I do these days I prefer to keep the abstraction a bit more at the jQuery level.

Quite a few of the mini-jQuery libraries, like the ones I linked to, are doing just that and allowing the sort of DOM manipulation it does but without all the other stuff.


Thank you for the recommendations - cash looks awesome and more than enough for most of my projects!


This is probably the biggest hole in vanilla js. In particular building a large DOM structure requires so many method calls.

jQuery lets one use an innerHTML-ish style, but it is supposedly guarding against injection attacks in some way. I don't like the hand-wavy way it claims to guard against injections, as basically it has no way to tell what part of a string was meant to be text, and what was meant to be elements.

So I ended up coding my own library. No conversion of strings to elements, so naturally no injections. Very small and simple. But saves a ton of typing when generating DOM structures in JavaScript: https://github.com/NoHatCoder/DOM_Maker


I don’t do front end work any longer but JQuery was the glue that held webpages together in the time of IE. Before the rise of Big Tech JS Frameworks it was the work horse getting things done in the browser while trying to have cross browser compatibility.


jQuery and the many plug-ins people wrote for it were amazing.

What was not amazing was the manner in which many projects utilized them.


However the page uses jQuery?

<script type='text/javascript' src='https://insidegovuk.blog.gov.uk/wp-includes/js/jquery/jquery...' id='jquery-core-js'></script>


The blog is run by a separate agency inside the UK government for various historical reasons.


insidegovuk.blog.gov.uk is not gov.uk.


Honestly I feel like jQuery is my "secret weapon" similar to how Lisp is Paul Graham's in Beating the Averages. It's so much more productive than vanilla JS with negligible impact on performance, especially with caching.


What are you doing with jQuery that you can't easily do in vanilla JS? I understand the argument before we had querySelector, but not now.


JQuery is terse, which helps with productivity.

document.getElementById('id').style.display = 'none' vs $('#id').hide();


JQuery has better alternatives like Alpine.js or HTMX which are much lighter and easy to use.


> especially compared to images which can be multiple megabytes in size

Please don't put images of multi-megabytes size on web-pages. My boss put a 10-megabyte image on the homepage of one of our clients; he fancied himself a photographer, and he took the photo. It gave a 10s page-load time from a desktop PC. I offered to shrink it for him, which would have taken 2 minutes; he wasn't having it.

I don't think it was pride in his photo; I think the client wasn't paying their bills, and he wanted to punish them.

But please, shrink your images to fit on the screen of your most-capable target device. There's no need for super-hi-res posters on websites.


Adding...please use the img tag's sizes and srcset to maximize performance.

Please stop using desktop sized images across all viewports when using background images.


> But when it’s on every page, from a web performance perspective, it equates to a lot of data.

How does browser caching come into play here? Doesn't it make a difference?


Browsers don't share cached assets between different origin domains any more. So the browser cache would only come in to play when you're browsing between pages on the same government website.

But I'm not sure that downloading 32kb on each page load is their biggest problem. It sounds like a lot of the cost of jquery is parsing that javascript on each page load - especially on low specced devices. Javascript parsing traces (as far as I know) aren't reused between page loads at all.


This is an interesting point. Is there no way for cdn-hosted JavaScript to have their JIT results cached? I suppose the dynamic nature of JS might make that pretty difficult. (Lack of hermetic environment around modules, so global variables could change compilation results.)


Yes, it makes a difference in network requests.

It only needs to get the file from the Internet once, but the browser still has to load the file and work with it on each page load! :-)

How that is handled exactly... I don't know LOL :-)


Maybe. JQuery reduces the number of characters you need to write compared to vanillaJS , so if their websites use some JS , it was probably a benefit. Now tell me about the megabytes of javascript that their new fancy frameworks download, uncached


""" Since these users have the slowest devices, they’ll need the most help to make sure their visit to GOV.UK is as fast, efficient and as cheap as possible """

If only web developers cared about *all* users the web would be much less crappy.


> But when it’s on every page, from a web performance perspective, it equates to a lot of data.

Author of this article is apparently unaware of browser caches.

> JavaScript is known as a “render-blocking” resource.

Yeah, if only there was, like, an async attribute or something.

> The graph below shows time to interactivity dropped from 11.34 seconds to 9.43 seconds

So, jQuery is way too heavy for these people, but interaction tracking analytics (these packages usually start around 200kB) is perfectly fine?

> total page load time dropped from 19.44 seconds to 17.75 seconds

Burying the lede here, if my team celebrated a 17 second page load, we'd be fired on the spot. Going out on a limb here to suggest jQuery is the least of their problems.


Why is jQuery "render-blocking"? Wouldn't loading it at the end of the HTML or with the `async` attribute fix that?


I’m still a huge fan of jQuery. When I start a new web project then the first thing I do is add jQuery to the project. This is pretty much the only dependency that I ever use in my web projects. Nothing can beat the short “$” function and effects such as fadeIn, fadeOut, and others.


"But when it’s on every page, from a web performance perspective, it equates to a lot of data."

Caching? Set cache headers such that the jquery.js rarely/almost never expires and requires a re-fetch?


Couple of relevant previous discussions around this subject:

Gov.uk drops jQuery from their front end (90 days ago|437 comments) https://news.ycombinator.com/item?id=31434434

How and why we removed jQuery from Gov.uk (5 days ago|43 comments) https://news.ycombinator.com/item?id=32423422


jQuery is still wonderful.


Try new stuff every once in a while. You might be surprised


Try old stuff every once in a while. They still work. You might be surprised.


I've worked with a lot of technologies over the years. Some I still consider to be good, others I am happy to leave behind. jQuery is in the latter camp.

I have worked with these technologies professionally:

- Vanilla js targeting ie5+

- jQuery

- MooTools

- Angular

- KnockoutJS

- virtual-dom

- React

- Typescript

I've also tried out a handful more in my spare time, like Vue, Preact, and Elm.

If your list does not include any modern stuff, then your opinion is worth zero to me, because your reference for what's good is horribly outdated. If your list matches or exceeds mine, and you still prefer jQuery, then I'll be bewildered but at least your opinion is interesting.


There's a JS library called New Stuff?

For me, I use Jquery because it works. Light and low impact. Syntax is good for readability.


jQuery has its own syntax?



That's the API. Syntax is something completely different: https://en.wikipedia.org/wiki/Syntax_%28programming_language...


We can safely use the word syntax in this context without offending programming languages. You seem overly concerned with latching "syntax" to a tightly controlled definition. Perhaps you should google "jquery syntax" then contact each website informing them of your objections.


Syntax does not mean API, so it's simply the wrong word.


It's the right word in the context I used it.

"API" refers to the whole thing. API does not refer specifically to how statements are written or how accessible or readable its custom patterns and expression are to read and write.

I like the jQuery syntax. This means I like how the expressions tie together in the way they do, using the custom syntax it employs for representing objects, methods etc. I'd give example of some expressions I like the syntax of, but it sounds like you're not interested.

Maybe you're offended because I highlighted your less than helpful advice to "try new stuff", which was a response to someone expressing their like of Jquery. They weren't looking for advice, or revealing a problem. You turned their like of Jquery into a problem, and now you're turning my use of a word into another problem.


> But when it’s on every page, from a web performance perspective, it equates to a lot of data

It's not cached?


It’s so nice to know that someone still case about web performance. In a world where it’s quite normal to see website with static information and 1-5MB of compressed JS, which takes long time to load and makes your device hot


jQuery was nice in a pre transcompiled js world, but those days are long gone.


I very intentionally avoid that world when I can because the tooling overhead is just too much, and the general state of the tooling does not leave me impressed. I don't think I'm the one, and there are plenty of older sites/apps just churning along. Don't confuse popularity with ubiquity.


No, they are not gone at all.


Deleted.


Tracking and spyware can be loaded after page load.

And with HTTP2 it doesn't really matter if you have more than 8 resources.

I doubt query selectors will have any meaningful impact on page load


At the end he said a certain cohort of users, but how many users were actually in that cohort? JQuery is an older technology. I find it hard to believe something made to work on older hardware / internet is affecting performance that much. We have 5G and stupid fast chips in phones now. I’m skeptical. Performance issues are usually database queries or code compiling / “waking up” some server function.


There is a massive difference between using a library like jQuery and using vanilla JS for similar purposes. I would not be surprised if a lot of frontend devs aren't fully aware that a lot of functionality is now available in modern, vanilla JS vs. using a library.

The website is a bit of a parody, but the numbers are real: http://vanilla-js.com/

For similar functionality, modern, native JS is orders of magnitude faster than almost all JS libraries. The libraries obviously do more, but they also account for older browsers and just by their nature have a lot more overhead.

The point about edge case devices also should not be missed – the assumption that "we have 5G" should be a target is really quite surprising. There are always cases for a more optimum website. When traveling and there is low signal, or people on pay-per-MB plans, and more – there are all kinds of people that can benefit from a slimmer and lighter (compute-wise) website. The stupid fast chips are not used by everyone, in fact, not even the majority.


> We have 5G and stupid fast chips in phones now. I’m skeptical.

I'm not. Rural areas often have unbelievably bad internet connections. In my home town we have around 2 MBit/s for broadband connections and 5G is not there either.

You lose nothing by optimizing your sites for the worst possible case and not for what you'd like everybody to have.


"Performance issues are usually database queries or code compiling / “waking up” some server function."

That's not true. The overwhelming majority of user-perceived latency for mobile devices accessing typical websites occurs after the initial HTML response has been received. See eg https://wpostats.com or https://httparchive.org for data.


Can I point you to this blog post from edent, on the importance of making government websites work for everyone: https://shkspr.mobi/blog/2021/01/the-unreasonable-effectiven...


This quote really hammers the authors point home, I remembered this article on this alone:

> But the GOV.UK pages are written in simple HTML. They are designed to be lightweight and will work even on rubbish browsers. They have to. This is for everyone.


gov.uk serves all of the UK - including the person who lives in the middle of the mobile dead spot, and the person who can't even get consistently working ADSL. 4G/5G doesn't cover the UK by a very long way, and there's plenty of the UK who cannot afford anything but the cheapest smartphone.

It's much less common in the UK than the US to buy a smartphone on credit - cheap phones with cheap pre-paid plans are normal - so low-end models with very slow chips are more common. The sort of thing where even scrolling through your contact menu will induce lag.


> We have 5G and stupid fast chips in phones now.

The people who need government services the most won't have this


Do we have hard data on this?

Just want to make sure we aren’t just making assumptions.

I would think most folks of any income would replace their phones even every 5 years, simply because the batteries would’ve died. And I’d think any phone in the past 5 years would be reasonably decent. Even the ultra cheap ones.

Plus with networks, cell carriers are at least incentivized in general to get folks off 3g into lte since it’s cheaper for them to not have to maintain N systems.


IMO the key concept is that the government does not serve most folks. It serves all folks, so it is good to be slightly conservative in terms of bandwidth.

For data maybe Android share in Europe could be a start, or people with prepaid plans. In my experience both are much more prevalent in Europe than in the US.


GDS have it (or at least proxies for some of it) as part of their RUM data

Device memory is a good proxy for how powerful the phone is

Effective connection type will give some data on network performance


You have all those things, I have some of them, many people do not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: