> "The site example.com uses 5 MB of scripting. Allow it?"
Oh God, no. Please don't do this.
1) We know that users presented with these kinds of dialogs don't read them. They just blindly click "OK" to get the dialog to go away. So now you're just annoying people to no good purpose.
3) If the problem we're trying to solve here is that code makes sites load slowly, making people click through a dialog isn't going to make the site load any faster. In fact it will almost certainly take the user more time to find and click the "OK" button than it would to just load the code.
4) It's measuring the wrong thing. 1 kilobyte of badly written code can do as much or more harm to the user experience than 1 megabyte of well written code will.
 With the possible exception of people on strictly capped data plans, but now we're solving a different problem.
Which they can do by finding a way to load a lot of JS that doesn't trigger the popup (someone below mentions running eval on a png). This will likely make the site even slower to users but there won't be a popup.
I've found it best to never assume that intelligent creative people will choose the specific approach I want to solve a problem.
I've seen it happen in sales, you focus on add on warranties at head office, store level they convince customer to buy cheaper product and sell the warranty (now if the margin on the warranty is high enough to offset that might be a smart move) or in one memorable instance give everyone a discount equal to the cost of the warranty..
We did wonder how that branch was selling more warranties than the next three best performing.
Judging it by KB of data just doesn't make sense, unless you apply it to everything. And then it would show on pretty much every site.
I would say you're annoying some people so others have the ability to make that choice, and the site developers will definitely see the friction of that dialog as something to be avoided. It incentivizes good behavior.
> It's measuring the wrong thing. 1 kilobyte of badly written code can do as much or more harm to the user experience than 1 megabyte of well written code will.
That depends entirely on your connection, keeping in mind your connection might be from your phone, when you have very poor reception.
The fact that people are immediately assuming it only matters for execution time after loading illustrates part of the problem, which is that developers often forget about entire classes of access and thus don't consider the problems they create for them.
"The bits for this website traveled through a router in Europe. Do you want to continue?"
"This website uses some special behavior that isn't part of the w3c spec and works slightly differently in your browser than in other browsers. Do you want to continue?"
"This website was developed by a company that offers its employees below market rate health insurance. Do you want to continue?"
There exist people who'd like to know about each of these things. It would be insane to make them dialog boxes.
That is justification for opt-in instead of opt-out for this feature. But you lose that incentive you want. Want to make your users happy and have on-by-default protection to incentivize better resource management? Gotta get out of the users' way.
Yes but how many places actually make this effort? Seems to me that most places are very content to make some monstrous multi-MB React/Angular/etc app, ship it and wash their hands. The number of websites that have actually gone to the effort of optimising their sites is extremely small, so I'm doubtful when someone says "oh but web apps can do this too, if you only just progressively load/build it properly/etc" because that implies effort that so far, seemingly no one has gone to the trouble of actually doing.
"They" and "them" are - as far as I am aware (and I work in a highly-charged (some out say overly-so) diversity-aware company) - the currently preferred terms. The sand may well have moved under my feet (this happens every month or so) but as far as I understand this is the politically-correct term for referring to generic "people" without referencing gender one way or another.
For better or worse, I think we can all agree that only ever saying "him"/"he" or "she"/"her" is currently considered unacceptable (for better or worse) in English-speaking companies.
I do wonder though how this plays out in companies where gender is a major part of the the dominating-language. How can you eliminate her/him in a language where such simple concepts even a table has a gender stereotype (der tish, la mesa, la table etc)
Hi, I'm the person who wrote the (apparently offending) comment.
I am sensitive to and aware of these issues. However, using they/them is not necessarily an obvious slam dunk in these contexts either, as that's technically not grammatically correct when used in the singular. There are those who argue that the singular they is useful enough in our modern gender-fluid context that we should just start using it and let the grammar rules catch up when they will, but this is definitely not a settled currently-considered-the-only-acceptable-choice-everywhere kind of thing.
Grammarly tackled this question on their blog last year (here: https://www.grammarly.com/blog/use-the-singular-they/), and while I think that piece makes some good arguments, note that it approaches the subject as "while non-standard, this is a good thing that people should be doing anyway" rather than "everyone knows this is the only way to approach this." Similarly, Merriam-Webster (see: https://www.merriam-webster.com/words-at-play/singular-nonbi...) judges this a matter of still-evolving usage rather than long-settled grammatical law.
The last thing I wanted to be when I grew up is a grammatical pedant, so please don't take this as an invitation to a war over which usage is the right one. I can see merit in both arguments (though I tend to think the singular they will eventually reach consensus acceptance, just because it's so useful). I just mention it to illustrate that it's not necessarily the case that failure to reach for the singular they automatically marks a writer out as oblivious or sexist.
(And note that for some singular-they advocates "he/she" and "her/him" aren't really any better either, as they omit trans folk and other non-binary people as much as using "he" in one place and "she" in the next does.)
This is a place where the language is rapidly evolving, is all I'm saying, and therefore maybe we should lean towards reading people charitably until such time as a consensus has been reached.
I don't agree with sacking people over this but we 100% absolutely need to confront sexism (in every direction, and implicit & explicit) in our industry.
Even if that is a sackable offence where you work (which I'm highly sceptical about), that would still be breaking UK employment laws and thus the terminated employee would have grounds for unfair dismissal. This is even taking into account how sensitive to discrimination our employment laws are.
Usually what happens when employees are fired over seemingly trivial things, they already have a string of misnomers on file and really the trivial occurrence is the "final strike" (to use a baseball metaphor).
We don't need that in our industry.
Linguistic gender and whatever "gender" is supposed to mean for people are entirely orthogonal.
It's not "la table" because it has traits related to the female sex.
> Counterargument is contradiction plus reasoning and/or evidence. When aimed squarely at the original argument, it can be convincing. But unfortunately it's common for counterarguments to be aimed at something slightly different. More often than not, two people arguing passionately about something are actually arguing about two different things. Sometimes they even agree with one another, but are so caught up in their squabble they don't realize it.
Whereas you are arguing about what is politically correct, I argued that the "politically correct" usage of they/them has a problem. I actually agree that they/them is politically-correct, and think we should change that.
So your argument does not, in fact, disagree with mine.
My understanding is that using "they"/"them" for purposes of gender equality/neutrality is well understood by many people.
EDIT: if sarcasm, we can't have perfection, but not excluding ~50% seems like better...
Please - sexism has no place in our industry.
But on the other hand, I love the idea of resource-limiting tabs by default, along all of:
- CPU usage (allow an initial burst, but after a few seconds dial down to max ~0.5% of CPU, with additional bursts allowed after any user interaction like click or keyboard)
- Number of HTTP requests (again, initial bursts allowed and in response to user interaction, but radically delay/queue requests for the sites that try to load a new ad every second even after the page has been loaded for 10 minutes)
- Memory usage (probably the hardest one to get right though)
If the browser had soft resource limits that just gradually slowed down code execution and HTTP requests to extreme levels, with a discreet pop-up warning in the corner "this site is attempting to use higher than normal resources, click here to allow temporarily/permanently" for the times you're opening a WebGL demo...
...it seems like it would be a big win for users.
What's more interesting to me is the broader economic question. Do users end up flocking to those browsers because it makes the browser more snappy for the other web sites and contexts? Or do they ditch them because they are necessarily slower than browsers that don't? Is this sort of decision, in other words, a sort of suicidal option for browsers?
Granted, those technologies are not as widely used and that limit, if absent, would not inconvenience the user as quickly and easily as CPU/memory hogging does.
Another thing could be to allow the site to read those limits and consider what to load. Kinda like mediaquery.
Funny how Mozilla spent so long trying to make firefox multi process and right after they got there it was decided that we had to throttle these process anyway.
I run adblock primarily because I don't want to see ads and get tracked by sketchy sources. Resource usage is a minor bonus. If that hurts peoples pockets then so be it, most ads are out of hand and need to be tempered. Most people who run adblock aren't going to click on the ads anyway.
With that said, I do hope we're able to figure out how to treat web "sites" and web "apps" differently - for the former, I want as little JS as possible since that just gets in the way of content, but for the latter, the JS is necessary to get the app running, and I don't mind if its a few megabytes in size.
It's also not very effective as an ad-blocker or anti-tracking tool.
Setting some arbitrary hard-limit is counter-productive and shortsighted. That quote is infamous for exactly the same reasons that this claim is flawed.
I can easily imagine a 5mb+ web application. Sure you might not like that, but it does not make that a bad situation.
It's impolite for an installer to not tell you it's copying 50GB of data to your hard drive before it starts.
"640K ought to be enough for everyone" is shortsighted, but "Web applications should not assume the user wants to dedicate all system resources to them without knowledge/permission" is not.
It won't matter much longer. Websites are giving way to phone apps. Where no visibility into behavior or resource consumption is normal, standard, and expected.
You don't know how many resources it's going to use until you've already visited it.
Content blockers try their best do block and remove specific parts of the web page, surgically identifying them. You can’t just randomly block stuff and therefore break an insane amount of websites and call it a day.
What about web apps? Should there be a «this application wants to exceed the xMB quota» confirmation messages? Oh please no.
I'm not sure I particularly agree with this proposal, either, for a lot of reasons outlined in comments here (most broadly that this is putting too much of a burden on users, and that it's just likely to create a new round of cat-and-mouse games with bad actors to find ways around it), but the author certainly isn't inexperienced in web and application matters.
However, you can't expect actual web applications to do this.
- https://www.audiotool.com/app/ (sign in required; demo https://www.youtube.com/watch?v=gvIu_R8bmdA)
- https://human.biodigital.com/signin.html (sign in required; demo https://www.youtube.com/watch?v=dW4JSMlBhWQ)
Other more commonly know examples
- Most of Google Drive offerings
- Google Analytics control panel
I see this confusion a lot when this topic is discussed on HackerNews.
Example: Build a usable decent image editor that allows you to preview your changes without the page refreshing or redownload an entirely new image (therefore using even more data).
Example 2: Make a playable 3D game
Example 3: Make a spreadsheet web app
I do it every day, and there’s absolutely no reason that progressive enhancement should be ignored today, except that fromt-end devs have forgotten how.
You flat out rejected the idea.
> How do you think webapps were written before React came along?
Jquery / Prototype / ExtJS, and others where written in Flash or Java.
It would be fascinating if the entire internet functioned as workers extending the graph that is the internet. Web applications adding to the graph, while web browser, walked through it. There is a lot to be imagined. I think there is room for both applications and data to exists together.
This everything-is-a-website mentality is a disaster for privacy, security, and the future of decentralized general-purpose computing.
- Convenience of just going to a site and not installing anything.
- Cross platform, if you have a browser most web apps will work.
- Being relatively certain that by just visiting a web app you're not going to end up with malware or other unwanted software
- Being able to access your app from other locations without admin access.
- Updates are ready and available instantly
There are of course many many more benefits.
I get that not everything has to be a web app but it's disingenuous to imply that they have no use cases.
We still don't have a reasonably easy way to write cross-platform applications (Mac/Win/Lin/Phone) in a native-first language. The reason JS has taken off is because it filled a rather HUGE hole.
If there is a chance that is changing, let us know, but right now, the hole isn't being filled with anything else.
The point is to remove the need to install things.
An image editor that runs entirely in the browser and only has sandboxed access to open and save dialogs is more secure by miles than running a random .msi installer.
2) what you propose is completely different from the proposal in the link. I’d be ok with a JS-off-by-default world (or a cookie-off-by-default too!!) but having a limit on file size is just… miopic.
Your first point misses the argument. It’s not that you should support everything with JS disabled; you should, however, support a subset of your interaction as a fallback. Yeah, users don’t get offline mode or whizzy ajax updates, but you can absolutely find a way to work, in nearly all apps. And it isn’t that hard.
The fact that so many people believe this is impossible is an indication of how low the front-end development bar has fallen. This is why we have webpages that take 10MB of JS to render 300k of text.
Who is asking you do to this? I have not had a single request -- ever -- in 15 years of web development. Hey more power to you if you want to disable JS, but the world will be leaving you behind.
I also ask my ISP to support ipv6, but I imagine the same thing happens there. Engineers never get the memo because it’s deemed unimportant by people who make decisions.
The number of people with JS turned off are so small they don't matter, and on top of that, they're not monetizable, so I really don't care if the site works for them.
There is virtually never any business rationale for making that extra effort.
Give people the option to easily turn off the crap, and you’ll find out how much it matters.
Not sure how I feel about a statement this broad. I work mostly with front end code and I try neat things and tricks to reduce resource use all the time. I also always push for the lightest way of doing things in PRs. I’m certain I’m not the only engineer who thinks like this.
Maybe in your company. My experience browsing modern front-end SPA design however begs to differ. They pride themselves in loading megabytes of uncompressed images, layers of invisible background images, frameworks to load frameworks, auto-download/auto-playing 4k videos...
giving devs 4 year old laptops would help but when shiny startup is offering new macbooks how can you compete?
Just have them test under processing and bandwidth limits?
This is a flawed assumption, I think. Sure, maybe _some_ people are blocking ads because all that JS and network IO eats up battery but I bet more people block ads either because they dislike advertisements generally or they dislike advertisements based on surveillance. Whether you manage to cram your ad serving code into 1Mb or 1Kb is irrelevant for the later two groups of people.
Would you like to subscribe to our newsletter?
Please give us access to your location!
This site is way better if your download our app!
Those pixels do not execute themselves (or cookies, or headers, or whatever we are still allowing). They still need an interpreter. Interpreters are indeed security flaws. Pushing people away from scannable code and towards hiding their code in other filetypes will cause a proliferation of interpreters, and a proliferation of security flaws.
To process such a file, you DO need an interpreter that follows the "rules" of the encoding, but the interpreter can be small, concealed, and varied, resulting in a cat and mouse game if you are trying to block such implementations.
What I really want to see happen is WHATWG start a standard lib project. In my mind it would look something like this:
* Every browser agrees to preload / cache all recent versions of the standard library (or most recent version for each major release assuming semver). This allows all programmers to load the version they need without concern for the performance hit.
* User loads the standard library via a script tag with src something like: https://ecmascript.org/stdlb/v1/lib.js. Doesn't really matter what it is, just some recognized url the browser knows about that encodes the version.
* Each browser can provide a native implementation of anything in the stdlib so long as it passes the spec. Browsers could even optimize which pieces of the stdlib it parses / loads based on this. e.g. If the browser has a native Promise implementation then it doesn't need to load the Promise code from the stdlib.
* Be reasonable but aggressive about adding to the stdlib. It's scope should be wide and cover common use cases. e.g. I shouldn't have to write a URL parsing class or a throttle function every time I start a new project (which is where we are today). There are plenty of projects to look at for learning what people need (lodash, etc).
Obviously this would not cover everything; it's not going to add `await` to browsers that don't support it. But I think we're at the point where what we are most in need of is not language features (and mostly these are transpilable anyway), but stdlib functionality.
Perhaps we just let quality rise to the top? If your website is full of ads and slow, then someone else with better, less annoying execution will eventually win more users.
What is going to be the limit?
Who is going to be setting it?
How is it going to be calculated? Is it the amount of loaded files? What if the page is small but then loads more dynamically later?
How I, as a developer, make sure my website works everywhere?
What with sites that use a lot of code but still work fast? The amount of code does not directly translate to slowness. You can have little code and slow website or a lot of code and fast website. As an example, single page applications tend to load all of its code but still manage to be responsive.
> What is going to be the limit?
Whatever the user wants it to be.
> Who is going to be setting it?
A default from the browser, which can be overridden by the user.
> How is it going to be calculated? Is it the amount of loaded files? What if the page is small but then loads more dynamically later?
Preferably by a total per-site and per page (with page being a higher amount than site). When a site/page hits its limit, halt running with a popup asking for a temporary larger limit (and a checkbox to remember the limit for this site).
> How I, as a developer, make sure my website works everywhere?
> What with sites that use a lot of code but still work fast? The amount of code does not directly translate to slowness.
Depending on your connection, it really does. Bad mobile connections (even if only in certain locations, like supermarkets that are constructed like Faraday cages) mean that an increase in the size of the JS loaded required to use the page directly translates into extremely slow websites, and in a lot of contexts, the first page is the only page visited, so the idea that a SPA will be quick after that first load (to stave off that future point) doesn't really solve the problem.
He suggests 1mb, personally I still think it is too much.
1mb for a single page isn't enough?
> What with sites that use a lot of code but still work fast?
It's fast if the resources are near your device (CDN), most websites don't do that at all.
Wether a website wants to be inefficient or has a good reason for having a lot of resources it impacts the speed with which the site loads. The slower the site loads the more users are going to abandon the site. Stats say by 2 seconds most users have abandoned a slow loading site. Caveat that if the user feels the site is really worth the wait, they will wait longer.
So seems everything in this proposal is already addressed...
What about single-page apps where the whole app is "a single page". With this proposal, even lazily loading the extra bits would hit the limit.
"But there's a downside to this content blocking: it's hurting many smaller sites that rely on advertising to keep the lights on. More and more of these sites are pleading to disable content blockers"
The solution proposed is unworkable for all the reasons others propose. But there really is a baby/bathwater situation with ad blockers that attempt to kill all advertisements.
I don't know what the solution is, but I think it is an important problem and deserving of a more sophisticated solution than ad blockers or ones like the solution proposed. People here are smart, I hope people can do more than just state reasons why we can't do better than what we are doing.
I don't think "get a different business model" is a reasonable option for many web sites, unless the whole point of the site is to support a product (for instance Adobe's Photoshop web page, or whatever).
Really, we block ads because they are intrusive, disruptive, invasive and bloated pieces of malware. Simple as that.
I've got another way to solve this problem.
Content blockers should operate on a blacklist instead of a whitelist. Advertisements appear by default, but if a given site annoys you enough, you can go into your ad blocker's preferences and add it to your blacklist, and then you'll never see any ads on that site again.
Why is this not even an option in any of the major adblockers? I know it's possible in Ublock Origin via tweaking advanced settings, but it should be a built-in, user friendly feature.
Thats literally what ABP has. Ads which have meet the acceptable ads criteria appear by default, and the rest are blocked by default. Users can however go and disable that setting so that all ads are hidden by default.
Right now, most people I know who use adblockers will "whitelist" websites they want to support, and/or who display ads they find nonintrusive.
I really think the default should be switched, so websites can display ads by default but a user can whitelist domains they dislike. At minimum, it should be possible to switch to this functionality.
Remember those popups in IE "Do you want to continue running scripts on this page?" Nobody knew what they were, where they came from, or what they were talking about. Even wose than that was the fact that nobody knew what the Yes/No button was for and if you clicked "Yes" sometimes it would just keep popping up over and over again.
They got rid of that for a good reason (although its still alive and well in MSHTA.exe). It sucked and offered nothing of value to most users.
On the other hand, the New York Times's website has 5 external scripts that total over 1mb of just text. They've got their CMS, 4 DIFFERENT ad agencies, analytics, and some other crap the site problably never "needed" before 2010. That's disgusting. And that's what you get from a $300m+ company that RELIES on technology to stay in business. Missing a feature? Fuck it, just have the client pull down another 400kb of js from a sixth CDN. Nevermind the actual CONTENT that attracted the user in the first place is measured in bytes.
Almost every page of my WP website has just 8 internal resources and a page size of <310kb. There's no reason NYT can't stay in that ball park with their deep pockets. Megabytes of tracking and bullshit for basically a white page with text is disingenuous.
"This page will cost $0.02 to retrieve. Pay by running 20FLOPs of code?"
"This page will cost $0.05 to retrieve. Pay by running 500FLOPs of code?"
"No, pay via $fiat_transfer_method"
* A number of seconds on page load,
* a fraction of a second in response to network activity (eg. image loads),
* a small number of frames in response to significant user interaction (mouse clicks, typing),
* one frame after less-significant user interaction (mouse movement), limited only to local operations (no network activity).
The vast majority of sites should need nothing more than this, so opt-in to unlimited usage should be fine. Sites where the user has enabled extra permissions like notifications should be allowed extra time for those.
I think this is a great idea. It puts pressure on developers and makes experiences better for users. The average American Internet speed is sub-100 Mbps, but average LTE speeds are closer to 12 Mbps, with websites opting usually to use responsive layouts over separate mobile sites. This means you're downloading the full resources of a desktop site, and the mobile device is adjusting to media queries.
5 MB / 12 Mbps is over 3 seconds. That's bullshit. Put pressure on developers, make a better web.
Base it on the content of the page, not the person visiting. I've never clicked on a retargrted ad for sneakers that follows me to a tech site, but an ad for (for example) DataGrip on an article about SQL tooling might actually interest me!
Given that, I don't see any point trying anything that somehow keeps ads around. Intrusive online advertising doesn't really need to exist.
A lot of people use their phones to access the Internet and have hard data caps, and webpages don't need to load several MB of JS to display 55k of text. There's certainly use cases for JS that justify that amount of script to load, but shoving ads, trackers, and widgets aren't those uses.
It's aimed at limiting resource usage of 3rd parties (ads), and pages voluntary limiting their usage, but presumably browser extensions could add the limits too.
Smaller sites could avoid farming their content out to 2 or 20 locations on the cloud.
The user can already add limits. Try using uBlock Origin with scripts open to inline and first-party scripts and images everywhere ... and nothing else.
If the page comes up blank, bye-bye. Don't visit it any more. They made their choices, let them live with it.
> The situation I'm envisioning is that a site can show me any advertising they want as long as they keep the overall size under a fixed amount, say one megabyte per page.
With minification/compression, I don't see how 1MB could work...
A build tool that scans your JS code, and includes only the jQuery functions that it has found to be using; or equivalent library etc.
Whoever wrote that comment was a liar and a peddler of nonsense.
That’s a lot less than 1MB
But the developer stuff in the walled garden not so much.
The App Store Connect login/start page has 2MB JS and the App page has >3MB.
Overall the whole Apple Dev experience is sluggish.