Hacker News new | comments | ask | show | jobs | submit login
Add limits to amount of JavaScript that can be loaded by a website (webkit.org)
114 points by ingve 20 days ago | hide | past | web | favorite | 137 comments



> A simple dialog would put the user in control of the situation:

> "The site example.com uses 5 MB of scripting. Allow it?"

Oh God, no. Please don't do this.

1) We know that users presented with these kinds of dialogs don't read them. They just blindly click "OK" to get the dialog to go away. So now you're just annoying people to no good purpose.

2) Even if the user does read this dialog, how is she supposed to know how much JavaScript is too much? There's no context to tell her whether 5MB is a lot, or how it compares to payloads delivered by similar sites. ("The site example.com wants to load more code than 99.99% of other sites you visit" would be slightly more helpful, but only slightly.) It just expects her to have a strong opinion on a subject that nobody who isn't a coder[1] themselves would have an opinion about.

3) If the problem we're trying to solve here is that code makes sites load slowly, making people click through a dialog isn't going to make the site load any faster. In fact it will almost certainly take the user more time to find and click the "OK" button than it would to just load the code.

4) It's measuring the wrong thing. 1 kilobyte of badly written code can do as much or more harm to the user experience than 1 megabyte of well written code will.

[1] With the possible exception of people on strictly capped data plans, but now we're solving a different problem.


You're thinking about it wrong. The overall effect is not that users will take control over their data, or whatever. The effect is that users will complain that the site is broken. And then the pointy-haired boss will demand that the pop-up is fixed. Now instead of "page weight" being "blah blah nerd bullshit," it is something the pointy-haired boss will demand from ad network partners, and the effects will spread across the web ecosystem.


>And then the pointy-haired boss will demand that the pop-up is fixed.

Which they can do by finding a way to load a lot of JS that doesn't trigger the popup (someone below mentions running eval on a png). This will likely make the site even slower to users but there won't be a popup.

I've found it best to never assume that intelligent creative people will choose the specific approach I want to solve a problem.


It's always wise to be careful how you pick your targets because people optimise for them in ways you don't expect so if you pick the wrong ones then you get unwanted outcomes.

I've seen it happen in sales, you focus on add on warranties at head office, store level they convince customer to buy cheaper product and sell the warranty (now if the margin on the warranty is high enough to offset that might be a smart move) or in one memorable instance give everyone a discount equal to the cost of the warranty..

We did wonder how that branch was selling more warranties than the next three best performing.


Yup. I work in healthcare and one story I hear is about readmission rates to hospitals. Basically, you don't want people to have to go back to the hospital after a procedure due to complications. So it makes sense for Medicare to reward hospitals and surgeons with low readmission rates. All good. Except for one little problem, there's one kind of patient that never ever gets readmitted. A dead one. And studies have shown that decreases in readmission rates are correlated with increases in morality rates (at least for certain procedures).


I don’t see why the runtime/interpreter couldn’t keep accurate count of the amount of code it has loaded at any time. We trust the implementers to handle much harder things already.


Is that really meaningful when a single photo will have far more data than most JS payloads?

Judging it by KB of data just doesn't make sense, unless you apply it to everything. And then it would show on pretty much every site.


Javascript also has the overhead of parsing the source code in addition to network transfer, which can easily take multiple seconds with a multi-megabyte bundle and that's a penalty even when the js is served from the browser cache


True, but JS with a really inefficient loop can do the same. That's the problem in judging by KB, it isn't really very conclusive. Browsers already alert on long-running scripts, maybe the threshold on that should be brought down instead.


Given the choice between a person with a particular hairstyle and a person who minimizes the value of others by mocking their appearance, I would choose to work with the former.



"Pointy-hair" is not a hairstyle that even exists in reality. It's a Dilbert reference.



I doubt the singer from Prodigy has ascended to any position of corporate authority.


Nevertheless, he does have the hair for it.


> We know that users presented with these kinds of dialogs don't read them. They just blindly click "OK" to get the dialog to go away. So now you're just annoying people to no good purpose.

I would say you're annoying some people so others have the ability to make that choice, and the site developers will definitely see the friction of that dialog as something to be avoided. It incentivizes good behavior.

> It's measuring the wrong thing. 1 kilobyte of badly written code can do as much or more harm to the user experience than 1 megabyte of well written code will.

That depends entirely on your connection, keeping in mind your connection might be from your phone, when you have very poor reception.

The fact that people are immediately assuming it only matters for execution time after loading illustrates part of the problem, which is that developers often forget about entire classes of access and thus don't consider the problems they create for them.


"Your connection is using RSA signatures instead of ECSDA. Do you want to continue?"

"The bits for this website traveled through a router in Europe. Do you want to continue?"

"This website uses some special behavior that isn't part of the w3c spec and works slightly differently in your browser than in other browsers. Do you want to continue?"

"This website was developed by a company that offers its employees below market rate health insurance. Do you want to continue?"

There exist people who'd like to know about each of these things. It would be insane to make them dialog boxes.


> I would say you're annoying some people so others have the ability to make that choice

That is justification for opt-in instead of opt-out for this feature. But you lose that incentive you want. Want to make your users happy and have on-by-default protection to incentivize better resource management? Gotta get out of the users' way.


> That depends entirely on your connection, keeping in mind your connection might be from your phone, when you have very poor reception.

Even large payloads on slow connections can be worked around if you know what you are doing. If your site is well engineered to progressively enhance without disruption, it does not matter if JavaScript is being loaded in the background.


> If your site is well engineered to progressively enhance without disruption, it does not matter if JavaScript is being loaded in the background.

Yes but how many places actually make this effort? Seems to me that most places are very content to make some monstrous multi-MB React/Angular/etc app, ship it and wash their hands. The number of websites that have actually gone to the effort of optimising their sites is extremely small, so I'm doubtful when someone says "oh but web apps can do this too, if you only just progressively load/build it properly/etc" because that implies effort that so far, seemingly no one has gone to the trouble of actually doing.


It's not common enough, but I think the situation is getting a bit better. Node and React made it greatly easier to render your pages on the server side, and I've seen a lot more sites do it.


You’re right that saying 5MB lacks context. Instead they could use BW and latency to offer a “congestion” index the user could gauge and load js or not load js.


Just load the site. If it's too slow for your connection that's a subjective call. Close the window if you're unhappy. Deal with it.

Having to support people who refuse to load more than arbitrary amount X of JavaScript would be ridiculous.


It's not so ridiculous to expect progressive enhancement to work. It takes designers willing to use <a> and <img> tags for that to happen.


It is ridiculous when there's so many gradients and things to accommodate.


[flagged]


[flagged]


I disagree.

"They" and "them" are - as far as I am aware (and I work in a highly-charged (some out say overly-so) diversity-aware company) - the currently preferred terms. The sand may well have moved under my feet (this happens every month or so) but as far as I understand this is the politically-correct term for referring to generic "people" without referencing gender one way or another.

For better or worse, I think we can all agree that only ever saying "him"/"he" or "she"/"her" is currently considered unacceptable (for better or worse) in English-speaking companies.

I do wonder though how this plays out in companies where gender is a major part of the the dominating-language. How can you eliminate her/him in a language where such simple concepts even a table has a gender stereotype (der tish, la mesa, la table etc)


> For better or worse, I think we can all agree that only ever saying "him"/"he" or "she"/"her" is currently considered unacceptable (for better or worse) in English-speaking companies.

Hi, I'm the person who wrote the (apparently offending) comment.

I am sensitive to and aware of these issues. However, using they/them is not necessarily an obvious slam dunk in these contexts either, as that's technically not grammatically correct when used in the singular. There are those who argue that the singular they is useful enough in our modern gender-fluid context that we should just start using it and let the grammar rules catch up when they will, but this is definitely not a settled currently-considered-the-only-acceptable-choice-everywhere kind of thing.

Grammarly tackled this question on their blog last year (here: https://www.grammarly.com/blog/use-the-singular-they/), and while I think that piece makes some good arguments, note that it approaches the subject as "while non-standard, this is a good thing that people should be doing anyway" rather than "everyone knows this is the only way to approach this." Similarly, Merriam-Webster (see: https://www.merriam-webster.com/words-at-play/singular-nonbi...) judges this a matter of still-evolving usage rather than long-settled grammatical law.

The last thing I wanted to be when I grew up is a grammatical pedant, so please don't take this as an invitation to a war over which usage is the right one. I can see merit in both arguments (though I tend to think the singular they will eventually reach consensus acceptance, just because it's so useful). I just mention it to illustrate that it's not necessarily the case that failure to reach for the singular they automatically marks a writer out as oblivious or sexist.


[flagged]


Why do you assume from a single comment that I "only ever us[e] one gender"? If I were a writer who is sensitive to these issues, and who tries to address them by taking care to mix up using he/his and she/hers when writing, would it not be the case that you might see one comment from me using he/his and then another one later using she/hers?

(And note that for some singular-they advocates "he/she" and "her/him" aren't really any better either, as they omit trans folk and other non-binary people as much as using "he" in one place and "she" in the next does.)

This is a place where the language is rapidly evolving, is all I'm saying, and therefore maybe we should lean towards reading people charitably until such time as a consensus has been reached.


IMHO you're really overreacting here. It's not unusual or bad style to pick a random set of pronouns for a specific imagined person in such a few sentences of scenario.


It is a sackable-offense where I am; using He/Him and She/Her is absolutely forbidden.

I don't agree with sacking people over this but we 100% absolutely need to confront sexism (in every direction, and implicit & explicit) in our industry.


I notice you're a fellow Londoner, so here's my take on your comment from the perspective of someone who is a hiring manager and thus has had the displeasure of dismissing staff:

Even if that is a sackable offence where you work (which I'm highly sceptical about), that would still be breaking UK employment laws and thus the terminated employee would have grounds for unfair dismissal. This is even taking into account how sensitive to discrimination our employment laws are.

Usually what happens when employees are fired over seemingly trivial things, they already have a string of misnomers on file and really the trivial occurrence is the "final strike" (to use a baseball metaphor).


Using he/him/she/her has absolutely nothing to do with sexism. It doesn't sound like you are in a healthy environment if you can get sacked for that.

We don't need that in our industry.


>where such simple concepts even a table has a gender stereotype (der tish, la mesa, la table etc) //

Linguistic gender and whatever "gender" is supposed to mean for people are entirely orthogonal.

It's not "la table" because it has traits related to the female sex.


This counterargument fits into Paul Graham's hierarchy of debate at level DH4: http://www.paulgraham.com/disagree.html

> Counterargument is contradiction plus reasoning and/or evidence. When aimed squarely at the original argument, it can be convincing. But unfortunately it's common for counterarguments to be aimed at something slightly different. More often than not, two people arguing passionately about something are actually arguing about two different things. Sometimes they even agree with one another, but are so caught up in their squabble they don't realize it.

Whereas you are arguing about what is politically correct, I argued that the "politically correct" usage of they/them has a problem. I actually agree that they/them is politically-correct, and think we should change that.

So your argument does not, in fact, disagree with mine.


Does it?

My understanding is that using "they"/"them" for purposes of gender equality/neutrality is well understood by many people.

EDIT: if sarcasm, we can't have perfection, but not excluding ~50% seems like better...


Sigh - I am getting heavily downvoted on this.

Please - sexism has no place in our industry.


You're getting downvoted because you're interjecting with an off topic comment that (most people think) adds no value.


As an industry we need to call people out on sexism whenever and wherever we see it.


In addition to sexism, what other exceptions do you believe HN should add to this page?

https://news.ycombinator.com/newsguidelines.html


Perfect is the enemy of good.


This particular solution seems wrong for all the obvious reasons listed in other comments...

But on the other hand, I love the idea of resource-limiting tabs by default, along all of:

- CPU usage (allow an initial burst, but after a few seconds dial down to max ~0.5% of CPU, with additional bursts allowed after any user interaction like click or keyboard)

- Number of HTTP requests (again, initial bursts allowed and in response to user interaction, but radically delay/queue requests for the sites that try to load a new ad every second even after the page has been loaded for 10 minutes)

- Memory usage (probably the hardest one to get right though)

I mean, every so often I've caught things like a Gmail tab in Chrome stuck at using 100% CPU indefinitely because it must be stuck in some infinite JavaScript loop due to a bug, as well as the occasional other random blog that uses constant 50% CPU for no discernable reason... and it would be nicer if this didn't suck up two hours of my laptop's remaining battery time before I discovered it.

If the browser had soft resource limits that just gradually slowed down code execution and HTTP requests to extreme levels, with a discreet pop-up warning in the corner "this site is attempting to use higher than normal resources, click here to allow temporarily/permanently" for the times you're opening a WebGL demo...

...it seems like it would be a big win for users.


Right, you definitely want something that is somewhat more adaptive if you want to solve this problem. Something to the effect of "as you use more resources you slow down those resources."

What's more interesting to me is the broader economic question. Do users end up flocking to those browsers because it makes the browser more snappy for the other web sites and contexts? Or do they ditch them because they are necessarily slower than browsers that don't? Is this sort of decision, in other words, a sort of suicidal option for browsers?


The need for the user-friendly resource limitations in browsers is real.

E.g.

https://www.coindesk.com/firefox-announces-move-to-block-cry...

I know that solution presented in the article didn't work (neither in Firefox nor in Opera), and I know that from a person close to me who is not "technical" but wants to access some sites which slowed down the person's computer (the sites indeed managed to use exactly 100% of the CPU, that means, to use all cores at once, and to contact regularly some other sites with the "coin" in the name). Then, I've shown the person that the JavaScript can be blocked and that the computer is that way completely responsive... and you know the result: even the non-technical persons do what's rational, once they know that they have a choice.


Well, the browser publisher can simply make a setting or a fork (because that change looks large), and see what wins.


As a power user I’d appreciate having those individual granular controls. Maybe for non power users, have a reasonable default threshold determined by the browser for each of those parameters, and give them a single off control to let a particular tab “run wild”.

Unlike most of the commenters here, I agree this is a problem that needs to be solved by browsers. JavaScript blockers work but are unnecessarily big hammers.


You could democratise the config -- allow users to choose the median setting made by those who manually intervene? Power Users get granular control, other users get sane (?) defaults?


Note that we already have examples of a certain resource limit. WebSQL and IndexedDB databases are capped to a maximum size per domain, and different browsers have different caps and different UX for configuring/bypassing them.

Granted, those technologies are not as widely used and that limit, if absent, would not inconvenience the user as quickly and easily as CPU/memory hogging does.


As a person getting dozens of tabs open to do cross comparisons all the time I would gladly opt in at settings to enable and tweak such limit values. Exceeding those would then alert if I truly am interested in allowing the exception to rule.

Another thing could be to allow the site to read those limits and consider what to load. Kinda like mediaquery.


> CPU usage (allow an initial burst, but after a few seconds dial down to max ~0.5% of CPU, with additional bursts allowed after any user interaction like click or keyboard)

Funny how Mozilla spent so long trying to make firefox multi process and right after they got there it was decided that we had to throttle these process anyway.


Author's premise includes a lot of personal opinion and simple gripe against JS. not a fan of this. Whats the limit going to be? 1MB? 5? why is it the browsers job to police that?

I run adblock primarily because I don't want to see ads and get tracked by sketchy sources. Resource usage is a minor bonus. If that hurts peoples pockets then so be it, most ads are out of hand and need to be tempered. Most people who run adblock aren't going to click on the ads anyway.

With that said, I do hope we're able to figure out how to treat web "sites" and web "apps" differently - for the former, I want as little JS as possible since that just gets in the way of content, but for the latter, the JS is necessary to get the app running, and I don't mind if its a few megabytes in size.


While I'm not a huge fan of JavaScript heavy sites, I do think it's a little disingenuous to drag content blocking in as part of the argument. Ads have little to do with his basic argument, it just there to make the sales pitch go down a little easier.

It's also not very effective as an ad-blocker or anti-tracking tool.


"640k ought to be enough for everyone."

Setting some arbitrary hard-limit is counter-productive and shortsighted. That quote is infamous for exactly the same reasons that this claim is flawed.

Lots of unanswered questions here, e.g. what happens if a 500byte javascript file then does a load of XHR requests that download more JS: what happens then? does the connection just get closed as soon as you pass 5mb? That will break the internet for lots of users. Plus what happens if I down load a couple of kilobytes of javascript that ends up allocating many gigabytes of memory? Is that ok because it was under 5mb of over-the-wire-bytes?

I can easily imagine a 5mb+ web application. Sure you might not like that, but it does not make that a bad situation.

If you don't like javascript, then the answer is simple: disable it. If you want to use javascript but feel like a particular website is using too much resources, then the answer is simple: dont visit it. You have choice.


I can imagine a 5mb+ web application too, but there should be some way of communicating something needs a lot of resources before it is casually downloaded.

It's impolite for an installer to not tell you it's copying 50GB of data to your hard drive before it starts.

"640K ought to be enough for everyone" is shortsighted, but "Web applications should not assume the user wants to dedicate all system resources to them without knowledge/permission" is not.

It won't matter much longer. Websites are giving way to phone apps. Where no visibility into behavior or resource consumption is normal, standard, and expected.


> If you want to use javascript but feel like a particular website is using too much resources, then the answer is simple: dont visit it. You have choice.

You don't know how many resources it's going to use until you've already visited it.


I don’t know who the author of this proposal is, but I believe this should not be pursued and that his idea is uninformed.

Content blockers try their best do block and remove specific parts of the web page, surgically identifying them. You can’t just randomly block stuff and therefore break an insane amount of websites and call it a day.

What about web apps? Should there be a «this application wants to exceed the xMB quota» confirmation messages? Oh please no.


The author's name is on the proposal, Craig Hockenberry. To avoid making folks Google for him, he's been running the Iconfactory, a design and programming consultancy that does, yes, icons, as well as web sites and applications. They're most famous for Twitterrific, the very first Twitter client (and the one that gave us the word "tweet" for Twitter posts as well as the blue bird icon). They've been in business over twenty years. Hockenberry is himself both an app developer and a long-time web developer.

I'm not sure I particularly agree with this proposal, either, for a lot of reasons outlined in comments here (most broadly that this is putting too much of a burden on users, and that it's just likely to create a new round of cat-and-mouse games with bad actors to find ways around it), but the author certainly isn't inexperienced in web and application matters.


Also worth noting that he just launched a competing ad network: https://blog.iconfactory.com/2019/01/advertising-with-ollie/


As near as I can tell the Twitterrific ad network literally only serves in-app ads to users of Twitterrific.


Exactly what I feared. It looked suspiciously too well written to be from someone not versed on the topic. I stand on my opinion: this proposal is antithetical to the web itself. (Doesn’t change the fact that I’ve always admired all their work, now that I know who is he)


You should be making your pages work without javascript. Progressive enhancement has been a best-practice for years.

I routinely use the fast JS switcher extension to turn off javascript on any site that does any number of irritating things (popups, autoplay videos, etc.) If those sites don’t work any more, then I just never visit them again. Their loss.


> What about web apps?

I think it's important to make this distinction. A lot of websites don't need the JavaScript they ship with to present the material the users want. I agree those websites should work without JavaScript.

However, you can't expect actual web applications to do this.

Examples

- https://www.draw.io/

- https://www.audiotool.com/app/ (sign in required; demo https://www.youtube.com/watch?v=gvIu_R8bmdA)

- https://human.biodigital.com/signin.html (sign in required; demo https://www.youtube.com/watch?v=dW4JSMlBhWQ)

- https://pixlr.com/x/

Other more commonly know examples

- Most of Google Drive offerings

- Google Analytics control panel

- Twitch

I see this confusion a lot when this topic is discussed on HackerNews.


Webapps can be implemented just fine without Javascript; I’ve been doing it for years. React is a luxury, not a requirement, and progressive enhancement is a thing in all but the most demanding apps.

The point is, I currently exist in a world where I randomly shut off all javascript on websites, and it’s a happy world. Personally, I’d be tickled pink if it was a built-in browser feature (and if it forced web devs to be more responsible? Bonus!)

plorntus 20 days ago [flagged]

No, not in every case can they be built without JavaScript. That's an insane statement to make.

Example: Build a usable decent image editor that allows you to preview your changes without the page refreshing or redownload an entirely new image (therefore using even more data).

Example 2: Make a playable 3D game

Example 3: Make a spreadsheet web app

The list goes on. Some things can of course be done without JavaScript but tons of ideas would not only be infeasible to develop, they'd be a usability nightmare.


Nobody said anything about “every case”, but I’m here to tell you that most webapps can be written without it. How do you think webapps were written before React came along?

I do it every day, and there’s absolutely no reason that progressive enhancement should be ignored today, except that fromt-end devs have forgotten how.


You implied "every case" when not addressing the list of web apps, while also rejecting my premise that there is a distinction between "websites" and "web apps" by stating:

> Webapps can be implemented just fine without Javascript; I’ve been doing it for years.

You never addressed any of the examples, so I don't see why it's wrong to assume you believe that the listed applications can be implemented, in the browser, without JavaScript.

You flat out rejected the idea.

> How do you think webapps were written before React came along?

Jquery / Prototype / ExtJS, and others where written in Flash or Java.

I think it's fine to want parts of the internet to work without JavaScript or having any programming language involved. I highly suggest you checkout IPLD[0]. I've always thought that having a graph browser, much like the original intent of www may of been when Tim Berners-Lee wrote the original web browser.

It would be fascinating if the entire internet functioned as workers extending the graph that is the internet. Web applications adding to the graph, while web browser, walked through it. There is a lot to be imagined. I think there is room for both applications and data to exists together.

[0] https://ipld.io/


Why do you want any of those things to run inside a browser? Those all sound like great candidates for being actual installed software. Do we even remember how to write that?

This everything-is-a-website mentality is a disaster for privacy, security, and the future of decentralized general-purpose computing.


Many reasons why:

- Convenience of just going to a site and not installing anything.

- Cross platform, if you have a browser most web apps will work.

- Being relatively certain that by just visiting a web app you're not going to end up with malware or other unwanted software

- Being able to access your app from other locations without admin access.

- Updates are ready and available instantly

There are of course many many more benefits.

I get that not everything has to be a web app but it's disingenuous to imply that they have no use cases.


Both Java and C# tried to break the web (silverlight and Java FX) and both ended up being considerable failures.

We still don't have a reasonably easy way to write cross-platform applications (Mac/Win/Lin/Phone) in a native-first language. The reason JS has taken off is because it filled a rather HUGE hole.

If there is a chance that is changing, let us know, but right now, the hole isn't being filled with anything else.


JavaFX is a GUI toolkit. It does allow the use of CSS for design, but is nothing like silver light, flash, etc. It's not mean to run in a browser at all. You probably meant Java applets?


> Why do you want any of those things to run inside a browser? Those all sound like great candidates for being actual installed software.

The point is to remove the need to install things.

> security

An image editor that runs entirely in the browser and only has sandboxed access to open and save dialogs is more secure by miles than running a random .msi installer.


Curious how to implement https://github.com/jgraph/drawio without JS?


1) The applications you develop/design (I infer) have a very limited kind of interaction, or have no need to support offline conditions and/or real-time data. This is not the case for the rest of the world.

2) what you propose is completely different from the proposal in the link. I’d be ok with a JS-off-by-default world (or a cookie-off-by-default too!!) but having a limit on file size is just… miopic.


You’re right on the second point: I think it’s somewhat silly to set an arbitrary size limit (an easy javascript off button would be fantastic, however.)

Your first point misses the argument. It’s not that you should support everything with JS disabled; you should, however, support a subset of your interaction as a fallback. Yeah, users don’t get offline mode or whizzy ajax updates, but you can absolutely find a way to work, in nearly all apps. And it isn’t that hard.

The fact that so many people believe this is impossible is an indication of how low the front-end development bar has fallen. This is why we have webpages that take 10MB of JS to render 300k of text.


> I’ve been doing it for years.

Who is asking you do to this? I have not had a single request -- ever -- in 15 years of web development. Hey more power to you if you want to disable JS, but the world will be leaving you behind.


I’m one of the people requesting this kind of thing. Just because your helpdesk or sales is not forwarding my requests to you doesn’t mean I haven’t made them.

I also ask my ISP to support ipv6, but I imagine the same thing happens there. Engineers never get the memo because it’s deemed unimportant by people who make decisions.


Things that work without JS are typically much faster, more responsive and more consistent in terms of user experience. A competitive advantage over what young frontend developers produce these days.


Speaking as a web developer myself, I 100% abandoned supporting users without javascript a decade ago

The number of people with JS turned off are so small they don't matter, and on top of that, they're not monetizable, so I really don't care if the site works for them.

There is virtually never any business rationale for making that extra effort.


All the more reason to put the button in the browser.

Give people the option to easily turn off the crap, and you’ll find out how much it matters.


> Today's JavaScript developer is acting like they have a 100 GHz CPU with terabytes of memory. And being lazy and uninspired as a result.

Not sure how I feel about a statement this broad. I work mostly with front end code and I try neat things and tricks to reduce resource use all the time. I also always push for the lightest way of doing things in PRs. I’m certain I’m not the only engineer who thinks like this.


I agree. I don't do frontend stuff, but I sit next to the team that does. They take a lot of pride in making fast-loading pages that render quickly and respond instantly. That's not compatible with writing giant, bloated, inefficient code.


>They take a lot of pride in making fast-loading pages that render quickly and respond instantly.

Maybe in your company. My experience browsing modern front-end SPA design however begs to differ. They pride themselves in loading megabytes of uncompressed images, layers of invisible background images, frameworks to load frameworks, auto-download/auto-playing 4k videos...


No one prides themselves on any of those things.


And yet they still do it.


i agree with you but you have to keep in mind that pages which load fast on a fiber connection and render and respond quickly on the latest macbook don't guarantee the same for the average user.

giving devs 4 year old laptops would help but when shiny startup is offering new macbooks how can you compete?


Giving devs 4yo laptops when they only need then for testing seems, erm, strange, limiting.

Just have them test under processing and bandwidth limits?


Definitely can't generalize, but the average macbook pro sporting web dev often forgets what their users deal with. I had conversations at work where I pointed out the data we had on our users (and how they weren't even close to having that kind of hardware and internet connection), and no one would believe me until they saw the data themselves.


> I think it's time we start looking at the problem differently. It's resource abuse that's the root cause, so why aren't there limits on those resources?

This is a flawed assumption, I think. Sure, maybe _some_ people are blocking ads because all that JS and network IO eats up battery but I bet more people block ads either because they dislike advertisements generally or they dislike advertisements based on surveillance. Whether you manage to cram your ad serving code into 1Mb or 1Kb is irrelevant for the later two groups of people.


Please disable your adblocker!

Do you accept our cookies, tracking and privacy policy?

Would you like to subscribe to our newsletter?

Please give us access to your location!

This site is way better if your download our app!

Please allow us to load another 30MB of JavaScript!


What's stopping a site from loading javascript using a side-channel (encoded inside a png, for example), and then eval-ing the payload? I guess you can prevent this by limiting the amount characters you can send to eval, but I'd imagine it'd break a lot of sites. Plus you can probably bypass the eval limit by rolling your own eval (basically writing a javascript interpreter in javascript)


Simple: stop loading those PNGs or strip them of all not-image content. The fact that you can inject code into an image sounds like a security flaw waiting to happen.


Anything medium can be encoded and decoded. You transpile your code into pixels, CSS, whatever medium is considered "ok." Images just happen to be large and so can be used to hold a lot of information. You can't just inherently know what is and isn't encoded information and strip out "the bad stuff."

Those pixels do not execute themselves (or cookies, or headers, or whatever we are still allowing). They still need an interpreter. Interpreters are indeed security flaws. Pushing people away from scannable code and towards hiding their code in other filetypes will cause a proliferation of interpreters, and a proliferation of security flaws.


Image files should only be for images, and what little metadata needed to display them properly. Anything else should be discarded.


What things should and should not be does not affect what they can be. The point is that the image data can be encoded. It's not "extra," it's the actual image itself being used to relay additional information covertly. How do you differentiate between which pixels are "good" and which ones are "bad"? There is no way to know.

To process such a file, you DO need an interpreter that follows the "rules" of the encoding, but the interpreter can be small, concealed, and varied, resulting in a cat and mouse game if you are trying to block such implementations.


Can you define "not-image content"? Steganography begs to differ.


Simple: Any content that isn't required to generate the image.


In this case, the JavaScript is the image. It's all just bytes, they can be decided as pixels or code.


IMO the critical thing needed to stop JS bloat is a decent standard library. The _vast_ majority of code on an average site tends to be the same thing implemented 45 times because the stdlib in Javascript is anemic. The current process of adding a function at a time to the language spec is rather pointless at solving this issue since it's often years after the function is added before anyone can realistically rely on it existing. In the mean time we have to keep loading polyfills just in case. Also at the current rate of functionality additions it'll be decades before the stdlib is decent.

What I really want to see happen is WHATWG start a standard lib project. In my mind it would look something like this:

* Open source project that implements the standard library in Javascript only and holds the canonical test suite. Something akin to core-js but the 'official' version of it and with features outside the language spec.

* Every browser agrees to preload / cache all recent versions of the standard library (or most recent version for each major release assuming semver). This allows all programmers to load the version they need without concern for the performance hit.

* User loads the standard library via a script tag with src something like: https://ecmascript.org/stdlb/v1/lib.js. Doesn't really matter what it is, just some recognized url the browser knows about that encodes the version.

* Each browser can provide a native implementation of anything in the stdlib so long as it passes the spec. Browsers could even optimize which pieces of the stdlib it parses / loads based on this. e.g. If the browser has a native Promise implementation then it doesn't need to load the Promise code from the stdlib.

* Be reasonable but aggressive about adding to the stdlib. It's scope should be wide and cover common use cases. e.g. I shouldn't have to write a URL parsing class or a throttle function every time I start a new project (which is where we are today). There are plenty of projects to look at for learning what people need (lodash, etc).

Obviously this would not cover everything; it's not going to add `await` to browsers that don't support it. But I think we're at the point where what we are most in need of is not language features (and mostly these are transpilable anyway), but stdlib functionality.


Hmm, not sure I like this idea. I'm a very pro-PWA/webapp sort of person, so anything that restricts flexibility on the webapp side annoys me. Especially given the number native apps that are equally as bloated and virtually never share or reuse code between apps (think of the 10mb native note taking apps out there).

Perhaps we just let quality rise to the top? If your website is full of ads and slow, then someone else with better, less annoying execution will eventually win more users.


I use Super Stop and I'm in a habit of hitting shift+escape on basically every website I visit. It kills all AJAX calls and content loading. 9/10 websites load perfectly in the first second and stopping them after that just prevents popups, tracking, etc. It would be nice to have a plugin that automatically stops websites after some preset interval.


Uh... this is wrong on so many levels.

What is going to be the limit?

Who is going to be setting it?

How is it going to be calculated? Is it the amount of loaded files? What if the page is small but then loads more dynamically later?

How I, as a developer, make sure my website works everywhere?

What with sites that use a lot of code but still work fast? The amount of code does not directly translate to slowness. You can have little code and slow website or a lot of code and fast website. As an example, single page applications tend to load all of its code but still manage to be responsive.


If I was responsible for setting the agenda...

> What is going to be the limit?

Whatever the user wants it to be.

> Who is going to be setting it?

A default from the browser, which can be overridden by the user.

> How is it going to be calculated? Is it the amount of loaded files? What if the page is small but then loads more dynamically later?

Preferably by a total per-site and per page (with page being a higher amount than site). When a site/page hits its limit, halt running with a popup asking for a temporary larger limit (and a checkbox to remember the limit for this site).

> How I, as a developer, make sure my website works everywhere?

The same way you, as a developer, make sure it works everywhere with Javascript features. You find a sane maximum by the lowest of all the browsers that support it. If you figure out what IE, Chrome, Firefox and Safari (and their mobile equivalents) have as limits, you've basically solved the problem.

> What with sites that use a lot of code but still work fast? The amount of code does not directly translate to slowness.

Depending on your connection, it really does. Bad mobile connections (even if only in certain locations, like supermarkets that are constructed like Faraday cages) mean that an increase in the size of the JS loaded required to use the page directly translates into extremely slow websites, and in a lot of contexts, the first page is the only page visited, so the idea that a SPA will be quick after that first load (to stave off that future point) doesn't really solve the problem.


> What is going to be the limit?

He suggests 1mb, personally I still think it is too much.

> How I, as a developer, make sure my website works everywhere?

1mb for a single page isn't enough?

> What with sites that use a lot of code but still work fast?

It's fast if the resources are near your device (CDN), most websites don't do that at all.


I think we already do this. Its built into the fabric of the web. Not sure why we are proposing arbitrary limits and additional user opt in for this.

Wether a website wants to be inefficient or has a good reason for having a lot of resources it impacts the speed with which the site loads. The slower the site loads the more users are going to abandon the site. Stats say by 2 seconds most users have abandoned a slow loading site. Caveat that if the user feels the site is really worth the wait, they will wait longer.

So seems everything in this proposal is already addressed...


> 1mb for a single page isn't enough?

What about single-page apps where the whole app is "a single page". With this proposal, even lazily loading the extra bits would hit the limit.


I strongly suspect the true purpose of this would be to make javascript unusable for all but the most trivial cases, and to train end users to fear it like a virus, or find its presence annoying, making it an anti-feature. Killing SPAs would probably be a feature in that case.


This Author got to have some personal hate against javascript. Otherwise, it just does not make sense.


I 100% agree with this:

"But there's a downside to this content blocking: it's hurting many smaller sites that rely on advertising to keep the lights on. More and more of these sites are pleading to disable content blockers"

The solution proposed is unworkable for all the reasons others propose. But there really is a baby/bathwater situation with ad blockers that attempt to kill all advertisements.

I don't know what the solution is, but I think it is an important problem and deserving of a more sophisticated solution than ad blockers or ones like the solution proposed. People here are smart, I hope people can do more than just state reasons why we can't do better than what we are doing.


Most people block ads because they don't want to see ads, not because they want less javascript. The solution is to find a better business model, not make smaller ads.


I think it is a small subset of people that see things that black and white. I agree it is not simply "wanting less javascript", but I think a large number of people would not mind small ads that have minimal bandwidth requirements, minimal cpu usage, minimal slowing down the page, that don't pop up in your face, are not visually distracting, and are relevant to the material on the site.

I don't think "get a different business model" is a reasonable option for many web sites, unless the whole point of the site is to support a product (for instance Adobe's Photoshop web page, or whatever).


Long, long time ago I wrote down my set of rules for acceptable ads: https://news.ycombinator.com/item?id=10521930

Really, we block ads because they are intrusive, disruptive, invasive and bloated pieces of malware. Simple as that.


> Content blockers [...] prevent abuse by ad networks and many people are realizing the benefits of that with increased performance and better battery life. But there's a downside to this content blocking: it's hurting many smaller sites that rely on advertising to keep the lights on. [...] In effect, these smaller sites are collateral damage in a larger battle. And that's a big problem for the long-term health of independent content on the web.

I've got another way to solve this problem.

Content blockers should operate on a blacklist instead of a whitelist. Advertisements appear by default, but if a given site annoys you enough, you can go into your ad blocker's preferences and add it to your blacklist, and then you'll never see any ads on that site again.

Why is this not even an option in any of the major adblockers? I know it's possible in Ublock Origin via tweaking advanced settings[1], but it should be a built-in, user friendly feature.

[1] https://github.com/gorhill/uBlock/wiki/Dynamic-filtering:-tu...


>Advertisements appear by default, but if a given site annoys you enough, you can go into your ad blocker's preferences and add it to your blacklist, and then you'll never see any ads on that site again.

Thats literally what ABP has. Ads which have meet the acceptable ads criteria appear by default, and the rest are blocked by default. Users can however go and disable that setting so that all ads are hidden by default.


A built in whitelist ≠ a user created blacklist


Easylist would be a user created blacklist, right? Very ad blocker I know of uses that.


I mean per site that displays ads, not the ads themselves.

Right now, most people I know who use adblockers will "whitelist" websites they want to support, and/or who display ads they find nonintrusive.

I really think the default should be switched, so websites can display ads by default but a user can whitelist domains they dislike. At minimum, it should be possible to switch to this functionality.


Just thinking about this problem and the proposed solution is making me flustered.

Remember those popups in IE "Do you want to continue running scripts on this page?" Nobody knew what they were, where they came from, or what they were talking about. Even wose than that was the fact that nobody knew what the Yes/No button was for and if you clicked "Yes" sometimes it would just keep popping up over and over again.

They got rid of that for a good reason (although its still alive and well in MSHTA.exe). It sucked and offered nothing of value to most users.

On the other hand, the New York Times's website has 5 external scripts that total over 1mb of just text. They've got their CMS, 4 DIFFERENT ad agencies, analytics, and some other crap the site problably never "needed" before 2010. That's disgusting. And that's what you get from a $300m+ company that RELIES on technology to stay in business. Missing a feature? Fuck it, just have the client pull down another 400kb of js from a sixth CDN. Nevermind the actual CONTENT that attracted the user in the first place is measured in bytes.

So while I think this is horribly misguided, I do agree that something needs to be done to deincentivize lazy Javascript programmers pasting all their problems away. Perhaps the lock icon in the address bar should also turn yellow or red to reflect external script resources? Or maybe whenever multiple frameworks are combined or multiple ad networks used?

Almost every page of my WP website has just 8 internal resources and a page size of <310kb. There's no reason NYT can't stay in that ball park with their deep pockets. Megabytes of tracking and bullshit for basically a white page with text is disingenuous.


We're in a kind of interesting economic situation when it comes to consumer compute and bandwidth resources. There's no "billing" for compute time on the server side nor on the client side. Bandwidth costs are also negligible on most connections. This means there are no pressures to efficiently utilize them beyond user annoyance (which usually isn't a strong enough motivating factor to modify behavior). Is the solution some sort of bidirectional billing scheme?

"This page will cost $0.02 to retrieve. Pay by running 20FLOPs of code?"

"OK"

"This page will cost $0.05 to retrieve. Pay by running 500FLOPs of code?"

"No, pay via $fiat_transfer_method"


One thing I would love personally is the ability to restrict JS usage to a CPU limit, which increases in response to certain events and gradually reduces to 0 if unused. This could be

* A number of seconds on page load,

* a fraction of a second in response to network activity (eg. image loads),

* a small number of frames in response to significant user interaction (mouse clicks, typing),

* one frame after less-significant user interaction (mouse movement), limited only to local operations (no network activity).

The vast majority of sites should need nothing more than this, so opt-in to unlimited usage should be fine. Sites where the user has enabled extra permissions like notifications should be allowed extra time for those.


So it's totally okay when we block advertisers but when developers are the ones under scrutiny, suddenly the user-centric argument is out the window.

I think this is a great idea. It puts pressure on developers and makes experiences better for users. The average American Internet speed is sub-100 Mbps, but average LTE speeds are closer to 12 Mbps, with websites opting usually to use responsive layouts over separate mobile sites. This means you're downloading the full resources of a desktop site, and the mobile device is adjusting to media queries.

5 MB / 12 Mbps is over 3 seconds. That's bullshit. Put pressure on developers, make a better web.


I really hope interest based advertising dies. I'm not convinced it works and every time it's brought up, someone links to some study suggesting it's not particularly effective. I actually don't mind ads all that much, if they're static, don't track me and are rendered server side (no javascript).

Base it on the content of the page, not the person visiting. I've never clicked on a retargrted ad for sneakers that follows me to a tech site, but an ad for (for example) DataGrip on an article about SQL tooling might actually interest me!


This whole thing seems like adding an arbitrary knob for arbitrary reasoning, considering that it would impact sites that have nothing to do with the author's problems with js.


I think it's incorrect to assume that small websites rely on advertising to keep the lights on. They are the ones who definitely can't make enough on ads to sustain and are always funded primarily some other way. But at the same time running ads they let advertising companies to profit from all of them in aggregate.

Given that, I don't see any point trying anything that somehow keeps ads around. Intrusive online advertising doesn't really need to exist.


I came to these comments to see if it was mostly webshit programmers complaining about the horror that would be unleashed if JS had limits, and I was not disappointed.

A lot of people use their phones to access the Internet and have hard data caps, and webpages don't need to load several MB of JS to display 55k of text. There's certainly use cases for JS that justify that amount of script to load, but shoving ads, trackers, and widgets aren't those uses.


There exists a less blunt proposal to add resource limits to web pages:

https://www.igvita.com/2016/03/01/control-groups-cgroups-for...

It's aimed at limiting resource usage of 3rd parties (ads), and pages voluntary limiting their usage, but presumably browser extensions could add the limits too.


More done, with less bytes is a virtue and a great show of skill among software developers. Outside that infinitesimal small bubble however... users could barely give a damn.

Also I fear that a lot of this is javascript phobia formed by the mindset that the web is supposed to be documents not full blown applications, after all why complain about 2 megs of js when the page has a 10mb banner image and an autoplaying 40mb HD video?


smaller sites are collateral damage in a larger battle.

Smaller sites could avoid farming their content out to 2 or 20 locations on the cloud.

The user can already add limits. Try using uBlock Origin with scripts open to inline and first-party scripts and images everywhere ... and nothing else.

If the page comes up blank, bye-bye. Don't visit it any more. They made their choices, let them live with it.


Surf (a suckless webkit2 frontend) can be configured to disable JS by default, if you really need it, hit ctrl-alt-s and volla the junk gets executed. Browsing without js is wonderful. https://surf.suckless.org/


I am not sure the proposal is solving the right problem:

> The situation I'm envisioning is that a site can show me any advertising they want as long as they keep the overall size under a fixed amount, say one megabyte per page.

With minification/compression, I don't see how 1MB could work...


I believe a huge saving across the web could be something much more simple considering even the ubiquity of jQuery.

A build tool that scans your JS code, and includes only the jQuery functions that it has found to be using; or equivalent library etc.


This exists already. Modern JS build tools are actually pretty sophisticated.


Example.com does not use any scripting, let alone 5mb worth. In fact, it only transfers 1.24kb of plain CSS/HTML.

Whoever wrote that comment was a liar and a peddler of nonsense.


lol, good luck getting this through in Safari with the current state of Apples websites.


So. I figured you were probably right so I went and checked Apples websites for their largest products to see how bad it was.

The largest sum of JavaScript was on the Mac Pro (2013) page with just under 200kb.

That’s a lot less than 1MB


Sure, their sales pages are in shape.

But the developer stuff in the walled garden not so much.

The App Store Connect login/start page has 2MB JS and the App page has >3MB.

Overall the whole Apple Dev experience is sluggish.


Love it, I believe Chrome mobile already completely kills JS on GPRS connections.


the ability to restrict cpu and memory consumption per page would make more sense


Disable Javascript by default and make turning it on the exception. The web would be a much better place.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: