
Limiting JavaScript? - happy-go-lucky
https://timkadlec.com/remembers/2019-01-31-putting-some-limits-on-javascript/
======
saurik
> Sizes are being used as a fuzzy proxy here which makes sense—putting a cap
> on CPU usage and memory is a lot harder to pull off. Is focusing on size
> ideal? Probably not. But not that far off base either.

No: this _doesn 't_ make sense :/. The core problem is that stuff sits around
executing some tiny input handler or animation in a loop, burning CPU. When I
have tracked the tabs that are the worst performers down to the code causing a
problem, it is never a large amount of code: it is some stupid mechanism that
polls the position of something (like the cursor or the scrollbar), or is
trying to push some analytics to a server.

This really has _nothing_ to do with the amount of code being downloaded. I
realize some people care complain about how much stuff they have to download,
but that just isn't what is actually causing most people problems. Sure,
tracking CPU is sort of annoying, but it absolutely isn't hard. Chrome already
is running these things in separate processes (for security), and the
operating system is tracking the time used for each thread: you can just ask
it and make some kind of limit if that is what you care about.

I mean, in this article I see ideas for size limits for images, which is at
least consistent... but that is going way way too far: 1MB just isn't good
enough for a reasonable image. If you care so much about bandwidth, make a
bandwidth cap for the page and if it exceeds it--across all media--figure out
some way of blocking or punishing the site.

What most of us care about is that there seems to be no limit on the CPU usage
of any given page. This is easy to fix--it is a virtual machine, after all!--
by just doing the same trick Erlang uses for compiling a preemptive fiber and
then limiting its time execution slices.

What I know I care a lot about is when a tab I haven't looked at in three days
is suddenly using CPU time _at all_. Just make it so background tabs get
severely limited in their ability to do background execution and eventually
get stopped entirely, and the problem is essentially solved.

(Chrome, which is apparently already big on these size limits, doesn't do
this, and I swear it is because it is against Google's interests to do it as
it mostly makes it more difficult to do stuff like tracking and advertising
:/.)

~~~
jefftk
_> What I know I care a lot about is when a tab I haven't looked at in three
days is suddenly using CPU time _at all_. Just make it so background tabs get
severely limited in their ability to do background execution and eventually
get stopped entirely, and the problem is essentially solved._

Chrome does throttle background tabs:
[https://developers.google.com/web/updates/2017/03/background...](https://developers.google.com/web/updates/2017/03/background_tabs)

It doesn't throttle them all the way to zero, though. If they did that they'd
break things like sites that change their favicon to signal "unread message".

(Disclosure: I work at Google, though not on Chrome)

~~~
tokyodude
yes. without background processing every web based service that has live
updates would break. Email (Gmail/outlook.com/...) chat
(slack/messenger/WhatsApp/Discord) SNS (fb/Twitter) even stack overflow and
github have various forms of live update

~~~
int_19h
On mobile platforms, app don't use polling loops for that sort of stuff,
precisely so that power usage could be optimized. Time for something similar
for the web?

~~~
omnimus
Arent notifications same thing with only difference that they go through
apple/google centralized server so your phone doesnt have to listen for
multiple servers but just one?

------
est31
Modern computers are so fast, they can process gigabytes over gigabytes of
data. And often, the content we browse on the web is text based, or text with
some images. Still, web sites manage to suck up the remaining capacity. Why?
Web browsers are faster than ever before, but web sites are bloatier than ever
before, eating up all the hardware and software capabilities.

I don't think that writing lots of lean web sites and hoping for people to
switch for them is the right approach. The approach chosen here, by using the
power of the user agent, seems the right one.

~~~
kllrnohj
> Web browsers are faster than ever before, but web sites are bloatier than
> ever before, eating up all the hardware and software capabilities.

Are they? Have web browsers actually gotten any faster at all over the last,
say, 5 years? 10?

JS engines got a bit faster, but what about CSS & HTML parsing? 2D rendering
performance? Layout engine performance? DOM performance? Mozilla made a bit of
noise about this a year or two ago with their whole Project Quantum push - but
had you ever heard a peep about this stuff prior to that? Or since? Nobody
benchmarks this stuff, and yet it's _insanely critical_ to interactive
performance. But since it's harder to measure than JS performance, the only
thing ever measured is JS performance. And occasionally, rarely, page load
speeds.

Open up a 10MB plain text file in Chrome and it completely falls over. Zero
JS. Zero CSS. Zero HTML. Just plain text. Are modern browsers really fast?

And for what it's worth modern computers are _wide_ \- 4 core with SMT is damn
near low end these days. Yet the web is still incredibly stuck in the single-
thread mode of operation. Both the browser internally and the platform itself
(WebWorkers are far too slow, heavy, and restricted to meaningfully be used to
offload interactive work). And there's almost no work being done to address
this. WASM's threads are the only sliver of light here on the platform side.
Is it really surprising that people throw RAM at the problem as a result?
Throwing more caches at things is the natural response to being heavily
starved for CPU on the single thread you can use.

~~~
webmobdev
Agree 100% with you. All we have to do is look at the Opera browser with their
Presto rendering engine to realise the truth of this - it really rattled
Internet Explorer and Microsoft with its super fast rendering speed and small
size that made it easy to download on slow connections. It made Internet
Explorer and Firefox look like clunky and slow bloats of software. Opera was
so good that they were able to charge for it, and despite the free browsers
available many people bought it.

~~~
zeroname
I'm not sure that's a fair comparison though, Presto didn't support HTML as
thoroughly and it has since dropped out of the race entirely.

We can't be piling more and more high-level crap onto the standard and expect
Moore's law to keep up with it.

~~~
webmobdev
As far as I remember, Opera with its own rendering engine used to be one of
the most web standards compliant browsers.

That said, my point stands - I was pointing out how browsers like Firefox and
Chrome are still bloated softwares compared to early Opera (pre Blink, Presto
versions). Them "dropping out of the race" is irrelevant to that aspect.

------
underwater
Folk like Alex Russell has a weird an intense hatred of how people actually
use JavaScript. Of course he dresses it up as caring about user experience,
but - surprise - every single time the solution to a problem turns out to be
"don't write React/Angular", which is not a very pragmatic stance.

This wouldn't be such a problem, except he has a huge sway on the language via
TC39 and the web via his work at Google, and keeps trying to foist over-
designed complex solutions, like Web Components and PWAs onto developers.

------
mariopt
> Per-script max size: 50kB

> Total script budget: 500kB

This limit would break, nearly, all modern SAP apps. Bootstrap 4 min js bundle
is 49Kb. Also, the other limits are far from reasonable.

I would be wiser if Chrome/Firefox/etc would target ad networks. Would be nice
to have optimised ad that would be a tiny fraction of the website. Some ads
download several megabytes just to show video gifs. These are the bastards
that waste a good chunk of my 4G data plan.

Websites need to start to be reasonable about the amount of ads per page. When
I pause my ad blocker, I get scared by the websites I usually visit: So many
god damn ads everywhere, makes wonder why ads blockers users are not 99%
instead of the current 30%.

For website owners: if your website/app is slow, users will stop using it and
other websites will replace it. It's your own problem, not the community
problem.

Don't think it's fair to limit javascript/browser because some morons code
garbage without caring about it. It is more than possible to code fast
javascript apps.

~~~
dagoat
> This limit would break, nearly, all modern SAP apps

I presume you mean SPA, and you’re probably correct. What limits do you think
are reasonable?

To me, if a SPA is > 500kb of JS my assumption is it’s unnecessarily bloated.

Gzipped aren’t most modern frameworks/libraries < 150kb? With many being
considerably less

Perhaps the issue isn’t proposed limits, but instead the state of most modern
SPAs

~~~
markmark
Personally I think just leave it to users to vote with their feet. If I think
a site is too slow to load or too slow to use, I'll stop using it. If I'm
happy to wait for 10Mb of js to run a web app, then let me.

~~~
super-serial
YES - I hate Google, their AMP team, or anyone else trying to tell us how to
"fix" the web.

It used to be innocuous when Google had 20% browser market share, but now they
act like they own how users should experience the web. Like the uBlock origin
guy said... if Chromium keeps heading down this path it should no longer be
called a "user-agent" because it's no longer acting on behalf of users.

------
martin_drapeau
The browser thanks to JavaScript has become a platform. It has displaced
native desktop development. As a developer it would suck for me to be imposed
limits. Another thing to manage.

Personally, I don't see a problem to be solved here. Bloated sites with ads
will always exist. The solution is quite simple - don't visit them. They will
eventually die or be replaced by something leaner. A great example is GitHub
which has replaced Sourceforge.

~~~
zeroname
Statists will downvote this post, they believe regulation can solve the issue.

Libertarians will upvote this post, they believe the market can solve the
issue.

~~~
krapp
This article is the programmers' equivalent of Grover Norquist saying "I don't
want to abolish government. I simply want to reduce it to the size where I can
drag it into the bathroom and drown it in the bathtub."

Only with Javascript.

Also, libertarians are statists. Weak statists, but statists nonetheless.

------
z3t4
I used JavaScript white-listing for a few years, but now I only use an add-
blocker - which is way less work, and solves most of the issues. Monetization
from ads was nice while it lasted, it got destroyed by click farms and spam
sites. You can no longer earn money from ads unless you are big enough to talk
directly to advertisers. For small time content providers I instead recommend
something like Patreon instead of ads. I think we need to invest some thought
to micro-transactions though, as right now 10% or more of _every_ transaction
goes to middle-men.

------
EFruit
I'm all for more tools for users to limit resource abuse, but I don't see it
changing a thing. The precedents are already set.

If the user doesn't give the developer what he/she wants, the developer will:

\- block them until the user allows however much resource abuse the developer
desires (ad-block blocking scripts), or nag them to change their settings

\- try to evade the blocking using any and every unblocked mechanism (ads via
websockets to get around request filters, etc.)

\- do absolutely nothing and let the site stay broken

The problem isn't that there are resources to abuse. The problem is the
significant motivation to abuse them.

------
mattnewport
Finally all those hours I wasted on code golf might prove useful after all.

------
the8472
From the performance-perspective it would be useful to also limit javascript
execution time per some time-slice. Parse/compile time should be included in
that of course.

From a privacy perspective javascript should also be limited to the same
origin. That way you can run local trackers when a user visits the site but
not do tracking on 3rd party sites just by loading some cruft into the page.

Ads can be served in iframes, without javascript or cookies (cf. sandbox
attribute).

~~~
wolco
You realize that companies could create a backend file that would load the
external tracking.

Ads in iframes are not smart enough to show anything related as there is no
content.

~~~
the8472
If they add their own backend stuff they might as well use self-hosted
tracking.

As for the frames, that's on the embedder to provide the necessary information
instead of doing user tracking.

------
ksec
I don't like Web Apps, I like Web Pages, as much as I want to get rid of
Javascript, there are simply no other alternatives.

We should limit Javascript, but not by its code size. We should include more
Native functions across browser vendors so we could reduce the use of
Javscripts. Have a minimal Javascript Standard Library.

------
tedunangst
If the server side developers gave a s#!t, they'd just optimize for all
clients. No need for clients to request it.

Also, are any of these limits going to apply to XHR? Or can I just use a
loader and eval to get unlimited JS? And if the limits do apply, I assume that
means gmail and maps simply stop working at some point?

~~~
dooglius
Doing this is how you get server side developers to give a shit.

~~~
pdimitar
I dislike the assumption that the backenders don't care.

I personally do. But I'm never given the choice to make a site lean. The
priority is always "make it work" which is of course the right thing to do
_first_ , but after that nobody gives you time to make it smaller and faster.
There's always the next ticket in the backlog and you cannot argue.

So please, don't bash backenders. Many of us care and do our best with the
very limited time budget we manage to _STEAL_ to do optimizations. But proper
optimizations require dedicated time and effort with focused sessions -- and
we are never given those.

------
ringaroll
Personally, if it ever happens, I'm going to tell users to download the native
app. Those who want to download it will do it. Those who wont, no worries. I'm
not going to pander every browser's arbitrary requirements. All the hate on JS
is getting is beyond stupid.

~~~
tokyodude
I will not be installing your app and I will advise all friends, family, and
aquaintances to do the same. Native apps are 1000x worse than the web in the
risks to users. Even if I trust you I also have to trust the authors of every
library your app uses not to pown my computer or use my camera or mic, scan my
network, capture my screen, read the clipboard constantly, read and/or upload
all my files, etc...

~~~
krapp
>Even if I trust you I also have to trust the authors of every library your
app uses not to pown my computer or use my camera or mic, scan my network,
capture my screen, read the clipboard constantly, read and/or upload all my
files, etc...

You have to do that with all of the software you run, and the operating system
you run it on, anyway.

------
crooked-v
A 500kb-per-page limit would be a big middle finger to my company's web apps
and most of my company's users, simply because those web apps each do a
substantial amount of stuff (document generation, complex visual editors of
business domain data, elaborate searching and editing functionality in huge
tables of data), and the overwhelming majority of our users value constant
iteration on new features (of which there is still a huge to-do list) over
optimization.

------
pdimitar
> _...is that they aren’t going to roll something out to the broader web that
> is going to break a ton of sites. If they did, developers would riot and
> users would quickly move to another browser._

Yeah, like Firefox, Safari and... which other browser exactly?

With the near-monopoly state of the browser ecosystem, _that_ particular
argument I quoted above isn't very relevant these days.

------
dreamcompiler
Google itself could do a lot to solve this problem if it wanted to: Just
deprioritize the search rankings of sites that pull in a lot of external
javascript, or that do unnecessary client-side rendering (where the definition
of "unnecessary" would obviously need some careful consideration).

~~~
millstone
Do we really want Google to editorialize in this way?

------
aboutruby
I would rather like to see limits self imposed by websites similar to CSP.

~~~
ChrisSD
I think CSP shows self regulation doesn't work in practice. At least not
without incentives which don't currently exist. Therefore it's more practical
to put control in the hands of the user agent.

------
anfilt
Sounds good to me.

Although, that does not do much about small hot spots/loops.

------
pdimitar
Take a wild guess what would happen if this gets accepted.

That's right: the ad networks will hyper-optimize their script sizes and
runtime footprint. They'll just become much better.

------
spricket
Google does an endless dance around how they should handle the ad blocking
problem. Don't be fooled by this, the solution has been in their face for
years.

Most websites are reached from Google search. If Google de-ranked slow, ad
filled, and paywalled sites, the internet would fix itself overnight.

But they will never do this because Google cares about protecting their own
ads above everything else

------
hilbert42
I disable JavaScript in browsers and have done so for many years, if it's hard
or awkward to disable it in a specific browser then it instantly gets the
flick and I substitute one that's more amenable to having its JS operation
switched to 'off' mode.

Why do I bother going to this trouble when, these days, most of the web
considers JavaScript 'on-mode' as the 'essential' default? Well, I've multiple
reasons the first of which is useability. Whenever I have to use a browser
where I am unable to disable JavaScript (i.e. on machines that I don't own or
control)—I feel both frustrated and no longer in control of my browsing
experience, here's a few of my reasons:

1\. Speed. Fundamentally, I find that with JavaScript 'on' the browser's
usability suffers enormously, its response speed drops to the point where it's
damn difficult or painful to use (essentially, the browser's ergonomics have
taken an unacceptable nosedive). If you've ever browsed the web without
JavaScript for any length of time then you'll greatly appreciate the truly
enormous increase in rendering (display) speed of web pages when JS is turned
'off'. Moreover, the browser not only renders pages much more quickly but also
the rendering is much smoother—gone are the pauses and jerky page-loading
operation that so often plagues JavaScript's operation.

1.1 Why users actually put up with such unacceptable response times I can only
attribute to the fact that most have never used a browser with JavaScript
disabled. This alone is an indictment of the web/development industry: for
when websites downgrade users' browsing [useability] experiences for their own
explicit benefit (and or pecuniary interest) then they are effectively
exploiting users. Essentially, users do not really benefit from the use of
JavaScript, but websites do, and they do so mightily!

2\. Security and Privacy are so much easier to enforce when a browser's
JavaScript is disabled. Right, that's a sweeping statement but it's easy to
test. Using your browser's default settings and without additional add-ons or
plugins (with the exception of say a JavaScript on/off toggle add-on), go to
security/vulnerability-testing sites such as by Steve Gibson's, (GRC's)
ShieldsUP!! or the EFF's Panopticlick, site and check your browser's privacy
and security with and without JS. You'll be surprised. Moreover, many of the
privacy-invading techniques used by websites to steal your personal info are
killed stone-dead if JavaScript is disabled.

3\. Neutering JavaScript works wonderfully as a first line of defence against
ads and ad/user-tracking. Even without AdBlock or similar ad-blocking
software, ads are essentially yesterday when JavaScript is disabled! Adding
ad-blockers, etc. later only improves one's blocking experience. Make no
mistake, JavaScript's main web function is to make it dead easy for websites,
advertisers and Tech Giants to track you wherever you go across the web as
well as to supply you with targeted advertising, etc.—everything else—all of
JavaScript's other features are only of ancillary benefit (and prior to JS's
introduction, the web had other alternatives).

Nowadays, I'm essentially out of touch with the latest ads as I never see any.
…And what a truly wonderful condition that is.

4\. Websites that Require JavaScript. When I encounter a website that
absolutely requires JavaScript to function so conditioned are my reflexes that
I find I've backed out and off it without me even having realizing it. I can
wiz through dozens and dozens of news items on Hacker News and easily bypass
any sites that will not function without JS. I've never needed to worry, as on
the Web there's always thousands of equivalent or alternative websites that
are more 'cooperative' from which to choose.

5\. In very rare instances when I must visit a site that requires JavaScript
to function, I've a browser add-on that has an icon on the navigation toolbar
which allows me to simply toggle JS on and off whenever required. Accidentally
leaving JS on is almost impossible as the icon changes from green to red when
off. Similar methodologies apply on my rooted smartphone: along with the
absolute prerequisite of completely removing (deleting) Google's GApps, the
3rd-party browser I use has a feature to turn JavaScript quickly off.

I'm a heavy web user and have been so for decades, I often literally peruse
thousands of web pages per day without any need for JavaScript whatsoever. I
only add that I feel sorry for the many thousands of you who are welded on to
addictive sites where JavaScript is necessary.

Tragically, JavaScript's unfortunate arrival on the web several decades ago
was the beginning of the end of the old fast web as we once knew it, and if we
are to ever reclaim the web for users—claw power back from the Tech Giants
like Google, Facebook et al—then we will have to begin by severely curtailing
JavaScript's power.

Limiting JavaScript as outlined in the article isn't anywhere near a
satisfactory solution. For starters, can you imagine the fights and
disagreements over how these various, essentially arbitrary limits will be
set.

Keep in mind that it is JavaScript that fuels the Tech Giants' presence on the
web and thus they're the ones who are its 'true' pushers. Like drug peddlers,
they've forced this horrible, unnecessary, pernicious JavaScript scripting
'kludge' onto us users so as to maximize their business models—that of
maximizing their profits, and they've done so at the expense of us users. In a
much more user-centric web environment, none of us users would ever need this
JavaScript 'junk'.

~~~
leftyted
The analogy of JavaScript to addictive drugs should be comical but instead I'm
starting to find this "JavaScript derangement syndrome" stuff just goddamn
tiresome.

If you're not visiting websites that rely on JS, you're not using the same
internet as the vast majority of people who use the internet. Good for you, I
guess. Good luck looking at a map in a web browser without JS, although I'm
sure you'd never sully your computer by visiting a Google website.

~~~
zzzcpan
You can argue the same about blocking ads. Yes, not having ads and javascript
is very different user experience. Very much not the same internet. But why
would you want it any other way, why go back to that ad-riddled annoying slow
insecure manipulative web that the vast majority apparently use?

