
Add limits to amount of JavaScript that can be loaded by a website - ingve
https://bugs.webkit.org/show_bug.cgi?id=194028
======
smacktoward
_> A simple dialog would put the user in control of the situation:_

 _> "The site example.com uses 5 MB of scripting. Allow it?"_

Oh God, no. Please don't do this.

1) We _know_ that users presented with these kinds of dialogs don't read them.
They just blindly click "OK" to get the dialog to go away. So now you're just
annoying people to no good purpose.

2) Even if the user does read this dialog, how is she supposed to know how
much JavaScript is too much? There's no context to tell her whether 5MB is a
lot, or how it compares to payloads delivered by similar sites. ("The site
example.com wants to load more code than 99.99% of other sites you visit"
would be slightly more helpful, but only slightly.) It just expects her to
have a strong opinion on a subject that nobody who isn't a coder[1] themselves
would have an opinion about.

3) If the problem we're trying to solve here is that code makes sites load
slowly, making people click through a dialog isn't going to make the site load
any faster. In fact it will almost certainly take the user more time to find
and click the "OK" button than it would to just load the code.

4) It's measuring the wrong thing. 1 kilobyte of badly written code can do as
much or more harm to the user experience than 1 megabyte of well written code
will.

[1] With the possible exception of people on strictly capped data plans, but
now we're solving a different problem.

~~~
lmkg
You're thinking about it wrong. The overall effect is not that users will take
control over their data, or whatever. The effect is that users will complain
that the site is broken. And then the pointy-haired boss will demand that the
pop-up is fixed. Now instead of "page weight" being "blah blah nerd bullshit,"
it is something the pointy-haired boss will demand from ad network partners,
and the effects will spread across the web ecosystem.

~~~
porphyrogene
Given the choice between a person with a particular hairstyle and a person who
minimizes the value of others by mocking their appearance, I would choose to
work with the former.

~~~
barbecue_sauce
"Pointy-hair" is not a hairstyle that even exists in reality. It's a Dilbert
reference.

~~~
krapp
[https://i2-prod.mirror.co.uk/incoming/article7110907.ece/ALT...](https://i2-prod.mirror.co.uk/incoming/article7110907.ece/ALTERNATES/s615b/Keith-
Flint.jpg)

~~~
barbecue_sauce
I doubt the singer from Prodigy has ascended to any position of corporate
authority.

~~~
krapp
Nevertheless, he does have the hair for it.

------
crazygringo
This particular solution seems wrong for all the obvious reasons listed in
other comments...

But on the other hand, I _love_ the idea of resource-limiting tabs by default,
along all of:

\- CPU usage (allow an initial burst, but after a few seconds dial down to max
~0.5% of CPU, with additional bursts allowed after any user interaction like
click or keyboard)

\- Number of HTTP requests (again, initial bursts allowed and in response to
user interaction, but radically delay/queue requests for the sites that try to
load a new ad every second even after the page has been loaded for 10 minutes)

\- Memory usage (probably the hardest one to get right though)

I mean, every so often I've caught things like a Gmail tab in Chrome stuck at
using 100% CPU indefinitely because it must be stuck in some infinite
JavaScript loop due to a bug, as well as the occasional other random blog that
uses constant 50% CPU for no discernable reason... and it would be nicer if
this didn't suck up two hours of my laptop's remaining battery time before I
discovered it.

If the browser had soft resource limits that just gradually slowed down code
execution and HTTP requests to extreme levels, with a discreet pop-up warning
in the corner "this site is attempting to use higher than normal resources,
click here to allow temporarily/permanently" for the times you're opening a
WebGL demo...

...it seems like it would be a big win for users.

~~~
crdrost
Right, you definitely want something that is somewhat more adaptive if you
want to solve this problem. Something to the effect of "as you use more
resources you slow down those resources."

What's more interesting to me is the broader economic question. Do users end
up flocking to those browsers because it makes the browser more snappy for the
other web sites and contexts? Or do they ditch them because they are
necessarily slower than browsers that don't? Is this sort of decision, in
other words, a sort of suicidal option for browsers?

~~~
acqq
The need for the user-friendly resource limitations in browsers is real.

E.g.

[https://www.coindesk.com/firefox-announces-move-to-block-
cry...](https://www.coindesk.com/firefox-announces-move-to-block-cryptomining-
scripts)

I know that solution presented in the article didn't work (neither in Firefox
nor in Opera), and I know that from a person close to me who is not
"technical" but wants to access some sites which slowed down the person's
computer (the sites indeed managed to use exactly 100% of the CPU, that means,
to use all cores at once, and to contact regularly some other sites with the
"coin" in the name). Then, I've shown the person that the JavaScript can be
blocked and that the computer is that way completely responsive... and you
know the result: even the non-technical persons do what's rational, once they
know that they have a choice.

------
headcanon
Author's premise includes a lot of personal opinion and simple gripe against
JS. not a fan of this. Whats the limit going to be? 1MB? 5? why is it the
browsers job to police that?

I run adblock primarily because I don't want to see ads and get tracked by
sketchy sources. Resource usage is a minor bonus. If that hurts peoples
pockets then so be it, most ads are out of hand and need to be tempered. Most
people who run adblock aren't going to click on the ads anyway.

With that said, I do hope we're able to figure out how to treat web "sites"
and web "apps" differently - for the former, I want as little JS as possible
since that just gets in the way of content, but for the latter, the JS is
necessary to get the app running, and I don't mind if its a few megabytes in
size.

~~~
mrweasel
While I'm not a huge fan of JavaScript heavy sites, I do think it's a little
disingenuous to drag content blocking in as part of the argument. Ads have
little to do with his basic argument, it just there to make the sales pitch go
down a little easier.

It's also not very effective as an ad-blocker or anti-tracking tool.

------
mattlondon
"640k ought to be enough for everyone."

Setting some arbitrary hard-limit is counter-productive and shortsighted. That
quote is infamous for exactly the same reasons that this claim is flawed.

Lots of unanswered questions here, e.g. what happens if a 500byte javascript
file then does a load of XHR requests that download more JS: what happens
then? does the connection just get closed as soon as you pass 5mb? That will
break the internet for lots of users. Plus what happens if I down load a
couple of kilobytes of javascript that ends up allocating many gigabytes of
memory? Is that ok because it was under 5mb of over-the-wire-bytes?

I can _easily_ imagine a 5mb+ web application. Sure you might not _like_ that,
but it does not make that a _bad_ situation.

If you don't like javascript, then the answer is simple: disable it. If you
want to use javascript but feel like a particular website is using too much
resources, then the answer is simple: dont visit it. You have choice.

~~~
tenebrisalietum
I can imagine a 5mb+ web application too, but there should be some way of
communicating something needs a lot of resources before it is casually
downloaded.

It's impolite for an installer to not tell you it's copying 50GB of data to
your hard drive before it starts.

"640K ought to be enough for everyone" is shortsighted, but "Web applications
should not assume the user wants to dedicate all system resources to them
without knowledge/permission" is not.

It won't matter much longer. Websites are giving way to phone apps. Where no
visibility into behavior or resource consumption is normal, standard, and
expected.

------
yuchi
I don’t know who the author of this proposal is, but I believe this should not
be pursued and that his idea is uninformed.

Content blockers try their best do block and remove specific parts of the web
page, surgically identifying them. You can’t just randomly block stuff and
therefore break an insane amount of websites and call it a day.

What about web apps? Should there be a «this application wants to exceed the
xMB quota» confirmation messages? Oh please no.

~~~
chipotle_coyote
The author's name is on the proposal, Craig Hockenberry. To avoid making folks
Google for him, he's been running the Iconfactory, a design and programming
consultancy that does, yes, icons, as well as web sites and applications.
They're most famous for Twitterrific, the very first Twitter client (and the
one that gave us the word "tweet" for Twitter posts as well as the blue bird
icon). They've been in business over twenty years. Hockenberry is himself both
an app developer and a long-time web developer.

I'm not sure I particularly agree with this proposal, either, for a lot of
reasons outlined in comments here (most broadly that this is putting too much
of a burden on users, and that it's just likely to create a new round of cat-
and-mouse games with bad actors to find ways around it), but the author
certainly isn't inexperienced in web and application matters.

~~~
ghiculescu
Also worth noting that he just launched a competing ad network:
[https://blog.iconfactory.com/2019/01/advertising-with-
ollie/](https://blog.iconfactory.com/2019/01/advertising-with-ollie/)

~~~
chipotle_coyote
As near as I can tell the Twitterrific ad network literally only serves in-app
ads to users of Twitterrific.

------
anontechworker
> Today's JavaScript developer is acting like they have a 100 GHz CPU with
> terabytes of memory. And being lazy and uninspired as a result.

Not sure how I feel about a statement this broad. I work mostly with front end
code and I try neat things and tricks to reduce resource use all the time. I
also always push for the lightest way of doing things in PRs. I’m certain I’m
not the only engineer who thinks like this.

~~~
kstrauser
I agree. I don't do frontend stuff, but I sit next to the team that does. They
take a lot of pride in making fast-loading pages that render quickly and
respond instantly. That's not compatible with writing giant, bloated,
inefficient code.

~~~
vezycash
>They take a lot of pride in making fast-loading pages that render quickly and
respond instantly.

Maybe in your company. My experience browsing modern front-end SPA design
however begs to differ. They pride themselves in loading megabytes of
uncompressed images, layers of invisible background images, frameworks to load
frameworks, auto-download/auto-playing 4k videos...

~~~
seattle_spring
No one prides themselves on any of those things.

~~~
chris_wot
And yet they still do it.

------
troutwine
> I think it's time we start looking at the problem differently. It's resource
> abuse that's the root cause, so why aren't there limits on those resources?

This is a flawed assumption, I think. Sure, maybe _some_ people are blocking
ads because all that JS and network IO eats up battery but I bet more people
block ads either because they dislike advertisements generally or they dislike
advertisements based on surveillance. Whether you manage to cram your ad
serving code into 1Mb or 1Kb is irrelevant for the later two groups of people.

------
xg15
Please disable your adblocker!

Do you accept our cookies, tracking and privacy policy?

Would you like to subscribe to our newsletter?

Please give us access to your location!

This site is way better if your download our app!

Please allow us to load another 30MB of JavaScript!

------
gruez
What's stopping a site from loading javascript using a side-channel (encoded
inside a png, for example), and then eval-ing the payload? I guess you can
prevent this by limiting the amount characters you can send to eval, but I'd
imagine it'd break a lot of sites. Plus you can probably bypass the eval limit
by rolling your own eval (basically writing a javascript interpreter in
javascript)

~~~
sneakernets
Simple: stop loading those PNGs or strip them of all not-image content. The
fact that you can inject code into an image sounds like a security flaw
waiting to happen.

~~~
IggleSniggle
Anything medium can be encoded and decoded. You transpile your code into
pixels, CSS, whatever medium is considered "ok." Images just happen to be
large and so can be used to hold a lot of information. You can't just
inherently know what is and isn't encoded information and strip out "the bad
stuff."

Those pixels do not execute themselves (or cookies, or headers, or whatever we
are still allowing). They still need an interpreter. Interpreters are indeed
security flaws. Pushing people away from scannable code and towards hiding
their code in other filetypes will cause a proliferation of interpreters, and
a proliferation of security flaws.

~~~
sneakernets
Image files should only be for images, and what little metadata needed to
display them properly. Anything else should be discarded.

~~~
IggleSniggle
What things should and should not be does not affect what they can be. The
point is that the image data can be encoded. It's not "extra," it's the actual
image itself being used to relay additional information covertly. How do you
differentiate between which pixels are "good" and which ones are "bad"? There
is no way to know.

To process such a file, you DO need an interpreter that follows the "rules" of
the encoding, but the interpreter can be small, concealed, and varied,
resulting in a cat and mouse game if you are trying to block such
implementations.

------
mbell
IMO the critical thing needed to stop JS bloat is a decent standard library.
The _vast_ majority of code on an average site tends to be the same thing
implemented 45 times because the stdlib in Javascript is anemic. The current
process of adding a function at a time to the language spec is rather
pointless at solving this issue since it's often years after the function is
added before anyone can realistically rely on it existing. In the mean time we
have to keep loading polyfills just in case. Also at the current rate of
functionality additions it'll be decades before the stdlib is decent.

What I really want to see happen is WHATWG start a standard lib project. In my
mind it would look something like this:

* Open source project that implements the standard library in Javascript only and holds the canonical test suite. Something akin to core-js but the 'official' version of it and with features outside the language spec.

* Every browser agrees to preload / cache all recent versions of the standard library (or most recent version for each major release assuming semver). This allows all programmers to load the version they need without concern for the performance hit.

* User loads the standard library via a script tag with src something like: [https://ecmascript.org/stdlb/v1/lib.js](https://ecmascript.org/stdlb/v1/lib.js). Doesn't really matter what it is, just some recognized url the browser knows about that encodes the version.

* Each browser can provide a native implementation of anything in the stdlib so long as it passes the spec. Browsers could even optimize which pieces of the stdlib it parses / loads based on this. e.g. If the browser has a native Promise implementation then it doesn't need to load the Promise code from the stdlib.

* Be reasonable but aggressive about adding to the stdlib. It's scope should be wide and cover common use cases. e.g. I shouldn't have to write a URL parsing class or a throttle function every time I start a new project (which is where we are today). There are plenty of projects to look at for learning what people need (lodash, etc).

Obviously this would not cover everything; it's not going to add `await` to
browsers that don't support it. But I think we're at the point where what we
are most in need of is not language features (and mostly these are
transpilable anyway), but stdlib functionality.

------
osrec
Hmm, not sure I like this idea. I'm a very pro-PWA/webapp sort of person, so
anything that restricts flexibility on the webapp side annoys me. Especially
given the number native apps that are equally as bloated and virtually never
share or reuse code between apps (think of the 10mb native note taking apps
out there).

Perhaps we just let quality rise to the top? If your website is full of ads
and slow, then someone else with better, less annoying execution will
eventually win more users.

------
Zecar
I use Super Stop and I'm in a habit of hitting shift+escape on basically every
website I visit. It kills all AJAX calls and content loading. 9/10 websites
load perfectly in the first second and stopping them after that just prevents
popups, tracking, etc. It would be nice to have a plugin that automatically
stops websites after some preset interval.

------
lmilcin
Uh... this is wrong on so many levels.

What is going to be the limit?

Who is going to be setting it?

How is it going to be calculated? Is it the amount of loaded files? What if
the page is small but then loads more dynamically later?

How I, as a developer, make sure my website works everywhere?

What with sites that use a lot of code but still work fast? The amount of code
does not directly translate to slowness. You can have little code and slow
website or a lot of code and fast website. As an example, single page
applications tend to load all of its code but still manage to be responsive.

~~~
throwawayy1001
> What is going to be the limit?

He suggests 1mb, personally I still think it is too much.

> How I, as a developer, make sure my website works everywhere?

1mb for a single page isn't enough?

> What with sites that use a lot of code but still work fast?

It's fast if the resources are near your device (CDN), most websites don't do
that at all.

~~~
nicoburns
> 1mb for a single page isn't enough?

What about single-page apps where the whole app is "a single page". With this
proposal, even lazily loading the extra bits would hit the limit.

~~~
krapp
I strongly suspect the true purpose of this would be to make javascript
unusable for all but the most trivial cases, and to train end users to fear it
like a virus, or find its presence annoying, making it an anti-feature.
Killing SPAs would probably be a feature in that case.

------
holoduke
This Author got to have some personal hate against javascript. Otherwise, it
just does not make sense.

------
robbrown451
I 100% agree with this:

"But there's a downside to this content blocking: it's hurting many smaller
sites that rely on advertising to keep the lights on. More and more of these
sites are pleading to disable content blockers"

The solution proposed is unworkable for all the reasons others propose. But
there really is a baby/bathwater situation with ad blockers that attempt to
kill all advertisements.

I don't know what the solution is, but I think it is an important problem and
deserving of a more sophisticated solution than ad blockers or ones like the
solution proposed. People here are smart, I hope people can do more than just
state reasons why we can't do better than what we are doing.

~~~
burkaman
Most people block ads because they don't want to see ads, not because they
want less javascript. The solution is to find a better business model, not
make smaller ads.

~~~
robbrown451
I think it is a small subset of people that see things that black and white. I
agree it is not simply "wanting less javascript", but I think a large number
of people would not mind small ads that have minimal bandwidth requirements,
minimal cpu usage, minimal slowing down the page, that don't pop up in your
face, are not visually distracting, and are relevant to the material on the
site.

I don't think "get a different business model" is a reasonable option for many
web sites, unless the whole point of the site is to support a product (for
instance Adobe's Photoshop web page, or whatever).

------
Wowfunhappy
> Content blockers [...] prevent abuse by ad networks and many people are
> realizing the benefits of that with increased performance and better battery
> life. But there's a downside to this content blocking: it's hurting many
> smaller sites that rely on advertising to keep the lights on. [...] In
> effect, these smaller sites are collateral damage in a larger battle. And
> that's a big problem for the long-term health of independent content on the
> web.

I've got another way to solve this problem.

Content blockers should operate on a _blacklist_ instead of a _whitelist_.
Advertisements appear by default, but if a given site annoys you enough, you
can go into your ad blocker's preferences and add it to your blacklist, and
then you'll never see any ads on that site again.

Why is this not even an _option_ in any of the major adblockers? I know it's
possible in Ublock Origin via tweaking advanced settings[1], but it should be
a built-in, user friendly feature.

[1] [https://github.com/gorhill/uBlock/wiki/Dynamic-
filtering:-tu...](https://github.com/gorhill/uBlock/wiki/Dynamic-
filtering:-turn-off-uBlock-everywhere-except)

~~~
hungryfoolish
>Advertisements appear by default, but if a given site annoys you enough, you
can go into your ad blocker's preferences and add it to your blacklist, and
then you'll never see any ads on that site again.

Thats literally what ABP has. Ads which have meet the acceptable ads criteria
appear by default, and the rest are blocked by default. Users can however go
and disable that setting so that all ads are hidden by default.

~~~
Wowfunhappy
A built in whitelist ≠ a user created blacklist

~~~
hungryfoolish
Easylist would be a user created blacklist, right? Very ad blocker I know of
uses that.

~~~
Wowfunhappy
I mean per site that displays ads, not the ads themselves.

Right now, most people I know who use adblockers will "whitelist" websites
they want to support, and/or who display ads they find nonintrusive.

I really think the default should be switched, so websites _can_ display ads
by default but a user can whitelist domains they dislike. At minimum, it
should be _possible_ to switch to this functionality.

------
zelon88
Just thinking about this problem and the proposed solution is making me
flustered.

Remember those popups in IE "Do you want to continue running scripts on this
page?" Nobody knew what they were, where they came from, or what they were
talking about. Even wose than that was the fact that nobody knew what the
Yes/No button was for and if you clicked "Yes" sometimes it would just keep
popping up over and over again.

They got rid of that for a good reason (although its still alive and well in
MSHTA.exe). It sucked and offered nothing of value to most users.

On the other hand, the New York Times's website has 5 external scripts that
total over 1mb of just text. They've got their CMS, 4 DIFFERENT ad agencies,
analytics, and some other crap the site problably never "needed" before 2010.
That's disgusting. And that's what you get from a $300m+ company that RELIES
on technology to stay in business. Missing a feature? Fuck it, just have the
client pull down another 400kb of js from a sixth CDN. Nevermind the actual
CONTENT that attracted the user in the first place is measured in bytes.

So while I think this is horribly misguided, I do agree that something needs
to be done to deincentivize lazy Javascript programmers pasting all their
problems away. Perhaps the lock icon in the address bar should also turn
yellow or red to reflect external script resources? Or maybe whenever multiple
frameworks are combined or multiple ad networks used?

Almost every page of my WP website has just 8 internal resources and a page
size of <310kb. There's no reason NYT can't stay in that ball park with their
deep pockets. Megabytes of tracking and bullshit for basically a white page
with text is disingenuous.

------
maccam94
We're in a kind of interesting economic situation when it comes to consumer
compute and bandwidth resources. There's no "billing" for compute time on the
server side nor on the client side. Bandwidth costs are also negligible on
most connections. This means there are no pressures to efficiently utilize
them beyond user annoyance (which usually isn't a strong enough motivating
factor to modify behavior). Is the solution some sort of bidirectional billing
scheme?

"This page will cost $0.02 to retrieve. Pay by running 20FLOPs of code?"

"OK"

"This page will cost $0.05 to retrieve. Pay by running 500FLOPs of code?"

"No, pay via $fiat_transfer_method"

------
Veedrac
One thing I would love personally is the ability to restrict JS usage to a CPU
limit, which increases in response to certain events and gradually reduces to
0 if unused. This could be

* A number of seconds on page load,

* a fraction of a second in response to network activity (eg. image loads),

* a small number of frames in response to significant user interaction (mouse clicks, typing),

* one frame after less-significant user interaction (mouse movement), limited only to local operations (no network activity).

The vast majority of sites should need nothing more than this, so opt-in to
unlimited usage should be fine. Sites where the user has enabled extra
permissions like notifications should be allowed extra time for those.

------
andrewmcwatters
So it's totally okay when we block advertisers but when developers are the
ones under scrutiny, suddenly the user-centric argument is out the window.

I think this is a great idea. It puts pressure on developers and makes
experiences better for users. The average American Internet speed is sub-100
Mbps, but average LTE speeds are closer to 12 Mbps, with websites opting
usually to use responsive layouts over separate mobile sites. This means
you're downloading the full resources of a desktop site, and the mobile device
is adjusting to media queries.

5 MB / 12 Mbps is over 3 seconds. That's bullshit. Put pressure on developers,
make a better web.

------
davb
I really hope interest based advertising dies. I'm not convinced it works and
every time it's brought up, someone links to some study suggesting it's not
particularly effective. I actually don't mind ads all that much, if they're
static, don't track me and are rendered server side (no javascript).

Base it on the content of the page, not the person visiting. I've never
clicked on a retargrted ad for sneakers that follows me to a tech site, but an
ad for (for example) DataGrip on an article about SQL tooling might actually
interest me!

------
duxup
This whole thing seems like adding an arbitrary knob for arbitrary reasoning,
considering that it would impact sites that have nothing to do with the
author's problems with js.

------
zzzcpan
I think it's incorrect to assume that small websites rely on advertising to
keep the lights on. They are the ones who definitely can't make enough on ads
to sustain and are always funded primarily some other way. But at the same
time running ads they let advertising companies to profit from all of them in
aggregate.

Given that, I don't see any point trying anything that somehow keeps ads
around. Intrusive online advertising doesn't really need to exist.

------
deadmetheny
I came to these comments to see if it was mostly webshit programmers
complaining about the horror that would be unleashed if JS had limits, and I
was not disappointed.

A lot of people use their phones to access the Internet and have hard data
caps, and webpages don't need to load several MB of JS to display 55k of text.
There's certainly use cases for JS that justify that amount of script to load,
but shoving ads, trackers, and widgets aren't those uses.

------
pornel
There exists a less blunt proposal to add resource limits to web pages:

[https://www.igvita.com/2016/03/01/control-groups-cgroups-
for...](https://www.igvita.com/2016/03/01/control-groups-cgroups-for-the-web/)

It's aimed at limiting resource usage of 3rd parties (ads), and pages
voluntary limiting their usage, but presumably browser extensions could add
the limits too.

------
demarq
More done, with less bytes is a virtue and a great show of skill among
software developers. Outside that infinitesimal small bubble however... users
could barely give a damn.

Also I fear that a lot of this is javascript phobia formed by the mindset that
the web is supposed to be documents not full blown applications, after all why
complain about 2 megs of js when the page has a 10mb banner image and an
autoplaying 40mb HD video?

------
8bitsrule
_smaller sites are collateral damage in a larger battle._

Smaller sites could avoid farming their content out to 2 or 20 locations on
the cloud.

The user can already add limits. Try using uBlock Origin with scripts open to
inline and first-party scripts and images everywhere ... and nothing else.

If the page comes up blank, bye-bye. Don't visit it any more. They made their
choices, let them live with it.

------
jakeogh
Surf (a suckless webkit2 frontend) can be configured to disable JS by default,
if you really need it, hit ctrl-alt-s and volla the junk gets executed.
Browsing without js is wonderful.
[https://surf.suckless.org/](https://surf.suckless.org/)

------
tanilama
I am not sure the proposal is solving the right problem:

> The situation I'm envisioning is that a site can show me any advertising
> they want as long as they keep the overall size under a fixed amount, say
> one megabyte per page.

With minification/compression, I don't see how 1MB could work...

------
shanehoban
I believe a huge saving across the web could be something much more simple
considering even the ubiquity of jQuery.

A build tool that scans your JS code, and includes only the jQuery functions
that it has found to be using; or equivalent library etc.

~~~
Technetium_Hat
This exists already. Modern JS build tools are actually pretty sophisticated.

------
waste_monk
Example.com does not use any scripting, let alone 5mb worth. In fact, it only
transfers 1.24kb of plain CSS/HTML.

Whoever wrote that comment was a liar and a peddler of nonsense.

------
k__
lol, good luck getting this through in Safari with the current state of Apples
websites.

~~~
dijit
So. I figured you were probably right so I went and checked Apples websites
for their largest products to see how bad it was.

The largest sum of JavaScript was on the Mac Pro (2013) page with just under
200kb.

That’s a lot less than 1MB

~~~
k__
Sure, their sales pages are in shape.

But the developer stuff in the walled garden not so much.

The App Store Connect login/start page has 2MB JS and the App page has >3MB.

Overall the whole Apple Dev experience is sluggish.

------
amiga-workbench
Love it, I believe Chrome mobile already completely kills JS on GPRS
connections.

------
lurker458
the ability to restrict cpu and memory consumption per page would make more
sense

------
naaaaak
Disable Javascript by default and make turning it on the exception. The web
would be a much better place.

