Hacker News new | past | comments | ask | show | jobs | submit login
Limiting JavaScript? (timkadlec.com)
94 points by happy-go-lucky 47 days ago | hide | past | web | favorite | 96 comments



> Sizes are being used as a fuzzy proxy here which makes sense—putting a cap on CPU usage and memory is a lot harder to pull off. Is focusing on size ideal? Probably not. But not that far off base either.

No: this doesn't make sense :/. The core problem is that stuff sits around executing some tiny input handler or animation in a loop, burning CPU. When I have tracked the tabs that are the worst performers down to the code causing a problem, it is never a large amount of code: it is some stupid mechanism that polls the position of something (like the cursor or the scrollbar), or is trying to push some analytics to a server.

This really has nothing to do with the amount of code being downloaded. I realize some people care complain about how much stuff they have to download, but that just isn't what is actually causing most people problems. Sure, tracking CPU is sort of annoying, but it absolutely isn't hard. Chrome already is running these things in separate processes (for security), and the operating system is tracking the time used for each thread: you can just ask it and make some kind of limit if that is what you care about.

I mean, in this article I see ideas for size limits for images, which is at least consistent... but that is going way way too far: 1MB just isn't good enough for a reasonable image. If you care so much about bandwidth, make a bandwidth cap for the page and if it exceeds it--across all media--figure out some way of blocking or punishing the site.

What most of us care about is that there seems to be no limit on the CPU usage of any given page. This is easy to fix--it is a virtual machine, after all!--by just doing the same trick Erlang uses for compiling a preemptive fiber and then limiting its time execution slices.

What I know I care a lot about is when a tab I haven't looked at in three days is suddenly using CPU time _at all_. Just make it so background tabs get severely limited in their ability to do background execution and eventually get stopped entirely, and the problem is essentially solved.

(Chrome, which is apparently already big on these size limits, doesn't do this, and I swear it is because it is against Google's interests to do it as it mostly makes it more difficult to do stuff like tracking and advertising :/.)


> What I know I care a lot about is when a tab I haven't looked at in three days is suddenly using CPU time _at all_. Just make it so background tabs get severely limited in their ability to do background execution and eventually get stopped entirely, and the problem is essentially solved.

Chrome does throttle background tabs: https://developers.google.com/web/updates/2017/03/background...

It doesn't throttle them all the way to zero, though. If they did that they'd break things like sites that change their favicon to signal "unread message".

(Disclosure: I work at Google, though not on Chrome)


yes. without background processing every web based service that has live updates would break. Email (Gmail/outlook.com/...) chat (slack/messenger/WhatsApp/Discord) SNS (fb/Twitter) even stack overflow and github have various forms of live update


On mobile platforms, app don't use polling loops for that sort of stuff, precisely so that power usage could be optimized. Time for something similar for the web?


Arent notifications same thing with only difference that they go through apple/google centralized server so your phone doesnt have to listen for multiple servers but just one?


A simple short-term workaround would a setInterval equivalent that can be set to "every so often" instead of a hard number, leaving it up to the browser or OS to coalesce updates into chunks of activity every so often.


background connections !== polling loops

Many services use an open tcp connection (e.g. websocket) rather than polling. Mobile platforms are doing the same thing, just on a system level. I guess service workers are the closest web analogue.


Firefox has also the same logic to throttle background tabs.


> Chrome, which is apparently already big on these size limits, doesn't do this

Not sure what you mean, Chrome's background tabs are heavily throttled. Killing completely after a day or two would be nice. I use an extension for that.

I think you're missing the goal when you say what "most of us care about", though. Background tabs aren't an issue for people without many tabs open (many people) or who are on mobile (even more). Slow loading pages definitely are.


> 1MB just isn't good enough for a reasonable image.

That didn't make sense to me: the small (~700px wide) images I have in the library are all 50±20kB.

I went to Pexels, which hosts free stock photos, and I took the first one off their front page that had enough colors and didn't show faces: https://www.pexels.com/photo/person-pouring-coffee-on-white-...

Its maximum size is excessive unless you aim to support 4K: 5472×3648. It weighs 4.9MB. A lot!

I then went to Squoosh.app, which allows one to optimize images in various ways. The default option – MozJPEG at 75% quality – reduced image size by more than a half, down to 2.24MB, with no apparent loss of quality even at the high zoom. Illustration: https://i.imgur.com/2bkHrot.jpg

Do you really need to serve a 4K+ image, though? I reduced the image to 1920×1280, using the same app, with the same compression settings. 184kB! Illustration: https://i.imgur.com/52MctSN.jpg

At 33% zoom (which is necessary for a reasonable comparison, since Squoosh stretches the smaller image for comparison), the compressed image looks very good. It lacks the noise the original had, and looks more glossy. There are also advanced settings that one could tinker with, perhaps to a better compression with equal losses.

Is it a big deal? Perhaps – especially when you look to present the image as-is, with minimal losses between conversion from RAW to, say, PNG. For most websites, though? I reckon it's not going to be a problem: it's the sense of the image that matters, not the details.

And if you regularly serve 1MB+ images, maybe there's some sort of an indicator or tag that you could apply that will tell the browser: "Hey, look, I know you want to save bandwidth, but it's kinda my schtick to show really good images, so let me through, yeah?"


> unless you aim to support 4K

We're right at the point where people are starting to actively support 4K in web apps. Sure, not many web apps actually need it, but the ones that actually do (like photo browsers) definitely need it if they want to keep up with relevant trends over the next five years or so.


Sorry, it's not the 90s anymore. There's no reason not to support 4K, especially as such displays become more commonplace.


"Sorry". Jesus. Sure.


Modern computers are so fast, they can process gigabytes over gigabytes of data. And often, the content we browse on the web is text based, or text with some images. Still, web sites manage to suck up the remaining capacity. Why? Web browsers are faster than ever before, but web sites are bloatier than ever before, eating up all the hardware and software capabilities.

I don't think that writing lots of lean web sites and hoping for people to switch for them is the right approach. The approach chosen here, by using the power of the user agent, seems the right one.


> Web browsers are faster than ever before, but web sites are bloatier than ever before, eating up all the hardware and software capabilities.

Are they? Have web browsers actually gotten any faster at all over the last, say, 5 years? 10?

JS engines got a bit faster, but what about CSS & HTML parsing? 2D rendering performance? Layout engine performance? DOM performance? Mozilla made a bit of noise about this a year or two ago with their whole Project Quantum push - but had you ever heard a peep about this stuff prior to that? Or since? Nobody benchmarks this stuff, and yet it's insanely critical to interactive performance. But since it's harder to measure than JS performance, the only thing ever measured is JS performance. And occasionally, rarely, page load speeds.

Open up a 10MB plain text file in Chrome and it completely falls over. Zero JS. Zero CSS. Zero HTML. Just plain text. Are modern browsers really fast?

And for what it's worth modern computers are wide - 4 core with SMT is damn near low end these days. Yet the web is still incredibly stuck in the single-thread mode of operation. Both the browser internally and the platform itself (WebWorkers are far too slow, heavy, and restricted to meaningfully be used to offload interactive work). And there's almost no work being done to address this. WASM's threads are the only sliver of light here on the platform side. Is it really surprising that people throw RAM at the problem as a result? Throwing more caches at things is the natural response to being heavily starved for CPU on the single thread you can use.


Agree 100% with you. All we have to do is look at the Opera browser with their Presto rendering engine to realise the truth of this - it really rattled Internet Explorer and Microsoft with its super fast rendering speed and small size that made it easy to download on slow connections. It made Internet Explorer and Firefox look like clunky and slow bloats of software. Opera was so good that they were able to charge for it, and despite the free browsers available many people bought it.


I'm not sure that's a fair comparison though, Presto didn't support HTML as thoroughly and it has since dropped out of the race entirely.

We can't be piling more and more high-level crap onto the standard and expect Moore's law to keep up with it.


As far as I remember, Opera with its own rendering engine used to be one of the most web standards compliant browsers.

That said, my point stands - I was pointing out how browsers like Firefox and Chrome are still bloated softwares compared to early Opera (pre Blink, Presto versions). Them "dropping out of the race" is irrelevant to that aspect.


It's actually the opposite, Opera (Presto) was stricter in its adherence to standards. It was the first/second (depending on OS) to pass the acid 2 tests, and constantly was among the top scorers.

That said, there were often compatibility problems on various websites, because they favored IE (which wasn't standards compliant), and rarely if ever tested on Opera.


Exactly.

JS is absuerdly fast (comparatively).

The DOM/rendering is sluggish (comparatively).

The observation is the driving point behind virtual DOM.

Also why high performance rendering uses Canvas or WebGL, at the expense of debugging tools, browser extensions, and accessibility.


> Web browsers are faster than ever before

That's not true at all, browser used to be much faster when HTML was simpler.

> ...but web sites are bloatier than ever before, eating up all the hardware and software capabilities.

Actually, even a rather simple website can very slow to render, if it does a significant amount of dynamic updates. DOM performance is an absolute disaster, hence all these "virtual dom" implementations like React. CSS Layouting is incredibly expensive. That has nothing to do with website bloat in and of itself, it's the platform coming bloated out of the box.


Folk like Alex Russell has a weird an intense hatred of how people actually use JavaScript. Of course he dresses it up as caring about user experience, but - surprise - every single time the solution to a problem turns out to be "don't write React/Angular", which is not a very pragmatic stance.

This wouldn't be such a problem, except he has a huge sway on the language via TC39 and the web via his work at Google, and keeps trying to foist over-designed complex solutions, like Web Components and PWAs onto developers.


> Per-script max size: 50kB

> Total script budget: 500kB

This limit would break, nearly, all modern SAP apps. Bootstrap 4 min js bundle is 49Kb. Also, the other limits are far from reasonable.

I would be wiser if Chrome/Firefox/etc would target ad networks. Would be nice to have optimised ad that would be a tiny fraction of the website. Some ads download several megabytes just to show video gifs. These are the bastards that waste a good chunk of my 4G data plan.

Websites need to start to be reasonable about the amount of ads per page. When I pause my ad blocker, I get scared by the websites I usually visit: So many god damn ads everywhere, makes wonder why ads blockers users are not 99% instead of the current 30%.

For website owners: if your website/app is slow, users will stop using it and other websites will replace it. It's your own problem, not the community problem.

Don't think it's fair to limit javascript/browser because some morons code garbage without caring about it. It is more than possible to code fast javascript apps.


> This limit would break, nearly, all modern SAP apps

I presume you mean SPA, and you’re probably correct. What limits do you think are reasonable?

To me, if a SPA is > 500kb of JS my assumption is it’s unnecessarily bloated.

Gzipped aren’t most modern frameworks/libraries < 150kb? With many being considerably less

Perhaps the issue isn’t proposed limits, but instead the state of most modern SPAs


Personally I think just leave it to users to vote with their feet. If I think a site is too slow to load or too slow to use, I'll stop using it. If I'm happy to wait for 10Mb of js to run a web app, then let me.


YES - I hate Google, their AMP team, or anyone else trying to tell us how to "fix" the web.

It used to be innocuous when Google had 20% browser market share, but now they act like they own how users should experience the web. Like the uBlock origin guy said... if Chromium keeps heading down this path it should no longer be called a "user-agent" because it's no longer acting on behalf of users.


This exactly. I'm not sure why this needs to be enforced when large apps already reap what they sew in terms of bad SEO and bad UX.

There are plenty of reasons why you might want to heavy apps or raw photos on the web. This is the beauty of the web, freedom!


> I presume you mean SPA, and you’re probably correct. What limits do you think are reasonable?

Yes, I meant SPA.

Well, there shouldn't be a limit in my opinion. Javascript is powerful enough to allow a lot of heavy stuff like Games, desktop class applications, etc. Example (no longer in chrome app store) : https://www.youtube.com/watch?v=MfZpRtuPu-o

The issue with SPA is simple: You could use, let's say, React with Apollo GraphQl and the final bundle would be below the 500kb. The problem is when you need some UI components, the obvious way is to search for a plugin for that. Many React libraries use stuff like Jquery, Lodash, etc. under the hood that will increase the final bundle size. Over time, it just adds up with every new feature you want to add. Example: Kendo UI for React is just a wrapper around the jQuery version. Many other libraries follow this approach. To make things worse, the same package can be added to he client with different versions because they're dependencies of different packages.

We also live in the age of isomorphic javascript (code that runs on the server and client), so there is an "extra" bundle pushed to the client so that the apps becomes more responsive without needing the server that much.

Of course Gzip help to reduce the bandwidth but it doesn't solve the problem that more lines of javascript, means more time needed to interpret the code and memory "wasted". On frameworks like MeteorJs, you could archive a bundle of 10 megabytes or more very very easily. Some seconds into a Meteor app on iOS Mobile, the app would crash. But no one force me to use Meteor at that time, there were plenty of better options. Mea culpa, period.

This problem of slow webpages/"slow" javascript is quite old. Most modern libraries/frameworks already help developers to bring faster apps by having good defaults. Example: Gatsby, Apollo GraphQl client, etc.

Enforcing limits could brings us to a world where we need to use iframes and subdomains to get an app running. For sure no one wants that.


I don't think a size limit is reasonable at all. According to the article a 500kB JS takes about 3-4MB of RAM once it's parsed. So having 500kB costs me a bit of parsing time and a handful of MB of RAM, both of which aren't really bottlenecks.

If there was some size limit X, then any SPA requiring more javascript would just be split up into modules that dynamically load and unload as they are needed. But that just makes the total instruction parsing time worse, for saving single-digit MBs of RAM.

If that does anything to the problem of slow javascript at all it makes it worse, because now I might have to wait for the right module to load and parse.


Correct me if I’m wrong, but I have a feeling what you describe could be prevented and that if you wanted to load/unload modules the sum of the JS could still be limited to a certain size


> To me, if a SPA is > 500kb of JS my assumption is it’s unnecessarily bloated.

I'm guessing you've never written something that can, for example, dynamically generate XLSX files out of filtered datasets in the browser. Sure, we could do it on the server - but our users value fast and regular turnaround time on feature updates far, far more than the site loading a little slower every few weeks when the cached JS is replaced.


> I'm guessing you've never written something that can, for example, dynamically generate XLSX files out of filtered datasets in the browser.

Guess again :)

> our users value fast and regular turnaround time on feature updates far, far more than the site loading a little slower

That might be acceptable, for a time. Particularly if you’re working on a product that doesn’t have many/any backend developers. Or a backend. Or a product where the developers are more specialized in frontend. Or a product where the backend does not have a mechanism for handling long running tasks

But at some point for features like various file exports in the browser, one might want to fix the “why can’t we build X feature and delight customers as quickly server side”

Or not ¯\_(ツ)_/¯


> one might want to fix the “why can’t we build X feature and delight customers as quickly server side”

Sure. It's just that there's a two- or three-year-long to-do list of features before we can get back to optimizing things that none of the users care about anyway.


Users definitely care if an app gets slower every few weeks


If any of our users even notice that once or twice a release cycle the page for the web app takes a little longer to load, they haven't cared about it enough to actually mention it. By contrast, we get a constant flow of new feature requests.


Doesn’t sound like a very good metric to use. Ideally companies are actively talking to their users about the product


The browser thanks to JavaScript has become a platform. It has displaced native desktop development. As a developer it would suck for me to be imposed limits. Another thing to manage.

Personally, I don't see a problem to be solved here. Bloated sites with ads will always exist. The solution is quite simple - don't visit them. They will eventually die or be replaced by something leaner. A great example is GitHub which has replaced Sourceforge.


Statists will downvote this post, they believe regulation can solve the issue.

Libertarians will upvote this post, they believe the market can solve the issue.


This article is the programmers' equivalent of Grover Norquist saying "I don't want to abolish government. I simply want to reduce it to the size where I can drag it into the bathroom and drown it in the bathtub."

Only with Javascript.

Also, libertarians are statists. Weak statists, but statists nonetheless.


Anarchists will look a this post and shake their head in dispair.


I used JavaScript white-listing for a few years, but now I only use an add-blocker - which is way less work, and solves most of the issues. Monetization from ads was nice while it lasted, it got destroyed by click farms and spam sites. You can no longer earn money from ads unless you are big enough to talk directly to advertisers. For small time content providers I instead recommend something like Patreon instead of ads. I think we need to invest some thought to micro-transactions though, as right now 10% or more of every transaction goes to middle-men.


I'm all for more tools for users to limit resource abuse, but I don't see it changing a thing. The precedents are already set.

If the user doesn't give the developer what he/she wants, the developer will:

- block them until the user allows however much resource abuse the developer desires (ad-block blocking scripts), or nag them to change their settings

- try to evade the blocking using any and every unblocked mechanism (ads via websockets to get around request filters, etc.)

- do absolutely nothing and let the site stay broken

The problem isn't that there are resources to abuse. The problem is the significant motivation to abuse them.


Finally all those hours I wasted on code golf might prove useful after all.


From the performance-perspective it would be useful to also limit javascript execution time per some time-slice. Parse/compile time should be included in that of course.

From a privacy perspective javascript should also be limited to the same origin. That way you can run local trackers when a user visits the site but not do tracking on 3rd party sites just by loading some cruft into the page.

Ads can be served in iframes, without javascript or cookies (cf. sandbox attribute).


You realize that companies could create a backend file that would load the external tracking.

Ads in iframes are not smart enough to show anything related as there is no content.


If they add their own backend stuff they might as well use self-hosted tracking.

As for the frames, that's on the embedder to provide the necessary information instead of doing user tracking.


I don't like Web Apps, I like Web Pages, as much as I want to get rid of Javascript, there are simply no other alternatives.

We should limit Javascript, but not by its code size. We should include more Native functions across browser vendors so we could reduce the use of Javscripts. Have a minimal Javascript Standard Library.


If the server side developers gave a s#!t, they'd just optimize for all clients. No need for clients to request it.

Also, are any of these limits going to apply to XHR? Or can I just use a loader and eval to get unlimited JS? And if the limits do apply, I assume that means gmail and maps simply stop working at some point?


> I assume that means gmail ... simply stop working at some point?

Gmail still works?

Seriously though the limits applying to XHR is a really good point. I'm guessing browsers would have to enforce some artificial restriction on the eval function, possibly prompting the user with "This webpage is attempting to run a large amount of [insert user-friendly layperson terminology here]. This may be related to advertiser abuse. Do you want to allow this?"

But it's not clear how they plan to solve it for iframes either which could essentially grant unlimited JS as well.

They have good intentions with the discussion and it's an interesting idea but I don't see it becoming reality until they can solve these problems which seem actually rather hard.


Doing this is how you get server side developers to give a shit.


I dislike the assumption that the backenders don't care.

I personally do. But I'm never given the choice to make a site lean. The priority is always "make it work" which is of course the right thing to do first, but after that nobody gives you time to make it smaller and faster. There's always the next ticket in the backlog and you cannot argue.

So please, don't bash backenders. Many of us care and do our best with the very limited time budget we manage to STEAL to do optimizations. But proper optimizations require dedicated time and effort with focused sessions -- and we are never given those.


Personally, if it ever happens, I'm going to tell users to download the native app. Those who want to download it will do it. Those who wont, no worries. I'm not going to pander every browser's arbitrary requirements. All the hate on JS is getting is beyond stupid.


I will not be installing your app and I will advise all friends, family, and aquaintances to do the same. Native apps are 1000x worse than the web in the risks to users. Even if I trust you I also have to trust the authors of every library your app uses not to pown my computer or use my camera or mic, scan my network, capture my screen, read the clipboard constantly, read and/or upload all my files, etc...


>Even if I trust you I also have to trust the authors of every library your app uses not to pown my computer or use my camera or mic, scan my network, capture my screen, read the clipboard constantly, read and/or upload all my files, etc...

You have to do that with all of the software you run, and the operating system you run it on, anyway.


And that, I believe, is the true reasoning behind this move. As web apps drive traffic away from stores, and thus drive away revenues from such stores, limiting JS would be a way of forcing apps back to the stores.


> All the hate on JS is getting is beyond stupid.

This is an emotional statement. Mind elaborating it with facts?


A 500kb-per-page limit would be a big middle finger to my company's web apps and most of my company's users, simply because those web apps each do a substantial amount of stuff (document generation, complex visual editors of business domain data, elaborate searching and editing functionality in huge tables of data), and the overwhelming majority of our users value constant iteration on new features (of which there is still a huge to-do list) over optimization.


> ...is that they aren’t going to roll something out to the broader web that is going to break a ton of sites. If they did, developers would riot and users would quickly move to another browser.

Yeah, like Firefox, Safari and... which other browser exactly?

With the near-monopoly state of the browser ecosystem, that particular argument I quoted above isn't very relevant these days.


Google itself could do a lot to solve this problem if it wanted to: Just deprioritize the search rankings of sites that pull in a lot of external javascript, or that do unnecessary client-side rendering (where the definition of "unnecessary" would obviously need some careful consideration).


Do we really want Google to editorialize in this way?


That would mean a huge chunk of the modern websites will just disappear from search. As I techie I'd rejoice but let's be realistic -- people go to search engines to get stuff done. They'll be very frustrated if they cannot.


I would rather like to see limits self imposed by websites similar to CSP.


I think CSP shows self regulation doesn't work in practice. At least not without incentives which don't currently exist. Therefore it's more practical to put control in the hands of the user agent.


Sounds good to me.

Although, that does not do much about small hot spots/loops.


Take a wild guess what would happen if this gets accepted.

That's right: the ad networks will hyper-optimize their script sizes and runtime footprint. They'll just become much better.


Google does an endless dance around how they should handle the ad blocking problem. Don't be fooled by this, the solution has been in their face for years.

Most websites are reached from Google search. If Google de-ranked slow, ad filled, and paywalled sites, the internet would fix itself overnight.

But they will never do this because Google cares about protecting their own ads above everything else


I disable JavaScript in browsers and have done so for many years, if it's hard or awkward to disable it in a specific browser then it instantly gets the flick and I substitute one that's more amenable to having its JS operation switched to 'off' mode.

Why do I bother going to this trouble when, these days, most of the web considers JavaScript 'on-mode' as the 'essential' default? Well, I've multiple reasons the first of which is useability. Whenever I have to use a browser where I am unable to disable JavaScript (i.e. on machines that I don't own or control)—I feel both frustrated and no longer in control of my browsing experience, here's a few of my reasons:

1. Speed. Fundamentally, I find that with JavaScript 'on' the browser's usability suffers enormously, its response speed drops to the point where it's damn difficult or painful to use (essentially, the browser's ergonomics have taken an unacceptable nosedive). If you've ever browsed the web without JavaScript for any length of time then you'll greatly appreciate the truly enormous increase in rendering (display) speed of web pages when JS is turned 'off'. Moreover, the browser not only renders pages much more quickly but also the rendering is much smoother—gone are the pauses and jerky page-loading operation that so often plagues JavaScript's operation.

1.1 Why users actually put up with such unacceptable response times I can only attribute to the fact that most have never used a browser with JavaScript disabled. This alone is an indictment of the web/development industry: for when websites downgrade users' browsing [useability] experiences for their own explicit benefit (and or pecuniary interest) then they are effectively exploiting users. Essentially, users do not really benefit from the use of JavaScript, but websites do, and they do so mightily!

2. Security and Privacy are so much easier to enforce when a browser's JavaScript is disabled. Right, that's a sweeping statement but it's easy to test. Using your browser's default settings and without additional add-ons or plugins (with the exception of say a JavaScript on/off toggle add-on), go to security/vulnerability-testing sites such as by Steve Gibson's, (GRC's) ShieldsUP!! or the EFF's Panopticlick, site and check your browser's privacy and security with and without JS. You'll be surprised. Moreover, many of the privacy-invading techniques used by websites to steal your personal info are killed stone-dead if JavaScript is disabled.

3. Neutering JavaScript works wonderfully as a first line of defence against ads and ad/user-tracking. Even without AdBlock or similar ad-blocking software, ads are essentially yesterday when JavaScript is disabled! Adding ad-blockers, etc. later only improves one's blocking experience. Make no mistake, JavaScript's main web function is to make it dead easy for websites, advertisers and Tech Giants to track you wherever you go across the web as well as to supply you with targeted advertising, etc.—everything else—all of JavaScript's other features are only of ancillary benefit (and prior to JS's introduction, the web had other alternatives).

Nowadays, I'm essentially out of touch with the latest ads as I never see any. …And what a truly wonderful condition that is.

4. Websites that Require JavaScript. When I encounter a website that absolutely requires JavaScript to function so conditioned are my reflexes that I find I've backed out and off it without me even having realizing it. I can wiz through dozens and dozens of news items on Hacker News and easily bypass any sites that will not function without JS. I've never needed to worry, as on the Web there's always thousands of equivalent or alternative websites that are more 'cooperative' from which to choose.

5. In very rare instances when I must visit a site that requires JavaScript to function, I've a browser add-on that has an icon on the navigation toolbar which allows me to simply toggle JS on and off whenever required. Accidentally leaving JS on is almost impossible as the icon changes from green to red when off. Similar methodologies apply on my rooted smartphone: along with the absolute prerequisite of completely removing (deleting) Google's GApps, the 3rd-party browser I use has a feature to turn JavaScript quickly off.

I'm a heavy web user and have been so for decades, I often literally peruse thousands of web pages per day without any need for JavaScript whatsoever. I only add that I feel sorry for the many thousands of you who are welded on to addictive sites where JavaScript is necessary.

Tragically, JavaScript's unfortunate arrival on the web several decades ago was the beginning of the end of the old fast web as we once knew it, and if we are to ever reclaim the web for users—claw power back from the Tech Giants like Google, Facebook et al—then we will have to begin by severely curtailing JavaScript's power.

Limiting JavaScript as outlined in the article isn't anywhere near a satisfactory solution. For starters, can you imagine the fights and disagreements over how these various, essentially arbitrary limits will be set.

Keep in mind that it is JavaScript that fuels the Tech Giants' presence on the web and thus they're the ones who are its 'true' pushers. Like drug peddlers, they've forced this horrible, unnecessary, pernicious JavaScript scripting 'kludge' onto us users so as to maximize their business models—that of maximizing their profits, and they've done so at the expense of us users. In a much more user-centric web environment, none of us users would ever need this JavaScript 'junk'.


> Essentially, users do not really benefit from the use of JavaScript, but websites do, and they do so mightily!

Bullshit. Lots of features that are essential in people's everyday use case need Javascript in the web.

> 4. Websites that Require JavaScript. When I encounter a website that absolutely requires JavaScript to function so conditioned are my reflexes that I find I've backed out and off it without me even having realizing it. I can wiz through dozens and dozens of news items on Hacker News and easily bypass any sites that will not function without JS. I've never needed to worry, as on the Web there's always thousands of equivalent or alternative websites that are more 'cooperative' from which to choose.

Good for you for who's use is just for browsing in HN. As for most people there who use it for work and personal reasons, I am glad Javascript is there to provide features that are not possible without it.

> 5. In very rare instances when I must visit a site that requires JavaScript to function, I've a browser add-on that has an icon on the navigation toolbar which allows me to simply toggle JS on and off whenever required. Accidentally leaving JS on is almost impossible as the icon changes from green to red when off. Similar methodologies apply on my rooted smartphone: along with the absolute prerequisite of completely removing (deleting) Google's GApps, the 3rd-party browser I use has a feature to turn JavaScript quickly off

Guess what language that browser add-on you are using is written in.

> In a much more user-centric web environment, none of us users would ever need this JavaScript 'junk'.

In a user-centric environment, we put people needs first. So having a language that empowers developers to put features that are useful to the user is the primary focus. If you want to create JS less websites, you STILL can do so.

I understand that Javascript has much to improve. But your hatred for it borders on the idealogical.


> Guess what language that browser add-on you are using is written in.

While certainly curious, it is in no way hypocritical to use a hammer to smash a hammer factory.


> Lots of features that are essential in people's everyday use case need Javascript in the web.

I also disable javascript by default. And yet none of the websites that I enable javascipt on have essential javascritpy features, but they merely don't present the content in plain html.

> empowers developers to put features that are useful to the user is the primary focus

No, there is and always was a tension between usability and developers putting up new features. If you truly focus on user experience you definitely can't empower developers to invent features, you have to constrain them and force them to follow UX guidelines.


Do you never buy anything online? Or play an online game? Or use Google maps? I can think of lots of websites that I regularly use where JavaScript is important. I agree it is best to limit it, but it definitely has lots of uses.


With the exception of certain games, none of those other applications require JavaScript.

Form submissions and such have been part of HTML for a long time, and map viewing most certainly doesn't need JS. See this comment from 3 years ago:

https://news.ycombinator.com/item?id=10872194

Unfortunately the map images in the article there no longer work because Google has decided to "deprecate" that API, but there's absolutely nothing about serving what is essentially a large tiled image that requires the capability to run arbitrary code on the client. In fact, here's a tile URL I just found that still works: http://mt1.google.com/vt/lyrs=y&x=1325&y=3143&z=13

JavaScript is like Flash: useful or even essential for some things, but far too overused these days.


Sure, map viewing definitely doesn't need JavaScript, but Google Maps definitely does. I don't see how you could reasonably implement something like Google street view without JavaScript. Even just having the different locations on the map fade in and out of view as you zoom requires JavaScript, as far as I know. Personally, I don't want to have to use a stripped down version of maps, I want to be able to go into street view and see the locations, and smoothly see more detail as I zoom in.

Perhaps a better example is video calling. Definitely can't do that without JavaScript. My point is that disabling JavaScript entirely eliminates several classes of websites. I think it is better to have reasonable limits to JavaScript than make these types of websites impossible.


Why does this functionality have to be embedded in a website? I use google earth and skype to look at maps and make video calls. I don't feel like I'm missing out.


FWIW I generally take out my phone and use the Android Google Maps app, because it is considerably faster than waiting for the web version to load.


> And yet none of the websites that I enable javascipt on have essential javascritpy features, but they merely don't present the content in plain html.

You're right, that can be a significant problem. Often what I do is to turn JS on and let the page load then turn it off again. When I encounter Smart Alec sites that try to catch people like me out by refreshing JS every few seconds, I either toggle network access to the internet off or capture the text by various other means.

There's always a solution one way or the other.


> Guess what language that browser add-on you are using is written in.

The issue is not that the code that websites make your browser execute is written in a bad language, but that websites make your browser execute code.


I expected flak over aforementioned comment, especially so on Hacker News given that many of its readers earn their incomes from JavaScript. It's not JavaScript per se that's the main problem, rather it's the uses—the undeniable abuses—to which it has been put by many unscrupulous web players. Essentially, the abuse of users by developers and powerful websites is primarily channeled through this tool; any other equivalent mechanism would have received equal criticism from me.

(Incidentally, I note you've not seen fit to address this aspect of my earlier comment nor the significant matter of JavaScript's propensity to highly degrade the speed of webpage rendering).

> In a user-centric environment, we put people needs first. So having a language that empowers developers to put features that are useful to the user is the primary focus. If you want to create JS less websites, you STILL can do so.

By using JavaScript do you honestly believe you are actually putting people first or is this just developer/website owner rhetoric? With few exceptions, what you are saying I consider mostly fanciful. We've seen sweeping statements such as 'put people needs first' and 'the user is the primary focus' before, for decades similar lines were peddled by Microsoft until they became such a joke that it wasn't game to push them any longer (I needn't remind you that this BS came from the company that arguably did even more damage to the Web than did JavaScript).

The principle problem with JavaScript is that it has evolved into an extremely powerful tool—in fact it is now too powerful and consequentially the power it has unleashed is very lopsidedly in the hands of developers and website owners (as well as unscrupulous hackers); the only time ordinary users ever get a look-in is when programmers permit users to access web features of which the website strictly approves. Users essentially have no control over JavaScript's ability to invade their privacy etc. (without resorting third-party browser add-ons or turning off JavaScript altogether, and even that can be difficult these days, as some browsers have removed the switch that turns it off).

JavaScript has always been developed with the at-developers'-convenience philosophy in mind, this has been true ever since the days of LiveScript. This design philosophy never changed with Microsoft later involvement except to make matters very much worse with is non-standard non-cooperative approach to everything in the IT world, its JScript and latter .NET being typical examples of the problem. Now we've WebAssembly/Wasm and essentially the same rules apply, W3C may think it runs the show but it's Google, Apples, Microsoft that effectively run the show.

Had developers had users' ethics on their side, these problems could have been avoided early on. It would have been possible to separate JavaScript's multi-paradigm language/features into two separate languages. One language would only carry out essential glue-like functions (to use Andreessen's well know quip) to connect it with HTML so as to perform basic transactions with web servers etc. The other would carry out all those other nefarious tasks so reviled by people like me and users could block it at will without affecting the presentation of web pages. Naturally, nuking this script would be beneficial: trackers, etc. would be blocked and users would experience a substantial increase in speed as web pages would render much faster. (Right, I'm grossly oversimplifying these processes but I'm not discussing the intricacies of languages here).

Have you actually taken the time to survey typical pages these days? I'll bet not, for how could you honestly justify or defend them as they are? It's only possible if you're on the receiving end of those cents. (I'll demonstrate later in a separate reply with some of my own stats.).

> Bullshit. Lots of features that are essential in people's everyday use case need Javascript in the web.

In the present Web climate I agree that one does need JavaScript, especially if one's the type of person who has minimal attention span and cannot live without dissolves, pop-ups, ads, moving images and every other kind of conceivable visual distraction. Of course knowledgeable users also need sufficient cognitive faculties to enable them to dismiss away all concerns and worries they may have about them being continually monitored and their privacy violated. Tragically, many millions of other ordinary Web users are still not aware of these issues, nor are they aware of the harmful consequences.

That said, I fully accept there's a place for legitimate contextually-based animations, graphs and videos etc. to be displayed within web pages as well as the need for mechanisms to allow users to access payment systems etc. but strictly on the proviso that they are integral to the primary content of that page. Whilst these days JavaScript is often the main enabling technology that's used to deliver such content it doesn't necessarily have to be so as other ways still exist. In principle, I've no objection to using JavaScript in this context but I see no way of limiting its use to just this purpose.

I am sorry that you've missed main thrust of my argument. What many website owners and developers simply do not realize or just blatantly ignore is how truly alienating it is for text-based junkies like me to be, say, well into reading the second paragraph of a story only to have a pop-up (or worse for an overlay) to suddenly appear with the aim of having me joining a mailing list or such. …And, as we already well know, that's just the beginning of the assault on us users when we visit websites. As far as I am concerned, such behavior is just not on.

The reason why JavaScript is so on-the-nose is that it's the first-line weapon website and developers use in their assault on users' senses. As with anyone defending him or herself from attack, one's first aim is to neutralize the enemy's main weaponry; it thus makes common sense to nuke JavaScript whenever possible.

As JavaScript has become so all pervasive, and with the rise of its even more insidious WebAssembly/Wasm derivative, it's easy to envision a not-too-distant Web where both have taken over—a Web where browsers will have access to very few or even no pages at all unless JavaScript is enabled. If this ever eventuates then I foresee a need for tailor-made JavaScript engines that would modify the way JavaScripts and Wasm work. 'Modified' JavaScript engines would have features that would allow their operational parameters to be reconfigured in quite powerful ways that would enable users to claw back control from websites, and they would do so in much more sophisticated ways than say NoScript or various ad blockers do now (in fact, these engines might even run their own user-defined add-ons).


The analogy of JavaScript to addictive drugs should be comical but instead I'm starting to find this "JavaScript derangement syndrome" stuff just goddamn tiresome.

If you're not visiting websites that rely on JS, you're not using the same internet as the vast majority of people who use the internet. Good for you, I guess. Good luck looking at a map in a web browser without JS, although I'm sure you'd never sully your computer by visiting a Google website.


That is part of why it pisses people off - that so much of what uses JavaScript is completely unnecessary and often a haphazard security risk.


You can argue the same about blocking ads. Yes, not having ads and javascript is very different user experience. Very much not the same internet. But why would you want it any other way, why go back to that ad-riddled annoying slow insecure manipulative web that the vast majority apparently use?


> ' I'm sure you'd never sully your computer by visiting a Google website.'

Absolutely not, as I have no need to do so as there's so much more on the Web other then Google, Facebook and Amazon. I can't even remember my usernames let alone the passwords.

Clearly, what I do is irreverent for everyone else (and no one else would care anyway). My major concern is that those who find themselves having to use Google or Facebook essentially have no other alternatives. In this way Big Tech has has effectively monopolized the Web and I consider that completely unacceptable. So should most other people.


Certainly it would seem that most holdovers no longer have objective arguments that would tip most people off JavaScript on the cost/benefit scale. I believe it's worth considering that most of the people left with that viewpoint may have a.. psychological need to keep JavaScript off. Very strong personal preference or otherwise. And that's fine!

I get the fatigue though. At this point we are pretty far out from this being a real bottom line issue; not since most businesses even stopped requiring Javascript be disabled. We have finally arrived at the promised land; SNI, Javascript, and greenfield browsers. Rejoice!


>. Good luck looking at a map in a web browser without JS,

I don't care. I can use:

- Marble - VikingGPS - FoxtrotGPS

And so on.


But the whole point of JavaScript is that, instead of having to install programs, you can run them in the browser. Obviously pretty much every website can be replaced by a desktop application, but that isn't really practical for the vast majority of people, and I don't really see what problem you are solving.


> Websites that Require JavaScript. When I encounter a website that absolutely requires JavaScript to function so conditioned are my reflexes that I find I've backed out and off it without me even having realizing it. I can wiz through dozens and dozens of news items on Hacker News and easily bypass any sites that will not function without JS. I've never needed to worry, as on the Web there's always thousands of equivalent or alternative websites that are more 'cooperative' from which to choose.

One of my side-projects is a solitaire based card game that I am writing a server for in Rust and a client for in JavaScript.

If I presented visitors with js disabled the opportunity to download a native client (yes, real native not just handing you a single-executable bundle of a browser and the web app), would you download it?

a) If I provided the binaries hosted on my server, served to you over TLS?

b) If I linked you to it in Windows Store / Mac App Store / an Ubuntu PPA on launchpad.net / Google Play Store / iOS App Store / F-Droid?

c) If I told you to install the Rust tool chain via https://rustup.rs/ and to run “cargo install” plus the name of my package that would be hosted on crates.io?

d) If I linked you to the build instructions in the wiki on GitHub that would tell you how to build it from source?

Please rank these from most likely to least likely that you would be willing to do if you were interested in my solitaire game.

And also, how would I best demonstrate the value to you of my solitaire game? Embedded video (plain video tag that doesn’t require js)? Screenshots and text? A link to the video on YouTube?


> If I presented visitors with js disabled the opportunity to download a native client (yes, real native not just handing you a single-executable bundle of a browser and the web app), would you download it?

Most likely yes.


I tend to agree on all your points. But when a site I need to browse does require JS, simply "turning it on" is, in my opinion, much too lax. You really need more granularity in managing javascript than the binary enable/disable feature available on browsers: namely javascript blocking extensions (noscript, umatrix, etc...). Being able to block 3rd party javascript is invaluable, for example.


> Absolutely agree. Earlier I attempted to reply to another post wherein I covered this issue but my post was too long so it wasn't posted. FYI, here's the extract which specifically deals with this issue (mind you, it's not a practical solution yet, for as far as I know no one has yet written the code to do the job):

" <...> As JavaScript has become so all pervasive, and with the rise of its even more insidious WebAssembly/Wasm derivative, it's easy to envision a not-too-distant Web where both have taken over—a Web where browsers will have access to very few or even no pages at all unless JavaScript is enabled. If this ever eventuates then I foresee a need for tailor-made JavaScript engines that would modify the way JavaScripts and Wasm work. 'Modified' JavaScript engines would have features that would allow their operational parameters to be reconfigured in quite powerful ways that would enable users to claw back control from websites, and they would do so in much more sophisticated ways than say NoScript or various ad blockers do now (in fact, these engines might even run their own user-defined add-ons).

Users could then gain the upper hand over websites by essentially feeding back any information that would satisfy a website or trackers; depending on the circumstance data could be accurate, part-accurate, part-obfuscated, misinformation or all or part randomized and tailored for all or just specific websites [right, it needs to be very flexible]. For example, all machine and O/S parameters could be obfuscated or scrambled, misinformation supplied such as saying ads were being displayed or clicked on when neither was the case, personal information scrambled or obfuscated and trackers supplied with misleading and deceptive junk. Furthermore, the process could be fully automated to allow users a smooth and unhindered Web-viewing experience.

If you think these suggestions harsh or unfair then I should not have to remind you that this is effectively what thousands of commercial websites and especially Big Tech—Google, Facebook et al—are already doing with your personal data (remember Cambridge Analytica?). Essentially, most users don't have a clue about the extent of the personal data that's collected from them by these websites nor of its contents or how it is actually processed nor do they know to whom it's sold. Moreover, websites unfairly vie for both users' attention and personal data by using tactics which are unethical, overly-invasive, highly-obfuscated and deceptive.

Whilst many users have yet to realize it, they're already in an undeclared war with websites and developers and especially Big Tech. The power imbalance already favors large websites with large resources and ready access to considerable funding, it's comparatively easy for them to refine and escalate their invasive processes as and when is necessary. Moreover, these practices are made much easier for websites to implement by the fact that netizens have little or no effective legislation to protect them against unfair practices, as with respect to such matters governments have long gone AWOL. It's very clear the odds are heavily stacked against ordinary Web users. Thus, given these circumstances, it's not at all unreasonable to expect users to have to deploy obfuscation and deception as quite legitimate countermeasures to protect both themselves and their data. We users desperately need tools such as a modified JavaScript engine to effectively even the playing field.

If website owners consider themselves hard done by my proposals in these comments—in the sense that they would not make sufficient income if they didn't force users to accept abusive JavaScript scripts, invasive ads, privacy hacks and trackers, not to mention having to commit the unacceptable practice of dangling irresistible and addictive 'baubles, bangles and beads' in front of those who are easily distracted and or addicted—then I'd suggest that the Web would be much better off without them. Let them go broke, their absence would only be beneficial in that there would be less network congestion for everyone else after they've gone. <...>"


You missed one big advantage: Power consumption. Cool, silent, and the battery lasts a lot longer on my laptop.


Touché

Absolutely agree.


I agree with so much of this. I think the only real thing holding back a limited javascript world is javascript becoming a limited/disabled majority. I don't think it will happen because so many businesses rely on the interactivity and tracking that javascript gives. Try explaining to the lady down the street what disabling javascript does for her but the one website that needs to be visited requires JS can't be used and see the outcome of that. If turning off JS does not impact a user's most visited sites then it is easy to do but if not then it is a lost cause. I guess browsers enforcing limits is the next best option but it'll see push back no matter what.


> I don't think it will happen <...>

I agree with you that it certainty won't happen in the current web climate. In an earlier reply I've made allowances for this - see my reply above to kirion25.


How do you value usability but then have every other website basically disabled? The net trade-off there surely isn't in the direction of usability, versus simply installing an ad blocker.

If I installed ad blockers on some publicly accessible computer, or a computer for multiple users, I think very few would complain about issues. If I disabled JS on the other hand...


> How do you value usability but then have every other website basically disabled?

Clearly, it depends on how one uses the web (i.e.: what sites one frequents). In my case well over 95% of sites I visit work without Java Script. The reason for this is I primarily visit sites that have text as their main content.


I'm just curious. By any chance are you a math trader on board game geek?




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: