> insofar as it usually allows Javascript on the local site by default
NoScript can do this too BTW. The web is almost unusable without that setting enabled.
The drawbacks you identified are exactly the same on NoScript so they should not be used as a reason to pick one over the other.
It takes a LONG time to get a good list of whitelisted sites, now that mine is more or less usable I don't intend to switch - especially since I disabled the "show webpage with ads" thing on NoScript, so I don't have the problem in the article.
One advantage of uMatrix: You can filter based on host-destination pairs.
That is, in NoScript you can only filter based on the destination. For example, if you allow one site to run scripts from cloudfront.net, every site can. Effective the rules are,
DENY * *
ALLOW * cloudfront.net
In uMatrix, you can write rules based on host-destination pairs, permitting scripts from, e.g., cloudfront.net to run on your bank's website but not on any others:
DENY * *
ALLOW mybank.com cloudfront.net
Finally, you can filter by host & destination & function:
Google, Bing, Gmail, Yahoo Mail, Facebook, Twitter, Reddit, HN, Slashdot, and Github all work with Javascript disabled. If a site does not (e.g., Office 365, Slack, Instagram), that tells me something about the competence of the developers they are willing to hire.
It could just post the current chat via plaintext up until that point. Yes, you'd have to manually refresh to see updates, etc, but if you are blocking JS, you should expect reduced functionality of web-apps.
It's the outdated notion that "the web is still a collection of hypertext documents". Yet the reality is that JS webapps have been the standard for some time now, and won't be going away any time soon.
I agree the trend is toward thick Javascript apps, but I call that the death of the web and a wholesale regression to client/server architecture where data is entombed and only usable by a single piece of software that you don't maintain.
That's a pretty harrowing view. I see the movement of logic from the server to the client as liberating, personally, as it gives me more control over it.
Javascript is open source by its very nature. So if I wanted to tweak how Slack works, for instance, I can do so via extensions or writing code myself. In the cases of completely client-side apps, I can even back them up for my own use, such as I've done with a regex testing tool.
You've referred to it as a regression, but let me give you an example like Twitter. Would it make sense to have the server generate the HTML for each and every request, and then send that as a new page whenever you click a link? It's actually much faster to instead have the client send a small AJAX request (give me tweet #3242565), the server respond with that data, and then the client to update the page with it.
Twitter on its face is pretty simply. There's only a few template pages it needs to know ahead of time (timeline, individual tweet). So by moving to a data passing model you actually send far less data. The "javascript payload" being too large can be mitigated by rendering on the server for the first view, as is done in React.
In the case of Twitter, you're still dependent on the server for info either way. But this lets you conserve data, reduces server load, and yields a faster client interaction.
As I said, the web isn't static documents anymore. We've moved beyond that and it's actually pretty great. HN's overall position on this subject disappoints me for that reason.
> The web is almost unusable without that setting enabled.
The web (i.e., the web of hypertext documents linked together) is perfectly usable with JavaScript disabled; what doesn't work are all the single-page apps which hijack the web's infrastructure in order to break it. This is somewhat like a medicine which does not interfere with the operation of human cells but which disrupts the replication of viruses.
NoScript can do this too BTW. The web is almost unusable without that setting enabled.
The drawbacks you identified are exactly the same on NoScript so they should not be used as a reason to pick one over the other.
It takes a LONG time to get a good list of whitelisted sites, now that mine is more or less usable I don't intend to switch - especially since I disabled the "show webpage with ads" thing on NoScript, so I don't have the problem in the article.