Excellent checklist. Anyone who is directly accountable for operations but only a non-voting advisory role for platform selection will have flashbacks reading this list.
It’s a zero-day only at the point when it’s been exploited without having been reported. “Zero” is referring to the number of days since disclosure that an exploit ocurred. If it’s patched before it’s exploited, it wouldn’t be considered a “zero-day exploit”.
I believe if, say, 10 days had passed since the report, it would be called a “10-day” exploit. But it’s also security research jargon that I’m not familiar with in a practical sense so I may be wrong.
That smiley face is very optimistic. I ticked several showstoppers, and it's still grinning. What a can-do trooper!
Also, when using this analysis platform to vet production architectures, the more terrifying a SaaS/PaaS option is, the more clearly the indicator "green lights" that plan.
I got 18% on GMail. I pay them, other people pay them, and they don't have much downtime. IMAP support hasn't changed in decades and pushes each email to your local machine. If they start to suck one day, I just point my MX records somewhere else and nobody will even know. All in all, one of the least scary services I use.
Most of the 18% are them not being bound to any contract. I am not sure I have ever signed a contract that binds the other party (I don't even think my offer to buy a home completely precluded the seller from backing out), so it doesn't extra worry me. Having a backup and a changeable address assuage any of my fears. (I feel the same way about cloud providers. If GCP nukes me, it will be an annoying day to get going on AWS, but hardly the end of the world.)
You just point the MX records of your custom domain to a different provider, you mean? The vast majority of Gmail users do not have custom domains, and directly give out @gmail.com addresses to other users and services.
The thing is, Google Workspaces base plan is only $6/user/mo. If you value your email (which, as annoying as email can be, you should, since it's often the key to all of your other online accounts), $6/mo is... intensely reasonable. (And you get other stuff with that too; it's not just email.) Unfortunately, we've been trained to expect that everything on the internet is free nowadays, so even that sounds like an expensive service.
But agreed, most people just have @gmail.com addresses, and thus no portability. Google of course knows this, and that's by design. I wish we had a popular email service that would register a domain for you as a part of the signup flow. That would be pretty cool.
You can host them at apple if you have iCloud+, the cheapest of which is 0.99.
MS charges you about half of Gmail (through resellers) if you don't want the office programs installed and about the same if you want that included.
Zoho mail is about 3 dollars a month and then there is that service I can't remember (just email?) that will let you host the entire domain for 10 a year.
I got 66% for Gmail, but a bunch of these did feel a little difficult to judge. Like "Bugs are not fixed." Is that any bugs? All bugs? The majority of bugs? Important bugs? Etc, etc. I went with no for "bugs are not fixed", but yes for "bugs are not acknowledged."
Honestly, the biggest potential avenue for being screwed by Gmail is being locked out of all your other accounts that use your Gmail address as a login credential. That's sort of covered by a few of the options, but I think it deserves a question of its own (e.g. "My access to other services is mediated by this service.")
I also drew a line between Gmail and GSuite. GSuite would obviously win back some points in the "I pay for it" category.
This seems to be quite applicable for services, if you took the questions and applied them to products, especially operating systems or programming languages, we're all in deep deep trouble.
I use WikidPad[1] to store my notes, appointments, etc. I've been doing so for more than a decade.
I switched to Linux last year, only to find that WikidPad does not work correctly. There were breaking changes made to one of the libraries, so none of the dialog boxes work correctly.
After missing doctors appointments(that I didn't know about, because they were in WikidPad), I decided that WikidPad was something I had to be able to use, and switched back to Windows 10.
I'm fortunate in that I'm not using source code, but rather a Windows Binary, so I should be good, effectively forever... or until Microsoft decides that only "apps" will be allowed.
Look at all the mess that happened when Python decided that we must all convert to Python 3 to satisfy a Unicode fetish, and made a ton of breaking changes along the way.
---
Also, I became interested in a C program written about 21 years ago[2], and it took the efforts of some kind souls here on HN to help me get it to run in a modern environment, because C itself has changed so much in the meanwhile.
I guess there should be a checkbox, or maybe even a slider, for "how much do you care about this service?" or "how much does this service impact your life?"; if HN dropped off the face of the earth tomorrow, I'd be unhappy, but life would go on, whereas for many people a gmail account ban would be catastrophic.
Hey NoScript peeps, (or other users without Javascript), I dig the way you roll. That's why this page is mostly static and doesn't generate the list dynamically. The only things you're missing are a progressive descent into darkness and a happy face who gets sicker and sicker as you go on. Oh, and there's a total at the bottom, but anyone who uses NoScript can surely count for themselves.
I ask this out of genuine curiosity - what is HNs obsession with disabling JavaScript? Are y’all browsing the internet like this? If so how and why? Isn’t most of the modern web dependent on JavaScript?
Anytime I’ve shared a personal project on HN someone has commented that it doesn’t work with JavaScript disabled.
"The modern web" often depends on JS, but the World Wide Web was originally created as a system for sharing documents, and some people refuse to accept the slow change towards web-apps, especially in cases where it's unnecessary. For instance, news articles don't need JS.
JavaScript is fine for e.g. territorial.io - where JS isn't an implementation detail but critical to the core concept - but if it's unnecessary, then some people despise being forced to use it.
Disabling JS reduces third-party blocking code, tends to speed up page loads by removing the need to download JS scripts, has no degradation of the content 99% of the time, and is an extremely effective ad/tracker blocker. Overall, it's not hard to see why some people love it.
JS also breaks some web browsers, which pisses off anyone who likes those web browsers.
JS is often proprietary code running on the client's computer, or at least hard-to-verify open-source code that's running on the client's computer and creates unnecessary remote code-execution security holes.
I don't care personally, but I can see why some people do and I think "the modern web" is rather problematic and needs to be reformed.
I think JS can actually be a good thing overall (when executed properly), because it reduces the need to load content from the server and thus can make the user's app more resilient against inconsistent internet connections. Also, JS runs client-side which is better than relying on code that runs serverside, as a rule. (See "service as a software substitute".)
the World Wide Web was originally created as a system for sharing documents
I agree with your post but every time someone says this about the early web I feel a strong stabbing sensation in the back of my mind. It's just wrong. The web was never just a document sharing system. In Tim Berners-Lee's original memo he talks about dynamic pages that are built on the fly, and 'special links' to external apps.
To quote the memo;
"Hypertext allows documents to be linked into "live" data so that every time the link is followed, the information is retrieved. If one sacrifices portability, it is possible so make following a link fire up a special application, so that diagnostic programs, for example, could be linked directly into the maintenance guide."[1]
Dynamic apps running in browsers weren't really a thing until the later 90s, but accessing apps via Common Gateway Interface that rendered HTML on each request was a thing right from the start. That's where I started my career. The web has never been about just static documents. If nothing else it it wouldn't have been necessary if that's what Tim was proposing - we had networked access to document systems before the web was a thing.
But also... Tim wrote his memo 34 years ago. 'The modern web' in the sense of Web 2.0 has been around for at least 20 of those years. Even if the original web was about document sharing, there's no reason to believe that's still the case.
You aren't the only graybeard in the room. What you're describing isn't the original intent or usage of the web, but the very beginnings of the ongoing process of enshitification of the web writ large. Yes, from jump there were folks that refused to meet the media on its terms . Table based layouts, JavaScript googly eyes, "jello mold" layouts, flash-based fonts, etc. This trend has culminated in usability issues, page weights, and browser feature creep that can best be described as abusive bordering on sociopathic.
The original memo outlining the vision of the web was not describing "the original intent or usage of the web" but rather was "the very beginnings of the ongoing process of enshitification of the web writ large"?
This seems like a very extreme degree of revisionism.
Like, you could say, "the original vision was bad, because it included the seeds of enshitification". That seems plausible to me, from the perspective of people that just want static hypertext.
But it was clearly seeded in the original vision, not something that developed later.
I'm clearly not as familiar with Tim's writing as you. Please direct me to the relevant parts that call for walled gardens, grotesque disregard for screen readers, and all of the myriad layers of gratuitous protocol/language/feature cruft that's been troweled over the original medium in a largely successful attempt to convert it into a strip mall?
I'm not familiar with his writing at all. But conveniently, the person you replied to included a link to the full text they were quoting from. Did you miss that?
> JS runs client-side which is better than relying on code that runs serverside
For what purposes/criteria? You can't just say "better" in a vaccum and not have an argument in HN comments.
The server-side CPU is often a lot more efficient and less thermally throttled, for example. A lot of client-side stuff seems like double-billing for energy: serialize data on server, deserialize it on the client, do a transformation which has also just been sent over by the server, then and only then render data.
There are distinct advantages of client-side rendering, but it depends on usage patterns.
One case is you're something like Reddit, Facebook or Twitter: the transformation logic (i.e. JSON-to-HTML) is cheap in bandwidth to send once and have the client re-use it countless times. Bonus point, you can re-use the backend code for various frontends (web app, native iOS/Android/... app, Electron app).
The second case is anything that has a large number of one-time visitors: delivery of the transformation logic can be done using a cheap (and visitor-local) CDN, and you only need to scale the backend servers according to load - and you only need to pay for this task, as the computational cost is outsourced to the visitors.
The case where server-side rendering shines however is infrequently visited sites or ones that don't have external or internal interfaces, like a company's or personal blog or homepage: there, the effort required to maintain complex logic on both server and client is greater than just throwing up Wordpress.
You're listing the benefits for the website, and you're right -- those benefits are why sites do this.
But what it's really doing is lightening the burden on the server by increasing the burden on the client. There is little benefit to your readership. And allowing client-side scripts also decreases security and privacy.
>Also, JS runs client-side which is better than relying on code that runs serverside, as a rule.
This is not true, server-side code is a fixed, known environment so it is strictly better for any critical code and especially anything to do with security authentications.
Do you really want your super important and critical stuff to be handled by code that can be examined and modified on the fly with no damns given with the server completely unaware?
Not to mention server-side code by its nature doesn't execute client-side, which means less end user system resources used and thus less electricity consumed by and billed to the end user.
> Do you really want your super important and critical stuff to be handled by code that can be examined and modified on the fly with no damns given with the server completely unaware?
Is this supposed to be an obvious "no"? To me it's a pretty resounding yes.
Personally, I'd like servers to do a good job of returning well-structured data quickly, and to be able to manipulate that however I want on the client side.
Anything that is "super important and critical" will need to be validated on the server no matter what. But by pre-validating things in the browser, the user can have a more intuitive, seamless experience. Is the username you're typing already taken? No need to press the submit button, then scroll around trying to find the correct error message, we can give you feedback while you type.
You are right that there are tradeoffs, and I agree that client resource usage is under-analyzed, but sensible use of client-side code can make things like forms, visualisations, tables, etc much easier to use.
> Are y’all browsing the internet like this? If so how and why?
Yes. All JS turned off by default unless it's on a whitelist. Takes a few days to get the whitelist set up, but its fairly straightforward.
Between this and uBlock, it makes the web a lot safer and resource intensive to browse. For example, people frequently cite Chrome as being faster than Firefox, but with my setup that has rarely been a the case in a very long time.
I've also generally found a strong correlation between websites that don't work or work badly without JS and them being websites I don't want to frequent. And when the website doesn't work without enabling most of the JS, including all the ad and tracking stuff? That's a good sign that I probably should never whitelist that website.
There's also just the matter of principle. If a website works just fine for my needs with most or all of the JS turned off, it wasn't really necessary in the first place was it?
I use and recommend the same approach, and add the following trick: some sites will use CSS to hide content until JS runs and twiddles some attribute to un-hide everything. For those sites, disabling CSS will often make the content readable without the need to enable JS. This bookmarklet disables CSS:
javascript:(function(){ var i,x;for(i=0;x=document.styleSheets[i];++i)x.disabled=true;})();
This doesn't work for anything that runs a `<div id="root" style="display:none">` style hider, and in most cases it messes up the layout completely because it renders all images without explicit height/width attribute at full native resolution.
NoScript and uBlock Origin are what I use. My two must-have extensions for any browser I regularly use.
With NoScript, I started off with everything blocked, and just stuck the extension icon into the toolbar. Whenever a site wouldn't load properly, I'd temporarily enable the JS just for that single domain, and see if that got it to work. If it didn't, I would usually poke about and see what other domains need to be unblocked. It's pretty common that you might need to unblock some CDN or a 'static' subdomain to get the site to load. If it's a website I expect to come back to, I will then switch those exemptions to permanent, but otherwise I try to avoid leaving a lot of cruft in the whitelist.
Sites include the page origin, and any other sites and domains from which requests are made. Many common advertising and tracking domains are blacklisted by default.
Rules may be written specific to a site or as default rules.
You leave Javascript enabled, open 15 tabs, and in all likelihood 2 or 3 of them are discreetly running cryptominers, another 2 or 3 are using the timed visited links attack to sniff your browser history [1], another 4 or 5 are installing permacookies, and Google and Meta and Twitter are now aware of 12 out of the 15 tabs you opened because those pages have a Google analytics or AdSense JS snippet or a like/retweet button somewhere on the page.
Javascript is a security and privacy nightmare. It's frankly absurd to me that we default to Javascript switched on everywhere for everything. It feels like chmodding your entire home directory to 777 on a shared system.
JS is also an accessibility nightmare, but unfortunately turning it off doesn't fix that.
Every normal person uses it daily, yes, but "without issue" is less obvious; that they don't notice being stalked or having their battery drain faster doesn't make it not a problem.
There's a ton of issues here, it's just most people either don't know them and don't realize the full implications. And some people just don't care because they feel powerless - "Yes I allow these companies to track me because what else can I do if I want to use mu devices like all other people do?"
0) Not seeing ads, nag screens, gratuitous requests for sending notifications, etc, and often even avoiding paywalls.
Of course there are less drastic ways to achieve all this. With JavaScript off, many things on the internet look broken, so it's pleasant that certain things become manifestly un-broken.
For example, analytics scripts don’t necessarily need to be enabled even if other scripts are. NoScript in particular allows such options.
Not only analytics but advertising and general “tracking” stuff. There are many reasons and any individual may have one or more which would persuade the decision.
If one “offers advice” that a website “doesn’t work with javascript disabled” just consider that they are suggesting to try implementing progressive enhancement (https://en.wikipedia.org/wiki/Progressive_enhancement) as is done on the linked page. Basically get the page working with HTML only, then add CSS, then add JS.
> Isn’t most of the modern web dependent on JavaScript?
Way more than justified, sadly. Web pages are supposed to serve documents which can be read, there is zero need for client-side scripting in that case. And that's the predominant use case.
Yes, javascript is useful for the occasional site that needs client-side interactivity. Games that run on the browser are a good use case. Maybe the only one.
But for documents? No, absolutely not needed.
As to why object to it? Code running on my computer being served from an untrusted third party is always a huge security risk. I don't want to run your code, I don't trust some random website admin. Yes, browsers do their best at containing the malware but it will never be completely perfect. The only perfect solution is to simply never run any code being served off some random website. Which means disabling javascript as much as possible.
>Are y’all browsing the internet like this? If so how and why? Isn’t most of the modern web dependent on JavaScript?
It's not all or nothing. I enable JS for the bank, webmail, etc websites I visit regularly and trust. There's no need to have it enabled on the rest of the shit that I only visit when it gets linked on HN.
Yes, I'm browsing with JavaScript disabled by default. It cuts out an incredible amount of crap. With JavaScript, web sites (especially "news" sites) are unusably slow due to all the crap advertising pushing GB of crap JavaScript at you.
Sturgeon's law is understated. 99.987% of anything in an iframe is crap.
Make HTTP request(s) outside browser then view with HTML reader. Many other ways to do it. Depends on personal preference.
Why: Because it's more flexible and efficient than using a browser. For example, 1. send multiple HTTP request in single TCP connection, 2. filter response body, e.g., using UNIX utilities, 3. it's both fast and reliable. There is no waiting.
"Isn't most of the web dependent on JavaScript?"
For ads and tracking, yes. For displaying text, generally no. In the later case, some sites may use Javascript to request text (JSON) from some other path or domain. In these cases, one can send the requests to the other path or domain without using JS. If the user prefers only text content, without graphics, this is especially convenient.
> If so how and why? Isn’t most of the modern web dependent on JavaScript?
I expect a webpage to be a webpage, not an "app". I always think it's funny that many pages will print the message "This app requires JavaScript to run" when they fail to work without JS. "Apps", I feel, should be executables I run on my machine after installation. I mean, that's bad enough -- how do I know a video game isn't scouring my system for all my personal data to sell to advertisers as it admitted in the EULA I didn't read? But now webpages can be fly-by-night Turing complete brain surgeries on my computer? No thanks.
Gee, thanks W3C, for making battery life and other things JS queries which are at best useless and at worst tools to track and datamine me. (By the way, all the ads, sometimes also malware, loaded via JS sure eat into that battery life. Yay!) Let me whole-heartedly subscribe to this Molochian acceleration to value extractive singularity so that I can visit a webpage! Maybe I'll also get to enjoy some shitty animations or standard workflow-breaking behavior (e.g., pagination/right-click context menu hijacking) while I'm at it to get in the way of the content I wanna get. :D
Try it sometime! Turning off JavaScript makes web browsing noticeably faster and less memory-intensive. You may be surprised how much still works fine with it off. Software like NoScript lets you allow it temporarily if something really does need it.
> what is HNs obsession with disabling JavaScript?
It's a defensive measure to help prevent tracking and other attacks.
> Are y’all browsing the internet like this?
I am.
> Isn’t most of the modern web dependent on JavaScript?
A well-designed webpage degrades gracefully and is still usable without JS. There are plenty of badly-designed websites that don't do this, of course. I just don't go back to those.
I disable js in safari (which is my main browser) on ios. If there is something that I really want to see that requires js, I switch to brave which has js enabled. But most of the time if a site requires js, I just leave the site. I now find browsing the web with js enabled to be intolerable.
At one time the concept of running arbitrary unfamiliar code without so much as glancing at it was recognized as the horror it is.
If I'm visiting your webpage it is because there is information as HTML that I wish to read. Why do you need to issue commands to my computer? That's invasive and unnecessary and certainly not something I'm enthusiastically consenting to.
Many JavaScript payloads are also huge. Millions of humans do not have a reliable and fast cable or fiber ISP connection.
I can't speak for anyone else besides myself, so I can't answer the question "what is HNs obsession".
But I can share my thoughts on NoScript:
An ad-blocker acts as a blacklist of sites/domains which are not allowed to run JavaScript in your browser. So basically everything is allowed by default - and the plugin downloads a list of known sites/domains that serve ads, track you, serve malware etc. That list is curated by someone else - and with the goal of disabling as many ads as possible while breaking as little site functionality as possible.
NoScript allows you to manage a whitelist of sites/domains which are allowed to run JavaScript in your browser. So by default everything is forbidden - and only sites/domains explicitly added to that list can run JS. And you have to curate that list yourself - so it's up to you to add sites/domains to those you want to trust permanently/temporarily. This is more involved, requires more effort, and many sites will break - loosing legit features, sometimes even breaking static content.
So, there are some parallels in there - one is a general blacklist that's remotely managed for you - and the other is a whitelist that you have to manage on your own.
You get more control and more security - at the cost of increased hassle, as many sites will just be broken out of the gate, and some will still be broken even after you (temporarily) allow first-party scripts a CDN or two. Some less recommendable sites need JS from 20+ domains (some of which dynamically load in even more domains) to work, while other sites work perfectly fine with just first-party scripts allowed and nothing else. For me that's interesting to know.
But you also get to see and learn a lot about what's going on under the hood, which might be interesting too, if you work in the field.
It's up to everyone to decide for themselves, whether that's worth the hassle - for most people it's probably not. But if you are technically minded and have already gained a little bit of experience with NoScript - the hassle isn't actually that great.
All pages you use regularly will be permanently whitelisted - many other pages can deliver you their static content fine even without JS - lots of pages are pretty easy to whitelist temporarily in just a few clicks - and the rest are often best to stay away from anyway.
I've been running like this ever since Javascript started to take over from Flash, circa 2010 or so. RAM usage is decreased, CPU usage is decreased, UI spam is basically gone, it often disables hostile page formatting, it lets me see which websites are slapped together like a five year old played with LEGOs and are thus security nightmares, and it generally just makes things safer and faster.
> what is HNs obsession with disabling JavaScript?
Javascript is an inherent security vulnerability because it allows other entities to execute arbitrary code on your computer system. Disabling it is basic prudence at this point.
> Are y’all browsing the internet like this?
Yes. Every system I use online has Javascript disabled by default for all sites and a curated whitelist.
> If so how and why?
Why is already answered above. How is via Firefox + NoScript + uBlock Origin. This allows me to control which domains are allowed to execute Javascript with a default-deny policy and blanket disable XSS.
> Isn’t most of the modern web dependent on JavaScript?
Yes, but most of it renders text just fine without it, and the ones that don't are usually sites I don't want to be on anyway. Most of the modern web is also garbage that is actively harmful to the security of your system and to your psyche.
Javascript is an attack vector, and a way for some ads slip past the ad blocker.
While some websites do need JS and therefore get it selectively enabled (if I trust them), if all I want to do is read some text then I generally have a better experience without JS.
>what is HNs obsession with disabling JavaScript? Are y’all browsing the internet like this? If so how and why? Isn’t most of the modern web dependent on JavaScript?
You know how most security vulnerabilities are some form or another of Remote Code Execution, right?
Well, now consider what JavaScript usually is: Code from a Remote server that Executes on its own locally. The remote server might not have anything to do with the website you intended to visit, too.
The literal nature of JavaScript is a security threat and liability.
yes i do this too however my usecase is slightly different than other commenters so let me share that. It helps me in prioritizing the content that i am really interested to see by putting the effort to open it in a new private window. i fall for click bait and click on the links and when i find that it is not visible in js disabled i mostly close them unless i really want to see it so it acts like a first level filter.
Personally I selectively disable scripts because it’s not any 3rd party business to know which sites I visit on my private device. And if a site works without any JavaScript that’s the ideal scenario - I don’t have to hunt thru the scripts to see which ones are actually required for the site to work.
I'm not, no. 15 years ago I would NoScript for security reasons. But I visit far less shady sites and times have changed with browser security. Never did I NoScript for philosophical reasons.