Hacker News new | past | comments | ask | show | jobs | submit login
“Computer security 80% solved if we deprecate technology shown in this graphic” (twitter.com/matthew_d_green)
358 points by mariuz 9 months ago | hide | past | favorite | 416 comments



As long as some platform is capable and powerful for many things, there will be malware. The reason why most (consumer-facing at least) malware isn't targeting Linux is because its desktop market share is like 3%. It's way better to target Windows on desktop since you can reach way more users that way.

The only other alternative is turning your computer into a glorified phone (a.k.a. a locked-down media consumption device) where everything is nicely sandboxed and nothing has any kind of permission to do "bad" things. (Except tracking. Because guess what, the company who makes the OS also sells ads.)


Popularity was the canned cope for why Windows 95 through XP were riddled with so much malware. But then Microsoft started taking security more seriously with Vista and onwards. The situation didn't turn into sunshine and roses, but it did improve dramatically. It turns out that popularity wasn't the problem, the problem was the insecure nature of the software. There is of course still a lot of room for improvement.


As opposed to the Linux security best practices of curl | bash? I have no choice but to set up my computer to run untrusted code, on a CPU which itself might be spying on me -- I don't feel like my environment is inherently more secure than Windows at all. Just less popular.


> As opposed to the Linux security best practices of curl | bash?

Just because some people like to ask you to install their software that way doesn't make it "Linux security best practices" and it doesn't mean you need to follow those directions.

You can review whatever you're running, and you should if you want to install that way and feel it's insecure. At a minimum you can download the script to an actual file you keep around for a while and run it, so if something weird does seem to be happening you can at least see what the script was attempting.

Or, just refuse to install software that way. There's almost always a different way, and that's just provided for convenience. If people are opting for the unsafe method because it's convenient, I don't think that says as much about the OS as it does the people using it.


I don't disagree with what you're saying, but:

> You can review whatever you're running

How realistic is this for regular users? And even power users, in some cases. Let's say you download the install script. It's either hundreds of lines or it in turn downloads and runs some blob. Are you comfortable asserting your review is enough?

Is this truly so different to clicking on some random Windows installer?

If the same kind of Windows non-power users start running Linux and it becomes a really widespread desktop OS, would the situation be particularly different?


Regular users on linux shouldn't be downloading software through their web-broswer at all; that's a Windowsism. Regular users on linux should be using their package manager to install new software. Say what you want about Debian's volunteers, but they're a hell of a lot more trustworthy than the average windows software download website.


Shouldn't perhaps (with caveats, and therein lies the rub), but is there a reason to believe that they won't?

I'm torn on this. On the one hand, yes, a "regular" user should be using a distro that has a wide array of natively packaged software, and relying on that as much as possible. But not all software is distributed this way.

And many "regular" users will be coming from a Windows background, meaning they're not going to recognize the fact that the site they found when googling for "Install Spotify on Ubuntu" that tells them to open a command prompt and paste this command or download this .deb file is actually malicious.

In practice, they're susceptible to the same kinds of attacks they would be on Windows.


That blame still appropriately lies with maladaptive behaviors learned from Windows. The only way to completely stop users being susceptible to the "attack" of them phrasing their desires as web searches and then blindly following whatever malicious instructions come up is to fully remove administrator privileges and lock them out of "their" computers. But doing this at the level of the OS producer is utterly at odds with the foundation of a free and open society.

The incremental way to solve this problem is through various rules based around users engaging with details of the OS. One very simple one of these is "only install software through the system package manager". If users violate those rules, short of the above "solution", there is literally nothing that can be done to help them.


Yeah this conversation is borderline philosophical. What does "secure software" mean? As a software engineer I've always thought about secure software as software that does not have bugs that can be exploited by non-authorized users. Be it privilege execution, code injection, remote code execution etc.

As an end-user, I choose to use Linux because it does not stand between me and my computer. I am the master of the machine. I tell it what to do, and it obeys. That is the relationship I want to have with a piece of tangible property that I paid money for.

So if I do something unsafe, even through ignorance or naivety, I still see that as being my fault. Not the software's. In other words, the software was behaving as expected. There were no bugs. It did what the authorized user told it to do.

But I can see the point of view that secure software could also mean software that makes it difficult for the authorized user to do dangerous things. Especially in an organization setting where the user is not actually the owner of the machine, but is using company equipment and software.


> That blame still appropriately lies with maladaptive behaviors learned from Windows.

As a Linux nerd who started tinkering around 2001, I’m having a hard time with this framing. Being a Linux user around that era involved regularly compiling code that practically I couldn’t review. While the steps involved were different, the behaviors associated with installing software were not inherently more secure than windows at the time.

The main differences were: 1) I was more technically savvy and thus not as likely to install something obviously dangerous and 2) One just wasn’t very likely to run into a repo impacted by supply chain compromise or somesuch. Those same behaviors today are pretty risky in comparison, leading to threads like this one.

I’d argue that it’s less about windows being a source of maladaptive behavior, and more about windows attracting users who don’t know better than to behave the way they do. If those same low-tech users were somehow using Linux at the time, I don’t think they’d have learned anything more adaptive, and/or Linux would have been a much larger target, changing the threat landscape entirely.

But to go a step further, I guess the question that comes up for me is: why does it matter if it’s the fault of Windows? The typical “OS war” rhetoric never made much sense to me.

To me, the point of running Linux or really any desktop OS is to allow me to go beyond walled gardens.

My brother is a non-technical artist/creator type. If he relied only on the system package manager, he might as well use an iPad.

What you’re framing here as “violating rules” is really just “using the computer for its intended purpose”.

Following those rules may fix the security problem, in the same way that never leaving my apartment will probably protect me from most seasonal illnesses.

We need better ways to universally indicate trust in distributed software beyond system package managers.


> The main differences were: 1) I was more technically savvy and thus not as likely to install something obviously dangerous and 2) One just wasn’t very likely to run into a repo impacted by supply chain compromise or somesuch. Those same behaviors today are pretty risky in comparison, leading to threads like this one.

Yes, and I would add 3) there was no meaningful malware made for Linux back then, because it wasn't a target for malware authors. Otherwise you (and I!) would have been hit. I didn't understand half the stuff I installed on my Linux box back then, and many install instructions were positively arcane.

> To me, the point of running Linux or really any desktop OS is to allow me to go beyond walled gardens. My brother is a non-technical artist/creator type. If he relied only on the system package manager, he might as well use an iPad.

Fully agreed! Furthermore, anyone using Linux back then would have found your opinion entirely uncontroversial. The point of Linux back then was freedom to do whatever. Gardens -- even non-walled gardens -- were not the point.


Sure, if you turned the 90's/2000's herd of Windows users loose on Linux, you'd likely run into many of the same issues. But still modulo needed software being available in a package manager or at least with a higher barrier to entry of developing substantive source code, rather than needing to use sketchy warez or "legitimate" download sites that added in less-severe malware.

But the real condemnation of Windows isn't just an "OS war", but rather the fight is about the virtue of understanding the software in front of you. This is critical to security - not in the small (the idea that an individual can audit every line of code they run is a fallacious straw man), but rather on the large scale. Having published source code allows the building of a distributed consensus that something can be trusted, not beholden to the existing power structure, whereas without that you're stuck trusting the word of a single company and/of their auditors.

Like today we see malware continually being bundled with proprietary software, so often that it's considered banal and just how things are (eg web surveillance). Yet in Libre land when something like this happens it's still seen as an exceptional occurrence that needs to be stomped down hard. That directly follows from the underlying attitudes of unilateral "trust us" versus community consensus.

> We need better ways to universally indicate trust in distributed software beyond system package managers.

I'd love to hear you expound on this. The only two solutions I can see are are trusting and sandboxing. Sandboxing and capability security was a radical pie in the sky idea two decades ago, but now we've got two extremely popular implementations - web javascript and Android. So it feels a lot less like a lofty perfect solution, despite the current failings (both of those implementations have shirked being secure against many types of attacks!).

Is there some third way, especially that isn't just a combination of the two basic existing approaches? I'd love to hear one!


What about things that are not in their package manager, like most games?

"Only download through this walled garden [Steam, GOG Galaxy, etc]"? So walled gardens are the answer?


Gardens are the solution, but people shouldn't be locked into any garden against their will. Users should be free to choose the garden they prefer any time they wish, or to start their own garden and invite other's to visit it.

I choose the F-droid garden and the OpenSUSE garden. Other people may prefer other gardens, and they should be free to choose the ones they prefer as I am free to choose mine.

When people criticize walled gardens, it's because the wall is like the Berlin Wall; a wall designed to keep people in against their will.


> When people criticize walled gardens, it's because the wall is like the Berlin Wall; a wall designed to keep people in against their will.

Fair enough. You are right there.

But in essence, it's not that Linux is "safer" than Windows against malware. It's that it's a nerdier culture with different practices that don't translate well to the mainstream. Like user kbenson above who suggested "reviewing the installer"... I hope we all agree that's ridiculous, right?


That is not what I said, there was a conditional on that sentence. The real point I was making was the last paragraph of what I posted. Don't use that method if you care about security, it's never, ever been the sole option in any case I've seen which wasn't meant to be nefarious (at a minimum, there should be directions to manually do step by step what the script automates).

That everyone jumped on the "review what you run" part because they weren't paying attention to what I actually said and it looked similar to other arguments when this comes up I think has less to do with what I said and more to do with people wanting to argue that discussion yet again.

The bottom line is that Linux is no different than Windows in this respect (look at the Deno install directions if you want to see the powershell equivalent of curl piped to bash), and this is more a matter of the developer communities being okay with this method, and promoting it regardless of OS. In that respect, saying "Linux security best practices of curl | bash" was just inflammatory and wrong, and deserved to be called out as such. It's not only a Linux thing, it's not anything like a best practice (it's a convenience method provided that trades away security), and as such the statement is just plain wring.


Ok, I'll readily admit I misunderstood that part of your comment then. I stand corrected.

So let me rephrase what I think is the key point here:

Linux is only "safer" than Windows because it has fewer users and they tend to be more technically minded.

However, were Linux to somehow become as mainstream a desktop OS as Windows, two things would happen:

- The userbase would become less technically minded and security aware. I don't want to call them "stupid" however, they likely know other stuff instead. I can't drive a car for example, am I stupid?

- It would become a juicier target for malware creators, and therefore malware would be as widespread as in Windows-land.

There's nothing magical in Linux that would protect a large and careless enough userbase.


I would say Linux and Windows are likely roughly equivalent in almost all safety concerns for regular users (if we normalize for how good a target they are which makes Windows get targeted more in absolute terms, and ignore technical merits of the kernel and secure access/ACL systems, which I think aren't really what this discussion is about).

There might be a slight edge in Linux in that the main way of getting software, the base OS repos and package manager, are about as trustworthy as you can get if you trust the OS to run in the first place, and generally they are packaging and shipping a lot more software that they've vetted, and built themselves and verified with the system than Windows does, which allows the regular user to not lower that trust as much.

Windows is getting closer, in that they have their store now and you can even use winget to install things from it, but those things aren't packages by MS (even if they might be vetted to some degree), so it's note quite the same. In some aspects it's better and in some worse, actually (there's a benefit to the OS maintainers tracking and building stuff themselves).

Beyond the main OS software, it gets into trust relationships and quickly becomes the exact same regardless of OS, as long as you're allowed to install arbitrary software. This is as opposed to a walled garden, which is explicitly trading away convenience (of one type) and choice for security (and uniformity, but that's not a given), in the same way but the opposite direction as curl piping a script to bash which is trading away security for convenience. I wrote elsewhere in this thread (multiple times, to various degrees) how the trust relationship to the source is the real question, and not really different in the desktop OS cases (in case I wasn't clear and you want more words on it to clarify my opinion).

> There's nothing magical in Linux that would protect a large and careless enough userbase.

Nope, there isn't, just as there isn't in Windows or Mac OS (yet, but they're both heading that direction to some degree with their stores).


Yes, gardens are one answer, and likely the best one currently, for non-webapps. Distro package repositories themselves are the original gardens. People tend to give them a pass because having good incentives have kept them decently honest, but distro package repositories are fundamentally gardens.

Gardens allow you to make a small number of trust decisions, and then trust all the software they have vetted by extension.

Note that I'm leaving out "walled" because multiple software sources can coexist. "Walled" only comes about when some company tries to constrain you to their singular source.


If someone gives you a guarantee of safety, you get to blame them when things go wrong. If you demand to strike out on your own, you have no one to blame but yourself. And you should honestly be proud of taking the risk; it's literally the only reason to use all this proud, evocative language about being trapped and needing to be free.

You want to be cutting edge, but not get cut.


Wait. Linux users "strike on their own" all the time!

Who here is a Linux user and never downloaded stuff outside the repo, or compiled sources and run them without reviewing every security loophole? Linux users are the most "demand-ey" of users, even starting flamewars over being forced to do things this way or that way!

I'm really skeptical that this wouldn't introduce malware if malware authors deemed Linux a worthy target.


Just gardens. Package repositories are just that, you can pick whatever you want.

Games are a bit of special case as they don't exactly play nice with Linux and many of them are also run thru emulation like Proton


But that's it. Games and games downloaded from dubious sources are one of the primary infection vectors. It doesn't happen enough in Linux because there aren't enough Linux users to make it a worthwhile target for malware authors!


For this and other reasons (including compatibility issues), when I want to download a game, I will prefer a version for DOS, NES/Famicom, Game Boy Advance, etc, instead of native code on Linux or Windows. (I also prefer FOSS when available, especially for native code; I will not run any non-FOSS program on my computer that is not running as native code.) (And, in some cases, the game might follow rules that I can just reimplement myself instead. For some types of puzzle games, it is relatively easy to reimplement it in Free Hero Mesh, so I am glad I wrote that program. Other times, other reimplementations can be written.)


<citation needed>

Most people don't go on random torrent sites to get their games.

And most people that do also don't get infected because usually the top rated torrent is just fine. I pirated a lot as a dumb kid without income and haven't managed to catch anything at least.


> That blame still appropriately lies with maladaptive behaviors learned from Windows. The only way to completely stop users being susceptible to the "attack" of them phrasing their desires as web searches and then blindly following whatever malicious instructions come up is to fully remove administrator privileges and lock them out of "their" computers.

And making it a class at school. We have universal education in most places, we can use it for something useful. There's no reason that we have to capitulate to corporations and their moats. We can teach children how the devices that surround them and order them around work, and how to deal with the predators that they'll encounter while interacting with them.


The way of solving it would be streamlining adding new repositories for the 3rd party stuff.

Way too often it's "download some dumbass script running some half-assed autodetection just to add a line of text to config and a GPG key.


I would argue that to make a distro that has a wide array of natively packaged up-to-date software in a world of significant (lets say 25% and above) desktop linux use, then it would be nessecary to discard the level of curation that serves the security that such distros currently uphold.

That is to say, if desktop linux use became significant enough to challenge windows and macos, then either the distros will lag out of date packages, not cover widely requested packages, or not have security through curation. Any of these will likely inherently lead to users downloading packages to install manually.

I say this as someone that has been running desktop linux on and off since 1992; and I wholeheartedly believe that there's no inherent reason linux desktop use cannot be popular.


It's easy: Windows insecurities are the fault of Windows. Similar Linux insecurities are the fault of Linux users.

That's a general theme in OS wars, Windows users are eager to criticize Windows, while Linux and macOS users are eager to defend their OS.


In the specific case of running something blind with curl| bash instead of using a package manager or using something that maintains a degree of isolation (flatpak, docker containers, ...), yes it's the fault of users.

I don't criticize Windows when a user installed a malware and clicked on "Yes" when prompted to allow the app to make changes to the computer in the same way, I don't criticize linux distroes for allowing users to install something by piping directly a curl into bash without checking. I appreciate not having an os that's a walled garden unlike what happens with phones but that does put some responsibilities on users.

For a real world example, if someone gets an std from having unprotected sex without a condom, the blame lies on them for doing so.


Package manager is less safe than curl|bash, they do the same thing, but package manager does it with root privilege, while curl|bash can run in user context.


> Package manager is less safe than curl|bash, they do the same thing

This is so far from true, I wish this misconception didn't exist.

First, a package in a distro repo is maintained by known people, the package maintainers. Some do an exceptional job, some put in less effort, but at the very least there is some oversight on what gets into an official distro package. And there is an audit trail of who approved what and when.

Distro packages are signed and verified by the tooling, so inserting a malicious package is much more difficult.

Distro packages are also versioned and prior versions remain available, so if there is an issue we can trace who/when/why it was introduced and when it was fixed.

None of these protections exist when you just curl some random executable off from some third-party website and hope for the best.

Realize that the website might return a different executable every time. Even if you & I curl it within seconds, we might be served different content. It might return malware in a tiny percentage of cases to avoid detection. It may return code without any versioning, or code that claims a constant version even though the content changes. It is impossible to have reproducible installs when you just retrieve something via curl without any validation or versioning.

(You could, of course, curl it to a file and then compute a hash on the file, assign it a version number, store and look up known hash/versions and so on... but now you're down the path of building an adhoc package manager to resolve the problems with curl|bash. So... might as well use the mature package manager your distro already has because it already solved these problems.)

> while curl|bash can run in user context

No, it runs as whatever user you run it as. You can find plenty of websites saying their installer needs to be run as root.


Package manager != distro repo.


That only makes sense if you don't understand all the things package managers generally do or why curl|bash is as bad as it is, and specifically why it's even worse than 'curl > file.sh; . file.sh'

- Package managers generally sign their packages, and provide ways to prove the integrity of the downloaded packages, so even if someone hijacks your DNS or exploits the box you're downloading your packages from and injects a malicious one, it will be rejected on your side. Yum/dnf do this with GPG, for example.

- Package managers usually track what was done and what files were put where, so there's an easy way to see what was installed and clean it up (which may also be automated). This isn't perfect, as the packages can run scripts as part of install usually, but it is helpful.

- Packages from package managers are vetted by the team that puts them out, and depending on the system built by that same team. For stores, often it's just vetted in some way and the submitter builds it, but package systems are almost always vouching for the packages you get. In the case of the OS package manager, you obviously trust them if you're running the OS, otherwise this discussion has no meaning.

- Whether things are installed as root or as a user (commonly. You can curl|bash as root too, and many instructions prefix it with sudo...), but I think it's usually more important how trusted the sources are. The OS packagers, a well trusted group or company with something to lose, etc is more likely to have put systems into place to ensure safety (and protect themselves from problems if they are hacked). See above. Regardless of whether they're run as root or not, I think they're somewhat safer and more trustworthy, but this is a personal choice, since it's largely based on your own trust of the sources and the systems used in the chain of getting the software to you.

Ultimately, it's less an OS issue and more an issue of what the individual is comfortable with, which spans operating systems. People do stuff just as unsafe as curl|bash on windows all the time too.


>Package managers generally sign their packages

At least on ubuntu and centos I had no problem installing unsigned packages, no warnings, nothing, it's allowed by design. Try it yourself.

>Packages from package managers are vetted by the team that puts them out

It's called distro repo, package managers don't contain packages.

>In the case of the OS package manager, you obviously trust them if you're running the OS, otherwise this discussion has no meaning.

This has nothing to do with security, you can trust your system to run malware correctly too, just like any system does it.


> At least on ubuntu and centos I had no problem installing unsigned packages, no warnings, nothing, it's allowed by design. Try it yourself.

Manually, through yum localinstall or by referencing a local RPM or a manual rpm command, or an unsigned package in the remote repos? I've seen repos configured with signing keys have problems and the install command fails, so I assume you're referring to manually.

I think what you're seeing is that the package manager utilities (yum/dnf/apt/whatever) are all capable of verification, but are also happy to install things without verification in many cases. But if the RPM you downloaded is signed and RPM has had signatures loaded into it and there's a mismatch between what the rpm utility knows about and what the RPM you're installing is signed with, rpm will complain very loudly and fail (I have had to add --nosignature to rpm commands in some cases and import keys into rpm in others).

In addition to at the RPM level, the repos themselves often indicate a gpg key that packages are signed with, which the system package maintainers, which is what I was somewhat ambiguously referring to in my prior comment as package managers (in which I meant the managers of the system packages), will sign all packages they publish with so the integrity of updates and additional software they provide can be confirmed.

Given that, I'm not sure how you can maintain that the package management utils on systems do the same thing as piping arbitrary internet content to a bash prompt. I think you were just possibly a bit mistaken about what the package management utilities are really doing and enforcing with their signatures.

> It's called distro repo, package managers don't contain packages.

It's called the packages the OS provides. It's called many things. My terminology was somewhat ambiguous.

> This has nothing to do with security, you can trust your system to run malware correctly too, just like any system does it.

Sure it does. If you trust your system to run malware from the OS providers, then you don't have to care what other software you're downloading and running, you've already set your trust level of the system to "none".

If you do trust the OS provider (whether that by a Linux distro, MS for Windows or Apple for Mac OS) to not be malicious (and please, let's forestall any digression into privacy, we're talking about malicious intent not allowed by EULAs), then you should trust the other software they provide that's verifiably from them.


I disagree, with the package manager you know that it's been vetted by the distribution maintainer, they ensure that when you install something through the package manager the package comes from them and they ensure the binaries do not change. The package maintainers will also remove packages that have major security issues and the package manager will track the apps installed and automatically update them if there's a security fix.

Curl | Bash usually runs in user context (although I've seen curl | sudo bash in the wild) but there's also concerns of vetting who wrote the instructions, if they actually point to the app or a modified version of the app containing malwares, etc...


It's called distro repo, package managers don't contain packages and can be used to install packages from any source. Some version sensitive programs are distributes as packages, because distro repo moves slowly.


> Regular users on linux shouldn't be downloading software through their web-broswer at all; that's a Windowsism.

I strongly disagree. "Only download from here; if it doesn't have what you want, though luck".

Also, this seems like an argument in favor of a walled garden. If so, I suppose that would fix Windows.


> "Only download from here; if it doesn't have what you want, though luck".

It's not that doing otherwise is prohibited. It's that doing otherwise should get your hackles up.

Which is why it isn't this:

> this seems like an argument in favor of a walled garden.

There are no walls. It's just a garden. But you have to understand that if you leave the garden, you're on your own.

For software developers and IT professionals, that's fine. They have a professional knowledge of the reputation of the source or know how to read the code, or how to set up a virtual machine if they want to try it but don't trust it. And if an ordinary user who is rightly wary of doing that still wants to get the latest AI thing from github, they call up their friend the software developer or their company's IT department or pay a computer repair shop they trust to set it up for them.

But that should be rare, because anything which is both popular and safe should promptly get added to the package manager.


Agreed, not a "walled" garden but a garden. Essentially an app store.

So essentially if Windows had this, problem fixed?

Or put another way, if most users came to Linux and started downloading crap from everywhere, there would be incentive for malware authors to write it for Linux, bringing it to the current situation with Windows?


> So essentially if Windows had this, problem fixed?

The problem is that the nature of Windows isn't to have this. Linux package managers are run by the community. They basically include anything popular that meets the licensing requirements to allow them to redistribute it. Windows instead has a "store" that wants to extract a vig, but because most of the software people use on Windows is commercial, the vendors then avoid to store to avoid the vig and the default is to install things from random websites.

And even if they fixed that, people are used to doing it the other way, vendors have already spent the time to set up alternate distribution infrastructure and they don't trust Microsoft not to reverse course once a critical mass of things are using their system and they can increase the friction to installing things from outside of it and then turn the screws on anything inside it.

To make it work there you would need the distribution system to be run by multiple independent third parties so they'd have to compete with each other to keep distribution margins low. There probably is a market for a third party "store" for Windows that pays the 3% to the credit cards and then distributes the apps via P2P so they can charge no more than 5% to app developers, which the app developers might then actually use because they don't have to build their own system for payments and updates. But the stores want to charge 30% and then the developers don't want to use the stores.


> The problem is that the nature of Windows isn't to have this.

Interestingly, they're getting a version of it. The wingget command line package manager that windows ships with now has two sources, the MS store, and the winget community repo. The community repo is something anyone can submit to and it goes through some vetting process[1].

  PS C:\> winget search perl
  Name             Id                            Version                Match          Source
  --------------------------------------------------------------------------------------------
  Perl Formatter   9MSQVFZPRG3Z                  Unknown                               msstore
  Strawberry Perl  StrawberryPerl.StrawberryPerl 5.32.1001              Tag: perl      winget
  MAMP & MAMP PRO  MAMP.MAMP                     4.2.0                  Tag: perl      winget
  EditPlus         ES-Computing.EditPlus         5.6                    Tag: perl      winget
  XAMPP 8.1        ApacheFriends.Xampp.8.1       8.1.12-0               Tag: perl      winget
  XAMPP 8.2        ApacheFriends.Xampp.8.2       8.2.4                  Tag: perl      winget
  Sharperlight 5.4 PhilightPtyLtd.Sharperlight   5.4.60                                winget
  Wtils            Perlt.Wtils                   v1.0                                  winget
  Paperlib         FutureScholars.Paperlib       2.2.3                                 winget
  Teambition       Alibaba.Teambition            2.0.3                  Tag: paperless winget
  DingTalk         Alibaba.DingTalk              7.0.30-Release.6019103 Tag: paperless winget

It's not quite the same though, as there are different considerations when using a repository of things a unified group has decided should be included and built (or slightly modified existing) packages for and a repo where anyone can submit a package that will go through some level of vetting. In the end I still believe most this discussion is really about individuals and how much trust they apply towards different groups and sources and is not really about Linux or Windows in particular as much.

1: https://github.com/microsoft/winget-pkgs


The problem on Windows isn't exactly the lack of apt-get. It's the general problem of not making paid software distribution sufficiently fungible. Which is a cross-platform problem but manifests most on Windows because that's the platform where people use the most third party commercial software.

The Linux community "solves" this problem by directing commercial software vendors to the door and implementing anything they want as open source. But having something that works for this would help even there, for the cases where that hasn't worked, like AAA games.

Ironically the best way to solve this for proprietary software would be an open system. Have someone create a software distribution framework that uses open source code and federated P2P for distribution with pluggable payment processors. Make it an interoperable standard.

The idea is that it's a distribution system with no central distributor, a payments system which is independent of a payment processor. The vendor chooses which payment processors they want to accept (which might just be all of them), then the user chooses which ones they want to pay with, and if the vendor gets into a dispute with a payment processor their users automatically get diverted to a different one. If they get tired of their hosting provider they move to another one without the users ever noticing.

But then you need someone to develop it when its very purpose is to make sure nobody can extract high rents.

A coalition of mid-sized developers might be wise to pool their resources. Or, for that matter, get in with a bigger pool, because this is a problem that exists for subscription services and small businesses in general.


> But then you need someone to develop it when its very purpose is to make sure nobody can extract high rents.

It doesn't address a very important aspect of this which is trust, and that's most of the point of this discussion. People use repos usually because there's some level of trust in the repo maintainers. If anyone can push anything, then it's a liability to have that repo configured. If it requires careful vetting, then that costs money, and requires a central authority, which means it doesn't really matter whether it's P2P or not (except to lower cost), as it's centrally managed anyway.

Theoretically I could see a system like that in place where the "network" is all open and P2P and you just subscribe to sets of packages that have been "signed" by an authority you trust, but I'm not sure that the P2P portion is really all that useful then.

The whole reason the default repos in a linux Distro are things people feel safe running whatever they find in is because they know a group of people they trust has vetted it. If you're running Debian/Ubuntu/RHEL/Rocky/Windows/MacOS you've already trusted the maintainers of their default repos/etc by the nature of running their OS in the first place. People also often choose to trust large companies (Adobe, VMware, Google for Chrome in some cases) and/or well known groups/projects (Apache, ffmpeg, etc) when they distribute software separately, even it downloaded manually. Finally, people make ad-hoc choices about random less well known sites and people, and that's where random windows executable or Linux binaries, or installer scripts that are downloaded and run or piped to bash form curl happen.

All those levels of trust and those parties exist for every OS. Even Linux has it's fair share of third party downloaded applications people use, depending on what they use their system for. Some communities of people (e.g. developers) are much more comfortable with ad-hoc installation methods like curl|bash than others, and that's across OS boundaries. That's really what I meant way upthread when I said this isn't a Linux problem, it's a people problem.


Windows does have this, of course, so the question is, has security issues with windows decreased in line with the rise of users using the Windows Store to acquire packages, or is there just as much malware on the store?

(Genuine questions, I don't know the data, or even if the data exists outside of MS)


These are dishonest arguments.

1) Download random shit from the internet at your own risk. If you're given a vast supply of safe software, and you choose not to use it, remember that you're a grown up and you should do what you like.

2) Nobody is objecting to walled gardens with no walls. Almost nobody, I should say; I've seen people tell Apple users that the fact that they are happy with the app store makes them bad in some way, but those people are shitheads. The reason to attack Apple is on behalf of their users, not some perverse brand nationalism.

If an Apple user can install whatever they want, and end their relationship with the Apple corporation at any time, that's winning. If the vast majority of Apple users decide that they value whatever contract (implicit or explicit) that Apple has made with them, and enjoy the relationship and the stewardship of the app store, that's a choice they're making as free people. And under the pressure of free people, the app store would have to improve anyway. I certainly have affection for what Debian does (and for everybody who wrote the software packaged in Debian.) Why shouldn't they feel that for Apple?


I misspoke, it's indeed a "garden", not a walled garden.

Linux users often rail against Apple's gardens, so it'd be dishonest to pretend otherwise. I should know! I've been a Linux user for 20 years now.

> If you're given a vast supply of safe software, and you choose not to use it, remember that you're a grown up and you should do what you like.

But lots of software in Linux isn't available in any repos. For example, games and stuff a typical mainstream user would expect. So Linux couldn't be turned into a "safe" mainstream OS unless it adopted a more diverse "app store", like macOS.

But this could very well be done by Windows, so it's not that one OS is "safe" or "safer" than the other. It's essentially a popularity thing.

> Download random shit from the internet at your own risk

And here we have it! Linux users "download random shit at their own risk" because they are not mainstream users; their needs are served by their distro's repo because their needs are different. If Linux was a mainstream OS, with the kinds of users that come with it, it would either have to turn into macOS or Windows. Either draconian measures (a single store where you can buy everything), or no measures at all (== malware).

Expecting people to "review the installer" is ridiculous.


Regular users on linux shouldn't be downloading software through their web-browser at all.

They should be adding a repository trusted with keys but so far UI/UX for it is horrible for regular users. Still better than... whatever the fuck windows is doing tho.


> Also, this seems like an argument in favor of a walled garden.

It is always entertaining to see HN's commentariat both rail against walled gardens by (for example) Apple or Android that are aimed at making life easier for regular people, while advocating them for Linux.


I think it's pretty common these days to have people git clone a repo and then build it. Not everything is on a package manager, and I see fewer new things on aptitude. At best, they're available as modules in npm or pip to be installed globally


I just said ‘git clone’ to my wife and she slapped me in the face. I’m sorry but for the aforementioned regular users this is nowhere near common.


Agreed about `git clone`, but installing things from the web is one of the expected usages of any system. For regular users. Lots of indie and non-commercial (and even commercial) stuff to download this way.

In Linux, .sh installers are common. GOG games get distributed this way. If your wife still metaphorically slaps you when you mention .sh installers, it's only because she doesn't play games on Linux. She wouldn't know how to use apt either.

I think in the end the truth is that Windows is more targeted by malware because it's more widespread than Linux.


Yeah but I think it's unfair to compare average linux users to average windows users. They're not the same kind of users. Most (desktop) linux users are software engineers


This is giving me flashbacks to my consulting days. The IT people were all forced to call it “JitLabs” and “JitHub” because HR considered git to be offensive.


God forbid you edit images, they'd have had a heart attack...


Are you british? God forbid you used mongoDB


I would guess most of those are tools aimed at developers, who can take that risk if they wish. For most users, almost anything they want is either in official repos, or in Flatpaks, which offer some sandboxing (although I guess a malicious Flatpak could just ask for excessive permissions, like a random apk)


Developers are less common than regular users, but still they are among the "common" users of operating systems, so that use case must be handled. Malware on Windows also gets distributed in tools supposedly for developers, after all.


That is nonsence. On various distributions, packages are just packed stuff rom the vendor site.

Package manager has little to do with security, unless you count hash checking as one. Its about automation.

Besides, windows has multiple good package managers sine long time ago.

BTW, to demonstrate the invalidness of the argument, you don't have to look further then nvm package manager...


> That is nonsence. On various distributions, packages are just packed stuff rom the vendor site.

If it's actually from the vendor that's already an improvement over the typical Windows experience.


Regular users on Linux shouldn’t run commercial software?


Is commercial software incapable of being packaged?


No, but packaging a software package for every Linux distro that exists is unfeasible. Not that I care though, I don't run commercial software. But, you know, devil's advocate and all that. Still, I completely understand why someone might be frustrated by the way software is usually installed in Linux if they were, say, a game developer.


99% of software packed for Debian will just work with any of the derivatives. No idea how it looks like on the RPM side, but as long as your distro is new enough, 3rd party software packaged for Ubuntu usually works on Debian and 3rd party software packaged for Debian near-always works on derivatives.


Yes, packaging a software package for every Linux distro _is_ unfeasible, but have you ever used Linux? There are snaps, flatpaks, and AppImages, which can all run in any distro, and are generally more secure than "native" packages (for lack of a better word).


> Snaps

A technology superseded by Flatpaks, yet pushed incessantly by Canonical, a befuddling move that I still don't quite understand. Rough to use in any other distro.

> AppImages

Speaking from experience, these don't run on every distro. So they fail to fulfill their intended purpose. As far as I'm concerned, that makes distributing software as AppImages a no-go.

> Flatpak

Better than any of the technologies previously quoted, but it is not without it's own issues. The chances of a Flatpak working on any particular distro are acceptably high, but they still suffer from the same problem AppImages do. I've had an instance were a an app refused to run on OpenSUSE, even though it was working completely fine on Fedora (I was using Flathub's repo on both distros, I wasn't using Fedora's, just to clarify). I think it was Firefox, though I'm not 100% on that.

Still, I'm yet to see a commercial software being distributed as a Flatpak. My guess is that it's all more of a hassle than it is worth. Which, I guess you could say that about packaging commercial software for Linux in general. So, we're back to square one with the chicken and the egg problem that Linux suffers from. Though nowadays it's less severe what with the existence of SteamOS and all of that, so at least there is a substantial marketshare, small as it is.

EDIT: fixed vertical spacing.


How do you define "commercial software"? Spotify, Zoom, Steam, Discord, Postman, IDEA Ultimate, and lots of other end-user software that is built by companies and where people pay for things (i.e. commercial software?) is available through Flathub.

Most commercial software in general can be downloaded as a free demo version and then activated with a license key or account, and that model works really well with Flatpak and even Flathub.


> packaging a software package for every Linux distro that exists is unfeasible.

For every Linux distro, sure, but it is feasible to create an apt repo and a Yum repo, and don't those cover the vast majority of distros by usage?


Incapable of being packaged? Usually not. Incapable of being included in a distro’s repositories? Usually, yes.


Commercial software vendors can provide the source and build procedure.


How can you vet the source and build procedure?

Assuming this is a commercial vendor not available through your package manager, and that you must go to the website, pay and get a download link (with source in this scenario), how is this fundamentally different to a Windows user paying for and downloading something bundled with malware?

Were Linux to go mainstream, it'd be unrealistic to ask users to vet the source code! Who has the time and expertise? You fundamentally rely on others to tell you it's safe. On Linux it's a safe bet, since malware authors are less interested in targeting it.


>Regular users on linux should be using their package manager to install new software.

Unfortunately not at all feasible for a small dev releasing software on Linux. The choice is either to support 15 different package managers yourself (including QA!) or just hope that the community gets it right.

I eventually refused to support any user who didn't download the AppImage directly from my GitHub page because the distro-specific versions would frequently break.


Aside from Chromebook and Android, there aren't enough Linux users for this to mean much. I'd venture that "regular" Linux users aren't "regular" in the manner that you describe -- we build, and that's why we're on Linux. We "regulars" download software that moves faster than maintainers can keep up, because that's necessary for the things we're building.


>Say what you want about Debian's volunteers, but they're a hell of a lot more trustworthy than the average windows software download website.

Really? Software developers, who distribute through their websites, have an economic incentive to not give users malware. I'm not sure the same applies to Debian's volunteers. I don't even know who these volunteers are.


Average windows user does a websearch for software and very often finds not the first-party website operated by the developer, but instead finds websites like sourceforge, cnet, softpedia, etc. Downloading sketchy freeware from third parties is Windows culture. This culture is encouraged by Microsoft not vetting and packaging free software themselves like Linux distros do.


I don't know where you've gotten the idea that Windows will just run whatever software you provide it without saying anything. Executables must be signed with a trusted CA. You can get this trust by buying a CA and waiting for reputation to build (which means any malware you produce can be tracked back to your business), submitting the software to Microsoft for malware analysis, or waiting a very long time for reputation to build[0].

If your executable doesn't have trust, a scary warning pops up (or Windows blocks the app from running) and tells the user Windows Defender SmartScreen prevented an unrecognized app from starting. Running this app might put your PC at risk. This seems about as effective as having a bunch of random people vetting packages for a Linux distro.

[0]: https://stackoverflow.com/questions/48946680/how-to-avoid-th...


Sure, if by "scary warning" you mean the click through nags that Windows pops up early and often (sometimes multiple times for a single action) and that have trained generations to ignore software warnings and dialogs in general.

I honestly just installed my first non-throwaway Windows VM in a long while, and I was appalled how the state of the art in Windows "security" is still stuck where it was a decade ago.


Not only that, but for a while, a lot of Windows developers had links to sketchy mirrors right on their own web pages! They've normalized sending users to sites with names like DonkeyMirror.ru to download their official ZIP files.


> Downloading sketchy freeware from third parties is Windows culture

No, it's "computer illiterate" culture. Windows has a few package managers available these days (including a first party one). Developers on windows install things the same way that linux users do, though not usually building the software along the way (though I often have to use cmake with visual studio)


> I don't even know who these volunteers are.

They're probably on here, reading your comments, or reading LWN.

You have more chance of reaching a DD and reading their work than you do of reaching a commercial software author.


And how user is supposed to know that the company is a trustworthy company just selling their software and not a scam ?

Single Debian volunteer would have to do quite a bit of work to get into position of being able to just push malware into the repo; and if they did it lands in debian unstable/testing so there is also a pretty good chance it would be noticed.


> Really? Software developers, who distribute through their websites

Yeah, then some company installer-hijacks your software and SEOs your site. Case in point, VLC (for Windows of course)


> Software developers, who distribute through their websites, have an economic incentive to not give users malware

You're putting way too much faith in the efficient market fallacy. In reality, proprietary software companies are incentivized to distribute malware to increase their own control and their bottom line. Prominent examples being BonziBuddy, Sony Rootkit, Denuvo, all the crapware that comes bundled with Android/Windows, web ads, web surveillance, etc. Like every other day there is a new HN topic about how some company violated the trust they had built and screwed over users.


<violent head shake, spilled drink>

Excuse me, what?

Downloading via web browser was the original means (besides ftp) of getting anything. Hell tarball distribution was how everyone used to move bits around.

Package maintainers are not Linux. Never will be, never have been. Linux may start with a distro or live CD, but from there it's you arranging things in a way that best works for you.

Or are you going to try to sell me on the fact that Linux From Scratch is basically pushing you to wget source tarballs, is peak windowsism?

If anything, distribution package managers are more of a windowsism than anything else. About the most I tend to allow myself is to use the apt-ified form of software install after I've torn apart an sbopkg build from source. Even on windows I've gotten to the point I've started dumping symbol tables from binaries, for all the cold comfort and reminder that the world is a capitalist hellhole that offers nothing but clients of servers looking to charge you rent anymore.

How do you ever expect to learn how your computer works and how to drive it if you don't read?


Yeah, I'm nodding in agreement with you.

I'm surprised by some of the answers I'm getting -- and I'm both a Linux fan and an almost exclusive user for the past 20 years. Yet I don't delude myself about the ton of crap I download in order to get things to be the way I want. Sometimes it's Steam, sometimes it's GOG, sometimes it's the official repo, sometimes it's a PPA, sometimes it's just random stuff on the web.

And yes -- downloading stuff from the web is how it's supposed to be used. Have people really changed so much that this is now frowned upon?

In any case, I still think we're "safe" because malware authors don't think it's worth their time to target Linux.


> Regular users on linux shouldn't be downloading software through their web-broswer at all; that's a Windowsism

have you ever met a regular user?


> Regular users on linux shouldn't be downloading software through their web-broswer at all; that's a Windowsism.

Sure, downloading executables and running them in UAC-protected environment is a Windowsism. Linux way is to copy commands from a random web page and run them as root. Of course all the commands on how-to sites in search results are trustworthy!


> How realistic is this for regular users? And even power users, in some cases. Let's say you download the install script. It's either hundreds of lines or it in turn downloads and runs some blob. Are you comfortable asserting your review is enough?

> Is this truly so different to clicking on some random Windows installer?

Yes, because you literally can't look in a random Windows installer (or, at least, it's not made to allow you to do so). It's true that many users won't have the competence to read and understand source code, but … it seems like that may be a genuinely unsolvable problem (if you want powerful software to be available to non-dev users); I don't know much about my car, but I could, and, when it is genuine complexity making understanding difficult rather than intentional black-boxing and obfuscation, I don't blame that on the car manufacturer.


> Yes, because you literally can't look in a random Windows installer

Most windows installers are regular archive formats, with either msi information or an executable tacked on. They open just fine in 7zip. Of course analyzing the binary files inside the installer is another matter.


MSIs often contain CAB files, which 7Zip can also open, though usually files in MSIs aren't named the same as their uncompressed files (i.e., .dl_ for .dll).


I think realistically you cannot expect any user, in any system, to be able to review arbitrary scripts.

Experts can, but it's asking too much of regular users who aren't programmers.

And therefore, that's the answer of why Linux is "safer" than Windows.


Do regular users `curl|bash`? From my experience that's a pretty rare occurence: stuff users need is generally in the repos


What's confusing here is that desktop Linux almost doesn't have "regular users". You're a power user already if you are running apt, much less curl|bash. It's not super meaningful to ask what the small number of "regular" desktop Linux users are doing; what is meaningful is to ask what they would do if they existed.

Which is probably the same thing they do on Windows: use a browser to download and run whatever program claims to do what they want.


Ubuntu has an app store (a graphical interface to apt at the time), when my grandpa was running Linux that's what we taught him to use (not often did he install anything though: libreoffice, an email client and a browser was pretty much all he needed)


Exactly! You articulated that better than I did.


> Do regular users `curl|bash`?

They don't (unless following instructions). But that's my point: downloading stuff and "reviewing" it is not feasible except for power users -- which are not the scenario we're describing -- and not even then! Can you tell me you trust yourself to review a non-trivial install script?

> stuff users need is generally in the repos

Even games? If we're talking about regular users, they'll want to play games and other things not packaged with their repo.


Whether regular users actually realize this or not, they shouldn't download random binaries or scripts from random sites and run them on Linux any more than they should on Windows.

Contextually, it feels different, but it's not. Not really. If you want safe, there's needs to be a chain of trust or something analogous, whether that's vetted repos, trusted companies (i.e. "able to be tracked down and sued") you're installing from, or an individual or group with a vested interest in keeping things safe (a project that has a track record).

Should regular users be using a script from some site to install stuff? Probably not. It's not safe. But that's not a Linux problem as much as it's a developer ecosystem problem and people not recognizing it as unsafe when they'd be leery of doing the same thing on Windows.

As an example, I give you Deno's installation instructions page[1]. Notice that equivalent to curl and bash as the first available Windows installation method? You can do that, or you could winget install it from the Windows Store, which presumably goes through some vetting process. Mac OS is in there as well with a bunch of possibly unsafe options (depends on how much you trust each package system...).

This isn't an OS problem, it's a community problem. Either we have the option of people being able to do less safe things, or we all run the equivalent of iOS and can only install and run software vetted by others. Pick your poison.

1: https://deno.land/manual@v1.35.0/getting_started/installatio...


It is not a remote theoretical possibility but just that no one has attempted it, the PID 1 such as systemd can be replaced with an infected version of systemd and then imagine what's not possible once you as malware yourself are systemd, the ring master.

This Linux superiority complex isn't rooted in reality, not that I'll ever pick Windows as my daily driver.


Exactly.

I like and been using Linux as my personal and work computer for decades. But I don't delude myself about what safety is inherent to the system and what just about popularity.


"How realistic is this for regular users?" Doesn't matter. Irrelevant question. The fact that it's possible vs not-possible is the infinitely more important distinction.

"Is this truly so different..." Yes.

"...would the situation be particularly different?" Yes.

It only takes a few people to actually check something for everyone else to benefit from the results, and we even benefit from the mere possibility that anyone could at any time.

The fact that this benefit isn't 100% (we still discover 20 year old critical flaws in widely dispersed open source code) doesn't change the fact at all.


Unpopular opinion: "regular users" are, by their very nature, incapable of using any networked operating system with a 100% certainty of not infecting themselves with malware.

They're not qualified to only make safe decisions during their computing because they're not educated enough to understand what makes any given action safe or unsafe.

Using a computer is fundamentally not like using a car. Using a car, by and large, does not change. The only major exceptions are when the user fails to properly maintain it, altering weather conditions, and altering traffic conditions.

Once a driver has driven in any given permutation of traffic condition and weather condition, as long as they've maintained their vehicle, the driver's experience will be almost identical when they find themselves in that same permutation of conditions again.

This consistency allows drivers to build experience in adjusting their driving to operate in those conditions, which makes them better at it in those same conditions in the future.

We let laypeople drive, even those who haven't the slightest idea of how their braking system works mechanically, because there is an extremely limited range of outcomes from pressing the brake pedal at a given pressure in a given set of conditions provided it's maintained.

The scope of inputs we give drivers is ultimately tiny.

Computers are not like this. The safety habits you learned in 1995 are not going to cover every threat you encounter in 2005, the safety habits you learn in 2005 won't cover every threat in 2015, and likewise from 2015 to 2025.

As long as we give users a broad range of possible inputs, they will find ways to screw themselves with their own incompetence.

The reason iPhones and Mac OS computers are perceived by the layperson to be more secure isn't that they're inherently less hackable, it's because they treat the average user like the moron that the average user actually is by substantially restricting the input freedoms of that user. How many millions of iPhone users didn't get hacked because the developer denied them the freedom to sideload aribtrary unsigned IPA's

With great freedom comes an increased responsibility to understand the consequences of one's own actions. Users are lazy. Many are stupid. They do not read very much of anything. They do not understand the systems they are using and they don't want to.

As a technologist, I love having the freedom of an unbridled OS that lets me do whatever I want, including deleting the whole file system. That kind of freedom just isn't optimal for a typical user's security.

This may sound misanthropic to you, but look no further than the scores of people who microwaved or soaked their iphones because 4chan made spoofed ads that looked like real apple ads promising software updates that made it possible to charge one's iphone by microwaving it or a software update enabling waterproofing.

Users really are that stupid, and will ultimately find ways to harm themselves and their devices any way you allow them to, so long as there's a competent adversary trying to get them to do it.


> Unpopular opinion: "regular users" are, by their very nature, incapable of using any networked operating system with a 100% certainty of not infecting themselves with malware.

Unpopular? I'd go so far as to say it's a given, and go so far as to so even an "expert user" isn't going to be able to reach 100% certainty while still using the system for it's purpose in almost all cases, unless it's air gapped or they've had their permissions reduced to the point they can't do certain things (which might help the regular user as well).

> Using a computer is fundamentally not like using a car. Using a car, by and large, does not change.

Except in the way that it's exactly like using a car. That is, in that it's someone operating a complex piece of machinery within narrow bounds that make it generally safe, but sometimes things happen either from the operator stepping outside of those bounds for convenience or inattentiveness or because of outside actions that make it unsafe.

> We let laypeople drive, even those who haven't the slightest idea of how their braking system works mechanically, because there is an extremely limited range of outcomes from pressing the brake pedal at a given pressure in a given set of conditions provided it's maintained.

I would say it's more because "normal" operation of a car only requires being trained to a specific level on specific capabilities. A professional driver that races may use the controls of the car very differently and achieve a much different outcome (the e-brake is just for when parked? Says you...).

We do tend to only legally allow specific types of car use in specific contexts though, so that's food for thought.

> Users really are that stupid, and will ultimately find ways to harm themselves and their devices any way you allow them to, so long as there's a competent adversary trying to get them to do it.

I totally agree. I just don't think that Linux is particularly worse than windows these days with regard to the trouble you can get into (you can run powershell scripts to do installs to, and I've seen the powershell equivalent to curl | bash.

There's a whole host of behaviors that people view as different when the context changes that aren't really different in practice. Running random executables on Windows is generally unsafe, and most people develop that sense after a while (either from being told or the hard way). Doing the same on Linux is unsafe in many ways too (except that often there's some additional trust we layer on some of the places we're getting the executables from), and running random shell commands isn't really any different, but people feel like it is because it's no longer in the context of Windows. That doesn't really make it better, it just makes people feel better about it.

If you want to be safe, you either stick with a vetted source you trust such as the package repo for the OS or software originating at a company you trust (which might just mean they're someone possible to track down and sue, so they're less likely to go rogue), or that has a reputation they don't want to screw up and a mechanism is in place that you're fairly sure you're using code from them (e.g. github and a trusted author or project). Other than things fundamentally like that, you're just rolling the dice. Which happens, and we've all done it, usually without problem. Which makes up complacent.


regular users don't use linux lol...


That misses the point. There was a context to my reply!

I know regular users don't use Linux. What I'm refuting is the notion that Linux is safer than Windows because "you can review the install script".

What I'm arguing is that you really can't review anything. Suppose Linux were to magically go mainstream on the desktop: you cannot ask users to review installers. That's crazy.

Finally, what I'm supporting is the assertion that Linux is safer from malware precisely because it's less widespread than Windows, making it a less interesting attack target for malware creators.


This kind of comment reeks of idealism. Sure, you can look at every single thing you install because of the OSS nature of Linux, but you are incredibly naive if you delude yourself into thinking a) that you have the bandwidth do to this and stay meaningfully productive, and b) have the technical abilities to sufficiently evaluate what each piece of software does.

This take is reductive and should not be taken seriously.

edit: Editing to say, I do agree with the OP in spirit, you should try to avoid running untrusted software. But the devil here is in the details, it's simply not an easily feasible goal.


If the alternative is not even sources but just a black box binary, then its at least better to be able to inspect the sources.


I agree. But I think we shouldn't equate the fact that we-have-the-technical ability-to-do-this with we-have-the-bandwidth-and-the-skills-required-to-do-this.


The discussion was a comparison against the security culture in Windows. Source availability is a sort of bare minimum at least, obviously by itself thats not enough one wants the source to have been crosschecked by lots of eyes.

My personal opinion regarding security I would say is easy ability to apply sandboxing at various levels on the user side for software we trust less. (By user side I mean not depending on the developer to package the app a certain way etc. A virtual machine or a chroot jail etc are examples of what I meant by this).


My point isn't that you should review the scripts you download to install as much as it's that you should avoid using those if you actually care about security. They are, in every single case that I've ever seen that they're not actually nefarious, merely convenience methods that automate a bunch of manual steps that are outlined in instructions which you can also follow.

And that's in the cases where they (or trusted third parties) aren't actually packaging it already for more normal installation in the OS you're on.

So, I'm not being idealistic and saying "read the code of everything you run", I'm saying "realize that running random scripts is equivalent to running random executables in a lot of cases, so maybe don't do that no matter the OS, as it's not an OS problem and one OS won't really protect you regardless of whether it's Linux or Windows or Mac OS", and thus it's not really a Linux problem at all. As an example, I'll leave you with this[1], which I find fairly relevant since it has the powershell equivalent if curl piping to bash.

1: https://deno.land/manual@v1.35.0/getting_started/installatio...


On the other hand it seems entirely possible to use only apt-get install and be happy on linux.


Nowadays, you can also mostly life with just choco install on Windows. Not as well vetted as Debian packages, but a lot better than google.

But that only helps power users. On linux learning these things is simply a necessity because installing things outside your package manager is even less user friendly.


This is the essence of most of the rhetoric I'm seeing in this thread: Linux is more secure because its users are more technical.


I thought choco software is not vetted but managed by individuals, like the AUR in arch.


I dont know, i think the graphical frontends to package managers on linux are much easier than downloading and installing an exe on windows


I wonder how many preinst/postinst scripts are being read by apt users prior to install/upgrade? These run as root, making them a bit worse than the typical curl|bash after all.


You have so many more practical options for even amateur security auditing on Linux. You can trivially spin up a temporary OS/chroot/container and run your specimen inside it, or run it under strace and log every system call, or statically sift through an executable for strings like IP addresses in an instant using basic tools. Bash install scripts can simply be grepped. It's an environment that gives the user control by default, and as such it's that much harder for executables to gain the upper hand.

Obviously you don't do that for every binary you run, but you have options if there's something you're a bit suspicious of.


All of those options have Windows analogues though. Windows users can spin a VM, run procmon, and even have access to text editors.


Indeed. "Run a program or browser extension in Sandbox mode" is a great deal easier in Windows than the equivalent in Linux.


If people were doing their due diligence on every binary they execute, malware beyond highly targeted zero-days would be non existent.


Yeah, its so naive that its not even worth commenting.


While reviewing the software you are running would definitely help, it is also utterly impractical as a security measure for more than a small fraction of the folks who use software.


This is a large part of why I'm a heavy proponent of Flatpak/Flathub, Snaps and AppImage. The applications themselves may have a heavier payload, but run in relative isolation. It's an overall better option for security. Not to mention app/security updates won't affect the core os, and vice-versa.

It's not a panacea, but it's the best option for most people. I get why some may not like it though... I don't quite get the visceral resistance though.


Same reason windows nerds have a visceral resistance to software that costs money, and mac nerds have a visceral resistance to the idea that window management should be more sophisticated than pixel-hunting through a morass of overlapping crap. Nerds over-invest in a hobby/product, get locked into the inertia, start to identify with it, and must then defend it irrationally.


The amount of money the game studios, steam and others make would counter your first point... and compared to what, Linux, where nearly everything is free and won't generally pay for software?


You're right, add an asterisk for games. My whole point is the nerd will get stuck/fixated on his original way of doing things, which for a Windows nerd (like my past self) is spending your meager cash to cobble together a custom pc powerful enough to play games as a child/teenager, and pirating everything possible. Windows was The Best OS Ever (because you have no choice) - just like your [Xbox|PS2] was the Best Console Ever (because you couldn't afford both).

Of course, the adult version of this nerd will be able to weave much better post-hoc rationalizations. Head on over to ars technica or reddit or macrumors or linustechtips video comments for thousands of examples.


Honestly, my biggest gripes against Windows is the current direction of monetization of users... When I saw ads in my start menu search results, I was out. I've still used it a couple times for work, I still have it on my desktop, that I've booted to that drive twice. I've spent a fair amount of time getting some Windows things running on Linux.

All said, I like and dislike aspects of Windows, Mac and Linux... they all have faults. I'm a bit more forgiving of Windows in terms of security today (after a decade+ of working very diligently at it) than a couple decades ago, when I saw the likes of ILoveYou and I forget the SQL Server one a year or two later. Those were just stupid decisions all around (running Email in "local/full-access" security context instead of internet/untrusted). Similar for the SQL issue.


Linux security best practices are to (a) only run open source software with all code changes publicly visible on a version control website, and (b) rely on an expert maintainer to have performed at least minimal review on the software.

Granted, you might sometimes need to run something else, perhaps even closed source software. But insofar as that's considered necessary, the security posture of Linux isn't significantly worse than Windows, where almost everything is installed that way.


Yup, basically.

The situation is relatively straightforward, though people with biases (&/ desire to just argue &/ trolls) complicate it over and over again:

UNIX, and specifically Linux as a descendant, was evolved with very sensible and fairly solid security models (in multiple ways - including balancing simplicity [making it easier for users to specify and have that specification actually match their intention] against flexibility / rigor). Furthermore, from early days, there were heated substantial arguments about security vs. usability.

When I was younger, I had a more "Theo de Raadt" POV - it should be way more secure. But, I think that the arguments people like Torvalds made about "enough trouble getting adoption AT ALL", in essence, were better arguments.

Linux has been pretty good through the years. Far from perfect, but a good enough mixture in terms of balancing "getting sh1t done" against "keeping people safe".

Windows is a mess. It's always been a mess (though, to be fair, it DID really improve between 2000 and 2010, but only to the point sort of REQUIRED to continue to be commercially viable). The incentives etc. are all different. The M$ model is always "make things as easy as possible to just start using" and "try to keep everyone chained to the platform, in part through the otherwise almost altruistic method of never breaking ancient software".

There is no question that security - in terms of what is best for the user - is not the key principle / drive, there. You can judge that however you like, or not at all - it doesn't implicitly mean Windows is "worse", because that always depends on what is "important" ... what perspective you're looking at it from.

But, I certainly find that model ugly and unfortunate, personally.


"curl | bash" exists because there are too many Linux distributions and forks of distributions, making it basically impossible to distribute Linux software any other way without losing your mind creating hundreds of different repos.


I don't think that correctly encapsulates what's going on.

Step by step directions on how to install are done because of what you say.

An script you execute from the internet is a convenience method that automates those steps. For that convenience, you are trading away some level of security. It may be a very small amount, if you're getting the script from a well respected and secure source. It may be a quite lot if it's some random blog or 4chan comment.

For decades now there have been methods to deal with the problem of multiple OS targets. A self extracting shell script, so the installation script is packaged with the script, is still used by companies that need to ship in a mostly distro agnostic way. You still get a single package which can be signed or include checksums. Anything that the web script would do would be just as easily accomplished.

Directions to pipe a download of a script to an interpreter to execute it exist because a subset of people don't give a shit. It's nothing to do with Linux, it's everything to do with people not caring. As evidence of this, I give you Deno's install page, where they allow you to user powershell to retrieve and automatically execute a powershell script[1].

Note how the themselves call it out as a convenience. Many things in life are convenient, but aren't really all that secure, like leaving your keys in your ignition because you live somewhat remotely and feel secure. This is just another one. People like to play the odds.

1: https://deno.land/manual@v1.35.0/getting_started/installatio...


Convenience matters a ton. My rule of thumb is that each step halves adoption. So the fraction of potentials who actually install will be ~0.5^steps.

The only time this doesn’t hold is when there is a very strong overriding driver or a mandate pushing people to overcome friction.


I don't disagree with that, but I think that's less of a "Linux security best practice" issue and more of a problem a community has, and that community isn't even necessarily "Linux users" as much as it's "developers" and "self described power users" of any OS.

Linux probably happens to have a higher proportion of those, which might be why it's seen more as a Linux issue, but I don't really see them necessarily as intrinsically related.


> Just because some people like to ask you to install their software that way doesn't make it "Linux security best practices" and it doesn't mean you need to follow those directions.

It's the de-facto option for installing cross-distro software on Linux, especially if it's not in a package manager repo.


“You can review whatever you're running“

You could also review remotely hosted OOXML and its chain of oddly side-effecting dependencies.


The same can be said of Windows users.


Morally curl | bash is no different to downloading a package from the provider directly, or adding a repository managed by the provider. As people are well aware, it will not protect you from the software provider being malicious. Neither will obtaining that proprietary installer from adobe.com for Windows/Mac. I'd argue the security level goes from worst to best:

1. curl | bash = npm/pip/cargo/whatever install = developer provided package repositories = proprietary software installers

2. App stores from proprietary OS vendors. You still don't really know what's in the software, but at least you already have to trust Apple/Microsoft if you're using those OSes and they can remove detected bad behaviour globally.

3. Package repositories from trusted traditional linux repositories. You can view the purported source code of the build, plus there is now someone who can block bad packages.

4. Package repositories from linux repositories with public build processes. Not just the scripts, but being able to see the execution of the build and have it signed to prove where it came from (as opposed to Joe packager's personal machine then FTPing it up).

5. Making your own copy of every piece of software, auditing the entire source code, building it in an environment you control, and keeping the artifact you then sign somewhere you control. This is so much work that nobody does this.

People like to tut tut at curl | bash, but most of them are happy to do everything else in line 1, and maybe trust line 2. This is not the position of safety and moral superiority it's portrayed as.


If you're on a non-standard architecture, you're absolutely doing a lot of #5.

For that matter, I build my own Firefox even though Fedora offers a package because I like the concept of doing so, I can submit fixes, and I can do local optimizations. It's a lot of work but it's hardly infeasible.


Do you also build your own kernel, glibc, dnf, openssl, pipewire/pulseaudio, systemd, ffmpeg, ffmpeg extensions, gtk, gcc, rustc, llvm, python, coreutils, ca certificate bundle, x11/wayland, clang, nodejs?

I'm guessing no. So your overall security posture is line 3, where you're trusting fedora to be the gatekeeper for you.


6. use flatpak/snap/appimage that run in relative isolation from the core os.


Flatpak itself is really a different dimension of security to that discussed here. It solves a different problem (the software might have exploitable security vulnerabilities) than the one discussed here (the software itself might be malicious). Running software that secretly phones home everything you do in the program is still going to be a problem in flatpak.


I wouldn't say that it's a different dimension. If you install a Flatpak (and you check that the permissions it asks for make sense), the application will not be able to do as much damage even if it's malicious.

Furthermore I'd argue that a big reason (2) > (1) is not that Google/Apple are that great at detecting malicious applications, but that malicious applications also have a harder time getting too many permissions with their system.

And furthermore, a reason why "curl | bash" is bad, is that you are piping arbitrary code straight into a shell, which gives no chance for the system to know which permissions the code needs. Whereas if you do a "curl ... && flatpak install ...", it can.


This argument is moot once you allow said software to have access to your files...which you will usually have to if you want that software to be useful to you.


Access scoped to a directory is a big improvement over an unsandboxed process. Any unprivileged program can trivially steal your browser sessions by reading your profile dir, but with Flatpak it would be possible to only grant it access to your "documents". I don't know if this is currently done in practice though, or if it's still common to just grant it full access to ~, including dotfiles. Even if so, the technology is there, showing a clear route to improvement


> the Linux security best practices of curl | bash?

1. Widely criticized.

2. Not something the OS does, or even encourages. The OS permits it in the same way that the OS permits you to set your root password to hunter2 and run telnetd. You can't, and shouldn't, stop people from deliberately screwing themselves.


How do you know my root password? Delete this.


What is it? I only see “*******”.


Context for the uninitiated

http://bash.org/?244321


lol, yes. See, when YOU type hunter2, it shows to us as ****


> Linux security best practices of curl | bash

I don't know who tells you about Linux security, but you should replace them.

The best practice for installing software on Linux is to use the package manager and install from the repositories of your Linux distro, or trusted software vendors.


> The best practice for installing software on Linux is to use the package manager

Bingo. In Windows and even MacOS, it is normalized behavior to download and run software with your web browser. Want VLC? Google for VLC then maybe end up on a website like sourcef*rge that adds malware to the installer. On Linux, this sort of workflow is possible and permitted, but not encouraged. Instead users are encouraged to only install software through their package manager.

I can leave my dad with a Xubuntu install and trust him to not download malware because I taught him how to use the package manager, and warned him against trying to download software with his browser as though he were using Windows. 15 years like this and he still hasn't messed it up. With Windows he had new malware every week. Downloading and running strange software off the web is normal windows culture and windows scarcely even provides a better alternative to it.

(The "Windows Store" is an improvement to this situation I guess, but from what I understand most software available through it isn't free. This means windows users are incentivized to fall back on old habits and go scrounging around on the web for free binaries to blindly run.)


Worrying about the security of curl | bash when talking about an operating system where the modus operandi for installing software was downloading closed-source and often obfuscated binaries from random websites and running them is insane.


Honestly, I feel like more people have been infected with malware from App Stores than from downloading and executing shit from random websites. Even Warez sites from back in the day were more trustworthy.

Cuz if you download from a random site, you might think twice about what it is, is the source trustworthy, etc? But the App Store, well Apple and Google tell me it's 100% safe, so just download all kinds of trash.


For mobile app stores you are right, but only because they are the ONLY place most people will obtain software for their device from. You don't download binaries and install them from web sites on Android or Apple.

For desktop operating systems, I think you are probably wrong. Downloading and installing binaries from web sites is a huge cause of malware infection. And warez sites were not trustworthy at any time.

I doubt much malware has been installed via Linux package managers or by the Windows app store (if anyone actually uses that).


Browser extension 'stores' are terrible too. Both Google's and Mozilla's.

On the other hand, I do trust F-Droid. The vetting of ideological motivated volunteers beats the vetting of disinterested corporations.


Oh this is balderdash. Firstly if it's curl | bash you actually do have the choice to download the script before running it and review its contents.

Secondly most software on Linux is not installed this way - it's installed through the distro's package manager, flathub, Steam etc. where it actually is way more vetted than a random download. Of course you can install random downloaded appimages etc. if you want as well because this is Linux and it doesn't treat you like a child in a sandbox, you own your system, you do what you want with it.

Which gets to my last point - the software which is installed through curl | bash is generally targeting developers and frankly, as a developer, you should know what you're doing. You take the risk where the risk is small (on your throwaway dev VM), you vet & review the code first where the risk is real (on a production server or something).

Your comment was counterfactual nonsense


The security practice on windows is to click yes on any prompt shown to you.


That's not just Windows, someone had a presentation I think from Mozilla showing people can't make heads or tails of the SSL certificate error window and consider the whole thing to be "click yes to get on with things".

Found it: https://inoio.de/images/something-happened.jpg could be from https://www.usenix.org/sites/default/files/conference/protec...


There are a bunch of bad legacy technologies still baked into windows for which I can’t see a non-windows equivalent

For example: credential hashes. They can be used as a bearer-token, and a privileged enough one can log into absolutely any system in the entire domain and do anything.


> the Linux security best practices of curl | bash?

You do realize that even this is the same as downloading and running an executable from a website, which is still the norm on Windows, right?


In fact, curl | bash is safer, because you can replace "| bash" with "| less" and inspect the script.


I would argue that Windows checking certificate signatures provides a lot more safety to the vast majority of users than manually inspecting a bash script.



Yeah, you don't actually want to pipe to less and then rerun piped to bash because it's vulnerable to a TOCTOU problem. You want to save it the first time, then read it in your editor.

When you distribute those scripts around your org, you also want to verify them with a sha256 hash or something like that. This is what I do for installers I use this way and other software downloaded over HTTPS at work.

I also generally avoid installers that download other installers. If you've got a script that downloads platform-specific installers, you may prefer to write out specific instructions for each platform and download the platform-specific binaries directly. A shell installer like this running curl is a red flag imo.


But isn't installation via curl more of a PEBKAC issue? You don't have to pipe it to a shell immediately -- pretty sure you could curl the install script and manually verify it. And isn't the CPU-might-be-spying an invariant?

One certainly cannot change how one feels about one's security, but those don't seem to be reasons Linux is inherently more or less secure than Windows...


> As opposed to the Linux security best practices of curl | bash?

Gotta take offense here. That's a MacOS paradigm.

All Linux distros have proper package management, always cryptographically signed and increasingly reproducibly verified, and extremely broad coverage of virtually all the software in the community. The closest you get to this kind of thing as an "official install mechanism" is e.g. bootstrapping a package repo for third party software, which has you hand-verify the keys.

People who pull unverified code to their boxes are virtually all developers cloning stuff to build.


"Best practice" is a bit exaggerated.. or tbh a joke. Any reasonable software I have lately seen distributing that way (and that were few ones) usually come with disclaimers like "beware that you must trust us" or pointing to alternatives in the direction of package managers..

> to run untrusted code,

No again, that depends on who you trust, right? If you trust noone, it is all up to you, certainly.. and at least you have the theoretical possibility to review almost everything (which other people actually do).


A CPU that might be spying on you isn't to do with malware. This is about the difference between OSes, which is worth talking about in and of itself.


Average security is what is easy and common(usually because its easy). It's common for regular users to install via their software center where they will find their office software, browser, and other common apps.

It's common for some developer oriented software to list a lazy way to install software from a trusted source. They also aren't running curl $URL | bash on a $URL from a scam email they just got. It is indeed bad practice but its a relatively contained bad practice and its not reasonable to compare this situation where some developer oriented software recommends a controllable insecure method with the common user experience of hundreds of millions of windows users constantly installing all software by downloading and clicking on executable the functional equivalent of curl|bash


This an absurd example; ironically what makes it 'safer' in reality is that the only people who use it know how dangerous it theoretically could be, and thus are able to reason about its source. How often has curl | bash ACTUALLY been a vector for problems?


You can curl into bash all you want, but how's that better than just running an Installer.exe ? Most software I install on Linux is at least somewhat verified by a maintainer to be sane, there are no maintainers on Windows, just HTTPS servers with binaries.


Although curl|bash seems to be common, I don't like that and I don't do it. If I need that (rather than just using the package manager), I will first save it to a file (I still usually use curl for this, but will not pipe it directly into bash since that is too insecure), and then possibly modify the file if I want to do. However, such a thing should usually not be needed, if you use package managers or run programs under emulation or download and compile the source yourself (possibly after reading it if you are concerned about it; I use a non-Unicode locale for further security when doing this), etc.

The CPU spying is more of a real problem, though.


I've never ran curl | bash once and I use linux fairly regularly in the past 20 years. If I absolutely have to install something where the only way is to pipe a curl to bash (which is incredibly rare), I first download the script, review it and only then execute it. Doing curl | bash is absolutely not security best practice.


But realistically what those scripts do is always download and run even more code, and you probably aren't reviewing that.


Yup, it's why if I run this, it's really the last resort and I most likely will run it in an isolated container.


> As opposed to the Linux security best practices of curl | bash?

At the time Windows had far worse vulnerabilities than running untrusted code.

You didn’t need to download or run anything, just connecting to the internet without a firewall (which was common in the dial-up era) exposed services that could be exploited by a remote attacker.


You are supposed to run software packaged by your distribution, and that's trusted.

The curl | bash is bad practice and shouldn't be used.

You have way more control and security using a Linux distribution. The objectives of that and Windows are completely different, and that affects user's security.


> As opposed to the Linux security best practices of curl | bash?

the only thing I can recall asking me to do that is rust... (which tracks, as the rust ecosystem seems to have decided that the only kind of safety that matters is memory safety :) ).


That's not true. wget is more popular for this. And trusting PGP keys of random dudes adds an additional layer of (in)security.


Indeed. If you think Linux is inherently more secure than Windows, then I dare you to curl/wget some random sh script and run it as root.


Make sure to also test the safety of a spoon by scooping your eye out with it.


Linux best practice is to download package off signed repository, curl|bash is JS bros sniffing glue again


Linux security will never stop feeling weird to me. Like when I try to do something that needs to be started with sudo, instead of warning me and just asking me if I want to run the command as root, I have to go back and input it again with sudo. Then some programs like VLC just straight-up refuse to run as root.


> Like when I try to do something that needs to be started with sudo, instead of warning me and just asking me if I want to run the command as root, I have to go back and input it again with sudo.

This sounds like a misconfiguration or lack of support for policykit to me.

> Then some programs like VLC just straight-up refuse to run as root.

That IMHO should be the standard behaviour of most non-basic/GUI programs to me, running as root is overall a terrible idea.


Those are two great things for security that prevent you from:

$ rm -rf *

No permissions in this directory - want to try with sudo? (Y/n)


> As opposed to the Linux security best practices of curl | bash?

This comparison would only be valid if almost everything you install on Windows was at least theoretically inspectable before installation, instead of, well, almost nothing. LOL, nice cope.

Anyway, here's a Bash function you can add to your dotfiles to add confirmation to that sort of "workflow":

    confirm() {
      tmpfile=$(mktemp)
      # use tee to split stdin to stderr and the temporary file
      tee "$tmpfile" >&2
      echo >&2
      # Prompt the user.
      >&2 read -p "Do you want to pass this code along? [Y/n] " response < /dev/tty
      case "$response" in
        [nN]*) 
          echo "Operation cancelled." >&2
          rm "$tmpfile"
          exit 1
          ;;
        *)
          echo "Proceeding..." >&2
          cat "$tmpfile"
          rm "$tmpfile"
          ;;
      esac
    }
Now you can just take those one-shot install lines and stick this "confirm" function in the pipe like so:

curl <url> | confirm | bash ...

There's probably a slicker way to do this, if you're super into Bash. And you may want to `set -o pipefail` in general, so that the "exit" code of 1 actually gets seen...


Microsoft cannot fix this problem because it requires fundamentally pissing off large portions of users who do not want to change how they use windows. They don't want to lose access to their legacy software. You start putting everything legacy in a nice little container, and people will freak out when they notice FPS loss or some things being a little odd. You limit people's windows installations to only permit apps that utilize the newer permissions system, and they'll screech about wanting to install chrome. Microsoft's attempts throughout the years, like 10S indicate as much.


> You limit people's windows installations to only permit apps that utilize the newer permissions system, and they'll screech about wanting to install chrome. Microsoft's attempts throughout the years, like 10S indicate as much.

I don't think the 10S example works here at all - 10S was never locked down for security, it was locked down because it was for cheap devices to drive sales on the mandatory Microsoft Store software distribution - you get cheap Windows, it comes with software distribution strings attached was the deal offered with 10S effectively.

People hated having a crippled copy of Windows on their cheap computer understandably because app availability on the Microsoft Store was poor at that time (and still is today), whether it had security benefits wasn't the issue there - people just wanted to use their computers to run a Windows app and rightly got upset when 10S couldn't sometimes.

10S was arguably much more a product planning/marketing decision to offer cheap Windows PCs at ChromeBook price points, even if there were some security implications.


It works very well, because the Microsoft Store was trying to operate as other locked down stores do by enforcing better practices. Store apps used to have to use the new permissions system for example. 10S was locked down for security and even encouraged it after they dropped the cheap devices to help bolster secure environments on your own, Microsoft also benefited through control of the Store. They aren't mutually exclusive.

So yes, the security benefits is the issue. Once you impact people's ability to install Chrome and give it full permissions, they scream. 10S didn't allow this, and Google (just using them due to popularity and because they weren't trying to be malicious here either) didn't give a damn about adhering to the new app format's restrictions on permissions.


Sometimes you have to ignore what people want because the status quo does too much harm.

People objected to seatbelts in cars when they were introduced. Some people still do.

People object to EVs and don't believe that burning hydrocarbons is a problem.

We've mostly banned smoking in public places. A lot of bar and club owners thought that would kill their business, but it didn't.

We banned lead paint, despite the fact that it worked really well and covers just about anything without needing multiple coats.

It's easy to think of more examples.


Lead paint harmed a lot of innocent third parties, as did smoking near other people. Not sure I see the analogy.

When MacOS killed 32-bit libraries, it didn't save me from harm. It just made MacOS incapable of playing old steam games, and therefore my macbook air was no longer an acceptable laptop for vacations. I'm not saying it was a crime for Apple to change it -- OSes change. But it wasn't a benefit to me, and I took my business elsewhere.


Also Windows Vista. IIRC one of the main reasons users disliked it was incompatibility with some drivers and other software, but AFAIU a lot of these breakages were due to that software assuming admin permissions, and hooking into the kernel in undocumented ways, whereas Vista started the process of forcing software to actually use proper interfaces and made them actually get user permission to escalate privileges. A good thing, but which caused teething problems at first. I don't even blame users for their reaction, as the fact was their software wasn't working, regardless of the cause, but we got through that stage and now modern Windows is much more stable. I do still scorn Apple and others who criticised the existence of UAC - they should have been calling Microsoft out for taking so long to introduce it!


Well, the problem is that there are really many different markets for Windows. If my major use of Machine123 is to play old games, of course I'm going to be unhappy when Microsoft compromises my ability to do that. For that use, it might be better to lock down the OS in other ways (i.e., no internet).

The users were sold Windows as a solution to doing a wide variety of things. Now those things are getting compromised. They are not wrong that it was oversold.


> You start putting everything legacy in a nice little container, and people will freak out when they notice FPS loss or some things being a little odd

God forbid we waste a little processing power on security instead of the ever-expanding slime of bloated frameworks and nonsensical UI.

Seriously, someone competent, please bring an OS to market that can waste my CPU cycles on a robust sandboxing model, a la Android. Take my money. I'm tired of spending it on Apple's constantly degrading UX disaster and security half-assery.


> waste my CPU cycles on a robust sandboxing model, a la Android

Unfortunately Android's sandboxing sometimes literally wastes CPU cycles – when Google forced people more seriously to use the new scoped storage API, people stumbled across quite a few performance bugs once you stray past the very simple use cases.

And like almost all attempts at file sandboxing (except to a limited extent Apple's implementation for Macs), it's broken interacting with more complex file formats that don't consist of a single atomic file. Using a file explorer to directly open that kind of file (e.g. a local collection of HTML files) in another app has become impossible that way, because the sandboxing system will only grant access to the one single file you've clicked on, and ignore any related file that are implicitly required, too.


It already happens. Direct Draw games run like shit from Windows 8 and up, you need to use DXGL or something like that which wraps ddraw.dll calls into DX or GL.


As-is, every unprivileged application you run can

- take a screenshot

- record the screen

- capture audio output and input (microphone)

- use and record the camera, if present

- read almost every file

- write and delete most files (excluding some OS-owned ones, without elevation at least)

- capture mouse and keystrokes

- use the internet without too much restriction

(Tbf, it’s mostly the same on Linux with X11/Xorg, but at least there’s more/better sandboxing and packaging like flatpak - and Wayland).

MacOS, AFAIK that is, is leading the way in this regard .


I think the error in your comment is assuming there was one cause, or "the" problem.

Popularity is a factor. Poor design of 90s software is another factor. Neither of these are 100% of the problem.


Agreed. MacOS has much stronger security despite Apple not waiting for its users to get pounded by malware. I think it has something to do with basic philosophy - MS supports corporate certified malware. If adobe creative cloud needs a feature that uses 25%cpu 24-7 with telemetry and maxed permissions, MS is gonna support it.


You're conflating telemetry with permissions. macOS doesn't attempt to stop apps reporting how they're used, why would it? Instead Apple gathers such data and then keeps it for itself, requiring devs to go via Apple to get it.

macOS does have stronger security, but it's security in the form of stopping apps accessing files until they need permission and things.


Mac OS (Classic Mac) had basically no security in the Win95 timeframe. It didn't even have address space isolation between processes.


Aren't there 50% more Android users than Windows users at the moment?

One would assume the average personal has more personal data lying around on their phone than their PC.

If popularity was all that mattered, Android would be the top target.


It is the top target, and so many android phones are infected with malware that people just consider it normal these days, hell it often comes preinstalled. The difference is that the people making the malware also have huge teams of lawyers so they can get away with anything they want.


I think there's difference between AT&T's contact app and Mydoom.

YMMV.


Mac System 7 to OS 9 were not exactly security paragons, but they had far fewer virus issues. And that was the main alternative that anyone actually used.

Popularity was definitely a huge factor.


The possibility of there being a confluence of problems is one that's hard to grok, but doing so is ultimately fruitful.


> The only other alternative is turning your computer into a glorified phone (a.k.a. a locked-down media consumption device)

There’s a third alternative: keep the platform powerful but increase the default isolation level for third-party software and let the user choose what permissions it has. macOS is headed in this direction. Qubes is a more radical example and I think probably the future of desktop computing: everything will run in its own virtual machine.


This is where Linux is going as well. I believe that in the future most peoples base systems will contain only necessary packages and other software will be run out of Flatpaks or a comparable technology separate from the host.


This is how I run everything outside of coreutils in Linux.

Either with VM isolation or packed away in a container. Host is basically a hypervisor with external monitoring and logging.


Flatpak right now isn't really a security boundary. That said, I don't think there's any particular reason it couldn't or shouldn't be in the future. The model seems amenable to it, and maybe it was at one point the intent even.


MacOS phones home before running an unverified app. I understand why they do it, but I'd rather they didn't. And I definetly hope that Linux doesn't go that route.

Windows ... well my expectation is low with regards to phoning home and security defaults.


> MacOS phones home before running an unverified app. I understand why they do it, but I'd rather they didn't.

Considering Apple was brazen enough to name the software that phones home "Gatekeeper" it is all too clear why they do it.

https://support.apple.com/guide/security/gatekeeper-and-runt...


... because it supervises and scrutinizes "traffic" coming through the "gate" for security reasons, before letting it inside the "walls" where it might do harm? Like a gatekeeper does?


This is already painful though. The filesystem becomes a disjointed mess. Doing anything in an app becomes fighting permissions. Weird things break (eg I can drag and drop a screenshot from my downloads folder but not my screenshots folder into the discord flatpak).

Then of course apps simply get sloppy in requesting permissions.. and in return malware pretends to do the same.

The end result is now doing anything takes a half dozen prompts and isn't any more secure. It just sucks for me as a user.


The restrictions are a starting point for improving system security, not an end point. I agree that the prompts are annoying, there need to be better ways to delegate permissions.


Windows tried that with MSIX which brought sandboxing and a permission/capability system. They even made using that system a requirement for getting on the Microsoft Store. Developers nearly universally rejected dealing with that.

Now there's the Windows Sandbox, which tries to provide a strong security boundary through virtualization while still having the kernels cooperate on performance-critical matters (memory, CPU time and graphics).

Over time someone (maybe even Microsoft) could expand that to a Windows-based QubesOS-light. Having the user segment applications into containers (with temporary containers for sketchy stuff) but allowing all applications to show windows in the same Window manager might be a viable tradeoff that fills most security needs without breaking compatibility with any software.


We already have proof the model works both technically and commercially: phones. This is how Android and iOS have been working basically since the start. I imagine the reason most developers rejected MSIX was rational (there was no incentive for them to take a harder option when the older option was still there), but they could have done if it they were forced to. Exceptions are programs which actually do need high privileges, or those which require a degree of interaction with other programs that the safe interfaces do not allow (though Android anticipated this problem with the "intents" system from the start, which solves a lot of use cases)


The MSIX/AppContainer story is... long, sigh. IMO developers mostly rejected those technologies for reasons unrelated to permissions+capabilities:

- MS tried to use it to force developers to distribute their apps through the Store

- Eventually they gave up and let people use it outside the Store, but didn't invest enough in bug fixes and backporting the fixes, so it's impractical to use outside the Store

- It requires code signing outside the Store which is a huge pain


MSIX gets conflated with UWP but these days they are separate.

For a while, Microsoft tried to insist that Store apps were UWP apps, but UWP was basically a new OS platform and not really backwards compatible at all. By then though most Windows code had been written and wasn't being rewritten due to new development, so this attempt to get people off Win32 failed. Devs just rejected the store instead.

Project Reunion tried to fix this by bringing together the UWP and Win32 worlds. As part of that MSIX did indeed become available outside the app store and for Win32 apps which run as "full trust" i.e. with all permissions, and it was indeed unfortunately quite buggy. However, some bugs are fixed and others can be worked around. Conveyor [1] is a packaging tool (disclosure: made by my company) that can make and sign MSIX files from any platform, and it works around those bugs so you get a reliable install experience and of course, all the nice MSIX features that motivated the technology in the first place like Windows keeping your app up to date in the background, delta upgrades, sharing of files between different apps, clean uninstalls, being installable without admin privs and so on.

You don't have to code sign apps, btw. If you're OK with requiring admin escalation then Conveyor will self-sign your MSIX and install the relevant cert using a small stub EXE that the user invokes. This lets you get the MSIX feature set without paying for certificates. It's not really recommended though for anything except open source apps, as virus scanners will molest your processes in various ways.

That said, getting into the MS Store costs $19 for an individual and then MS code signs it for you, so it's cheap and easy. Conveyor can also release through that, including from Linux/macOS, after you get approved, but the approval process is more of a handwave these days than anything else.

So that's the current state of play. Win32 apps, unsandboxed, distributed either in or out of the store using MSIX, signed or unsigned. Now Microsoft is working on bringing sandboxing and permissions to the Win32 world, not just for UWP apps. However it'll still only work for MSIXs.

[1] https://hydraulic.dev/


I like the sandboxing model but virtual machines add way too much abstraction and overhead. Apple's approach is really good here IMO, as a user I need to be able to do things like select any file at any time, and it will under-the-hood add that selection to the sandbox


I wonder how much overlap exists between the security fanatics and those who post the "why my OS laggy??" threads around here about once a month.

The collective brain damage from a 500ms+ start menu has probably already placed us into a fatal death spiral.

Not saying you can't make an OS secure AND fast, but it definitely seems challenging if you also want to keep support for applications written 20 years ago.


VM technology currently adds a lot of conceptual and computational overhead in most cases but there are examples of it being integrated more seamlessly. WSL 2 is a good one.


>There’s a third alternative: keep the platform powerful but increase the default isolation level for third-party software and let the user choose what permissions it has.

That's the direction Windows is going as well with the MS-Store and appx/msix bundles.

There's also Process Memory protection by using virtualization https://support.microsoft.com/en-us/windows/core-isolation-e...


This is difficult to do effectively.


How do you mean? Phone OSes prove the model works, at least for ordinary "productivity apps". Linux kernel features to enforce it exist and are well tested, and are used by Docker and Flatpak. Microsoft implemented the technology, but had trouble getting developers to actually commit to it (which is rational, why choose a harder option if there's no incentive?). And MacOS seems to be doing a good job of balancing increased default restrictions with maintaining escape hatches where needed


There’s a lot of software that doesn’t fit into the phone OS model that people generally find it useful to have available. macOS has tried to bring some of this to the desktop but beyond straight ports of mobile apps their efforts to allow for apps to progressively do more powerful things have largely been failures.


I agree, but I’m definitely curious what you see as the biggest challenges given your extensive experience with Apple OS internals.


Users are typically not in a good position to make decisions about permissions that are bolted onto a model that didn’t have them to begin with. Think of all the annoying permissions that macOS has currently that you mostly have to click through for a lot of stuff to work (“I want access to your desktop folder”, etc). Software that isn’t designed for a more limited model from the start often does not interact well with efforts to contain it.


It seems like your thesis is disproved in your own examples.

Linux is hugely popular on phones (android), which are every bit as juicy (if not more so at this point) than a desktop target.

There is mobile malware but it's far more rare and harder to come by.

But then there's just the fact that the way software is installed on a linux machine is wildly different from how you'd install it on windows. Just getting that binary blob to run requires some heroic efforts (To the point where we've pretty much decided it's easier to distribute via containers rather than compiled binaries for a given desktop).

And if we expand beyond the desktop, we find linux everywhere in the server world. Easily the most popular OS to run server software. Which makes it a hugely valuable target for hackers. They'd love nothing more than to compromise a bank server.

To say there's nothing about linux that makes it inherently more secure than windows seems just unreal. Because nix was designed around multiple users from the ground up, user permissions have been baked into the common flow for decades. That alone creates a huge layer of security that makes things like root kits or worms running at root super hard to pull off. The old windows (9->XP) pretty much gave everyone running admin permissions. Writing or changing a system32 dll was child's play.

To exploit linux, you have to either trick a user to run something with elevated permissions or find a vulnerability in software running with root permissions. To exploit windows (particularly older windows) you have to trick a user to run your software.


Android malware isn't rare or harder to come by at all. Every so often even Google has to remove a whole lot of apps from the store due to malware and these are apps that went live and were downloaded. Some of this being even more intense malware than on desktop, as people rarely store contacts or SMS on a desktop (Joker malware for example). You've negated your entire point.


The vast majority of "malware" on phones isn't software that exploits security bugs in the system software. Instead it's software that effectively asks the user nicely to give up their information, using mechanisms provided by the system software to do just that. This isn't something you can trivially prevent, as some software really does need access to your location, contacts, SMS, etc.

The response to true malware on Android isn't looking for and removing APKs from compromised devices after the fact, it's patching the vulnerabilities in system APIs.


same is true for windows... zero day exploits are hard and get patched rather quickly in all major operating systems

It is always the abuse of legitimate features which are the problem


More intense than the desktop? I think not.

At the windows malware peak, your system could be infected merely by having an internet connection. How many android worms are there? None that I can think of.

Heck, windows PCs were regularly infected by browsing the wrong website. Or getting served a malicious advertisement. Can you honestly say that people are getting infected on Android regularly by surfing the internet?

The vast majority of android malware relies on social engineering to get the end user to grant a malicious app permissions to be malicious. That's hardly a failing of the OS. It's also nowhere near as bad as "I'm online and now risk being infected".


> Heck, windows PCs were regularly infected by browsing the wrong website. Or getting served a malicious advertisement. Can you honestly say that people are getting infected on Android regularly by surfing the internet?

I have used Windows for nearly 2 decades and I can't tell you the last time my system was infected. I do agree that browsers are the largest vector of attack but that also means browser vendors share some of the largest responsibility for creating secure systems.


Two decades ago was near the end of that Windows-malware peak. And really, if you were behind a NAT two decades ago, that would have stopped nearly all of it, so you might not have noticed how bad it was unless you were supporting a lot of Windows machines in varied environments.

There was a span of a few years when a Windows box connected directly to the Internet, using a public address, would reliably get pwned before long, even with nobody using it. But that was quite a while ago, and, again, just being behind a NATing router mostly solved the problem (assuming nothing infected ever connected to your local network).


Why would you need root on a typical linux system? A program running under your user account can alter config (in your home directory) to make sure it always runs when you log in, can add programs to your path by altering your path, can access all your personal files and can debug-attach to all processes and do what it likes with them (and your personal data) in order to spread mayhem. A lack of root access seems irrelevant to an individual user.

Android is more secure by not acting like a typical Linux install; it's not really evidence of Linux being a tricky target.


> Why would you need root on a typical linux system? A program running under your user account can alter config (in your home directory) to make sure it always runs when you log in, can add programs to your path by altering your path, can access all your personal files and can debug-attach to all processes and do what it likes with them (and your personal data) in order to spread mayhem. A lack of root access seems irrelevant to an individual user.

All true, but I guess I'd just say that the main difference is without root, recovery/removal of the virus is fairly simple (as is detection). To recover, reboot, login as a different user, restore the .bashrc/startup configurations to default, remove the virus. Done.

If a virus gets root access, really the only safe way to recover is a full system wipe and reinstall.

But I would say that typical linux is more secure than android. It's fairly uncommon to install software from untrusted sources. On debian, I'd do an `apt install xyz" for most stuff. I'm not typically just installing unvetted software from the internet, certainly not something emailed to me.


Android basically doesn’t rely on users and groups in a traditional sense for security. Most of its security model was bolted on top of Linux.


> The reason why most (consumer-facing at least) malware isn't targeting Linux is because its desktop market share is like 3%.

I don't eat that argument anymore. In the mobile space, Android (Linux) is the biggest player. It is even bigger than windows if both are considered among end-users[1] and I don't see as many people complaining about malware on Android as people complain about it on windows.

Of course, I don't think ms is incompetent with regards to windows security. But there are design decisions that make it historically problematic. The fact that win9x had zero process isolation (although with was possible since i386) and people expect program to continue working on winxp (NT kernel), the fact that centralized software distribution is a relatively novelty on windows (compared to apt which exists since 1998) and many other minor things, like extension hiding, make it an easier target than ChromeOS, iOS, Android, MacOS and GNU/Linux.

I remember people saying "when Linux become as popular as windows, you'll see it being target by malware devs". Well, consider smartvs, infotainment, servers, supercomputers, embedded systems, mobile (specially Android). Linux is bigger than windows for a long time. I don't think its lower desktop market share is the main reason for its lack of malware.

[1] https://gs.statcounter.com/os-market-share#monthly-202206-20...


I don't think that's a fair comparison, because Android phones allow the user to download signed apps from a "curated" store by default (I'm using the term curated very loosely here, but Google does make efforts to remove malware from their store).

If, by default, users only downloaded software directly from the Microsoft store, would Windows achieve a similar level of security?

As for smart TVs, infotainment, servers, etc. they all share the commonality that the end user doesn't typically download untrusted software. And if they do, it's typically from a vendor's own store.


> I don't see as many people complaining about malware on Android as people complain about it on windows.

Can't believe I'm seeing such a statement on HN. Android phones are arguably worse on malware-related threat vectors, especially when most OEMs themselves package in the majority of malware on consumer phones to begin with. Even assuming less dumb users, at its best it is an unholy combination of adware mixed with spyware, ridiculous amounts of tracking in the name of "telemetry" and consumer-hostile design choices often literally designed to make the user choose the wrong option. Combine that with how ridiculously easy it is to get malware installed on Android (the most popular apps and games on the Play Store are all adware, installing compromised "modded" apks that "unlock premium features" is just one tap away) and you get a platform that would make any infosec manager cry. At least Windows PCs are controllable by the organization's administrator, how are you going to control people's phones unless you start issuing company phones as well?


What a ridiculous take to even imply that malware on Android is even comparable to the malware dumpster fire on Windows. There is more malware on Windows than there are apps. 99% of what is considered "malware" on Android is often aggressive adware that is benign. Remember when the security pundits were predicting that the Stagefright exploit would infect billions of Android phones? Stagefright didn't amount to anything.

>At least Windows PCs are controllable by the organization's administrator, how are you going to control people's phones unless you start issuing company phones as well?

You mean the same "controllable" Windows PC's that are responsible for nearly 100% of the ransomware, virus and malware infections in corporations? Right.


As the original comment said

> As long as some platform is capable and powerful for many things, there will be malware.

Android != GNU/Linux. iOS != MacOS. GNU/Linux, Mac, Windows are far more capable and powerful that mobile platform and therefore far more susceptible to malware. Plenty of Linux-based servers are hacked every day, plenty of scanning bots are targeting Linux-based software vulnerabilities over the internet.


Snapdragon and A series SoCs equipped devices are more powerful than your average corporate PC. The issue with Windows is that it has numerous attack vectors and once you compromise one the others will likely be compromised as well. Additionally, not all Android phones are equal. This is one of the main reasons Stagefright was so ineffective. An OS built by Google is probably not going to be susceptible to an attack that works on a Samsung Android OS build and vice versa. This is further complicated by the fact that not all smartphones are running the same OS version. This explains why writing malware to attack all of Android is so futile. Your malware may work on a Samsung Galaxy S23 running Android 13, but will probably not work on the same phone running a different version of Android.


I don't fully agree on this one.

Linux has a wider attack vector since there are tons of packages out there. Yet the core has a lot of attention and many eyes on it, just because it is so open.

Vulnerabilities get patched rather sooner than later. Linux versions and gnu packages are running basically the entire internet, so there is definitely incentive to break into into it.

It's also a lot clearer in linux when a process is doing something it shouldn't, since it's a lot easier to probe into it to check what's going on.


> It's also a lot clearer in linux when a process is doing something it shouldn't, since it's a lot easier to probe into it to check what's going on.

Is this true? It's been a while but I remember being able to set performance monitors on almost anything in Windows. It seemed to have very robust instrumentation support.


> Linux has a wider attack vector

Nit but you probably mean attack surface. A vector doesn't have a width.


Is there any evidence that security vulnerabilities are on average fixed faster in the major Linux distributions than in Windows?


Some. Here's for the Linux Kernel: https://googleprojectzero.blogspot.com/2022/02/a-walk-throug...

The dataset is quite small, but on average it took Linux 25 days to fix a 0-day while it took Microsoft 83 days.


Does that metric include the delay if fixes getting incorporated into Linux distributions (and pushed out, assuming automatic updates—maybe not a good assumption) or Windows fixes getting deployed via Windows Update?

Edit: I don’t know much about this topic, but thought “time to deployment of a fix” might be more useful. Edit again: also unclear if the comparison is “apples to apples”.


I doubt it, any more than it includes the time it takes procrastinating users to actually update their systems.


The dataset does not appear to discuss the lifecycle of Linux distributions taking the security patches from upstream, nor the update process for all of the downstream distributions.

Something that's been widely discussed elsewhere is how often security issues are silently fixed in Linus's repo and therefore not picked up by distributions for their stable/LTS releases.

I buy the immediacy of patches if you compile your own kernel from the latest kernel.org sources, not if you're relying on distributions.


> Vulnerabilities get patched rather sooner than later.

Unless they're in the file systems, in which case it's in the too hard basket.


Sandboxing isn’t incompatible with a highly customizable OS. Malware is really more a question of being able to install software without the users control and the inability to remove such installations after the fact.

Windows suffers from Malware in no small part due to the systems design rather than simply being common. Plenty of alternatives have more users than windows did back in the late 90’s when it was a huge target.


I think the crux of the issue is that making a system that's customizable AND sandboxing AND user friendly multiplies together to create development and testing effort that's not palatable for most right now.

Even the open source offerings that add sandboxing often drop either the customizability or the user friendliness.


> Sandboxing isn’t incompatible with a highly customizable OS.

This is true, but there is much badly designed sandboxing. Many of the sandbox systems on Linux will have some problems, including too broad permissions, assuming text (including file names) is Unicode, failure to consider permissions that differ by command-line switches and other configuration files, failing to consider other things that the user might want to connect access to devices by files and other programs, and in case it prompts the user (at run time) for the name of another program to run by using popen or something like that and that program should run using its own permissions instead, etc. (Many of my programs that have an interactive mode do run other programs with popen, which prompt the user at run time to specify what command to run, so it can vary.)

I had my own ideas of operating system design, which is different from POSIX (although there is a POSIX compatibility layer available as a C library; the kernel need not know anything about it). One of its major ideas is that it uses capability-based security with proxy capabilities; these can be used for sandboxing and other high-level features implemented in terms of proxy capabilities. All I/O (except the Yield and Quit syscalls) requires capabilities; capabilities (as well as links) can also be passed in messages through other capabilities, including the initial message. There are many other ideas too, including ways for programs to be connected together and with user codes by user specifications. These ideas can allow working without dropping customizability, etc (in fact, the way I thought to do, would actually improve customizability, even for programs that do not normally have this feature).


This is such crap.

A very long time ago, Windows normalized the absolute worst security practices ever. This was never meaningfully addressed/punished publicly and we just kind of drifted to today -- where we're stuck with absurdities like the fact that you can't use a USB key literally as intended. No other product is this bad in terms of security; bread will not destroy your toaster the way a USB key can your computer.

You can't JUST put this on market share.


Windows being inherently insecure hasn't been the issue since at least mid-XP. It's also a completely moot point, because nearly any security breach since then could have been conducted over snail mail, because it has nothing to do with how computers are programmed. While everyone was complaining about windows security, stuff like shellshock sat in the open for decades, including on all the servers that were supposedly so secure.

Every organization that does internal phishing testing still fails every time. Any modern discussion about information security that doesn't deal with that is a red herring, and provides zero utility to anyone who isn't the enemy of a nation-state. Focusing on the remaining few buffer-overflows that take a chain of ten other exploits to even reach in the first place while everyone's data and info is leaked daily because the CEO clicks everything in an email is a dereliction of duty. It's like investing in StarWars and magic lasers that can't work while placing nukes on Moscow's door step.

"Security Researches" keep looking for the buffer overflows because that's fun and they don't want to admit that the real problem is a social one because that's hard and boring and doesn't let them play with the newest fuzzer or get them a $100k bounty.


I get what you mean here, but I still can't help but think that the Windows "make everything run really easy" mentality still crept in and stuck around,and to this day prevents software from implementing better ideas of "who or what really needs to run executable code?"

Like Javascript? Sure, it's VERY versatile, but "just download arbitrary code and run it in the browser?" That should have never happened in the way it has.


Linux has both a huge stack surface and a huge focus on the malware and TA sphere. I worry about my Linux environment far more than my Windows environment. properly securing, monitoring and responding in a Linux environment is much harder than a windows environment for a SOC. the enterprise tool set lags in this space by a lot, and the TAs targeting Linux are generally FAR more sophisticated.


This is definitely true, and I think there are at least two points worth considering here.

- Part of what makes the mainstream OS terrible is the mere fact that it is mainstream. If Linux hit 60-70% adoption, a plague of terrible software, adware, malware, and more would start degrading its quality.

- Despite the points above, it would be really nice if some of the lousy things pointed out the in the graphic were deprecated.


Free Software licenses typically say basically “I’m giving you this for free, so you take it as it is, no promises.” You get the guarantees that you pay for.

But nobody would write some of the absolute schlock they get over in proprietary-land if they didn’t think they could dupe unwitting consumers into paying for it.


Wait, you say malware don't target Linux because it has no market share? The OS that is ubiquitous on servers and mobile?


Typically there aren’t users on the servers. The servers aren’t used for browsing the internet or checking emails.


The rewards are greater on the server.

If you pwn just one server you can attack thousands of people, their data, their credentials, etc.

Saying that there's no malware for Linux because there's no reawrd is myopic - the payoff is potentially larger.


> Saying that there's no malware for Linux because there's no reawrd is myopic - the payoff is potentially larger.

I didn’t say that. My comment was really just commenting on the fact that most attacks (mal/ransomware/phishing/exploits) very frequently need some sort of user interaction. Without users or users doing user stuff it makes it harder to get things to execute on the machine/server. Sure if the server is in the DMZ and unpatched then yes it will be hammered by scanners and automated exploiters. With proper security hygiene and a proper patch cadence servers are usually more protected through defense in depth and lack of human.


You are correct, you most certainly did not say that, and I apologise for implying that you did.

Mea culpa.


They're configured and administered by professionals though, not your grandma (probably).


One would hope, but that doesn't solve many problems.

Just this week I did some work for a client (a tech company) on a public facing webapp.

After fixing the issues, I gave the manager and their architects who had been reviewing my PRs a short list of errors that I noticed in the current app.

They politely declined to have them fixed, but want to proceed with another engagement for more features.

You just cannot win sometimes ...


I think we've probably all seen mistakes being made by people who should know better in the industry but servers must still be a much harder target than, for example, my dad, who somehow gets his browser hijacked by a different malicious extension a couple of times a year.

It's been probably 20 years since I've seen passwords stored as plain text at any company I've dealt with, which is some progress at least!


I think there's a unique element to Windows with it's attempt to be extremely backwards compatible. This can be a tremendous boon when, say, running older software targeting a previous OS, but introduces vulnerabilities since your dependency tree has such deep roots. It's definitely a good target because god knows how many banks/hospitals/etc are running windows and have critical business data in Excel sheets or Power BI or whatever, but it doesn't help that Windows itself is constructed of layers and layers of older code that can't be sacrificed without wrecking some client workflow. I mean that screenshot of 10 different design styles in Windows 11 kinda goes to show how much of it is just ported over kinda arbitrarily.


It's not just the kernel, it's the whole ecosystem. How does Linux have anything resembling OLE? SMB? Sharepoint paths on the public Internet?

It's an operating system that automatically executes code found on USB sticks.

How is this even a discussion?


I'm far from a security expert, but this article (or a related one) was posted recently on HN and it covers some fairly technical reasons why Linux isn't as secure as commonly believe [0]. While that may be true, I think in practical terms you are less likely to encounter malware etc on Linux than Windows, but that could very well be because it is a smaller target as you mention.

https://madaidans-insecurities.github.io/linux.html


You can lock down your computer and keep it a computer. For an example look at the work going on with Fedora Silverblue [1] where you have an immutable OS install and use containers and flatpaks for everything. It is coming along nicely with side projects to allow for customization via Dockerfiles [2].

[1] https://fedoraproject.org/silverblue/ [2] https://github.com/ublue-os


Steam OS also works this way.


> The only other alternative is turning your computer into a glorified phone (a.k.a. a locked-down media consumption device) where everything is nicely sandboxed and nothing has any kind of permission to do "bad" things.

Your understanding of security research is badly out of date.

Turns out that, since the 80's, we've found a lot of ways to make computing platforms much more secure while sacrificing little flexibility.

For instance: capabilities. Apps working under a capability model get denied access to resources by default. If you later determine that yes, you want a program to be able to access the internet, then you can grant it that capability...but, say, only while the program is running, and you can revoke it at any time.

The dichotomy of "either malware or a locked down media-consumption device" is completely false.


Heterogeneity is an important part of defense in depth. Monocultures are more likely to be attacked.


> As long as some platform is capable and powerful for many things, there will be malware.

This is true but there are also degrees of that. Windows in particular is a graveyard of discarded tech waiting to be galvanized by malware, because of the backwards compatibility and because of the Microsoft's habit of abandoning the half-done frameworks and APIs. Apple's stuff is much tidier just because they regularly deprecate and compress their fully owned stack (although they also have their turds of course). In Linux, there's terrible fragmentation and a lot of ancient and barely maintained stuff, but at the same time it can be customized to only include the best practices and omit a lot of dead weight.


> The reason why most (consumer-facing at least) malware isn't targeting Linux is because its desktop market share is like 3%.

This is also why there's not as much software in general. So if a (lack of) regular software is a valid reason to not use linux, a lack of malware is also a valid reason to use it.


>As long as some platform is capable and powerful for many things, there will be malware.

Might want to rephrase that, uhm, Linux?

Windows is a platform that is accessible to the most dumb (and disinterested) users in the world. No offense, but phishers, malware authors and spammers all rely on a sucker buying OEM every minute.


I see a ton of linux malware as part of my job but it's a different kind to windows malware(which I also reverse/research) in Linux the focus is on server/enterprise so things like webshells, miners, data scraping are very common


This and I think Linux desktop is way less attractive as a target as not used on corporate machines so often. Corporate Windows desktops might be more much more interesting data-wise.


Proper sandboxing in Unix was a missing feature forever since SUID bit was introduced and was slowly mitigated by adding layers of virtualization instead of OS-level controls.


This argument is very unpopular when it’s used to explain why iOS is locked down.

Usually there is a lot of pushback along the lines that APIs should simply be made secure.


I thought PIP and apt upgrade/update were found to have really malicious packages you could easily pull and install.


Apple Macs had like single digit percentage market share in the 1980's, yet were havens for viruses.


You are underestimating the amount of IOT botnets that run on Linux and the amount of Android malware.


Would argue 95+% of people would find w/ this kind of setup on their desktop.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: