Maybe I just don't understand The Linux Way but I've truly never actually seen the point in the package manager debate. There exists some website that is the authoritative source for the software. That may be a normal website or it may be a GitHub but it is there. A package manager associates it with some slug, but so does the browser, the latter just has more dots in it. A package manager pulls it from some centralized archive, but you weren't manually inputting the URL to begin with. It does in fact centralize the update flow, but that's only beneficial when you update everything at once, which I don't have any reason to be doing: either I want it on latest, with the only interruption being not quite yet, or I don't want it updating at all. The primary meaningful value-add of a package manager is the package archive being manually checked for compatibility, which is actually bogus if you scratch the surface and regardless not really necessary on Windows where everything just works. Downloading and executing an MSI does not seem to be a meaningfully different operation than writing the apt-get command; I think it is a lot more to do with the visual effect of it not happening in a web browser and showing up in the downloads list.
Aside from a centralized, trusted packge archives, with signed packages and vetting, and secure downloads and verification, one difference is that with individual install.exe's - that installer can do whatever it wants. Windows package managers are just a facade over this and just run the installer for you.
An actual Package Manager _is_ the installer - they are part of the OS and know how to correctly install and fully uninstall things. Someone else in this thread wrote a nice summary of this aspect here: https://jmmv.dev/2022/03/a-year-on-windows-winget.html#unins...
This part is mostly true with MSI's, they're just a package format like .deb or .rpm - they just don't have any of the other infrastructure around them.
> It does in fact centralize the update flow, but that's only beneficial when you update everything at once, which I don't have any reason to be doing: either I want it on latest, with the only interruption being not quite yet, or I don't want it updating at all
Having this all centralized is actually really nice, so I wouldn't discount that. If you want to pin a package at a particular version, you can and the package manager will leave it alone. If you don't want things updated yet, just don't run the update yet.
I'm always uneasy when I install software in Windows because it means I have to trust another third-party. If I have 100 programs on my Windows PC, I've had to trust 100 third-party organisations with my computer and all the data on it. Not only that, but I've also had to trust my own judgement to find the correct website for all those programs, and not copycats that wrap the program in malware. Sometimes I check Wikipedia to find the official site and sometimes I find an official-looking GitHub repo with a lot of stars and work back to find the official site, but sometimes it doesn't matter anyway because the program itself becomes malware (eg. CCleaner.)
With a package manager I have to trust one organisation, which is the same that I get my operating system from. I don't even have to trust the website that hosts the packages, since the packages are signed. It's true that someone might try to sneak malware under the package mantainer's nose with obfuscated source code, but that's much harder than sneaking it in a binary. You're practically guaranteed that the software running on your PC matches the publicly available source code, and it's built with the latest compilers and hardening flags. (7-Zip used to build without any hardening flags because they made the binary larger.[1])
It's not even just security issues I'm concerned about when it comes to trusting software. I also have to trust the software not to make global, permanent changes to my system, which isn't uncommon with Windows software[2], but basically unheard of with Linux packages - and when it does happen, it's Microsoft doing it[3]. So I'd say the primary meaningful value-add of a package manager is trust.
> Downloading and executing an MSI does not seem to be a meaningfully different operation than writing the apt-get command;
For fetching development libraries, it's huge.
If I run `make` and it turns out I need libfrobnicate-dev, I can just quickly `sudo apt install libfrobnicate-dev`. I don't need to pull up a browser, Google it, find the MSI, make sure I'm grabbing the right version (Seriously why the hell does 32-bit Windows still exist?), download, and run it.
Oh, and what about when that dev library gets a new version? `sudo apt upgrade`. Done. All my libraries get updated, no need to track down 20 MSIs.
> Downloading and executing an MSI does not seem to be a meaningfully different operation than writing the apt-get command
If nothing else, unlike downloading and executing a MSI, apt will download the package, verify its checksum, and verify the signature of the package itself.
A debate requires two sides. The package manager "debate" only has one side. I've never seen someone defend the Windows way of downloading apps (until now).
Probably because "the windows way" seems normal and its not something most people even think if it could be done differently, let alone if it should be done differently.
In my, albeit limited, experience of showing someone how to get something on linux, the reaction usually is : "what? ....why? is it installed now?", shortly followed by : "Is there like an app i can use to install what i need instead?"