Open source drivers are rare on Windows because manufacturers almost always ship proprietary drivers that are good enough, and Windows users clearly have no issues running closed-source software.
Proprietary drivers on Linux are often crap, if they even exist at all.
Linux purposely makes proprietary drivers crap. The kernel offers no stable binary interface so drivers become broken every single time linux updates unless the drivers get compiled as part of the kernel. This forces manufacturers to either decide that they will open source the drivers, not have any at all or put in the work to keep them up to date every release.
It seems like forcing the all or nothing choice made a lot of OEMs open source their drivers or provide none which lead to the community making them.
No it doesn't, the devs refuse to provide stable in-kernel APIs because they want the flexibility to be able to modify them as they please when a better solution comes along. Also maintaining support for proprietary drivers is harder due to them being black boxes, not only in terms of debugging, but also in security and stability.
NVidia is basically the one major holdout these days, and its proprietary driver for Linux is very good, so it's not as if it's impossible to maintain a proprietary driver in the Linux ecosystem. The motivation here comes from Linux being huge in accelerated computing and 3d, not due to any particular love for Linux on Nvidia's part.
Indeed the lack of a stable interface has made it cumbersome to maintain a out-of-tree driver, which is GREAT since it means hardware vendors are more likely to open source their drivers or at least give enough documentation for them to be created by a third party. This ends up being a huge part of Linux's success, as it supports the widest range of hardware of any system 'out of the box', hardware support which is then functional on any platform on which Linux runs, which in turn is practically everything under the sun.
And if this wasn't enough, it is also a boon for alternative systems which will never see official proprietary drivers due to being niche, as they can port Linux drivers, or even add Linux driver compability layers.
As the situation on windows shows us, the alternative is drivers that are crap for other reasons. If Linux offered a stable binary interface for drivers, we'd have proprietary drivers that "worked" but were nevertheless still crap insofar as they were essentially malware, as is the case with this wacom driver.
Probably because desktop users expect their hardware to continue to work for a long time and to keep up with OS updates. For mobile people have been conditioned to accept throwing away and buying a new device every 2 years.
> Device managers don't care about Linux anyway, and wouldn't suddenly start caring if Linux announced a stable ABI.
From what I've seen, facilitating proprietary drivers seems like the motivation of most people lamenting the lack of a stable ABI. An example of this being the comment I responded to; "Linux purposely makes proprietary drivers crap. [...]"
Discounting proprietary drivers under the assumption that they wouldn't be written anyway, what does a stable ABI afford us? Out-of-tree FOSS drivers? In other words, drivers that aren't good enough to be accepted into the kernel?
Out of tree foss drivers are still not affected since you could most of the time just recompile them to work with the latest kernel.
Also when I said linux made proprietary drivers crap I meant that as a good thing. It lead to open source drivers where there otherwise would not have been. Some OEMs like AMD eventually went open source on linux while remaining proprietary on windows.
There is also the little matter of Windows (since 7 I think?) requiring kernel drivers to be code signed, unless you want to run your system with a permanent "Development mode" text overlay, not to mention the arcane procedure required to activate that in the first place. (You can't add another cert to the trusted set, either.)
So that puts a little damper on the whole "open source" thing. Of course it is also not effective at all, Stuxnet was famously signed by Realtek.
The process to get a driver signed doesn't seem too hard for an open source project to do. Biggest hurdle is the certificate costing around $300/year as far as I can tell, so it would need to be a project with a reliable stream of donations or an author/s willing to pay it.
Not too long ago I had to do some INF editing to get a driver installed on Win10, and the editing did invalidate the signature so it (silently!) refused to install, but booting with the "disable driver signature enforcement" option made it install, and it continued to load and work normally even after I booted back into normal mode. This was only a few months ago so unless something drastic has changed since then, maybe it's not that hard to install drivers with bad (or missing, but that's really the same if you just have an arbitrary signature) signatures. I thought I'd be out of luck and have to resort to something deeper and less reliable like kernel patching (tools exist to do that, but they get marked as malware, and you have to do it after every update...) but that was an unexpected surprising positive.
Editing the INF de-authenticates the installation of the driver, which can also be bypassed by adding to the Trusted Publisher root store, which is mutable (as Zadig/libwdi does), but the actual kernel-mode .sys binary still needs to be signed by Authenticode unless the system is in driver Developer Mode. Your method worked for installing a modified INF file, but will not work for installing a modified binary.
Sounds like my Linux experience in the late 1990s: lots of weird invocations without understanding what they do, just to keep the system barely functional. The roles sure have reversed...
As for manually forcing a particular signed binary for a specific device, the “Have disk...” or “manually select from” route still works without that developer mode nonsense.
That’s trivially easy to get around by installing you’re own CA cert when you install the driver.
This is arguably worse security wise but it makes the driver install process identical to the way it used to be as far as the average consumer can tell. This is why (IMO) free software is so important, to the point where I’ve begun to agree with the radicals and think it should be mandatory.
There are two separate authentication processes for drivers on Windows: Authenticode, which is used for the kernel-mode driver (.sys) and is strictly enforced, and driver package signing (.cat/.inf installation packages), which has a mutable root storage called Trusted Publisher system store. Zadig works by adding its own certificate root to the Trusted Publisher system store and self-signing the installation packages, but the three possible installed drivers (WinUSB, libusb0, and libusbK) were all still signed by Authenticode.
Yes, it is its own list for software publisher signing specifically, and is separate from the Trusted Root Certification Authorities certificate store.
Windows users should push for FOSS drivers as well also when the proprietary ones run perfectly. Privacy and security issues aside, being forced to depend on a closed driver means that the manufacturer can make the product obsolete just by stopping support on newer Windows versions, therefore turning the product essentially into trash, thus forcing the user to buy a new one.
The vast majority of Windows users don't really care about that though.
In the 90s, software modems ("winmodems", see [0]) were popular because they were cheaper than using dedicated hardware for generating and decoding the audio signals sent over the phone line. Those would break if the manufacturer didn't upgrade their driver for newer versions of Windows, since they're completely software driven.
I'd be very surprised if things have changed since then, and I bet that the majority of consumers would just pick the cheapest option at the big box store.
I remember getting PCI based Winmodems working on my old machines back in the late 90s/early 2000s. A lot of people bought ISA modems or physical COM port models so they wouldn't have to deal with that bullshit.
Now most Linux distributions are littered with binary blobs in linux-firmware that have to be loaded for everything from Wi-Fi to Bluetooth. We've gone the total opposite direction of where we should be .. except for like .. amdgpu.
Winmodems were not popular with anybody except manufacturers, and exponentially increased the support issues with modems as an entire technology. Read: people had more problems with Winmodems than they did before.
Not true in all locales. In 2001 I bought a computer locally (Haifa, Israel) with a winmodem. I installed Red Hat on it, so obviously the winmodem would not work. I could not find a hardware modem locally. The only solution that I had was to go with an ADSL line (250k up, 750k down) which cost a fortune but whose modem would plug into the network card.
It would be a full two years before I would see any other home users on anything other than dialup. 750k down in 2001 was so impressive, you could start listening to songs on Kazaa as soon as the download started!
The windows approach to driver certification makes this really difficult. Microsoft virtually requires every driver maintainer to pay $100-odd every couple of years for a signing certificate (and ironically enough this is done in the name of improving driver quality). For a corporation that's nothing, but it's a pretty steep ask for the hobbyist maintainers that OSS drivers tend to rely on.
>Proprietary drivers on Linux are often crap, if they even exist at all.
Not so with Nvidia GPUs. The open drivers are awful; the proprietary drivers are good.
(But IS the case with AMD GPUs, to the point where the proprietary driver seems to perform worse[0] and everyone pretends it doesn't even exist, which is upside down unintuitive coming from A.) Windows and B.) Nvidia.)
I have to use their proprietary drivers and I beg to differ. Nvidia drivers are still crap, given all the pain you have to endure to get them running. Yes, nvidia has managed to sneak into the linux world through cuda but as far as ease of use, they are still nothing short of crap. Not to mention if you want to use anything other than ubuntu.
The proprietary drivers are not "good". They don't support Wayland which has been the default on many distros for years. They also don't support prerelease or custom kernels. I had to build a custom kernel to include some patches for new hardware I just got and found out it was impossible for me to use the nvidia drivers on it. I ended up getting an AMD card because of that.
As a developer however, nvidia's closed source drivers are buggy as hell. The amount of issues and times they break the spec is astounding and a constant annoyance. AMD and Intel via the open Mesa drivers are blissful in comparison, plus the amazing debuggability.
I would argue that Nouveau is bad purely because of poor performance, and the proprietary drivers are tolerable but perform well once you have them working.
One thing I can say for Nouveau over the proprietary drivers is that they actually work without any real fuss. I've run into numerous instances where the proprietary drivers would prevent the system from booting. And I've yet to get them to work at all with any realtime kernel in Manjaro.
And then we get into the nightmare that is any laptop with an integrated Intel GPU and a dedicated Nvidia GPU...
>I would argue that Nouveau is bad purely because of poor performance
That and, if you have a G-SYNC monitor (which, in retrospect, you shouldn't, but I and several friends of mine do), it won't work at all with the Nouveau driver. :D
Proprietary drivers on Linux are often crap, if they even exist at all.