Hacker News new | more | comments | ask | show | jobs | submit login
Wayland misconceptions debunked (drewdevault.com)
254 points by Sir_Cmpwn 12 days ago | hide | past | web | favorite | 229 comments





There are an awful lot of comments on here that are trying to protect Nvidia's anti-Linux position.

Nvidia's position, literally, is their GPUs are not Linux compatible. The Linux ecosystem relies on the Mesa stack (Mesa, DRI, DRM, GBM, KMS, etc), and DRM is part of the upstream kernel itself... Nvidia has openly not used any of these APIs.

This is exactly equivalent to Nvidia not using WDDM and DXGI. If Nvidia chose to do that, Microsoft would outright ban them from driver certification (thus the platform entirely), so why is it wrong for Linux and Wayland to do the same?

The Linux community reached out to Nvidia and help them along with Nouveau, and at one point even threatened to sue them. Linus Torvalds has told them that he doesn't care about them if they won't actually participate in Linux, and will continue to break their drivers because they just don't listen.


> There are an awful lot of comments on here that are trying to protect Nvidia's anti-Linux position.

No, you are taking a with-us-or-against-us mentality. Users and application developers want things to work as broadly as possible, regardless of the graphics card or driver being used. Whether that helps or hurts various parties regarding Nvidia's closed-source stance is irrelevant.

> Nvidia's position, literally, is their GPUs are not Linux compatible

Citation needed. If this is their position, why would they offer Linux driver downloads? Not supporting something in a way Linux would prefer is not the same as not supporting something.

>This is exactly equivalent to Nvidia not using WDDM and DXGI. If Nvidia chose to do that, Microsoft would outright ban them from driver certification (thus the platform entirely), so why is it wrong for Linux and Wayland to do the same?

Microsoft can get away with this precisely because it sells Windows; it lowers the price to vendors (such as Dell) on the condition that they only use certified drivers. It is entirely possible to run Windows with uncertified drivers. In any case, it is my understanding that this is a licensing issue, and Nvidia would support more APIs if it could do so while keeping the driver proprietary.


>Citation needed. If this is their position, why would they offer Linux driver downloads? Not supporting something in a way Linux would prefer is not the same as not supporting something.

It's not so much that they are incompatible. It's that they weren't written for linux. They use the same driver for windows which has been adapted for linux. That also allows them to avoid any sort of derivate work clause that would kick in due to the GPL.


And why would this not be acceptable? Shouldn’t we be happy that nvidia is adapting this driver?

No if it goes against Linux standards. That's just accepting the lowest common denominator quality driver. If AMD an do a better job with AMDGPU, so can their much larger competitor.

That's the sort of talk companies with market share of Microsoft or ecosystem of Apple can afford.

I am a Linux desktop user myself but:

a) The AMD drivers were unusable crap for a very long time, things have improved only thanks to the open source driver (that ATI/AMD did not write, they only contributed specs). Basically 3D and OpenGL in Linux was for years synonymous with "buy Nvidia, it works".

b) I don't understand this "holier than thou" attitudes when Linux desktop isn't even a blip on the radar and pretty much contributes zero to the bottom line of these companies. However, it tends to have the most vocal, noisy and abusive users ...

Instead of being glad that Nvidia actually supports Linux (and FreeBSD and few other free OSes) for ~20 years now (before ATI even had a Linux driver at all!), we heap abuse on them because their drivers aren't to our taste? Would it be better if they cut their losses and stop providing this support? Especially for stuff that doesn't have any alternatives, such as CUDA? And no, Intel integrated graphics really isn't an option for high performance graphics and ATI's open source drivers are quite a bit slower. Of course, if all you need is requiring accelerated desktop so that you can have nice transparency/compositing and accelerated video playing, then you won't see this.

I don't get where did you get the idea about the "lowest common denominator quality driver" from Nvidia. Sure there are bugs and issues (e.g. with suspend) but in my experience the Nvidia graphic drivers were the least troublesome when it came to application support, OpenGL spec + extension support and various bugs. And I have been using 3D drivers in Linux essentially from the very beginning, when the top end hardware meant a 3DFX card or later a Riva TNT. My diploma thesis ran on a 3DFX back in the day ...

That said, it doesn't mean we shouldn't complain when something doesn't work but let's not forget the size of the market. A dose of reality helps - jumping up and down and yelling really really loud to look important doesn't make one so.


> that ATI/AMD did not write, they only contributed specs

This is a lie.

All of the main contributors for the amdgpu drivers are AMD employees. The files in drivers/gpu/drm/amd/amdgpu all have AMD copyrights in the headers.


>Would it be better if they cut their losses and stop providing this support? Especially for stuff that doesn't have any alternatives, such as CUDA?

They'd lose more if they did that. Do you think the whole world of HPC/ML/scientific computing will switch to windows to use nvidia cards ? The answer to that is no. They'll just move over to AMD. ROCM is coming along fairly well. Or something else will pop up. Don't get too attached to blobs.


They could drop graphics support on Linux with no significant losses to their hpc/ml market.

Yes, that is somewhat true I think. They could just expose an accelerator device without any graphics capabilities. I was replying to the parent's point that CUDA is not hostage. They lose more if they drop CUDA on linux.

I'll grant you, 18 years ago, Nvidia's drivers were superior by virtue of existing at all. That's a long time ago. The competition has improved, and the Nvidia experience has steadily gotten worse. I have a desktop machine at work that locks up hard with the proprietary driver, and another one where X randomly crashes if you use nouveau. Any goodwill they had with me for providing proprietary drivers on Linux over a decade ago has long since evaporated. Their hardware may benchmark well, but in my experience it's unstable and I suspect they're papering over a poor design with opaque drivers.

> And why would this not be acceptable?

Quoting the article:

> Actually, Nvidia doesn’t support us. There are three standard APIs which are implemented by all graphics drivers in the Linux kernel: DRM (display resource management), KMS (kernel mode setting), and GBM (generic buffer management). All three are necessary for most Wayland compositors. Only the first two are implemented by the Nvidia proprietary driver. In order to support Nvidia, Wayland compositors need to add code resembling this:

   if (nvidia proprietary driver) {
        /* several thousand lines of code */
   } else {
        /* several thousand lines of code */
   }
And it goes on, at length, describing the problems this causes. It's not like it's a mystery.

Numbers speak better though. Usage stats show, that with AMD open driver stack becoming very competitive, Linux users are gradually preferring AMD to Nvidia for gaming GPUs. The only thing that's holding AMD back now is sketchy availability of high end cards.

And for low end and integrated cards, using Intel or AMD APU is an obvious choice, especially in laptops. Optimus is such a horrible mess, that no Linux users should come anywhere close to it.


The question in my mind is, I suppose, are Linux desktop users ever going to be a enough of a critical mass for Nvidia to care if they lose a few sales to AMD here and there.

My gut feeling says no.


Linux desktop alone probably won't be enough to force them to open up their drivers or to stop being a thorn in the side of Nouveau development. But further advancement in the datacenter from AMD already might push their concerns over the edge.

Intel joining the high end GPU market with open drivers might also help speeding things up, including dislodging of CUDA lock-in.


Oh I'm not so sure about Microsoft doing that. Probably everybody has heard about this story now [1], but in case anyone hasn't:

> I first heard about this from one of the developers of the hit game SimCity, who told me that there was a critical bug in his application: it used memory right after freeing it, a major no-no that happened to work OK on DOS but would not work under Windows where memory that is freed is likely to be snatched up by another running application right away. The testers on the Windows team were going through various popular applications, testing them to make sure they worked OK, but SimCity kept crashing. They reported this to the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it.

Well, maybe Microsoft of 2019 might behave differently, after spending decades in desktop OS market domination, but what's more relevant is how they got there in the first place.

[1] https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...


I had not heard of it yet, so thanks. And well, makes sense, different priorities ...

https://xkcd.com/619/


NVIDIA GPUs on Linux are used by GPGPU data center customers, rendering data center customers, weird cryptocurrency miners, and then a tiny number of everyone else. The everyone else is predominantly certified RedHat / CentOS or Ubuntu animation and engineering Linux applications users. Then finally people who install Linux on their NVIDIA equipped laptops and desktops.

The last group is so vanishingly small. They’re neither a meaningful number of users nor a meaning number of payers.

Linux should do its best to support word of mouth evangelists with the best user experience possible, because even though free software evangelism is definitely viral, retention favors convenience over all other factors when the price is zero.

But this has always been about ego. Jensen Huang thinks he’s the biggest hot shot in the world, and that culture pervades his company. But Linux is going to be around a lot longer than somebody’s game rendering coprocessors.

I’d appeal to humility, not some meaningless details about APIs. Definitely not a business case.


What about Nvidia-equipped laptops that come with Ubuntu pre-installed?

Their position is a bit more complicated than that. Nvidia does provide drivers for Linux at all, and this is thanks to Nvidia customers that use Linux, like the VFX industry. Nowadays there is also lots of deep learning and crypto mining that's happening on Nvidia GPUs with Linux running on the CPU.

Having a hard stance on Nvidia, which I support, makes no sense if they have no interest in Linux support at all.


>Nvidia's position, literally, is their GPUs are not Linux compatible.

Nvidia's linux drivers work fine for me, although granted they are hard to install (hard compared to apt install).


At least on Ubuntu if you have 3rd party driver sources enabled it's literally an apt install away. Definitely not as good as being already installed like AMDGPU, but it's been pretty trivial installing NVIDIA drivers for me over the past years.

Nvidia’s position seems to be that Linux needs to grow up and provide a stable driver API and ABI, and stop claiming anyone is a “bad actor” just because they don’t want to put their drivers under the GPL, make them part of the Linux kernel source tree, and essentially give up control of them.

I’m honestly surprised the major distros haven’t gotten together to do this, in a shared fork if necessary. It’s outright embarrassing that this attitude still infects the Linux project.


This is perhaps the most important feature of Linux, and far from a problem. Nvidia needs to get over itself and put its driver in the upstream tree. AMD did it and it's been a huge success. Linux's unstable internal API and GPL'd nature has been a huge driving force behind the availablity of open source drivers. Hell, many distros with stable driver APIs like *BSD have Linux to thank for many of the drivers they have, because without Linux there never would have been open source drivers or specs in the first place.

None of the distros are going to get behind your fork because all agree with Linux on this matter, rightfully so. Nvidia can fuck off with their proprietary crap, Linux has no intention of bending over for that garbage.


And someone called Nvidia not simply handing over their IP to the entire world a “childish tantrum.”

Linux needs to get over itself and recognize that it’s part of a larger world, and that its “my way or the highway” attitude is unnecessarily constraining.

The fact that people are coming to you and saying “I’d like to use my Nvidia card with Linux but Linux isn’t supported” does mean it’s a problem—and the problem is Linux’s, not Nvidia’s.


Sorry, that's bollocks. AMD did not put their driver into the upstream tree! What is in that tree is the open source driver developed by volunteers, based on ATI specs.

ATI/AMD still has their own proprietary, closed blob driver that is not in the tree - and which actually runs quite a bit faster than the open source one.


    $ cd ~/sources/linux/drivers/gpu/
    $ git log . | grep "@amd.com" | wc -l
    42418
    $ cd ~/sources/mesa
    $ git log | grep "@amd.com" | wc -l
    4817
>ATI/AMD still has their own proprietary, closed blob driver that is not in the tree - and which actually runs quite a bit faster than the open source one.

It doesn't run faster, this is nonsense. I've personally met the AMDGPU driver developers and if you ask them they'll even tell you that basically no one needs AMDGPU Pro. It's not faster, this is bollucks.


To add to this, AFAIK there are two main reasons why they still have AMDGPU-PRO: it has the more compatible, better-performing OpenCL implementation (which you can run alongside Mesa, actually), and until recently it supported slightly different display features, and more compatibility profile functionality. As Mesa becomes more compatible with bizarre certified industrial applications and, and the upstream AMDGPU kernel driver enables more exotic display features, I suspect AMD might choose to just maintain Mesa for these GPUs.

AMD are the primary developers of the upstream amdgpu kernel driver. Take a look at those who are making commits:

https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

You are probably confusing the situation with nouveau, which is indeed community developed reverse engineered kernel driver for Nvidia.


> ATI/AMD still has their own proprietary, closed blob driver that is not in the tree - and which actually runs quite a bit faster than the open source one.

It is generally slower than Mesa + upstream AMDGPU and has been for more than a year.


No? https://github.com/torvalds/linux/blob/master/drivers/gpu/dr... and all the other files in the folder are by AMD.

> Nvidia’s position seems to be that Linux needs to grow up and provide a stable driver API and ABI, and stop claiming anyone is a “bad actor” just because they don’t want to put their drivers under the GPL, make them part of the Linux kernel source tree, and essentially give up control of them. I’m honestly surprised the major distros haven’t gotten together to do this, in a shared fork if necessary. It’s outright embarrassing that this attitude still infects the Linux project.

I don't find it embarrassing that Linux maintainers prefer to leave every line of code in the kernel up for improvement, rather than desperately replicating known misbehaviours until a new five-to-ten year waterfall release comes. AMD's drivers work great, I buy AMD because instead of getting upset that they don't get to call all the shots, they put in the work and make peace with the establishment.

NVIDIA would rather throw a pathetic, childish tantrum and blame everyone else for their arrogant position which leaves their customers begging maintainers to comply and work on their own time to enrich NVIDIA.


How does video playback acceleration work on AMD cards with VLC or mpv?

> How does video playback acceleration work on AMD cards with VLC or mpv?

It works great in MPV, I don't use VLC so I can't comment on that. The major problem right now with video acceleration is that Chromium (or maybe it's my distribution's build of it) seems to have enabled VA-API acceleration, but the Mesa VA-API state tracker (I think, I'd be glad to be corrected on this) I think is missing the API that would tell Chromium which pixel format to expect the decoder's output in; so Chromium assumes that it'll be in a certain format which it turns out not to be, and this leads to hideous corruption.

I suspect that if and when there is any significant amount of HEVC content on the web, this issue will probably be resolved by then. For now there are configuration parameters which can be used to work around the issue by disabling the pixel format Chromium defaults to, and do so just in chromium, but a fix (which will be in Mesa 19) is required for that configuration to work with Chromium (due to its unusual process naming).

I have used the MJPEG UVD decoder through ffmpeg to decode my 4K webcam's output and it works a treat.


You've encountered another issue like the Nvidia/Wayland one. Google has closed the hardware acceleration issue in Chromium on Linux as "won't fix".

My reading of the situation is: That's because it's not theirs to fix. The API is missing in Mesa, and implementing it there would fix the issue. I don't begrudge Google and the broader Chromium developer community for closing it.

So video acceleration works on Firefox?

It's kinda hard to tell, not sure Firefox supports any decode acceleration on Linux. If they do, it's not obvious when it's active, and Firefox uses quite a bit of CPU while playing video regardless.

What's embarrassing, is Nvidia's inability to provide upstream driver themselves, or to allow nouveau which already exists, use all their hardware features[1]. Linux kernel has quality rules, and if Nvidia can't follow them, it's their problem. It should not be in the interest of Linux maintainers to accept any junk upstream.

1. https://www.phoronix.com/forums/forum/linux-graphics-x-org-d...


And Nvidia has rules around access to its intellectual property. If Linux can’t follow them, Linux doesn’t get Nvidia’s code.

Nvidia can sit on their property all they want, and lose market share in result, while other GPU makers have no problem making open upstream drivers. That actually benefits everyone but Nvidia ;)

In practice though, that's likely the least of their reasons not to open their driver. More probably, they just don't want to expose some embarrassing code mess or non compliant cheating in their graphics stack, as well as don't want to allow anyone decide how to use their GPUs without Nvidia dictating them. If they control accelerated driver - they can dictate and demand more money in some use cases from those who want a more permissive driver. All of that has very little to do with intellectual property and more with market manipulation.


What a refreshing difference to other Linux ecosystem changes! Errors like enforcing client side decorations get corrected, there seems to be activity to add the features missing compared to X, and most importantly: Instead of replacing the existing working solution with broken crap because it's modern, Wayland slowly integrates itself into the ecosystem, and it has Xwayland, which should minimize breakage when moving over in a few years. And if I look at the stutter X produces with my window manager every time the compositor is off, and given that said compositor compton is abandoned and there seems to be no better alternative, I understand that there might be a need for a different solution.

I got hit by strange Wayland bugs when distros like Ubuntu and Fedora activated it too early and I'm not looking to make the switch anytime soon, but I wish the project the best and am happy to see it seems to be on a good way.


> Wayland doesn’t support Nvidia!

There is a very high likelihood that when a user complains about Wayland support for Nvidia they mean this:

* user already assumes Nvidia is a bad Linux actor

* user knows Nvidia only supports buggy blobs

* user may even know Nvidia is bad to people trying to unblob the blobs

* user already went through the trouble of doing what it takes to get the bad Linux Nvidia blobs to work to play games on their machines

* user notices that the bad buggy Linux Nvidia blobs fail to work at all with Wayland

* user is understandably even more frustrated than they were before. So they ask you-- a Wayland person-- how can it be even worse for them now when they've already been generating much more complex and power-hungry graphics with their buggy blob than they'd ever do with Wayland.

If you're representing Wayland, this problem is your problem whether you want it or not. Perhaps the answer is to make the protocol worse so the the Nvidia user doesn't remain eternally maximally frustrated. Perhaps the answer is that isn't possible and you're stuck in the same boat as the user. Perhaps it means you start a hunger strike or gather thousands of devs to protest outside Nvidia's headquarters until they improve. I have no idea.

But please don't do that FLOSS thing where all the devs on a project teach each other to sync on, "Sorry Mario, but your princess is in another castle." That pattern becomes it's own problem and can quickly end up discouraging development even from potential devs who don't care at all about Nvidia support.

> Wayland doesn’t support screenshots/capture!

What is the current UX for screenshotting, screencapture, and screencapture with audio for the most featureful DE that uses Wayland atm? Is there a program out there that offers push-button access for these three features (choosing the most sensible default format in each case)?


>There is a very high likelihood that when a user complains about Wayland support for Nvidia they mean this

As the actual person who receives these complaints... this isn't true. Once it's explained, though, the users still stick around and get angry because they made dumb, uninformed choices as a consumer and think it's our fault.

>If you're representing Wayland, this problem is your problem whether you want it or not.

It's not. We can just choose not to solve it. Use X and buy smarter when the next harware upgrade comes around, or wait until your hardware is supported by nouveau if you don't want to upgrade any time soon.

>What is the current UX for screenshotting, screencapture, and screencapture with audio for the most featureful DE that uses Wayland atm? Is there a program out there that offers push-button access for these three features (choosing the most sensible default format in each case)?

I don't really know what the situation is for more noob-friendly DEs like GNOME, but on sway this tool is the i3-equivalent of push-button simplicity:

https://wayland.emersion.fr/grim/

Push-button screen capture isn't there yet.


"the users still stick around and get angry because they made dumb, uninformed choices as a consumer and think it's our fault."

If this is the attitude of Wayland developers, it goes a long way towards explaining Wayland's ten year road of non-adoption.


You phrase that as though there is a real choice here. There is not. The open source community has had experience with how to interact with closed source drivers since the Linux 0.01 in 1991 and that experience screams "don't do it". Xorg has been a technical disaster for more than a decade; I have sympathy for the people who want to use Nvidia cards, but they are going to have to use X.org and like it anyway because that is all Nvidia supports.

Attempting to run a project based on the opinions of a group of Nvidia devs who (1) don't care about your project and (2) don't care about your goals is technical madness. It took the kernel more than a decade of stubbornness before all the other device driver writers caved and supported decent design. As far as I recall Nvidia is literally the last major device driver who refuses to play ball with open source. If Nvidia users can't use Wayland for driver reasons that is because of Nvidia's choices and there isn't anything the Wayland devs can do without repeating all the mistakes of X.org.

If it takes another decade before Nvidia caves and behaves like a good corporate citizen then that is a decade well spent by the Wayland devs. Long term maintainability will eventually trump short term user issues; just like it did for wifi et al. It is unfortunate Nvidia is making their user's lives hard, but the writing has been on the wall since AMD started open sourcing their driver stack in the 2008 era.


The problem here isn't the interaction with the nvidia devs. Like you said, flatly refusing to work with their binary drivers is WAI. The problem here is the interaction with wayland's users and potential users. It is perfectly possible to post exactly what you said here, along with other useful information about why and how people should expect their nvidia cards to fail to behave, instead of saying things like "users are idiots and will blame us no matter what we do". Communicating with users better will not only reduce anger at Wayland, accelerate its adoption, and improve its quality, it'll direct at least some of that anger at nvidia thereby directly enhancing the no-cooperation strategy you describe here.

Android and ChromeOS show it is possible to have it other way when FOSS religion doesn't stand in the way.

Android shows exactly what that would look like: You get a blob-like kernel/stdlib/driver package from your vendor, then build your operating system around that and never speak about it again because it's brittle mess of good enough.

The only reason Google and a few big players can ever ship updates is by playing hard with those vendors, and negotiating upgrade paths in advance. Nobody is interested in doing that for the desktop.


Exactly the example of the FOSS religion I was talking about.

Game devs just want to put shinny pixels on the screen, no one cares if it is a blob or not.

AMD is open source and there is hardly any advantage, X routinely crashs, doesn't provide support for the GPUs that were dropped from the rebooted driver, being stuck with fewer capabilities than fxgl.

Hollywood studios are quite happy with NVidia drivers.


Android and ChromeOS show how any device support is abandoned as soon as it ships.

My phone is running kernel 3.10 (the same as rhel7, except without backported fixes), and it isn't going to get anything newer.


AMD also decided to abandon my Radeon, dropping DX11 class features and video hardware decoding on their rebooted open source driver, so where is the difference?

Yes. Even if the analysis is exactly correct, this is not an acceptable way to think about or communicate the results. It leads thought in anti-user directions and shuts down communication.

Compare to something like this:

"We need to provide potential users of Wayland with better information, both to better calibrate their expectations and ensure that anger is properly directed and to help them make more informed buying decisions in the future."

Of fucking course you have a communication problem when your relationship with your userbase is founded on logic like "doing anything other than calling them idiots and ignoring them is useless and we shouldn't surrender to idiocy".


In general, users of $foo have no problem researching their hardware purchases to make sure that they are compatible with $foo. ...unless foo=FOSS. When buying a printer, many users have no problem researching the printer to make sure it works with their specific model of iPad, then turn around and say that it's their GNU/Linux distro's fault that it doesn't work with that printer.

If, as a consumer of software and hardware, you presume that it's incumbent upon people expending their own energy for free to cater to your personal situation and/or purchasing habits, you'll keep taking whatever you're given and I don't feel bad when creators ignore you. "Market adoption" is not a driving force here.

By the same token, if, as a developer of software, you presume that it’s incumbent upon people spending their own money to buy hardware that caters to your software design, one should not feel bad if users ignore your project and use alternatives that are compatible instead – namely X.

If Wayland developers truly do not care about adoption of their software, then that’s fine. But it means that UI frameworks which do care about adoption will have to continue to support both X and Wayland as backends indefinitely, increasing complexity. That’s okay for now, since removing X support would be a long way off in any case. But if enough years go by without at least a plausible route towards being able to remove it in the future, they may eventually decide to address the complexity issue by dropping Wayland instead.


Yep, that's the trade-off. Volunteer projects just aren't going to do whatever it takes, because market share doesn't override all other considerations.

But maybe that's okay. With a viable alternative, we can be patient.


Here's to another 10 years of non adoption, if wayland means "we don't care about popular (even dominant) use cases, now piss off".

The alternative is that the volunteers cave in to strangers who bought anti-FOSS hardware by making the volunteers' software much crummier over a long period of agonizing work. I think volunteers wanting clean FOSS being expected to do that shows a negative, selfish attitude plus lack of cooperation on the NVIDIA consumers making those demands, not the developers.

To illustrate with my own example, I bought a Linux-compatible laptop when I wanted to run Linux on a desktop. That rewarded the seller for effort they put in, reused the FOSS work already performed, and required no demands to be made of volunteers. These things are a two-way street. So, I met them in the middle. Did it again getting a Thinkpad cuz I wanted to try BSD's later this year. Supported all of them.

If trying to maximize adoption, you have to tolerate people's indifference and selfishness. They dont want to maximize sales for anti-FOSS company. So, they're telling that company's customers the issue, suggesting a switch, and having something waiting for AMD buyers.


You can't complain about adoption while taking a holier-than-thou stance towards users.

Since when has FOSS meant that I _can't_ work with proprietary bits? As a user, this is my choice. If software is going to dictate where I spend my money, then I'm less inclined to adopt it. That's not freedom.


I have distinct memories of people buying PC hardware specifically because it came with a "compatible with Windows Vista" sticker, and they were thinking of switching.

I didn't read any "holier-than-thou" sentiment in the parent comment. It's not unreasonable to expect users to think about the things they want to be compatible with when buying hardware.

"I want to be able to use the next version of Windows"... buy hardware that's known to be compatible with the next version of Windows. "I want to use this printer from my iPad"... buy a printer known to be compatible with that model of iPad. "I want to use this printer from GNU/Linux"... buy a printer known to be compatible with GNU/Linux. "I want to use OpenBSD"... buy hardware that's known to be compatible with OpenBSD. "I want to use Ubuntu 17.10"... buy hardware that's known to be compatible with Wayland.


The “holier-than-thou” part of the parent comment was calling Nvidia “anti-FOSS.”

Nvidia is going to do what Nvidia is going to do, for their own reasons. Unless you have some hidden inside information you shouldn’t assume they’re doing it in order to hurt Linux or Open Source.

They’ve probably done some evaluation of what they would get out of putting their drivers into the kernel tree versus keeping them to themselves, and decided that it wasn’t worth the work, expense, risk to their IP, etc.


Your memory involves a sticker on the box with the information you needed. You could draw an analogy if there were a "compatible with linux" sticker on some hardware.

People generally don't understand what linux is, or what a distribution is, or what versions of any of the above are new or upcoming. None of these problems apply to Microsoft, since there are many fewer versions, less fragmentation, and a marketing budget.


That's an unfair comment on 2 counts.

Firstly, it's refuting a different point than the one I was making. If the problem were "it's too difficult to determine if hardware is compatible with the distro I want to use," (which is a real problem) then the comment would at least be relevant. But the comment I was replying to said "If software is going to dictate where I spend my money [what hardware I buy], ..."; they were rejecting the validity the claim that they should consider which software they use when making a hardware purchase, which is true for no OS, and no reasonable user expects to be true.

Secondly, you are demanding an impossible task. Because GNU/Linux distros have less marketing budget, and not enough dominance, and can't convince hardware vendors to put a sticker on the box... they need to have better hardware compatibility than Microsoft, and just be compatible with all the hardware? Microsoft has vastly more resources to dedicate to hardware compatibly than just about any other organization, and hardware vendors themselves test their hardware with Windows. Expecting a less-popular desktop operating system to work with more arbitrary hardware than Windows does is unreasonable.

I get that "being able to try it out with the hardware I already own" is a hugely powerful thing. But most users who want to make any other switch accept that they might need to buy some new hardware when doing it. A user switching from a Windows laptop to an iPad as their daily-driver accepts that they may need to get a new printer that's compatible. They may even realize that they have an older iPad, and research the printer they want to make sure it's compatible with their model, and not blindly trust the AirPrint badge on the box. Few users would think that level of due-diligence is unreasonable. Saying "that research is difficult for GNU/Linux distros" is very different than saying "expecting any level of research at all is unreasonable".


"You can't complain about adoption while taking a holier-than-thou stance towards users. "

He was saying they're getting plenty of action. They're just not supporting users on specific hardware whose vendor is trying to block such action. It still is freedom. It just doesn't support that specific hardware.

"Since when has FOSS meant that I _can't_ work with proprietary bits? As a user, this is my choice. "

It really isn't if you're just a user. It's the developers' choices that dictate what software you can run on which hardware. Once their choices are made, you choose between what each offers. In this case...

"If software is going to dictate where I spend my money, then I'm less inclined to adopt it."

Nvidia is spending money on software that tries to block you from using free software with it easily. The volunteers developing one of those free packages refuse to build support for an anti-FOSS company putting up obstacles. Instead, they're doing work on companies helping them a bit or not putting up obstacles to their work.

As a user, it would be weird for you to claim to want free software while buying a piece of hardware whose developers are working against that goal. They'll also use your dollars from that purchase to do more activity that reduces your freedom as a user. You're free to choose to buy that hardware but there's no reasons for volunteers, much less those maximizing free software, to be forced to do painful work to support your choice. It's reasonable to say you're on your own if your choices create unnecessary obstacles.


> As the actual person who receives these complaints... this isn't true. Once it's explained, though, the users still stick around and get angry because they made dumb, uninformed choices as a consumer and think it's our fault.

In terms of sheer bang for your buck, Nvidia's cards have been on top for a while now (there's a good reason cryptocurrency miners have been using them), on Linux their proprietary drivers are the most stable in games - unless your only goal is to run Wayland, it's not exactly a "dumb, uninformed choice".

This seems like a clear problem with Wayland developers mentality if they expect consumers to make purchasing decisions based on their platform alone.


>In terms of sheer bang for your buck, Nvidia's cards have been on top for a while now

But they have not been on top for a long enough period that cards bought when they were are generally supported by nouveau, and cards bought after should be AMD. On top of this, the reality is that most people don't need top of the line GPUs. Last year's GPUs will run next year's games at max specs.

>Linux their proprietary drivers are the most stable in games

were


> Last year's GPUs will run next year's games at max specs.

If you want to run at 1080p@60hz perhaps, but many people now are going to 1440p, 2160p, 144hz or some combination there of which can easily strain even the latest GPUs.


You can get high end AMD cards for these, so it's not an issue if you want to use open drivers. Last year was a bit tight due to crytpocurrency mining rush (AMD GPUs were hard to buy, prices were crazy inflated and so on). This year should be better for gaming with open drivers, especially after Navi will come out.

Last I heard AMD drivers, both proprietary and open have severe stability issues on Linux. Is that no longer the case?

Not anymore for open drivers. Mesa developers had a lot of push to fix gaming related bugs. If you know some that are still broken, please open a bug on Mesa bug tracker and also list it here:

https://www.gamingonlinux.com/wiki/Games_broken_on_Mesa

This list is monitored by Mesa developers.

These days AMD explicitly recommend using open drivers for gaming.


I've bought an AMD Radeon RX580 on my desktop for gaming, and it's been working great so far with the default open source drivers that come with Ubuntu (18.04 onwards). Didn't have to do anything.

[flagged]


Nice argument

> on Linux their proprietary drivers are the most stable in games

Not anymore, as long as you are using latest Mesa. Also, Nvidia usage on Linux is gradually dropping. See: https://www.gamingonlinux.com/index.php?module=statistics&vi...

So I agree with position that compositor developers should not waste their time on supporting the blob. If Nvidia are so eager, let them contribute support, like they proposed for KWin.


From the site:

> This is currently in BETA. There are 8338 registered users on GamingOnLinux. These statistics are gathered using their manually entered data on their profiles. Please update your profile here to make this as accurate as possible.

> As with any survey, this won't be 100% accurate and should be taken with a pinch of salt.

8K users on a site about gaming. Doesn’t include RHEL/CentOS in its distribution chart. Way too myopic of a sample to even declare NVIDIA usage on Linux is dropping. For all we know it could be that new users are signing up that don’t have NVIDIA hardware, it doesn’t show a shift away from NVIDIA.

I know that your comment is about games, but over in the professional CG/VFX and AI/ML spaces, AMD is rarely touched. For both workstations and server racks.


I was talking about gaming. RHEL/CentOS are server distros, not gaming oriented by any means. Fedora is more like it in such cases, if you prefer that family of distros.

> it doesn’t show a shift away from NVIDIA

It does, as confirmed by many users who choose AMD for their next upgrades. Their choice of Nvidia was due to better performing drivers in the past, which is a non existent difference these days. So they choose to drop Nvidia and get AMD because of much better Linux integration (from kernel to the whole graphics stack and DEs), which is a consequence of AMD drivers being open.

> but over in the professional CG/VFX and AI/ML spaces, AMD is rarely touched

Not from what I've heard. AMD is better for GPGPU due to real asynchronous compute which Nvidia lacks. If anything, AMD hardware is more compute oriented, and Nvidia one is more gaming oriented.

AMD plan to address it with their next architecture (so called "super-SIMD"). But compute hardware support is already good there.

You might refer to CUDA lock-in, and some higher end libraries for AI being CUDA only. That's surely a problem, but not with hardware itself. AMD are working on addressing that as well (ROCm project).


> RHEL/CentOS are server distros, not gaming oriented by any means. Fedora is more like it in such cases.

That's why I ended by saying I know that parent comment and site was about gaming, but I wanted to add a bit of extra context. And these 'server' distros are the primary workstation hosts, both bare bones and PCoIP for pretty much the entire CG industry.

> It does, as confirmed by many users who choose AMD for their next upgrades.

It doesn't at all. This graph only shows a very generic view at best. If you wanted to see a shift, you would need a graph of users changing their profiles from NVIDIA hardware to AMD hardware and a separate graph of new users coming in with AMD gear. This is just a conglomerate of information in a single graph, there are way to many inferences that can be made that simultaneously can't be proven without more specific data.

> Not from what I've heard. AMD is better for GPGPU due to real asynchronous compute which Nvidia lacks. If anything, AMD hardware is more compute oriented, and Nvidia one is more gaming oriented.

Source? Working over in CG/VFX myself, NVIDIA is pretty much the only hardware to choose. For rendering, everything is CUDA based. Application support for AMD, not stellar. NVIDIA offers the most stable platform for our use cases so far. Release notes of software literally point out that AMD support is not as extensive as NVIDIA and subject to being unstable.

> You might refer to CUDA lock-in, and some higher end libraries for AI being CUDA only. That's surely a problem, but not with hardware itself.

There doesn't have to be a problem with the hardware, adoption will be zero as long as alternative OpenCL/Vulkan compute based libraries don't exist. I know there's been a long term project of getting OpenCL as an engine for TensorFlow, been a bit since I checked its progress. The OpenCL stack for AMD also doesn't have a stellar reputation for producing consistently competitive results. Phoronix released a strict OpenCL comparison adding the Radeon VII to the mix with ROCm 2.0 and as a developer, it wouldn't give me confidence that compared to CUDA I'm going to be getting the results I'm looking for. NVIDIA has a very mature stack and support with CUDA and its associated libraries. AMD is going to need to be able to supply that for people to ween themselves off of the green train.

Granted for more AI based work, one might look towards the Instinct lineup.

https://www.phoronix.com/scan.php?page=article&item=radeon-v...


> It doesn't at all.

If you didn't follow this, it doesn't to you. It does, to those who pay attention and constantly see people switching to AMD and commenting about their reasons. And these reasons are quite obvious really. Nvidia did nothing to improve their integration with Linux (the meain issue is their unwillingness to upstream their kernel driver), while AMD did a lot to improve their drivers. It's only natural to expect Nvidia Linux gaming usage to continuously drop as a result of the above.

> Source?

https://www.techpowerup.com/215663/lack-of-async-compute-on-...

AMD hardware is known to be better for GPGPU for a long time already.

> There doesn't have to be a problem with the hardware, adoption will be zero as long as alternative OpenCL/Vulkan compute based libraries don't exist.

If developers are using some closed source libraries, they are at mercy of those vendors who might be in the pocket of Nvidia. And no one stops open source libraries from not being CUDA only and from using Vulkan and OpenCL for compute needs. Given Vulkan is still rather new, it can take time for libraries to pick it up though for compute scenarios. But it will happen either way. Overall, I don't see good future for CUDA, if it will remain Nvidia only. Same as with gaming, open solutions will catch up, and Nvidia lock-in will crumble.


> If you didn't follow this, it doesn't to you. It does, to those who pay attention and constantly see people switching to AMD and commenting about their reasons.

I'm talking about this purely from a data point of view. I'm referring to the graph, and it has too many holes to be used as a sweeping generalization about the market and Linux users as a whole.

To be clear, I'm not arguing for or against NVIDIA/AMD. I'm just trying to point out the issues with using that graph as definitive evidence of anything.

> Nvidia did nothing to improve their integration with Linux, while AMD did a lot to improve their drivers.

And I applaud AMD for that. They've come a long way. However, (and this is personal opinion based off of experience), I don't see Wayland as being as a particularly massive reason. Wayland is what this thread is about anyways. There are still plenty of people using XOrg that are happy with that and don't run into any issues, myself included. That's not to say I don't support Wayland development and I hope NVIDIA never adds support for it in the future, but it's such a 'young' piece of tech that still requires some more maturing.

> AMD hardware is known to be better for GPGPU for a long time already.

Raw power is only a piece the pie. Some people take issue with that statement, but it's reality.

To the point in your link, Pascal fixes the async problem present in Maxwell.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-...

> And no one stops open source libraries from not being CUDA only and from using Vulkan and OpenCL for compute needs.

And yet they primarily are CUDA only. It's a conscious and intentional decision by the developers of those libraries to start with CUDA and stick to it. And the community not adding OpenCL support shows that AMD hardware isn't even in the space for that application.

I think that we could go back and forth with this, providing counter examples for each point the other brings up. I merely wanted to offer up an enterprise oriented view of the picture, when most people are looking at individual usage.


> I'm referring to the graph

Graph makes sense if you analyze the context. It's as I said above - increasing number of users are switching to AMD. Just check any thread in Linux gaming forums on the topic of "what should be my next GPU".

> I don't see Wayland as being as a particularly massive reason. <...> don't run into any issues,

It's one of the reasons. There are many issues with Nvidia. Abysmal integration due to lack of upstream driver such as broken vsync, no standard hardware sensors, no PRIME support and so and so forth are all well known Nvidia problems. To address them, Nvidia should either open up their driver, or to support Nouveau to begin with. So far they showed no interest in either of those cases.

> It's a conscious and intentional decision by the developers of those libraries to start with CUDA and stick to it.

That's too bad. Avoid libraries which proliferate lock-in. If their developers don't know any better, find those who do and support their efforts instead. It should be in the interest of actual developers who use said libraries, not to be locked into single hardware vendor.


> AMD hardware is known to be better for GPGPU for a long time already.

Maybe, maybe not, but the async compute argument you are making only applies to graphics applications that use compute shaders on maxwell - it doesn't apply to pure gpgpu without graphics.


Cryptocurrency miners preferred AMD cards, actually. And not the latest ones, but Polaris (4x0, 5x0), which provided the best efficiency wrt power, performance and price.

It's gamers, that prefer Nvidia. Nvidia has better optimized driver stack, although they also sometimes play fast and loose.


Originally, yes, in recent years however that changed. Look at the cards used for ZCash, Monero, etc.

Miners are increasingly moving to specialized ASICs anyway. Which is good for gamers who needs GPUs :)

Any anyone doing deep learning.

It is an uninformed choice if you want to run wayland. That doesn't mean there aren't other factors at play. For example back when I was shopping for a laptop 2yrs ago, AMD GPU's weren't even available on most vendors. Even now linux vendors such as system76 and entroware don't ship amd cards on their laptops. CUDA dominating the Deep Learning space doesn't help either. That said my next hardware upgrade will definitely have an open source supported GPU or I m just not buying.

>It is an uninformed choice if you want to run wayland

Creating a product that only works for people who already a) know about it and b) buy their hardware specifically to support it doesn't seem like a good strategy to achieve widespread adoption, which requires converting people who don't currently use or know about wayland.


Nvidia goes out of their way not to play nice with Linux. It is no wonder, that Linux ecosystem will not accommodate their special requirements.

When you are purchasing Nvidia, you are giving money to Nvidia. If you have problem with some software not working with a product your purchased, contact the people you gave money to for support.


> In terms of sheer bang for your buck, Nvidia's cards have been on top for a while now

https://www.phoronix.com/ has a recent benchmarking of absolute card performance[1], and sometimes does performance-per-dollar, as here[2] last year with OpenCL (reporting AMD's FLOPS as cheaper... when ROCm satisfices). I've found the site helpful when tracking bleeding-edge linux support for high-end graphics cards.

[1] https://www.phoronix.com/scan.php?page=article&item=radeon-v... [2] https://www.phoronix.com/scan.php?page=article&item=nvidia-r...


> I don't really know what the situation is for more noob-friendly DEs like GNOME, but on sway this tool is the i3-equivalent of push-button simplicity:

On Gnome you can bind shortcuts to copy save to a file or copy to clipboard the whole screen, the current window, or a rectangular screen region. In the latter case, you can drag a rectangle with the mouse to choose the area you want to screenshot. There is a visual indication of the copied region (it blends a white rectangle over it), and the clipboard works with all applications (both Wayland and XWayland).

For screen recording I've used Peek, which is pretty nice, too.


> If you're representing Wayland, this problem is your problem whether you want it or not.

You're absolutely, completely, 100% correct. In this scenario it is the Wayland developer's problem - wanted or otherwise - that the user is incredibly frustrated. Like most frustrated people, the user chooses to blame what seems like a reasonable target who coincidentally isn't them. Such as Wayland!

This anger and frustration is something that Wayland devs unquestionably have to deal with. It's just perhaps possible that the choices made by NVidia driving this completely reasonable frustration might not be something Wayland devs have to own.

Again, you're completely right. Unhappy users are going to blame Wayland devs. There just might be some room for subtlety.


It’s not just choices made by Nvidia, though, it’s equally due to choices made by Linux.

Linux is trying to force its unique “all drivers are open source and in our tree” view of the world on everyone. Linux could stop doing this and start guaranteeing stable interfaces.

If stable interfaces are available and Nvidia doesn’t use them, then it might be reasonable to call them “a bad actor.”


That's disingenuous. Maintaining a stable driver API is far from free, and the project has done just fine without one since its inception. Nvidia is not playing by the rules, and there's an alternative vendor. Why spend extra effort just to help them?

Assuming that a stable kernel API would just make things better ignores the potential downsides. I don't recall the name of the talk, but someone did comparative research into BSD kernel security and found that backwards compatibility was a significant source of bugs and that OpenBSD has benefited greatly from their aggressive removal of such code.

There's no reason to doubt that this holds true for Linux as well


> What is the current UX for screenshotting, screencapture, and screencapture with audio for the most featureful DE that uses Wayland atm?

The Gnome UX is pretty nice, see my sibling comment.


> So in short, people asking for Nvidia proprietary driver support are asking the wrong people to spend hundreds of hours working for free to write and maintain an implementation for one driver which represents a harmful force on the Linux ecosystem and a headache for developers trying to work with it. With respect, my answer is no.

Interestingly, this is more of an indictment of Wayland than anything in the article that's being "debunked". I don't dispute that it is difficult work, but you'll have to choose between making a viable X replacement, and dying on this particular hill.


Wayland is flourishing - everywhere except among Nvidia proprietary driver users. In reality it's them who are dying on their hill, left behind on crappy old software.

No, in reality desktop Linux, including wayland, is floundering. This is contributing to that floundering by splitting the effort, create an environment filled with vitriol on both sides, making it harder for things like games to say they support the set of things called "Linux", etc.

I'm not here to make Linux succeed. I'm here to make good software. Users who know better will use Linux and users who don't won't. Either way Linux will keep on trucking, as it has for a very long time. It shows no sign of stopping.

I'm using Wayland and love it and what it represents for the future of Linux. Desktop security is more important than supporting NVIDIA's hostile products.

Thanks for your work on wlroots and other software I've enjoyed. I imagine I'll be switching to sway soon as well. There was something preventing me from adopting it yet but I can't remember what it was. I believe it was on the roadmap though.

I'm glad you posted this article because my new year's initiative is to donate to a couple dozen FOSS projects and you definitely deserve some of that.


Given that Wayland still doesn't properly support remote desktop, a feature that has been working continuously in Windows for decades through tons of graphics model changes, I'm not sure I'd play the "users who know better" card.

[flagged]


No, it is not. The article basically just says, "yeah, we don't support remoting, someone got to make it work". For users, the second part is redundant, because the only thing that matters to them is that remoting doesn't work, and they can't really fix it on their own.

That aside, I have to say that if your responses are representative of the overall attitude towards such questions in the community, I doubt things are going to improve anytime soon.


Thank you.

Linux desktop is thriving in technical terms. I like what Wayland has brought. There is NO split effort. The direction is clear and everyone is on board except nvidia.

Really? Nvidia is the dominant GPU hardware manufacturer, because of CUDA. You cannot do deep learning without CUDA (OpenCL is not nearly as productive for development). Without deep learning, you are out of the game in the most exciting fields of software engineering. So for many software engineering professionals, Nvidia is absolutely the only option, and it is everything else that has to adapt to it.

Thankfully, Nvidia makes Linux drivers, otherwise I would have to (gasp) use Windows as my development machine. Unfortunately, Wayland doesn’t work. I love Wayland and its clean API, but I can’t use anything else except Nvidia, because there is no CUDA.


Deep learning has nothing to do with Wayland and is pretty uninteresting in my opinion. It's a matter of perspective.

And fine: if you have a difference of opinion, just use X.


I do GPGPU work and I require the proprietary nvidia driver, which I use only for CUDA computations. I'm also excited about wayland, would like to start using sway as my daily driver, and have even considered contributing. However, your attitude, and the comments you make about nvidia users, makes me feel completely unwelcome in the sway/wlroots/wayland community (despite using an Intel GPU for graphics!). Am I correct in this perception, and do you believe your attitude positively affects the health of your projects?

Nvidia provides good support for CUDA on Linux, but not good support for graphics on Linux. You recognize this, and use a different brand GPU for graphics. The Nvidia users that Sir_Cmpwn is frustrated with are those that think that the same GPU should be good for both tasks, and that everyone working on graphics do all the extra work to adapt to the whims of Nvidia engineers who don't care about their graphics use-cases.

The comment that Sir_Cmpwn was replying to litterally said "Nvidia is absolutely the only option, and it is everything else that has to adapt to it." Now that is the harmful attitude. Because you use Nvidia for CUDA doesn't mean that you need to use Nvidia for graphics. If you do choose to use Nvidia for graphics (despite a clear compatibility warning from the community), you shouldn't demand the graphics people to bend-over-backwards and "[have] to adapt to [Nvidia]."

And you, coffeecat, don't seem to demand that.


It's one thing to tell users to get lost, if they are making unreasonable demands (or any demands at all, for that matter). It's another thing to make vitriolic, divisive comments about all (not some) nvidia users, or to ban all users of proprietary drivers from an issue tracker.

I am not demanding anything, I just think that this attitude is unproductive and harms adoption and never wins. This is why Linux, which okays using firmware blobs and adapts to hardware, is the OSS champion, and Hurd is vaporware.

Wayland isn't refusing to support Nvidia for ideological reasons, they're refusing to support Nvidia for pragmatic reasons. It's not "ideologically, we'd like to be able to modify the Nvidia blob", it's "pragmatically, we'd need to make this specific change to the Nvidia blob to make it work".

You're correct that Nvidia proprietary driver users are unwelcome. We lay this out explicitly in our GitHub issue template, in fact. If you use that Intel GPU instead we'll get along swimmingly.

I do use the Intel GPU for all things graphical. But as I stated, I require the proprietary nvidia driver as well. Your issue policy is "If the nvidia module is loaded on your system, you are not permitted to file bugs of any sort" (https://github.com/swaywm/sway/issues/3039). From this, I can only infer that your attitude is to childishly exclude a large segment of users (owners of optimus laptops) on whose hardware free drivers are loaded, but who also have some use (which is none of sway/wlroot's concern) for their second nvidia GPU.

You should actually read the comment that was linked in that issue as well:

> [...] if using hybrid graphics where the primary is Intel and the secondary is nvidia, sway would continue to work (just without nvidia monitors) and the user would never see the log message. Then they're likely to create issues related to outputs not working, causing us to invest time in troubleshooting it, only to find that they are using nvidia and never saw the warning."

So yes, you can run sway with the nvidia module loaded, just please unload it and test again before you report bugs.


  > So yes, you can run sway with the nvidia module loaded, just please unload it and test again before you report bugs.
If that were the language used in the issue tracker, then fine. It's sensible to request that users reproduce the bug with the proprietary driver blacklisted before reporting. But the attitude they're projecting is hostile and unconstructive. The bug tracker says:

  > If you are using the nvidia proprietary driver for any reason, you have two choices:
  >
  > 1. Uninstall it and use nouveau instead
  > 2. Use X11+i3 and close your browser tab
  >
  > If `lsmod | grep nvidia | wc -l` shows anything other than zero, your bug report is not welcome here.
Furthermore, sway's behavior is to check whether the proprietary driver is loaded and, if so, exit unless sway is started with the flag --my-next-gpu-wont-be-nvidia. It's hostile, childish, and very off-putting to a potential user and contributor.

Technically, I think that they can do that. That’s what the fork button is for. However, if they are willing to alienate graphics enthusiasts, they wouldn’t be getting any respect.

I don’t want to settle for Intel GPU, because I have a clearly superior GPU available (Nvidia). I don’t want to settle for X, because I have a clearly superior API available (Wayland). I have a 4k OLED display, because if it is there, why settle for something worse? I realize that enthusiasts are rare, but it’s us who help iron out the bugs for the newest hardware, and it is sad to see such an alienating attitude from fellow cutting-edge developers.


So you are feeling self-righteous and looking down on people who have the wrong hardware?

Hardware that is wrong, because the (market dominant) company behind it, does not make a open-source driver?

I mean, you do realize, that Open-Source is sadly not the norm and rather something very odd to traditional industries?

So to people who do not worship the GPL(or even heard of it and just want their software to work), you might be a bit offputting. Which might not be the way to increase the significance of Open Source. Which is a bit sad. Because I love OS and yes I know about the GPL etc. - but even I am put of by that arrogance and ignorance. So I am not saying you need to support X or whatever, but maybe do not direct your hate of Nvidia to the users of Nvidia. Maybe they are even new to this. Because those would then just shake their head and go away, (back to windows). Hurray.


There is a difference between not supporting something and actively trying to sabotage it at every turn.

Nvidia doesn’t try to sabotage OS. They just don’t want to support existing APIs that the rest of the OS community uses, because they already have a similar API, readily available for everyone. The same problem is with Wayland developers — they already have an API that Intel and AMD agreed to use, and they do not want to make a special case for Nvidia. These kind of questions are resolved by asking oneselves who is having greater adoption — Nvidia or Wayland.

Nvidia's primary Linux user base cares about cuda. As long as those needs are met, I don't see much changing in the short term. Gaming and other use cases are still insignificant in terms of numbers.

Is there any distro using Wayland out of the box, aside from Fedora?

> There are two approaches to this endorsed by different camps in Wayland: these Wayland protocols, and a dbus protocol based on Pipewire.

Pipewire approach looks the best for screenshots and video recording. I hope all Wayland compositors will settle on it.

> Secondary clipboard support <...> wp-primary-selection

I guess the standard is very recent. Last time I tried, KWin didn't support it yet.

> We fought long and hard over this and we now have a protocol for negotiating client- vs server-side decorations, which is now fairly broadly supported, including among some of its opponents.

A major problem with Mutter remains. Mutter developers don't want to depend on GTK for whatever reason, but so are SDL, mpv and other non-GUI applications and toolkits developers. The result are ugly looking windows for such cases in Gnome.

> You have to step up, though: no one working on Wayland today seems to care.

It's a problem with Wayland in general. The end user gets the short end of the stick, since with removal of some functionality that was in X, some use cases are cut off. But there is no one player like with X server. Tons of compositors are developed by different people. So even raising the issue let alone coming to some consensus takes a very long time.

I'm not saying Wayland isn't the way to go, just pointing out it's a very hard transition.

> Wayland doesn’t support Nvidia!

How is the progress of using Vulkan for Wayland compositors and avoiding the whole EGLstreams mess?

I see this: https://github.com/KhronosGroup/Vulkan-Docs/issues/294

But it's not clear from it whether now there is a way to avoid using GBM or EGLstreams.


The problem is that it requires dbus, which not everone is on board with. Our argument is that the Wayland compositor should speak the Wayland protocol. That being said, the software I mentioned can be used to bridge between the two camps.

Following your edit: yes, the new standard protocol for primary selection is very new. But everyone supports the GTK protocol and the transition will not be visible to users.

Dude, so many edits: "But there is no one player like with X server" -> this player is wlroots

Oh my god man, stop editing your comment: "How is the progress of using Vulkan for Wayland compositors and avoiding the whole EGLstreams mess?" -> hard to say now, it's very very early.


Thanks for answering :)

> this player is wlroots

One of the players still, since for example KWin isn't using it. I.e. it unfortunately came too late to unify the situation.


A strong majority of Wayland compositors are using wlroots. Those that don't are in the minority, but they are mostly cooperative with wlroots aside from GNOME and E.

Isn't GNOME the biggest single desktop by market share?

Only because Ubuntu users started implicitly using it. KDE looks more popular among those who are explicitly choosing a DE.

But how do you know if someone explicitly chooses GNOME when it's the default?

Looking at GOL stats for example, Gnome usage grew when Ubuntu switched from Unity to Gnome (on roughly the amount of Ubuntu users). Before that, KDE had higher usage overall. So my conclusion is that Gnome's majority today can be driven by implicit Ubuntu's choices.

It's hard to measure. Probably?

That's good, but even if one isn't cooperative, it complicates things. And Gnome is quite big.

> Mutter developers don't want to depend on GTK for whatever reason

GTK is 100% designed to be a (Wayland/X/Windows/Mac) client. Shoving it into the display server is bound to end in disaster.


> Shoving it into the display server is bound to end in disaster.

KWin have no problem using Qt, so I don't buy the disaster argument in general for such solutions. But may be GTK is a lot worse.


AFAIK KWin mostly uses QtCore and maybe QtOpenGL and does most of the XCB/Wayland stuff by itself, precisely because they're a X11/Wayland compositor, not client.

But it's enough to be able to draw basic window decorations (i.e. border with title bar with control buttons). So why can't Mutter do the same to address the case of SDL-like windows that would prefer server side decorations?

It absolutely can, Mutter developers just choose not to.

> But it's not clear from it whether now there is a way to avoid using GBM or EGLstreams.

There's no avoiding GBM or EGLstreams, if you want a compositor, that can get a OpenGL/VA-API/whatever handle (as opposed to pixmap in system RAM) from the client and be able compose it on GPU (as opposed to being rendered by client by whatever means it wants, maybe transferred to system RAM, and then by compositor back to the GPU).


Do you mean even purely Vulkan based compositor will need GBM/EGLstreams? Or only when it needs to create an OpenGL context (which is obviously necessary to support as well).

Then I suppose Nvidia should finish their proposed common memory manager, which looks stalled now.


Even Vulkan compositor wants to compose OpenGL (or other non-Vulkan) clients, so it needs a generic way to work with a cross-API handles to buffers allocated in Video RAM. That's what GBM/EGLstreams/that stalled Nvidia memory manager solve.

> Things like sending pixel buffers to the compositor are already abstracted on Wayland and a network-backed implementation could be easily made. The problem is that no one seems to really care: all of the people who want network transparency drank the anti-Wayland kool-aid instead of showing up to put the work in. If you want to implement this, though, we’re here and ready to support you! Drop by the wlroots IRC channel and we’re prepared to help you implement this.

Somebody has actually done this!

Erik De Rijcke put together Greenfield [0], which is a Wayland compositor that relays surfaces with H.264 or JPEG over WebRTC. It is exactly as cool as it sounds, and it seems like it actually might work pretty well. It's the sort of thing that could be standardized, with a bit of work.

[0]: https://github.com/udevbe/greenfield


There are a number of counter arguments to the parent article in "Why I'm not going to switch to Wayland yet."[1]

On the subject of leaving the responsibility of taking screenshots to the Wayland compositor:

"for simple things using the compositor's screen shot tool is fine. But what if I don't like the screenshot tool for my compositor of choice? My experience with the GNOME screenshot tool (granted this was pre-wayland) was that it wasn't as good as, say, shutter, which has a lot of options, let's you easily crop and edit the screenshot from inside the screenshot tool etc. And then swaygrab doesn't even (currently) have an option to capture a rectangular region."

There are some other things "Why I'm not going to switch to Wayland yet" mentions which are important to me, like Wayland's lack of color picker tools and xdotool functionality.

The parent article says:

"Wayland doesn't have network transparency! This is actually true! But it's not as bad as it's made out to be. Here's why: X11 forwarding works on Wayland. Wait, what? Yep: all mainstream desktop Wayland compositors have support for Xwayland, which is an implementation of the X11 server which translates X11 to Wayland, for backwards compatibility. X11 forwarding works with it! So if you use X11 forwarding on Xorg today, your workflow will work on Wayland unchanged."

So why wouldn't I just use xorg to begin with?

Overall, Wayland just seems immature to me, and not a viable competitor to Xorg, which has had all of this functionality for decades. As an Xorg user, I really struggle to come up with compelling reasons to switch.

[1] - https://old.reddit.com/r/wayland/comments/85q78y/why_im_not_...


>screenshot quote

I addressed this in my article, please read it. And you can select a rectangular region on sway today, using tools which are portable across many Wayland compositors.

>So why wouldn't I just use xorg to begin with?

You can, no one's stopping you.


It’s still fuckin sad that we haven’t managed to build something lasting..

Hey Drew (and other contributors), thanks for all your work on this, I switched from i3 to Sway last week and everything has been quite stable. My only gripe has been that screenshotting/screencasting apps can only pull from fullscreen and can't capture from a window yet. The portal APIs seems to support this but the wlroots protocols only allows export from composited outputs, not from un-composited wl_surfaces. Hope this gets some love soon.

Sway is a do-ocracy. Get in there and do it! It's not as hard as you think.

I am aware that I may be the one who ends up implementing that :)

How would something like Easystroke [1] need to be implemented for Wayland? Would this have to be inside the compositor as it's like a global hotkey?

[1] https://github.com/thjaeger/easystroke/wiki


Yes, this would have to live in the compositor, or suitable protocol extensions would have to be developed and implemented.

How would you invisage it being implemented in sway for example? It's quite esoteric to be maintained along with a minimalistic compositor like sway, and moreover it probably requires a GUI for configuration. Would a PR implementing it be accepted?

It's probably just not a feature we'd support at all in sway.

I am really unstatisfied about the status of remote desktop session on GNU/Linux and other unices in general.

RDP on Windows works amazingly, it's really like having your remote machine at your fingertips (as long as network holds).

On GNU/Linux on the other hand, most solutions are hacky or cumbersome or non-standard or a linear combination of the previous adjectives.

I really wish I could have truly good remote desktop support on GNU/Linux.


Have you ever tried X2Go? I've used it a few times and it's been incredible - instant feedback exactly how you describe RDP.

There's also just regular old rdesktop, which seems to work pretty well if you're mostly connecting to Windows servers.


Yes I've tried X2Go and rdesktop with xrdp. Meh.

X2Go was cool and worked well, but support for various distro was a bit unclear (only Ubuntu? can't remember, that was like four years ago) as unclear was on which condition you could actually resume your session and how (what if I disconnect unexpectedly? how do I list my current sessions? Is there audio in/out forwarding? clipboard sharing?). The website homepage failed to address many of these questions, ultimately failing to answer the question "can I actually rely on this thing?".


IMHO hotkeys is something basic. I understand Wayland is being designed with security in mind, but Wayland developers must understand that what they're trying to do is to, ultimately, create an X replacement. And X is, mainly, user-oriented.

You can't have a graphical, user-oriented, server and tell the user to "hey, just use a shell script and a control binary, and then hook a hotkey into your window manager and make it run that script". That's just not how user friendliness works.

Try to find a common ground between Wayland being secure and Wayland being practical.


I think the wayland devs aren't focusing on replacing absolutely everybody's X-based workflows, but on providing a reasonably engineered component that integrators can build a desktop environment on top of. In practice, we'll probably get a distro shipping a self-contained DE with a centralised hotkey setup, and I expect that will do the job for the vast majority of users. Maybe someone else is building a phone OS and using wayland for the graphical bits and the hotkey thing isn't even going to be on their radar.

But why does every single DE has to reinvent the wheel with hotkeys? Why can't we have that provided by Wayland? (or maybe by libhotkey, if you wish). Why so much effort in replicating the same functionality over and over must be wasted in the entire linux ecosystem?

Not ever single DE, but the Compositor a DE uses must have it implemented.

Though Compositors may share libraries as well, and a unifying library might very well be developed for this.


My current showstopper with Wayland is that no screensharing application works with it: Skype, Meet, TeamViewer. I need them to work with my customers so I'm staying on Xorg until that gets implemented.

Same here. If Skype and Chrome screen sharing worked in wayland I am switching to wayland right away, as it seems everything else works well.

Personal criticism: it's not FUD if it's correct. A lot of the things the author concedes are actually limitations, even if not inherent (did anyone ever claim that?) are reasons I can't use Wayland or at least would be massively inconvenienced.

I never understood this complains? The main use cases work, I have almost never found a Linux user that was unhappy with Wayland. Once a while a power user, but that one is smart enough to install X if he needs a special functionality.

I never had a problem, and if you upgrade the next time, check the hardware compatibility and buy what you need. Till then, just run a X stack.


Wayland doesn’t support full-screen sharing, for example in Google Meet (aka Hangouts). Hopefully this is fixed soon (via dmabuf-export, mentioned in the article) but until then (and for the past couple years) it’s a major hassle for a lot of people who had Wayland forced onto them by their desktop environment. Even if this isn’t Wayland’s fault they would be wise to assist with a fix soon.

Regarding gaming by the way. XWayland works quite well for it already (it's critical since Wine for example still requires X and many older games depend on X as well), but there seems to be some issue with radv at present which requires setting explicit number of back buffers to avoid capping that disregards vsync settings:

https://github.com/doitsujin/dxvk/issues/806

If games are using SDL, you can also select Wayland using SDL_VIDEODRIVER=wayland. But there are some glitches still. I tried doing it for example with ScummVM, and it produced double cursor.



Great write up. That said I found the section about nvidia drivers confusing.

> All three are necessary for most Wayland compositors. Only the first two are implemented by the Nvidia proprietary driver.

Ok. But what about nouveau? Is nvidia supported using the open-source drivers (the only ones I will use) or are these drivers too lacking?

Are the wlroots developers completely disregarding nvidia-support over issues only found with the proprietary drivers, or are there actual unworkable nvidia issues regardless of driver used?


Yes, nouveau is of course supported.

But nouveau has trouble supporting the newest cards currently, since Nvidia seems to try their best to make their lives hell.


The Ubuntu 18.04 Gnome Wayland session works well on nouveau. (I've used one consumer card from 2010 or so and a pro card from 2013 or so. Also tried a newer pro card that nouveau didn't support, regardless of Wayland. In that case, the easiest solution was to remove the card and buy AMD.)

Nouveau does run under 1080gtx ti, but I’ve got a weird setup with 3x4K screens, so the driver needs to increase the GPU clock and it fails as that is a part of the proprietary blob.

So it should work with most of the setups until you’re trying to reach a certain performance threshold.


What's the status for global push-to-talk for voice chat apps like Discord? Does that also fall under hotkeys/"no one working on Wayland today seems to care"?

> Wayland can be keylogged, assuming the attacker can sneak some evil code into your .bashrc

I’m not sure what scenarios are being invisioned here, but isn’t that absolutely something a nefarious app could do, because the app will be run as you, and you have write access to your bashrc?

As an example, since it’s fresh in my memory, if the folks who put dodgy cryptocurrency stealing code into npm packages had instead put this keylogger in there, that would have worked right? You run npm install, npm runs as you, the package runs as you, the package updates your bashrc.


Yes, but that package could also do all sorts of other things, like encrypting your ~ and demanding ransom. It's not Wayland's problem to solve this - sandboxing is a separate matter. Wayland makes it easier to sandbox graphical applications but it is not a complete sandboxing solution.

Has wayland been released yet ? Does it still freeze the whole gnome desktop when dragging a Firefox tab from one window to another (asking for a friend) ?

Wayland the protocol was released a long time ago. Wayland implementations like GNOME are in varying states of completion. I don't know about GNOME in particular, but my compositor - sway - is shipping 1.0 within the next month or two.

> https://swaywm.org/

Looks good ! Do you think fancy animations and tiling could be done (wobling, fading and swiping effects when switch from one pane to the other, etc.)?


We're not interested in those for sway, but we work with a project called wayfire via wlroots that you may be interested in:

https://wayfire.org/


I saw wayfire last week and looks cool, a bit of compiz/beryl revival on wayland :)

I think sway is prety nice as it is but I would love to see a few things to make it feel "modern"

  - round borders
  - simple drop shadows
  - a way to add small effects when opening/closing/selecting windows. 
I was wondering how difficult would it be to allow run custom shaders on those actions. So for example one could implement very easily fade in/outs, gray out all the windows except the selected, and custom effects, etc... just adding something in the config file like on_selected_glsl: 'path_to_fragment.glsl (unixporn community would love it :)

Anyway, I'm pretty happy on the current state, so thanks for the amazing work :)



Wayfire (already mentioned by Drew) indeed has wobbly windows, the desktop cube and even the titular fire :)

https://www.youtube.com/watch?list=PLb7YRKEhWEBUIoT-a29UoJW9...


Cool. I used to love tiling managers (awesome-wm user) and I always thought it was lacking a smoothness factor when transitioning from a pane to another or when rearranging containers. Something that wouldn't make you think you just blinked and the windows were moved, a way to follow where visual objects are going to.

Something like that (notice how the resize is smooth and not instantaneous, which allows your eyes to follow movements but the left and right align are instantaneous and feel like something hit you from nowhere since you can't track the transitions with your eyes): https://streamable.com/wb36p


It’s the default on a number of Linux distros. I’ve never seen it crash when dragging tabs between windows, but it is still less stable than Xorg, in my experience. Full crashes are very rare as of recently, however. I’m impressed with how much it’s improved, but I don’t use it myself, as it isn’t supported on OpenBSD (where Xenocara is the X11 implementation).

Sure:

$ (kubuntu 18.04LTS default install, not tweaky tweaky) loginctl show-session 3 -p Type

Type=x11

https://bugzilla.redhat.com/show_bug.cgi?id=1399093

To be fair I remember video playback was waaaaaay smoother with wayland than X last time I tried. But this kind of bugs happened to me last year. I don't have time anymore for distribution tweaking, either the current LTS works or I pick up something else (that's why I don't mind running x or wayland these days: whatever works is okay).


Firefox-wayland released in Fedora maybe year ago. Can't remember nothing related in GNOME.

I remember that one https://bugs.launchpad.net/ubuntu/+source/nautilus/+bug/1704... (does apply to ff drag and drop too if I recall correctly, carried from 17.10 to 18.04) and https://gitlab.gnome.org/GNOME/gtk/issues/895.

I've been using Sway on my laptop recently and it's been a great experience. It looks great (easily customizable) and runs smooth.

Wayland is a great demonstration that not all change and deviation from tradition in met with vitriolic anger from the linux community (as has been asserted). Wayland meets its fair share of skepticism, but it seems cool heads prevail and the situation stays civil. This gives me confidence in the maturity of the community and the long term viability of Wayland.

I wonder if some of the ire was bled off by Mir? I mean, when Canonical announced Mir, people mainly responded with anger that they weren’t contributing to Wayland which was a longer-standing attempt to replace X11.

That's an interesting hypothesis, I hadn't considered that.

> So in short, people asking for Nvidia proprietary driver support are asking the wrong people to spend hundreds of hours working for free to write and maintain an implementation for one driver which represents a harmful force on the Linux ecosystem and a headache for developers trying to work with it. With respect, my answer is no.

I don't see how Wayland can succeed without supporting Nvidia. Yes, they'll have to write several thousand more lines of code, but if you're going to make a desktop environment in 2019, you're going to have to put in that effort to support the graphics card that most people have. It would be great if Nvidia had better Linux support! But they don't and washing your desktop platform's hands of the issue just makes the whole project DOA.

Are there any competitors to Wayland that don't take a hard-line stance against supporting the most popular and most powerful graphics cards in the world?


Not even Apple, apparently: https://appleinsider.com/articles/19/01/18/apples-management...

As someone who hasn't bought a GPU in several years, but might be in the market for one soon, I have to wonder: what the heck is going on with Nvidia?

I remember back in the day, on Linux, they were the graphics card to get if you wanted the highest performance, and could get their proprietary drivers working reliably (I never could). But now they seem to be alienating even closed-source partners.

As someone who thinks any GPU made in the past 5 years is more than good enough, avoiding the Nvidia driver situation (on every platform) would be worth the slight performance loss of getting literally any other card.


Nouveau works. Use it instead.

Also: it's possible to have a definition of success which excludes users of the proprietary driver. We don't have the same value system.

>the [...] most powerful graphics cards in the world?

Citation needed? And the highest of the high end graphics cards are excessive vanity most of the time, you don't need the tip-top to play any modern game at any settings you want.


Your work on sway and wlroots is phenomenal, but I feel like there is quite a gap here in terms of why people are concerned about Nvidia support.

Yes, Nvidia are not great when it comes to OSS, but like it or not, for doing real work in ML, scientific computing, or video and photo editing, Nvidia with CUDA is simply a much better experience and much more broadly supported.

While I don't think it should be the job of wlroots/sway to fix Nvidia's less than ideal driver chain, I think it is important to remember that a lot of people get a lot of productive work done on with Nvidia cards and would love to get rid of some of the pain of X while still being able to use their high end GPUs for both work and graphics.

It seems fairly clear with with MacOS focusing less and less on high end machines, there is a gap for high end unix workstations and hopefully desktop Linux can fill that gap, but right now that likely means Nvidia support... I think the best answer is just to buy amd and market forces will get Nvidia to change, but that still doesn't mean everyone can just go that path right now to get their day job done.


>Yes, Nvidia are not great when it comes to OSS, but like it or not, for doing real work in ML, scientific computing, or video and photo editing, Nvidia with CUDA is simply a much better experience and much more broadly supported.

Fine. If you do these things, stay on X.

>While I don't think it should be the job of wlroots/sway to fix Nvidia's less than ideal driver chain, I think it is important to remember that a lot of people get a lot of productive work done on with Nvidia cards and would love to get rid of some of the pain of X while still being able to use their high end GPUs for both work and graphics.

I honestly just don't care. They should boycott Nvidia and find something else to do with their time if they really want better support from Wayland. I'm not going to bend over for Nvidia. Ever.


Nor do I think you should, my only point is to say that people's FUD about Wayland often has roots in FOMO about not being able to use the new and shiny for their workflow and no clear path to when/how that will be fixed, regardless of whose "job" it is to fix the problem.

The people who demand change are wrong too, I am super happy people like you are doing what you do, otherwise this OSS thing really would never work.

But to be clear, while I think your stance is completely reasonable and justified, it is just a hard pill for some people to swallow. That doesn't mean that those who are jerks about it are justified, but it also doesn't mean that those who raise concerns, that those concerns aren't warranted.


Where is the current Nouveau work for supporting these features on nvidia cards? I feel like that's where the focus should be (where you should contribute time, money convince your company to sponsor, etc.) to get the features generally available in Linux without the proprietary drivers.


> Nouveau works. Use it instead.

It works beautifully with sway. Less so with KWin for me at least, although apparently NVIDIA are doing some work to fix that.

Reclocking support on recent GPUs is also a problem for anything that wants full performance - they really need to sort that out. I get that this is hard to do.

This whole situation is just really annoying and it must be immensely frustrating to be on the end of a lot of the complaints especially when the position of the sway developers not wanting to essentially write it twice just for NVIDIA support is totally reasonable.


> Nouveau works. Use it instead. No it doesn't. I have dual GPU laptop Intel UHD 620/Geforce MX 150 and until I blacklist nouveau the laptop is unusable. It either won't boot or when it boots will freeze the system every 30 sec for 10 sec making it unusable.

So use the Intel GPU instead.

Sure, that's what I am doing anyway. Just wanted to point out that nouveau is not really great option.

Working and actually taking advantage of the performance I paid for are two different things.

This is a very dismissive attitude that embodies the problem with Wayland. Millions of gamers absolutely do need that acceleration they paid for, it's not "vanity".

> We don't have the same value system.

I just want a usable Linux desktop that leverages the hardware in my machine. The "values" here seem to be: we acknowledge the extra work required to support mainstream graphics cards, we just don't want to do it and it's really your fault as a user for buying it/asking us to support it.


No, they don't. I'm a gamer myself (infrequently these days to be fair). You can have excellent performance with maxed out settings on big ol' 4K displays... on last year's GPUs, which work fine with Nouveau.

>it's really your fault as a user for buying it/asking us to support it.

Yes, I'm soooo sorry that I don't want to do free work to help you prop up a company that lives to make me suffer as a developer...

I'm clearly exagerrating here but I hope the point is made. Like the article says, just keep using X, I don't care.


> No, they don't. I'm a gamer myself (infrequently these days to be fair). You can have excellent performance with maxed out settings on big ol' 4K displays... on last year's GPUs, which work fine with Nouveau.

I don't have a horse in this race because I still do all my gaming on Windows (literally the only reason I still have a win10 license), but I think this is generally true. Up until last year I was playing current titles on a GTX-660Ti on high settings (not ultra) and was getting very reasonable performance. I now have a 1070 and a 4k monitor and it can play all the titles I own on ultra settings in 1440p @ 60 hz. It can't really push ultra in 4k @ that refresh rate, but I think people who need those settings are a pretty small minority at this point. It's important to remember that the market for desktop computer games has taken a back seat to the console market for a long time. Many, if not most, current AAA titles are developed for consoles first, and consoles are all running hardware that is a couple of years old at this point. Anyway, keep up the good work!


Oh, cool, what kind of games are you getting playable performance (60+ fps?) with? I was under the impression that nouveau was basically unsuitable for modern/AAA videogames, maybe I need to reevaluate!

Well, like I said, I don't game much. But recently I've been playing a little bit of DOOM at 4K on Vulkan via Xwayland using AMDGPU and have had no issues. I also had success in playing Final Fantasy X with PCSX-2 recently (at 1080p I think, but I'm pretty sure I had room to spare in terms of performance anyway). I also play osu! fairly often, but that's not exactly graphically intensive.

Again, you are completely ignoring people's use cases with smug glib replies. I assure you Nouveau absolutely does not work fine with newer cards, and never will, because those cards are locked into a low power state in Nouveau. (Due to nvidia firmware fuckery). We are talking about 10% (if that) of the performance levels VS the closed drivers. This absolutely, unquestionably, makes a huge amount of intensive 3d software (not only games) completely unusable.

If you are wondering why people are pissed about wayland, its precisely awful attitudes like this that sweep aside their real concerns and use cases with broken ideas that don't work, have never worked, and likely never will work.

Don't expect people to be on board when you so blatantly don't care about their needs.


I don't care about their needs. I've made this clear many, many times. But I do care about the needs of many thousands of users who choose not to reward abusive companies by buying their products.

You can keep using X with Nvidia. It's not my problem.


So perhaps you should complain less when people are hostile about wayland, because this is one of many aspects of how completely inadequate it is as an X11 replacement.

It doesn't make sense to be hostile about Wayland just because you can't use it due to your poor choice of graphics hardware. X still works. Just use that! No one needs to get upset.

wayland's plan is to superseded then depreciate X11 (ie X11 stops development). Again, you understand this and are just being glib and offering solutions _you know to be broken_ (because getting rid of x11 is the entire goal of wayland to begin with).

There is a ton of dishonesty underlying your arguments, and you sure as hell don't do any favors for Wayland, or how people perceive it when you use dishonest bs like this.


It is then up to Nvidia to find a way forward, once X11 development stops. After all, they monopolized driver development for their GPUs, so when something breaks, they get to keep all the pieces.

This is the real answer. The Wayland devs are the X11 devs. When the music stops and X11 development ends Nvidia is going to have to make the choice between supporting Wayland or dropping support for graphics on Linux.

Maybe your misconception TFA is debunking is that wayland centers your use case. ¯\_(ツ)_/¯

Nvidia makes many millions each year from CUDA on Linux. If they want to keep biting the hand that feeds them, there are lots of places that would be happy to embrace ROCm once it becomes viable.

Much more likely is that these shops just never install Wayland. I think people are underestimating the switching costs for moving from X to Wayland especially when that move requires buying brand new video cards.

I think the average user is going to "switch" to wayland by being handed a new laptop that happens to run a recent-enough gnome that uses wayland as the default.

> to have to put in that effort to support the graphics card that most people have

FYI: most people have Intel GPUs. More than both AMD and Nvidia combined (Somewhere around 70%, AMD around 13% and Nvidia around 17%). Intel and AMD are supported, ergo for most people it works just fine. It could work with Nvidia too, if Nvidia were cooperative.


> I don't see how Wayland can succeed without supporting Nvidia.

Well, they should get their act together and work with the rest of the community instead. In KDE, people working in KWin said that they will not support NVIDIA for Wayland, unless NVIDIA itself steps in, proposes a change, gets it approved, and maintains it.

That's what they ended up doing so far, although the change hasn't been merged anywhere yet AFAIK.


The GNOME project supports nvidia in its current beta

Awesome! Does it use Nouveau or the Nvidia binary drivers? There's a big difference in performance between the two.

It works with nvidias driver, recently nvidia began working on some patches for kde as well I belive.

Yep, the aforementioned patch: https://phabricator.kde.org/D18570

I heard that in a world based on the Wayland protocol, applications implement the window decorations.

If so, I'm interested in how we can expect consistency; and eg. moving the window of an unresponsive application.

This could well be a misconception, which is why now seems a decent time to ask.

These seem to be good qualities of X. Wayland has been a number of years in the making, so I have somewhat lost track of my source of this information.

Edit: Apologies, I did read the fine article. It appears I must have skipped a section.


See discussion about "server-side decorations" in the article.

Yeah, that's mentioned in the linked article as one of the common misconceptions.

Does Wayland support ripping of DRM'ed video streams?

Most compositors don't support DRM (that kind) at all (yet), only recently there was a protocol extension published + a merge request for Weston + I guess something on the driver side?

Most of the answers are either "well that's true, the protocol doesn't allow it, but implementations do it in non-standard ways" or "we are working on it" or "oh this is not a problem with Wayland, see, Wayland is just a protocol". i.e. at least half of the criticism, as of today, is still practically true.

You can't even open application windows on Wayland without a protocol extension.

The core Wayland protocol doesn't do these things by design, but that doesn't mean end-users can't do them on Wayland compositors which support these use cases in the places that they're designed to be supported.


The end user doesn't care about protocols or compositors. The end user wants a system that can replace Xorg and do the same or even a better job. And, as of today, Wayland fails miserably at that.

>The end user doesn't care about protocols or compositors

Which is why they shouldn't care that the core Wayland protocol doesn't implement it. These use cases WORK!




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: