
Sway 1.0 is not going to support the Nvidia proprietary driver - abstractbeliefs
https://drewdevault.com/2017/10/26/Fuck-you-nvidia.html
======
shmerl
KWin developers decided not to play this game either and just ignore
EGLStreams[1]. Gnome on the other hand budged and implemented an extra path.

The main point that Martin made back then:

 _> Overall we are not thrilled by the prospect of two competing
implementations. We do hope that at XDC the discussions will have a positive
end and that there will be only one implementation. I don't care which one, I
don't care whether one is better as the other. What I care about is only
requiring one code path, the possibility to test with free drivers (Mesa) and
the support for atomic mode settings. Ideally I would also prefer to not have
to adjust existing code._

KWin though has the benefit of supporting X11 which Nvidia blob users can rely
on. Wayland only compositors are really in a tough spot, until this mess is
resolved.

1\. [https://blog.martin-graesslin.com/blog/2016/09/to-
eglstream-...](https://blog.martin-graesslin.com/blog/2016/09/to-eglstream-or-
not/)

------
taesis
Sway [1] is a tiling Wayland [2] compositor [3], in case anyone was lacking
context.

[1]: [http://swaywm.org/](http://swaywm.org/)

[2]:
[https://en.wikipedia.org/wiki/Wayland_(display_server_protoc...](https://en.wikipedia.org/wiki/Wayland_\(display_server_protocol\))

[3]:
[https://en.wikipedia.org/wiki/Compositing_window_manager](https://en.wikipedia.org/wiki/Compositing_window_manager)

~~~
djur
Why does a window manager developer have to support specific graphics hardware
in Wayland? Isn't abstracting away that kind of detail the first
responsibility of a windowing system and/or framebuffer manager?

~~~
alxlaz
Wayland does not have the "window manager" concept. The compositor (which, for
the lack of a better term, well, composites windows) implements the display
server protocol as well. It creates and hands off surfaces to the clients, and
those clients directly render their contents there. So it's somewhat akin to a
window manager + some display server parts.

(Note: I have written very little Wayland-related code, and quite some time
ago, so do not put too much faith in the explanation below).

Managing device-accessible surfaces is very obviously hardware-specific, so it
was abstracted behind a number of mechanisms. One of these is the GBM API,
which is implemented (classically) by everyone except NVIDIA, and is the API
that sway is going to use.

NVIDIA's proprietary drivers do not support GBM, and NVIDIA is pushing for its
own solution called EGLStream. EGLStream does solve several problems that GBM
has, but it seems to me that what makes it most attractive to NVIDIA is that
it is theirs (it is an open standard but has only one serious commercial
implementation, on a single vendor's devices, so at this point it might as
well be closed). If you are curious, there is a wider discussion on the topic
here: [https://lwn.net/Articles/703749/](https://lwn.net/Articles/703749/) .

~~~
djur
This is very useful context, thanks. I hope things standardize quickly,
because having to worry about whether your terminal emulator is compatible
with your desktop environment is compatible with your graphics card and kernel
seems like it'd be a huge step backward for the Linux desktop. Xorg has been
something I haven't had to even think about for the better part of a decade
now, and that's the way it should be.

------
eartheaterrr
In my early stages of learning to program, I had numerous issues with running
Linux desktop on my Nvidia chipped laptop. I remember the frustration very
clearly - I hardly slept for ~3 days trying to fix my broken Ubuntu
installation and I was unable to continue with my programming exercises. I
couldn't afford a new laptop, much less a MacBook, and Windows didn't support
bash, nor did Windows have the community support that I needed. My feeling of
failure in getting my machine to run was demoralizing, so much so that I
considered quitting my pursuit of an IT career.

As painful as it was, I managed to get my machine working. But it led me to
never trust an Nvidia chip for my Linux desktop ever again. I wish I had
documented the insane finagling I had to perform just to get my machine in a
workable state. Maybe a year later, I encountered another Nvidia issue on my
Ubuntu laptop, and I posted my notes on a blog:
[http://www.lukeswart.net/2014/05/nvidia-optimus-with-
bumbleb...](http://www.lukeswart.net/2014/05/nvidia-optimus-with-bumblebee-on-
linux/) I wonder how many aspiring computer professionals have quit because of
the barrier in getting a Linux desktop to run smoothly.

I am indebted to the community members who devote their time and expertise
online to help Linux desktop users. Even if they can be a little snarky :-) If
someone is learning to program and only has a hand-me-down laptop with an
Nvidia gpu, then a Linux desktop may be the only reasonable option. I just
hope this can be more accessible. But as the author points out, Nvidia is only
making this problem worse.

~~~
Pokepokalypse
ha.

Had all those same problems with an ATI-chipped laptop.

Your problems were not nvidia-specific.

Also: Windows has supported bash for a very long time - under Cygwin. (also
Cygwin used to have pretty decent community support, back in the 1990's).

I'll say this also: Ubuntu is probably the EASIEST distro to get into a
working state; with either Nvidia or ATI. (the proprietary drivers have their
warts - but the worst are in their installers) ... (as long as you aren't
itching to get special features working, like power management, or switching,
etc - those things DO work, and CAN be made to work, with given chipsets, and
kernels, and driver revs - if you're smart, and lucky... for the rest of us,
there's the open source drivers, which are generally pretty good, and have
gotten MUCH better over the past 10 years).

~~~
saas_co_de
I have been using ubuntu daily for the past 7 years with very few issues.

If you can buy a laptop that is known to work that is best. If you have
existing hardware then it is sort of a crapshoot.

Desktop is easier. Just stick to intel (CPU, iGPU, ethernet, and wireless) and
you will have no problems at all, and even with some realtek and broadcom
stuff thrown in it usually works.

------
kungfooguru
It is infuriating that anyone finds this to not be appropriate. Nvidia is the
one that sells products and doesn't bother to care enough for their Linux
users to work with the community. If you want Nvidia proprietary driver to
work with Sway complain to Nvidia for how they've handled their driver for
Linux for years.

We need more projects to do what Sway and Kwin have done.

------
nhumrich
I dont disagree with this article, however its articles with this strong
language and "better than you" attitude that gives linux a bad name.

~~~
opencl
NVIDIA repeats the exact same terrible behavior with Linux stuff over and over
again (as the article put it, "throwing code/blobs over the wall and expecting
everyone to change for them") and IMO they deserve large amounts of angry
public criticism over it until they stop.

~~~
synicalx
I kind of sympathise with NVIDIA in all this; they've got a very tiny and very
vocal market share that behaves very very differently from their main customer
base.

With Windows users NVIDIA know that they want pretty games and they want them
prettier very year, and they also know that Microsoft is probably going to
work with them to make that happen. They've also got game developers and
hardware vendors lining up to get NVIDIA logos on their products.

Whereas in Linuxland, they've got a kernel, a billion side-projects that may
or may not be widely used and they all want NVIDIA to conform the THEIR way of
doing things, an angry Finnish gentleman who just shouts and swears at things
he doesn't like, and a largely non-gaming and very fragmented user base that
could be using any number of desktop environments/kernel versions/drivers etc.
On top of all that, this menagerie only accounts for about 3.8% of the desktop
market. And they're also kicking and screaming that NVIDIA isn't doing enough
to support the things that they want to do.

Sure NVIDIA could be working faster to better support Linux by throwing more
people and money at it, but they don't have to and until more people use Linux
they don't really have much of an incentive to do so. They'll probably have
better Linux support, eventually, and if that fits in with their business plan
then that's fine. If you're a Linux user and NVIDIA doesn't do what you want
it to do, then that's fine as well - vote with your wallet and buy from
someone else.

~~~
danieldk
_Whereas in Linuxland, they 've got a kernel, a billion side-projects that may
or may not be widely used and they all want NVIDIA to conform the THEIR way of
doing things, an angry Finnish gentleman who just shouts and swears at things
he doesn't like, and a largely non-gaming and very fragmented user base that
could be using any number of desktop environments/kernel versions/drivers
etc._

There is one big elephant in the room here that you are missing and that is
GPU computing. This is probably a big market for them now (Teslas go for a
couple of thousand a pop) and the vast majority of it is on Linux. Their
ecosystem with CUDA and CuDNN is pretty awesome, on the other hand, their
driver has issues.

At any rate, it's a market that they cannot afford to lose. As OpenCL support
for e.g. machine learning libraries such as Tensorflow is progressing they may
not have a lock on the market as they have now.

~~~
ktta
>As OpenCL support for e.g. machine learning libraries such as Tensorflow is
progressing

I wonder if the complex OpenCL API throws off many ML developers. Also I think
they tend to focus on CUDA because there is much less chance of it breaking
between devices. OTOH, each vendor has their own .so for OpenCL and they
usually tend to be buggy and stay a point version or two behind.

This is the explanation I tell myself when I wonder why the _hell_ Google
chose CUDA over OpenCL.

~~~
thawkins
Also, on machines with mixed intel/nvidia setuos, why cant i use the intel
solely for display and the nvidia soley for compute,why does the gpu have to
be tied into the display system, its a pain in tbe ass. To use cuda you almost
must make tbe gpu work as a display system.

~~~
ktta
You should be able to do that. You can use the Nvidia GPU in a 'headless'
mode.

~~~
tankenmate
The issue is that on many hybrid laptops the nvidia card is wired to the
display but the built in intel chipset isn't wired to the display or only
wired to the HDMI port.

------
bitL
If I want to play games on Linux in the same quality/speed as on Windows, I
have to use NVidia. If I want to do Deep Learning in Linux, I have to use
NVidia. If I want to plug 3x 4k monitors on Linux, I have to use NVidia. If
you can't come to agreement with them, what am I supposed to do?

~~~
Sir_Cmpwn
You just play games at marginally inferior quality. Our new work on Sway also
will let you use multiple GPUs without any proprietary interfaces, so you can
do 3x 4K on AMD too. Regarding Deep Learning, that's too bad. If you can
afford a 3x 4K setup though you can probably afford to have a Nvidia GPU for
Deep Learning and an AMD GPU for your desktop.

There are compromises you can make, most of them easy. None of this is my
problem as the maintainer of Sway.

~~~
falcolas
> None of this is my problem as the maintainer of Sway

So, in short, you're firing your Nvidia users? Sometimes it has to be done,
but you'll have to be very clear that you don't want Nvidia users.

Passive-aggressive attacks in an indirectly linked blog post against the
userbase and Nvidia themselves are not sufficient, you will have to just be
explicit about it in your GitHub README.

~~~
Sir_Cmpwn
Well, I'm firing my Nvidia proprietary driver users. Really, I never "hired"
them in the first place. The ones who use nouveau are cool in my book. I don't
feel the need to call this out in the README.

Also, this article contains pretty forwardly agressive attacks, I think.

~~~
SEJeff
Your users (those with 4 x 30" monitors like me) will be firing you as well.
It isn't my fault that Nvidia makes superior GPUs and writes shit software :)

~~~
Sir_Cmpwn
I use 4 displays, 3x 1080p and 1x 4K. I'm happy to lose users who don't do
their research and misportray the superiority of Nvidia, though.

~~~
SEJeff
I do CUDA work (OpenCL is close, but not nearly as powerful) along with run
big monitors. I've done my research, and there is Nvidia, and inferior
options.

~~~
majewsky
A friend of mine is doing heavy CUDA work at the Helmholtz research center in
Dresden, and at the same time he's desperately waiting for AMD Vega cards to
be in stock somewhere so that he can build a desktop without shitty drivers.

~~~
jhasse
Reference Vega cards are available in stock. Check mindfactory.de for example.

~~~
majewsky
Yeah, but the reference coolers are apparently not very good. Yesterday
evening, my friend was pondering if he could remove the reference cooler and
install an aftermarket CPU cooler.

------
teekert
"Nvidia module taints kernel"

I remember seeing that scroll by on my Gentoo install about 15 years ago. This
strange sentence actually got me reading deeply into the what and how of open
source. I have been hoping for a nice Nvidia Open source driver since... Until
Intel took that away, I'm not a gamer/deep learner so I always opt for Intel
integrated graphics nowadays, saves money, hassle and power.

I always believed that if AMD/ATI got their S together and made a super nice
open source driver they'd make a lot of money! Guess I've always over
estimated the Linux users' market share.

~~~
KozmoNau7
The current open source AMD driver is supposedly rather nice. I'm seriously
considering to replace my ancient GTX460 with a shiny new RX560.

------
Jasper_
I'm sure you're aware that they started a new open-source project for buffer
allocation? gbm has some major design flaws which makes it not that great for
most drivers and they have been trying to suggest alternatives.

[https://lists.freedesktop.org/archives/dri-
devel/2016-Octobe...](https://lists.freedesktop.org/archives/dri-
devel/2016-October/120067.html)
[https://github.com/cubanismo/allocator](https://github.com/cubanismo/allocator)

~~~
Sir_Cmpwn
Author here. I should probably mention this, I'll add some information. It's
not all roses here, either, for what it's worth.

~~~
Jasper_
> Edit: It’s worth noting that Nvidia is evidently attempting to find a better
> path with this new GitHub project. I hope it works out, but they aren’t
> really cooperating much with anyone to build it - particularly nouveau. It’s
> more throwing code/blobs over the wall and expecting everyone to change for
> them.

This is far from the truth. This is a piece of code that everybody has wanted
for the longest time (surface allocation is far from a solved problem), the
solution was discussed with the nouveau developers at XDC2016 and XDC2017.
People from ARM and Red Hat have both contributed to the project.

[https://www.x.org/wiki/Events/XDC2016/Program/jones_unix_dev...](https://www.x.org/wiki/Events/XDC2016/Program/jones_unix_device_mem_alloc/)
[https://www.x.org/wiki/Events/XDC2017/Program/#james_jones](https://www.x.org/wiki/Events/XDC2017/Program/#james_jones)

~~~
Sir_Cmpwn
Here's a more recent article to get you up to speed:
[https://www.phoronix.com/scan.php?page=news_item&px=Nouveau-...](https://www.phoronix.com/scan.php?page=news_item&px=Nouveau-
XDC2017)

------
yongjik
From TFA:

> When people complain to me about the lack of Nvidia support in Sway, I get
> really pissed off. It is not my fucking problem to support Nvidia, it’s
> Nvidia’s fucking problem to support me. Even Broadcom, fucking Broadcom,
> supports the appropriate kernel APIs. And proprietary driver users have the
> gall to reward Nvidia for their behavior by giving them hundreds of dollars
> for their GPUs, then come to me and ask me to deal with their bullshit for
> free. Well, fuck you, too. Nvidia users are shitty consumers and I don’t
> even want them in my userbase. Choose hardware that supports your software,
> not the other way around.

Now, contrast with [1]:

> I first heard about this from one of the developers of the hit game SimCity,
> who told me that there was a critical bug in his application: it used memory
> right after freeing it, a major no-no that happened to work OK on DOS but
> would not work under Windows where memory that is freed is likely to be
> snatched up by another running application right away. The testers on the
> Windows team were going through various popular applications, testing them
> to make sure they worked OK, but SimCity kept crashing. They reported this
> to the Windows developers, who disassembled SimCity, stepped through it in a
> debugger, found the bug, and added special code that checked if SimCity was
> running, and if it did, _ran the memory allocator in a special mode in which
> you could still use memory after freeing it._

[1] [https://www.joelonsoftware.com/2004/06/13/how-microsoft-
lost...](https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost-the-api-
war/)

~~~
Sir_Cmpwn
Author here. You know Microsoft engineers were paid 6 figure salaries to work
on that, right? I have a Patreon where I make enough money to recoup _half_ of
the _costs_ of maintaining these projects.

~~~
yongjik
"I'm doing it as volunteer" is not an excuse for being unprofessional.

~~~
StavrosK
I would expect professionalism from someone I'm paying. I don't expect
professionalism from someone who's basically giving me free stuff.

If someone offers to paint your fence for free, and your fence has some really
nasty angles at some point, would you say the volunteer was unprofessional if
they said "yeah I'm not going to paint that part"?

~~~
yongjik
> I don't expect professionalism from someone who's basically giving me free
> stuff.

Ironically, that's exactly what Microsoft has been saying about Linux during
those bad old days. Linux was successful because its developers didn't say "If
you want to talk to a professional, buy something else."

I'm not saying an OSS developer should bend over and take abuse from users.
But I believe there's a line between that and calling users "entitled brats"
just because they use vendor-supplied drivers for their $500 cards.

~~~
kelnos
No, he's calling his users "entitled brats" because they come into his channel
and bitch and whine and demand he support them... for free.

~~~
sgift
After the post and all his comments here I have my doubts that "bitch and
whine" isn't actually "people ask me questions, I realize that they use Nvidia
so their questions are by definition whining and bitching."

~~~
kelnos
You're certainly entitled to your opinion, but as someone who has actually
maintained widely-used open source software, you'd likely be shocked at the
level of entitlement I've encountered among OSS users.

------
jbeard4
I bought a Lenovo Thinkpad W520 back in 2012. This laptop has an NVIDIA
Optimus chipset, and for 4 years, I attempted to get triple-head support
working under Linux (outputting to the laptop screen, and two monitors).
Finally, in 2016, I was able to install the NVIDIA proprietary drivers through
apt on ubuntu 16.04, and everything worked, including triple-head support.

Overall, all the hardware in the laptop is now working perfectly, and has been
extremely stable. It took a long time, but I'm glad it's finally working using
the proprietary drivers. Unfortunately, I haven't gotten this to work yet
using nouveau. I'm currently running Mate and XMonad on X, and I have a
feeling it's going to be a few more years before I will be able to run Wayland
on this machine.

------
axiom92
Obligatory Linus rant on Nvidia:
[https://www.youtube.com/watch?v=IVpOyKCNZYw](https://www.youtube.com/watch?v=IVpOyKCNZYw)

------
woahhvicky
There should probably be a user space layer that abstracts over direct kernel
graphics APIs. There’s no reason GPM vs EGLStreams shouldn’t be an
implementation detail.

Another note, a window manager author shouldn’t have to deal with these sorts
of issues. Puts into question the “wayland way.” I guess this is what wlc and
wlroots are for but seems like those approaches have flaws given there was a
need for wlroots to even exist in the first place.

~~~
ascent12
GBM already IS a userspace layer than abstracts over driver-specific kernel
interfaces. That's the whole point.

You also seem to have missed the whole point of wayland's design. There is no
separation between "window managers" and "display servers"; there are just
"compositors". So naturally it is something that needs to be dealt with. wlc,
libweston, wlroots (for which I am an author, responsible for the DRM/GBM
code), etc., are designed to make it easier to write compositors and allow for
more code reuse. So yes, someone using one of these libraries wouldn't need to
deal with these details, but we're actually writing such a library.

~~~
badsectoracula
> You also seem to have missed the whole point of wayland's design. There is
> no separation between "window managers" and "display servers"; there are
> just "compositors".

I think the point is that this design is bad.

------
rjbwork
I'm tired of iNVtel. I'm buying AMD gfx and compute next time. I'm not the
power gamer I used to be, and their new stuff is powerful enough, while their
cpu's are getting pretty awesome.

------
finchisko
I have exactly opposite experience. All my computers used to had ATI cards. I
forgot the exact issues I had, but I remember reading all those issue FAQs
basically saying buy Nvidia to solve your problem. I was always jealous to
Nvidia owners for their Linux support, to then point I actually bought Nvidia.
I didn't experienced any problems with Linux since then. I wonder when
situation reverted being ATI better? And why I always have to be on the wrong
side when choosing graphic card.

~~~
jorams
The situation is annoying. Up until recently, when you wanted high performance
drivers for Linux you needed an AMD or NVIDIA card with proprietary drivers.
The NVIDIA driver was a bit annoying in some ways, but those issues could be
worked around. The result was a stable installation with great GPU
performance. Meanwhile, the AMD driver was somewhere between unusable and
terrible. It updated so slowly that you had to stay on ancient versions of X
to even be able to use it. For users of rolling-release distributions, that's
simply unworkable, and so they bought NVIDIA.

There were also open source drivers, which worked quite well, but they weren't
even close in terms of performance. (Intel has always played nicely, and their
official GPU drivers are open source and work well. Their performance falls
into this same category.)

Recently, however, AMD has replaced its proprietary driver with a new, open
source, high performance driver. AMD is now a serious choice. Sadly the tables
have turned, with NVIDIA holding back progress on Wayland by not cooperating
with the community.

~~~
majewsky
Can confirm. If you have a card that supports the amdgpu driver, then the
experience is great. Everything just works, and at competitive speed no less.

------
hornetblack
There's also the new external imaage/fence and semaphore support in Vulkan.
Which looks like it perform the parts done by gbm without referencing gbm
explicitly.

Last I checked it was only in a mesa dev version. But it would be a standard
hopefully everyone using Vulkan can support

~~~
dogma1138
Didn’t Vulkan adopt EGL? EGLStreams is a Kronos open standard.

~~~
hornetblack
Nope. You use Vulkan WSI and Swapchain extensions instead of EGL.
[https://www.khronos.org/registry/vulkan/specs/1.0-wsi_extens...](https://www.khronos.org/registry/vulkan/specs/1.0-wsi_extensions/html/vkspec.html#wsi)

------
Ericson2314
Yeah NVIDIA's market share makes me sad. AMD seems to have become a much
better team player last I checked.

------
moonbug22
Just another reason not to care about the cadt.

------
frik
Thanks to the hype and bubble around ICO and digital mining, all the AMD
graphic cards are way overpriced.

------
jackmott
Can someone ELI5 why one would want sway vs just using i3?

~~~
yellowapple
Sway uses Wayland, which is shiny and new and fast.

i3 uses X11, which is old and crusty and not particularly fast.

~~~
djur
But apparently a big difference between X11 and Wayland is that people
implementing X11 window managers don't have to go out of their way to support
particular graphics hardware. I have never in almost two decades of using X11
on Linux found myself worrying about whether my window manager supported my
graphics card.

~~~
pritambaral
That's because X11 is a standard, running on a stack of other standards (DRM,
KMS, etc.).

Wayland is also a standard, that should be running on a stack of other
standards (DRM, GBM, etc.), but then Nvidia comes along and goes "Fuck GBM.
I'll make my own, incompatible "standard"".

~~~
badsectoracula
No, that is because X servers only provide the mechanism (window system,
graphics, events, input, etc) but not the policy while providing functionality
for the clients to provide that policy themselves. So a window manager can be
made with and rely only to functionality provided by X11.

On Wayland the "window manager" (compositor) has to provide both the mechanism
and the policy.

------
dingo_bat
> Nvidia users are shitty consumers and I don’t even want them in my userbase.

Watch out, we have a wannabe Linus over here!

------
captainmuon
Wow, so much vitriol. I'm glad I _don 't_ like tiling window managers [x].
Sway seems well done, so I would use it and then be annoyed by the maintainer.

Wayland, systemd, heck, that they removed ifconfig - these are all changes I
understand and welcome from a technical POV, but that lose me as a user. The
pinnacle of desktop linux was around 2009, afterwards usability seemed to
regress. I'd love to use Linux, but currently I feel more comfortable on Mac
or Windows.

Doesn't help that leading projects on the linux desktop often have such
abrasive developers.

\----

[x] I tried them, but I found they are not really for me:

\- I need something that passes the coworker or girlfriend test: Other people
should be able to sit down and use my computer.

\- Keyboard shortcuts don't actually make me faster. I find personally there
is a lot of cognitive load trying to remember the keys to press, even after a
lot of practice. I like to offload this to my eyes and arms, and use a mouse
to e.g. arrange windows. That is the reason I gave up and now use arrow keys
in VIM, and VS Code when possible.

