Hacker News new | past | comments | ask | show | jobs | submit login
Chromium blacklists Nouveau (freedesktop.org)
321 points by trulyrandom 71 days ago | hide | past | web | favorite | 388 comments



For anyone interested, the link really should be the full bug report: https://bugs.chromium.org/p/chromium/issues/detail?id=876523

The current link says it as if Chrome was not being fair, but reading their reasoning it makes total sense. Key quote from said bug report:

> We thought about blacklisting nouveau driver long time ago, and decided to let more adventurous users to play around. Now that Ubuntu ships with nouveau on default, maybe it's time to blacklist it in Chrome.

> We received quite some bug reports on other rendering issues with Nouveau, and this bug is just one of them. So we will disable all GPU acceleration by default. If someone wants to bypass that, there are two options: 1) install proprietary NVidia drivers (see #24) 2) run Chrome with --ignore-gpu-blacklist, but be aware you are taking a risk here

> Unfortunately we don't have the resources to test every variation of every GPU/driver combination on linux, let alone investigate & fix bugs in drivers. We want a stable & secure browser first, a GPU-accelerated one second, only if possible.

> The default driver on Ubuntu LTS has severe issues, asking non-technical users to update their driver is just not acceptable as a prerequisite to use Chrome.

> If someone is interested in well-scoping the brokenness (version range and/or devices affected), we're happy to take a patch to the blacklist.

So the driver fails in some (common enough) cases, they let it run anyway but now it's default on ubuntu so a lot of non tech users are/will be affected, so they block it but you can still bypass the block if you want. And if newer version fix it and someone can help them figure out which versions work for what, they are willing to limit the blocking.


This is actually a reasonable position in my opinion, but there are two problems (neither of them is Google or Chromium's fault):

1) Nouveau is buggy and problematic (related more to performance and a lack of support for many features)

2) The Nvidia drivers are buggy and problematic (related more to setup and compatibility with the rest of the system)

Both of these problems can be traced back to the same root cause, which is simply that Nvidia refuses to do a good job of supporting Linux in any way. They could give Nouveau some funding and information about their hardware, but they don't do that. They could develop proper Linux drivers, but they don't do that either. Every way you look at it Nvidia is at the root of the problem.

The solution on the consumer side is to not buy Nvidia products, and to encourage other people to not buy Nvidia products. The solution on the producer side is to contribute to Nouveau (easier said than done unfortunately, your average web dev can't just jump in and reverse engineer graphics drivers on a weekend).


>which is simply that Nvidia refuses to do a good job of supporting Linux in any way.

I really dislike the way Nvidia makes it difficult for projects like Nouveau to reverse engineer an open driver, however I don't understand what you mean by poor support for their Linux driver, it's probably the one thing I can't fault them for as it is performing very well and stable.

I'm with you on not buying NVidia products though, that is the best way to send a message.


YMMV I suppose. Some people seem to have good experiences with their driver but mine have been terrible. I've used Ubuntu as my daily driver for 7 years now and Nvidia's driver has been the #1 source of crashes and configuration problems (half of the time I can't even get 3d accel working with it turned on, spend hours trying different versions that are reported on some forum to have fixed some problem but they don't, can't figure out why things aren't working, upon upgrading from 16.04 to 18.04 it borked the whole system and I had to switch to Nouveau, this list of Nvidia driver pain goes on forever, this is across multiple installs of Ubuntu).

All my experience has been on laptops and there are lots of reports of Optimus related problems on Linux so there's that. I gave up on the hope of having Optimus actually work years ago. I would like to just be able to enable the latest version of Nvidia's driver and see my laptop start using the discrete gpu. That would be a big step forward from the past 7 years.

FWIW there's Linus' famous Nvidia F-bomb :) "Nvidia has been the single worst company we ever dealt with" https://www.youtube.com/watch?v=IVpOyKCNZYw


FWIW it's quite possible that's not Linux-specific, a bunch of people have problems with NVIDIA driver crashes on, say, Windows too.

There's a reason Microsoft shimmed the ability for the display driver to crash and restart without taking out the rest of Windows...


You should give the SGFXI[1] script a try. it's a bit of a hassle to use video drivers from outside the apt-repos but that script basically always manages to set up a working driver for me.

[1] https://smxi.org/docs/sgfxi-manual.htm


One other factor: On 18.04, I have found it nearly impossible to use the NVidia driver. After installing, whatever I do, nouveau takes over. This was not the case on previous Ubuntu versions. Even putting nouveau on the blacklist does not solve the problem -- somehow even if I put it on modprobe's blacklist, also in the kernel arguments, it somehow manages to start up. No idea how or why. I solved it before, eventually, I can't quite remember, but since re-installing my operating system I haven't figured it out again. I think I eventually had to actually move/delete the nouveau ko files, which is a very destructive way to deal with a problem and interferes with apt, and even then they got re-built and re-installed by dkms and took over the display again! I did find a permanent solution which escapes me at the moment.

Anyways, it would be good if NVidia and Ubuntu got together and at least made it easier to make use of the official driver.


> 2) The Nvidia drivers are buggy and problematic (related more to setup and compatibility with the rest of the system)

Aren't scientific farm servers running on Linux kernel with Nvidia cards ? It would not be used if it was that buggy, so I think that claim needs precision


Is that science? A bunch of computing units running secret code?!

Reproducibility in computational science is recognized as a huge problem: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3383002/

A lab or institution claiming a publishable result based on discounted hardware from the manufacturer running code that does who knows what for their TensorFlow/whatever calculations is engaged in something other than science. I wouldn't go quite so far as to say that they're just engaged in advertising demonstrations, but it's coming close.


let's step back from the brink. Plenty of good science is done on proprietary foundations. It would be unrealistic to do otherwise. Reproducibility doesn't necessarily mean every component of the system is completely open and inspectable.


   Reproducibility doesn't necessarily mean every component of the system is completely open and inspectable.
If it's not "open and inspectable" then it cannot be reproduced by another group/researcher. It's just unsupported claims building on a wonky methodology.

I wonder what percentage of "Machine Learning" results are based on such shaky foundations and when they'll receive the same amount of skepticism and scorn we see heaped(deservedly) on the field of psychology.


> If it's not "open and inspectable" then it cannot be reproduced by another group/researcher.

That doesn't follow. They can (attempt to) reproduce with the same hardware, firmware, and drivers. They can reproduce with similar hardware from another vendor (AMD). If you're using OpenCL, you can reproduce on CPUs, albeit much slower and probably only for partial results. Maybe you want to break out some pen and paper to verify a few calculations of the CPU for even more partial results because you don't trust Intel's firmware blobs, even though you've presumably cross checked against AMD hardware, but you've gotten quite silly at that point.

Maybe some of these changes won't give you bitwise identical results - but then again, neither does most science. You check if the broad results and patterns are statistically similar. Maybe they are, and the results survive a large variety of similar but different test setups. Maybe they don't, and you discover an uncontrolled variable, like the exact chemical composition of the surface of your glass beakers being contaminated by impurities, or your AI being sensitive to the exact floating point rounding behavior of your GPU.

If anything, you want to see if your results survive variations in the uncontrolled variables, to make sure you've properly determined what factors are controlling the results.


What you say is correct, but partial.

If you cannot reproduce then you do not know whether it is due to the equipment/materiel varying or not and you have severely reduced options for investigating that.

If you look at the problems with even reproducing and testing the bugs that the specific situation under discussion exposes it is illustrative of the difficulties involved with working with a non-modifiable, unexaminable platform.

In your example, the researchers would be forbidden by IP laws from examining and testing the surface of the beakers for contaminants and publishing a description which allowed other researchers to make similar modifications.

It's the most sub-optimal situation in which to obtain something which starts to approximate reality. Not impossible, just difficult, and likely to lead to shenanigans.


The cards are not the issue. Drivers can be problematic with certain workloads (like rendering UIs) and not others (like loading and running non-graphics programs). It's also a lot easier to debug and deploy a fix to a farm of identical servers with identical drivers than it is to be compatible across a huge breadth of configurations.


>> Unfortunately we don't have the resources to test every variation of every GPU/driver combination on linux

So Google doesn't have the resources to test the default driver on the most widespread linux distribution for one of the two most widespread GPU producers in the market.


No, in fact one of the bug comments refutes this sarcasm:

> The default driver on the distribution we support is broken.

They do test the default driver on the distributions they support. The tests show that the driver is unacceptably unstable. It may work better on other distros/versions, but they don't have time to test those. They're happy for others to help out in testing these unsupported combinations, however:

> Again, if someone wants to spend the time to test thoroughly to narrow down the blacklist, we will accept patches.


Not for a free product, no. It's not our place to decide for Google what they want to do. If we want our open source software to work we should make it work. You can rebuild chromium to change the blacklist, or just use the command line to tell it to ignore the blacklist. It's not like Google doesn't provide options.

And in any case the reason nouveau sucks (and yes, folks, nouveau sucks) isn't Google's fault to begin with.

You seriously want to complain about Google's lack of support for an open source driver that barely exists because of NVIDIA's lack of support?


It's free in the sense you don't pay for it with money, but considering how much Google pays apple to use Google as the default search engine, I would say the Google Browser is a big money maker.


No, Chromium is free in the GNU sense, which is what I meant. You don't whine at open source providers to support features you want, it's bad form. Go fix it yourself, or use hardware that doesn't require this kind of hacky reverse engineering.


It's fairly uncommon for open source software to blacklist components that (attempt to) implement a compatible interface. Usually selecting an appropriate hardware driver is left in the hands of the distribution or administrator.


Yes, and the distributor or administrator is free to rebuild the browser or just set the flag to disable blacklisting. It’s only default behavior that changed.


Chrome has the special sauce — distributors and administrators cannot rebuild it, only Chromium.


Flip the flag then.

It's crazy to suggest they should give web access (not just webgl, but CSS styling) to a buggy GPU driver by default without an opt in by the user.


Is that different than the branding/trademark issue that Firefox also had for years?

(Not bring cynical, I genuinely don't know.)


Yes, it's different. You can build trademark Firefox so long as you do not modify it at all. You cannot build Chrome from the chromium sources — there are components which are not present in the open source repository. It's not just the trademark, unlike Firefox vs Iceweasel.


Thanks for the explanation!

From reading more about it, it seems these are all benign (in the sense of Chromium couldn't ship h264 because it's licensed but Chrome can).


> You don't whine at open source providers to support features you want, it's bad form. Go fix it yourself

This just about sums up the problem with open source. The response is always "if you don't like it, fix it yourself". Because everybody is a developer with copious free time to learn how to fix the problem in their favorite product. It's not like cloning and building Chromium takes hours. Or like you would lose anything by using Chromium instead of Chrome.


> Because everybody is a developer with copious free time to learn how to fix the problem in their favorite product.

But that's the point. The response always works because it's proportional to the problem. For problems that are legitimately small and you could fix yourself, or pay someone to do it for an amount of money that a normal person could feasibly pay, it's a valid way to solve the problem.

And for problems bigger than that, it invites the user to consider what they're really asking for and who they're asking for it. Actually fixing nVidia's dumpster fire would be a huge ordeal for anyone other than nVidia. Asking a third party to do it without documentation... let's just say there is a reason nouveau is in the state that it's in, and it's not a general lack of interest in fixing it.

So in that case "go fix it yourself" means "if you think it's so easy then let's see you do it."


I don't see what the issue here is. If you don't have the time to do it then get someone else to do it. If they won't do it for free then you pay them. The unfortunate part is that if you regularly buy brand new Nvidia cards then the cost of the proprietary driver is built into the price of the card, so it sucks that you would have to pay twice for drivers. But this clearly is a result of Nvidia's decision, they are thes one who are forcing you to take personal responsibility for getting your own libre drivers working, not the developers working on said libre (and gratis) drivers.


Sorry for the confusion, I see mentioning Chromium by name made my comment a bit misleading. I was responding to the parent comment (not to the original post) and using Chromium as an example of how it's unrealistic to ask users to fix things themselves when using OSS. I was not trying to comment on Google's decision on this particular issue here (though I have mild thoughts on that too).


The proprietary version is "if you don't like it, learn to like it."


Which is the same as the open source version?


No, with open source you have the choice to do or pay someone for the work. With closed source you get what the devs give you.


That’s still Hobson’s choice.


> Because everybody is a developer with copious free time to learn how to fix the problem in their favorite product

It takes copious free time to learn how to run chromium with an --ignore-gpu-blacklist argument? What kind of strawman are you trying here?

Look. The open source NVIDIA driver is terrible, and Google doesn't want it messing up the experience of the people using its product. Who are you to demand that Google fix someone else's terrible driver (which is terrible through no fault of Google's -- NVIDIA would prefer it not exist at all)?

If you really cared about this at all, you'd be upset with NVIDIA. But you're not. You apparently don't even like open source software, which presumably means you're using the binary drivers and unaffected, right?


I'm not complaining about the decision, I'm just saying the rationale given is ridiculous.

The driver is hopeless? Fine.

You don't want to spend money on people that will most probably install an ad blocker? Go ahead.

You don't have the resources? Come on.


Obviously it’s not that Google doesn’t have the resources. They have enough resources to fund whole new OSes. Of course they could fund testing.

Chrome team, on the other hand, does not have the resources. Which is to say that this is not the thing they believe is most important to work on at the margin, and given the amount of engineers Google will fund for the project.


Which, to be clear, seems entirely justifiable because software rendering exists.

If Nvidia wants their cards to work well on an all-free-software stack, they can provide a free software driver. If users care about making an unofficial driver work, they can do the work or fund the work. Chrome is just trying to get web pages on the screen and it doesn't seem like it's worth the boil-the-ocean level of effort of taking responsibility for implementing a reverse-engineered graphics driver when they can just get web pages on the screen using software rendering.


As far as I understood they could get pages on the screen even with the nouveau driver.


That's the part that I don't understand. The bug report says on Linux the same rendering is used for the UI as well as the page content... but only the UI was ever having a problem?

Something doesn't add up to me there, and no one that I saw ever addressed that in more detail.


If the chrome team doesn't have the resources for testing the most common Linux configurations, it's not a good sign. Linux market share is low but it's a useful signifier.


you're the only one saying the don't. they are saying the most common linux configuration causes issues so blacklist the buggy driver in it

they are also saying their that driver may work fine on some less common linux configurations, and the chrome team does not have the resources to test those.

you are literally saying they are saying the opposite of what they said, then blaming them for it.


They were using "not enough resources" to support the idea of blacklisting. If they meant it as you say, that means they're blanketly blacklisting the most common config despite having the resources to test it, which is even worse.


I'm not sure what is unclear to you here. They do support the idea of blacklisting for every possible unpopular config, despite it possibly working. This is not only common sense, it is what every company does. When you write your software, do you test it on every possible distribution users could use, or just the popular ones?

The common distribution was blacklisted not because it was not tested, but because it clearly has issues observed and reported by the users.


...no, "every company" does not blacklist unpopular configs.


If your software doesn't work in a popular config they don't blacklist it in all the untested less popular configs? You are clear they are blacklisting the GPU driver, not blacklisting a linux install - right?

If a driver does not work in popular configs, and needs to be blacklisted in those configs, your solution is to still whitelist it in other configs and have the browser likely not work.

yes, "every company" does exactly what google did. you are being purposely dense, trying to catch a "technically correct" chance of not admitting the stupidity of your question. The problem is, you are not even technically correct - just purposely dense.


Huh? It appears they did test it, and for that matter (according to the linked post) so did the Nouveau devs who also found it failed the easily accessible test suite.


We agree Nouveau is completely blacklisted right? That's the main problem here.

And the test suite failed on a minor issue where the cause was uncertain, which gets in the way of checking for the bigger definitely-the-driver problem.


If you can hang the GPU with regular webgl content, that's not a minor issue.


That's the reason I have webgl disabled even though I'm on Windows with a supported driver. It opens a gaping security/DoS hole without any benefit (do you ever see webgl used in any website?)


They have the resources to test that configuration. They just don’t have the resources to fix the driver bugs the tests found.


I mean, just because it's one of two GPU producers, that doesn't mean the nvidia product line is simple. There are desktop chips, there are notebook chips, there are multiple versions of the same card made by different manufacturers who add their own little touches, there are different chipsets, there are different configurations, there are different X11 setups, there are different desktop environments which may enforce different settings, etc.

It's not as simple as "one of the two most widespread GPU producers in the market".


and it's not one of two, it's one of three. Intel has huge market share, also, especially in things like notebooks.


Unfortunately, 'the most widespread Linux distribution' is rounding error in the overall scheme of things from a desktop standpoint so it's actually pretty good that they support it as well as they do. Adding Nouveau to the mix is rounding error on the rounding error. So from Google's standpoint, sure they don't have the resources to test it relative to the number of impacted users. Looking at it another way: it's not like Apple or Microsoft will (ever) provide better support so they can get away with it.

I solved the Chrome/Chromium on Linux problem long ago by switching to Firefox. I'll still fire up Chromium for Google sites and a handful of other sites, but for the most part Firefox has been getting it done for me.


They don't have the resources to "fix" it, they tested it and decided it was not stable enough. What they did was right. Firefox entirely disables GPU acceleration, so I am not sure what your argument is.


No, they don't want to waste time trying to make a broken driver work.

It's an Ubuntu/Nvidia problem, not a Google problem. AMD and Intel open source drivers work just fine.


Exactly. Nvidia have been relentlessly hostile to Free Software for ever. No one who wants an easy life or a stable, upgradeable operating system would buy an Nvidia card.


> most widespread linux distribution for one of the two most widespread GPU producers in the market

That's still less than a percent of the market I think. Maybe shave off another order of magnitude. So the economic argument is not the one to make. However, it sucks nonetheless.


Its still probably hundreds of thousands of users. I hate arguments about "but Linux is only a fraction of the market!" because the total OS / browser usage market is in the billions of people. Even small fractions of that pie could found their own countries and be of reasonable size for a nation.


>found their own countries and be of reasonable size for a nation.

Honestly, you could fill up the number of people using nouveau with chromium in a stadium more likely. From the small pie of linux users, you drill down to nvidia, and then further down to nouveau. All laptop users, even with nvidia cards, will only use the intel gpu unless they explicitly enable the discrete mode and not install the nvidia drivers. Desktop users obviously only buy nvidia if they want to use the binary driver. What do you think remains of the pie ? I haven't met a single person in real life who uses this stack. I love nouveau. That doesn't change the numbers though.


Then of those people, there is a fraction that uses the Nvidea cards that are impacted, and an even smaller number of uses will run into issues.

To save those users from themselves, Google now disabled GPU accelleration for all Ubuntu users.

Seems like a bit silly to me.


If the market share were compelling enough, couldn’t some other maintainer step up and do this work? OSS doesn’t necessarily mean support for everything under the sun at all costs; Chromium may be open source but that doesn’t mean they have carte blanche from Google to test every possible combination of distro and GPU.


This kind of problem specifically doesn't map to "user finds problem, puts money where mouth is on bug bounty to fix it" because actually figuring out your Blink based browser isn't hardware accelerated is at least a nontrivial task most won't even realize until they go "oh wow whys it so much faster on Windows?" and then not even know where to start looking to find out why.


Why should those would-be maintaners help Google? There's always Firefox to fix.


Firefox disables GPU acceleration on Linux entirely, largely for the same reasons.


I'm pretty sure Mozilla just don't care about Linux these days. It's been getting steadily worse, and 64.0 is almost unusable for me due to a bug that has nothing to do with graphics which hangs the whole thing for minutes at a time.


>Now that Ubuntu ships with nouveau on default, maybe it's time to blacklist it in Chrome.

I don't get this. All linux distros under the sun ship with nouveau, and always have. It is a kernel driver afterall. What else are they going to ship ?


I think their point is that by default, even with the "use recommended drivers" option enabled, Ubuntu uses the nouveau driver by default instead of installing and using the NVIDIA driver.


the nvidia driver is proprietary. If nvidia opens the source, maybe they would ship with that instead.


Well, installing recommended drivers on Ubuntu installs the Intel or AMD microcode, which is definitely proprietary, so I don't think that's a particularly reasonable argument.

The NVIDIA driver is the driver that yields the best performance and experience; AFAICT the only reason it's not shipped by default is because the kernel is then considered "impure" or something (because source isn't available to debug issues in the NVIDIA driver), hence bugs can't be filed upstream for any issues in the kernel if the NVIDIA driver is installed.

However, none of this is really the concern of the Chrome team. Their responsibility is to give the user the best experience, which they have defined as security, stability, and performance in that order. Since the nouveau driver is causing stability bugs, it's perfectly reasonable for them to fallback to software rendering and take the performance hit in order to preserve stability.


> the kernel is then considered "impure" or something

The term is "tainted".


> What else are they going to ship ?

Including Nvidia's binary driver by default. Obvs Debian, gNewSense, and GUIX can't do that, but since when have corps cared about them?

Like, don't get me wrong, I'm a FOSS or GTFO kind of guy, running Debian and Replicant, but shipping proprietary drivers by default is what chrome is hinting at obviously.


You can't, for the same reason you can't ship ZFS.

The combining of proprietary and GPL code has to be made by the user and the result is un-distributable. That's why distributions cannot ship it.

What they can do is to make the above step easy (which Ubuntu does).


It's way more complicated than that.

AFAIU, Nvidia's legal argument is that their driver can't be 'derived from' the Linux kernel even though it's linked into the kernel since it's older than their Linux port and only is linked against their GPLed (and if push comes to shove, dual licensed GPL and proprietary) kernel abstraction layer. Judges don't care about linkers and the GPL doesn't actually say anything about linking.


This is not the derivation argument (that would be forcing GPL on Nvidia, which they want to counter, of course).

This is about GPL infrigment; you can combine GPL licensed code with differently licensed code, but only in your privacy, without distributing the result to others. Once you would distribute the result, you would be breaking GPL, thus having no right to distribute.


The GPLv2 literally defines itself in relation to trees of derived works:

> The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language.

The argument that I've is that even when linked together, the Nvidia binary driver isn't a 'derivative under copyright law' because it doesn't have Linux specific entry points, and is older than their Linux port, and therefore isn't part of "The 'Program'" as listed in the GPL.


GPL defines itself by four essential freedoms.

Your argument is, why some piece of software should (or should not) be relicensed to GPL.

My argument is, that since that piece of software is currently not GPL licensed; avoids the relicensing topics and explores how to solve the current situation wrt distribution of the combined work. Use by the end-user is non-issue, due to freedom 0.


Their argument is that even if linked and distributed, there is no GPL violation for the reasons above.


I thought you could just have the drivers be downloaded and installed during installation i.e. never technically shipped with the OS.


That's somewhat similar to what Ubuntu does, with the additional drivers applet.


This isn’t correct. WiFi router ships with GPL Linux and proprietary router software. The company / foundation creating the distribution just needs to have a license for the proprietary bits.


No, that's not entirely true either. Depending on where the proprietary bits are, it's the other side that is the problem. It is a violation of the GPL. So the usual workaround is to shift the responsibility to the user so that incompliant stuff is done by the user by the way of a script that runs on the user's behalf.


It's not incompliant when the user does it, as long as the user doesn't copy and distribute the results.


If the wifi routers are shipping with binary kernel modules, then yes, it they are breaking GPL. This problem is actually common with embedded devices.


I never understood why NanoBSD didn't take off after the Software Freedom Conservancy lawsuits.


First, if you're talking about the lawsuits over wireless routers, that wasn't Conservancy, that was the Software Freedom Law Center (SFLC) working with the FSF. (And some feel those lawsuits were poorly timed, preempting active negotiations in progress and being largely responsible for Cisco ceasing large Linux-based projects in progress.)

And second, to answer your question, partly because Linux is sufficiently good and has wide enough hardware support that it's still easier to use Linux legally under the GPL than to use another operating system, and partly because if you're going to use another operating system with narrower hardware support you might as well use a tinier embedded one and ship cheaper hardware (which is what Linksys/Cisco did with the WRT54G, shipping VxWorks and less RAM, and then selling the more-Linux-capable WRT54GL as a higher-end product).


Beyond the whole foss aspect, there are good technical reasons too. For instance, blob won't work with wayland (at least right now AFAIK), and ubuntu was (is) debating switching to wayland as default.


If Nouveau isn't working all that well, then clearly a switch to a new graphics stack that would require it is waaay premature, given how prevalent NVIDIA graphics hardware is.


It does actually work fairly well for non gaming uses.


> blob won't work with wayland (at least right now AFAIK)

AFAIK it works with Gnome's Wayland Implementation, but not with the one from KDE.


My impression was that the 17.10 Weyland stuff was kind of a mess so the switch is probably premature at this point.


Fedora is Wayland-by-default for several releases already.


I had a handful of laptops at work that were briefly on 17.10 (because 16.04 had a showstopping bug for them) and the graphics were a complete headache until I switched them to X. Luckily 18.04 reverted to X and made my life easier.


I've never had problem with it; but then, my laptops have all Intel GPU and my only desktop with AMD GPU handles it nicely too.

Chances are, that you are using modesetting X driver anyway (i.e. the KMS driver, the same driver/backed, that is used by Wayland).


Fedora is a bleeding edge distribution whose sole purpose is to test out new features.


Sort of. It falls back to X for drivers that don't support Wayland, which has included nvidia for a long time. (Maybe there is some progress there more recently.) And the proprietary nvidia driver is widely used.


Does it really? I had grotesque lag when installing Fedora on a System76 laptop with an Nvidia card and had to switch to X manually.


Wait until they join up with Yutani.


I think what they mean is "it's now enabled and used by default even for the clic clic clic through the menu (non technical) users", not that the driver is technically present.


> What else are they going to ship ?

That's not the point. The point is that it results in brokenness, and the easiest way to mitigate that for the bulk of non-technical users is to use software rendering instead (i.e. blacklist it). They're open to more refined solutions, but these will take some effort.


The VESA driver?


How does llvmpipe do these days? I remember being excited about it ~5 years ago and I can only imagine both LLVM performance and CPU speeds have gone up.


llvmpipe is the gpu part, not the display part of the graphic card.

Otherwise, when using it in KVM VM, it can run the desktop compositor quite nicely.


But that's the part that's crashing. I'm suggesting Chrome use llvmpipe to do whatever they needed hardware acceleration for, generate a single boring texture on a single big rectangle, and pass that through to the graphics driver. That way Chrome itself isn't loading the Nouveau libGL and is much more insulated from incompatibilities between Chrome and Nouveau, whosever fault it might be.


WebGL requires far more than just blitting onto a single boring texture.


That's why you go through llvmpipe - you use the Mesa OpeGL implementation in LLVM-JIT-accelerated but pure software mode to implement WebGL, get the results onto a texture, and pass that texture onto your actual graphics card.

https://www.mesa3d.org/llvmpipe.html


That seems like something you'd prefer not to do if your OpenGL driver wasn't broken.


That's what swiftshader does.


Whoa, I totally missed that! https://blog.chromium.org/2016/06/universal-rendering-with-s... So is it the case that you get decent (though perhaps not great) WebGL performance even if you have no GL-compatible graphics driver at all?

Then are Nouveau users only losing performance and not functionality as a result of this Google decision?


I didn't look at the bugs at issue, but I rather doubt mode switching is what Chromium cares about. It's sure to be video decode or GPU output.


Chrome without acceleration == one very hot laptop in many cases (ie watching videos on YouTube).


Chrome without acceleration ~~ Firefox without acceleration. I don't think Firefox is enabling hardware video decoding in Linux, even on a fully-free driver?


Acceleration in browser is not only about video decoding.

The composition is GPU accelerated too. Chrome has it enabled by default, Firefox doesn't.

In the end, both browsers treat Linux as a second-class citizen. To make things even more weird, ChromeOS uses the same API (VA-API) and the same driver (for Intel GPUs) for video decoding acceleration that the stock Linux does, and the video decoding is supported on ChromeOS but not on Linux... And Firefox is just plain mess.


The main issue I have with the gpu blacklist is that chrome provides the user with no information or notice when it triggers.


One other problem with this is that now, if you want to run Chrome with --ignore-gpu-blacklist (or any other flag), you have to do that for every one of those bloated Electron apps that you have to run. There's no way to set a system-wide flag.

I'm facing this problem and it's very annoying. I have to run Chrome with --disable-gpu-driver-bug-workarounds, which ironically makes it work perfectly---with the "driver bug workarounds", it's terribly broken. But now, any Electron app I run (e.g. VSCode, Postman, Discord, etc.) is also broken by default and needs the flag as well. Frustrating.


Because it shows that there are different opinions about what a user can reasonably expect, the key quote for me is the following:

> Please remove the blacklist as all Nouveau users expect this corrupted behaviour. If at all possible it may be worthwhile to let users know it doesn’t have to be this bad if they flip to the binary driver. Perhaps a pop up letting them know?


[flagged]


It's not just gpu/driver, it's library versions, compile flags, compiler versions plus all the flags (static and dynamic).

Let's say 50 options. If it costs $1 to test each combo it would cost 2^50, around $10^15 dollars. Google only made $10^9 dollars profit last year.

Now of course I'm being facetious but do not underestimate the cost of testing.


That cost assessment is bizarre. Consider how many variants are there in the entire Google software library; somehow I doubt they're spending trillions on testing.


Feel free to use other browsers that do test this.


Other browser vendors don't have Google's resources.

And still, Mozilla manages to do a better job in this regard.


The bugginess of the nouveau driver has been an accepted fact since time immemorial. Hence the reason for Torvalds' middle finger on the subject. Nvidia is slowly coming around but the arc of history isn't treating it well.


The author, Ilia, noticed that this is on HN and responded in the mailing list thread. I don't know if I agree with the take, but I'll repost because I found it interesting.

---

I've glanced at the HN discussion about this situation [...] and it does seem like people are focusing on the wrong thing... the important bit isn't that nouveau crashes and burns in some situations—everyone already knew that, including the users of nouveau who continue to use it nonetheless. It's that if every piece of software feels free to ignore a system integrator's or user's wishes, then the user now has to know how to override that behaviour separately in every application. The situation is that Distro X has decided that nouveau is the right thing for its users. A user can disable that by uninstalling or otherwise disabling nouveau if they wish. But now chrome comes along with its own set of rules. What if every application starts doing that?

It should also be noted that outside of a few pathological cases, like creating 2GB+ textures which never happens in practice, nouveau works just fine for me. For other people, it dies at random intervals, irrespective of whether they're using chrome or not. While this is a non-ideal scenario, chrome shouldn't be in the business of worrying about things like that. It just confuses the situation for everyone.


> every piece of software feels free to ignore a system integrator

Yes, they do and they have very good reasons for it. Working with distros to ensure they ship the right libraries takes an infinite amount of effort so companies that what to ship a product to their users will work around it.

This also happens in the server space with the massive popularity of containers and language-specific package managers.

The traditional Linux Distro model is extremely flawed.


They already do this with fonts.


Nouveau are doing their best to reverse-engineer cards. NVIDIA doesn't help at all, not providing any docs and not interacting with the rest of the community. On the other hand, Intel and AMD have high-quality drivers and are part of the open-source community.


> NVIDIA doesn't help at all, not providing any docs and not interacting with the rest of the community.

That is simply not true.

Various docs, provided based on requests by Nouveau developers: https://download.nvidia.com/open-gpu-doc/

Documentation released in the last year include e.g. https://download.nvidia.com/open-gpu-doc/MemoryTweakTable/1/... https://download.nvidia.com/open-gpu-doc/MemoryClockTable/1/... https://download.nvidia.com/open-gpu-doc/BIOS-Information-Ta...

The announcement of the first piece of documentation in 2013: https://www.mail-archive.com/nouveau@lists.freedesktop.org/m...

Commits with @nvidia.com addresses (including Reviewed-By tags etc.) to nouveau driver: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

You can also regularly see posts by NVIDIA employees on the nouveau mailing list.

Sure, they are contributing nowhere near as much as AMD or Intel, but to say they are not interacting at all or not providing any docs or not helping at all is clearly false.


It's true they don't literally do nothing, but are Noveau developers getting the information they need about newer hardware? I think these benchmarks are illuminating: https://www.phoronix.com/scan.php?page=article&item=nouveau-...

You get 3x better performance with a GTX 780 Ti than a GTX 980 Ti. From reading the explanation from a Nouveau developer in the comments, I get the impression that NVIDIA just wants the open source drivers to work well enough that you can boot your computer and download the proprietary drivers.


I think the biggest missing piece currently is signed PMU firmware for GM20x to allow reclocking, which NVIDIA has not released and does not seem to currently be planning to (even though they have released some other GM20x firmware).

This post from a Nouveau developer a year ago provides a summary, which I think is probably still accurate: https://www.phoronix.com/forums/forum/linux-graphics-x-org-d...


Isn't the reclocking support for GM20x and newer the biggest pain point for last several years?


AFAIK yes, and I'm not aware of any positive developments on that front lately, though I haven't been following that closely.


Ya. It's too bad their hardware isn't more competitive. I am definitely rooting for AMD to wow us next generation like they did in the CPU market.

Things are getting far better for OSS you drivers. If you wanted any 3d acceleration in the past your only choice was the proprietary Nvidia driver or a broken proprietary ATI driver. (The ATI windows drivers were pretty buggy too during this period).

Back then the proprietary Nvidia driver kept much better Pace with their windows offering. Last I checked their proprietary Linux driver was almost a year behind (point release wise) the windows offering.


> I am definitely rooting for AMD to wow us next generation like they did in the CPU market.

Don't set your expectations too high. Their next generation of consumer graphics cards, rumored to be announced at CES next week, are reportedly still going to be based on a new revision of their current GCN architecture (which originally debuted in 2011 and is very much showing its age these days). Rumored performance for the "high-end" model will be somewhere around the GTX 1080 or RTX 2070 (though reportedly at a much lower price compared to either of those cards).

If you're waiting for them to produce an all-new GPU architecture the rumors are that they are working on one to be launched in the 2020-2021 timeframe. That's also when Intel is rumored to be preparing to launch their own dedicated GPUs so hopefully we will be going from 0 properly high-end GPUs with open-source drivers to 2 by 2021.


I got an amd fury x a while back because the nvidia binary drivers regressed severely with my old card, and I wanted 4K anyway.

It’s by far the best Linux GPU I’ve ever encountered, and that’s with open source drivers don’t set the kernel taint bit.

Its performance in benchmarks was much better than the comparably priced NVIDIA. It is also the quietest gamer GPU I’ve encountered (I bought one with slightly nicer fans).

Maybe things have changed (but the 1080 was certainly out back then), but this card wins on every metric other than CUDA support (nonexistent) and absolute performance (close enough to fastest, and certainly overkill for my Linux Steam collection at 4K).


I'm not saying that AMD's current cards are slouches (I'm planning to buy one myself this year when I build my next computer) but they certainly do not meet the performance bar set by Nvidia's current high-end models (the RTX 2080 Ti and Titan RTX) which is what I mean by "properly high-end". Those are admittedly niche products due to their high pricing but it is important for AMD to have a competitive product in that segment since many people don't do a lot of research and if all they know about GPUs is "Nvidia has the fastest cards" (or an equally-uninformed salesperson tells them that) and buys a 1060 over a 580 based on that "knowledge" then AMD lost a sale.


> they certainly do not meet the performance bar set by Nvidia's current high-end models

Assuming you can't tell the difference between, say, 100fps and 150fps, isn't the 'performance bar' a bit arbitrary after a certain point if the hardware runs the vast majority of the software thrown at it?


Objectively you are correct. However, as the rest of my comment indicates there are important marketing and brand-value perception reasons to have a true high-end card: how many professional gamers (streamers, e-sports, extreme overclockers, etc.) who do buy this class of GPU are there playing with AMD cards vs Nvidia cards? How many (tens/hundreds of) millions of views do they collectively get playing games with Nvidia hardware? Nvidia has likely gotten millions of dollars in effectively free marketing just for having these cards.

AMD's GPU market share has been shrinking with the general gaming audience despite the fact that their mid-range cards have been very price-competitive with Nvidia's offerings during the same time period they've lacked serious competition at the high-end. While focusing on the mid-range where the highest volume of GPUs are sold is the rational market strategy it may not be a winning market strategy in the real world.


> will be somewhere around the GTX 1080 or RTX 2070

Aren't those still adequate to run like, 99.9% of all games on the market? I would (and do) gladly pay for hardware that isn't the absolute fastest if the manufacturer employs a Linux driver team to make that hardware work on Linux.


I agree with you however as I explained in my reply to hedora there are important reasons why AMD should have a card in this performance class.


> Back then the proprietary Nvidia driver kept much better Pace with their windows offering. Last I checked their proprietary Linux driver was almost a year behind (point release wise) the windows offering.

Am I reading this right? Nvidia's drivers were never out of sync for a year with regard to the supported GPUs, OpenGL and Vulkan features. It's easy to see that looking at the Vulkan beta drivers page at https://developer.nvidia.com/vulkan-driver. (That's just a convenient example, non-beta drivers follow the same cadence but there no single page I can link.)


AMD GPU hardware is definitely competitive, they just aren't top of the line. At the mid range, they are very competitive, and that's what most people need anyway.

My next GPU will be an AMD GPU, as soon as my GTX 960 stops being sufficient (and no Wayland is certainly an issue).


In fact, when buying graphics cards I am looking much more on the driver support than on the pure FPS performance. I have a Dell XPS with NVIDIA graphics and an AMD RX460 in my desktop PC.

And in my experience, the NVIDIA driver landscape is just a mess. The state of Nouveau can be obtained from the article above and the proprietary driver causes all kinds of weird issues (e.g. UI spinners rotating at different speeds, fans turning faster than they are supposed to be, etc.). On the other hand, the (open source) AMD driver seems to have evolved quite well over the last years.

As a result, I hate my NVIDIA card and like my AMD card quite much. Every time I see those steam survey results I wonder why there are still so many people buying NVIDIA cards for their Linux boxes. They probably just trust the benchmarks and don't compare the experience first hand.


In the last 6 years I've used a 7870, 290, and 580 as my desktops GPU. All have been stellar cards that have only gotten better over time.

Back when I got that 7870 circa 2012 I was taking a major risk - at the time the 6870 was the premiere foss card and radeonSI was brand new with growing pains. The 290 was another risk being the first major architecture update to GCN. But I was validated in both purchases with usable cards at first that within a few months became excellent.

I've played a lot of the major Linux AAA releases on these cards shortly after release - Borderlands 2, Civ 5, Metro Last Light / Redux, Tomb Raider, etc. There used to be quite a few bugs at release and glitches. Nowadays everything is pristine. And I get about twice the framerate in BL2 today than I did four years ago on that 290.

I'm almost certainly going to buy a Navi card next unless we have another crypto bomb ruin the market again.


AMD GPUs are definitely competitive unless you made a business out of computing things in GPU. For most daily tasks like graphics (games) or even moderate number crunching (OpenCL), AMD GPUs are perfectly usable and high value.


How big is the gap between opencl and cuda right now?


These days Vulkan can be used for compute as well, and it gives the user a far more refined interface than OpenCL. There's very little reason to limit oneself to the latter.


Any Vulkan library you can suggest? Last time I checked Vulkan, it required enormous amount of boilerplate just to connect GPU, compile, send data and run code. So, I ended up using OpenCL for years. Unless there are good high level libraries around I would still stick to OpenCL.


I see it repeated all the time that AMD has great open source drivers, but their latest drivers on their 5xx line card couldn’t run my 4K display at 60hz over Display Port without splitting the screen into two separate X displays. Nvidia’s proprietary driver can handle it no problem.

Needless to say I have an AMD card, with awesome open source drivers, sitting in a bin. While the horrid binary-blob Nvidia actually drives my display.


That's a curious situation. I run my 4k displays over DisplayPort with an RX 580 on Ubuntu 18.04. However, I haven't gone out of my way to obtain drivers from AMD. I'm just using the drivers that ship with the kernel. It might be worth trying a vanilla installation of Ubuntu to see if you can get your set up working.


I did try both the kernel open source drivers and AMD’s pro edition.

Most likely you aren’t using Display Port 1.2 MST, which is what the Dell UP3214Q requires for 60hz 4K.

My testing was with vanilla Ubuntu 18.04.


For those closer to the hardware with insight into the emerging architectures and market dynamics, how far away is AMD/Intel/IBM/Google from catching NVIDIA from a technology perspective, and does NVIDIA have a defensible competitive advantage now that market forces have moved well beyond gaming and shifted US strategy toward a spending spree of buying up all IP?


What’s the rationale for NVIDIA for this? As the whole point is to expose hardware features through an API, what do they gain by keeping it closed? Are there features implemented purely in software?


The most commonly used APIs for graphics (X11 with all the "accelerated" rendering, OpenGL) match the actual current GPU hardware so poorly that most of the driver bloat is about dealing with that mismatch in the most efficient and transparent way possible. The drivers also must contain a collection of compilers and assemblers and other stuff (because the OpenGL spec says so). So there is a lot of secret sauce type stuff in there that these companies don't want you to know about. Whether this is justified is a completely different question.


The code might expose the implementation for a lot of NVIDIA specific features and libraries. Having that open and possibly ported to other hardware would not be in NVIDIAs interest.


[flagged]


Hey, I think you're being downvoted a bit harshly for your statement so I've decided to help you out by not suing you.

Cheers.


Both projects seem to put the blame on the other project. IMO, there are at least two other factions that can be blamed equally well.

First, nvidia for not providing OSS drivers and second, the user who runs an OS on hardware that can't be properly supported on this OS at the moment.

For years I have the rule that I prefer hardware that has proper Linux support. Even going a bit further that I try not to buy hardware that doesn't have OSS drivers unless there really is no other option (nowadays, that means mostly smartphones).


> nvidia for not providing OSS drivers

It's worse than this. Nvidia does not merely not provide open source drivers, they actively preclude work on Nouveau.

Phoronix link: https://www.phoronix.com/scan.php?page=news_item&px=Nouveau-...


Ouch, that's mean. IMHO the only way to react to that is to close down shop. Ideally including Linux throwing out support for Nvidia cards.

I have a strong opinion on this as I was involved a few years ago in the Kindle homebrew community. After a while I decided that my time could be used better than on a product by a company that tries to prevent us from making his product better and more interesting to its users.

There is so much energy and time wasted when you have to fight against the manufacturing company and in the case of a graphics card, the result is that it just works as well as the other graphics card option that is fully supported OOTB. And then that's just for a piece of hardware that you can replace on average for a few hundred bucks.

Is that really worth the hassle?


I've personally actively avoided buying anything with Nvidia hardware in it for 8 years now because of how awful their software business is. That has meant I've even skipped over tablets and phones using their chips and will never buy a Nintendo Switch because... well... firstly consoles are bullshit crippling perfectly good computers with proprietary OSes but secondly because it has an Nvidia chip in it.

I tell everyone I know not to buy Nvidia, I have built gaming computers both professionally and for family that I've actively advised against buying Nvidia parts for because of their business practices, and when I buy notebooks its a real PITA to get one without some Nvidia GPU in it but I do that too.

But Linux cannot "throw out" support for Nvidia. Nvidias driver is wholly independent of Linux, its why its such a PITA to work with. Its a DKMS blob driver and way too much other useful stuff depends on that interface to break it just to stop Nvidia. Fortunately nobody is going out of their way to support Nvidia either, such as their stupid attempt to fragment Wayland development with eglstreams nonsense.

The most valuable thing any of us can do is constantly decry Nvidia for being the reprehensible company they are and use what influence we have over those we know to distance them from buying their products the same way you should distance friends and family from buying Nestle or Exxon products or should avoid shopping at Walmart.


If you do any machine learning you'll likely still need Nvidia for CUDA, it was the deal breaker for going AMD in my case.


Would you mind giving an overview of their reprehensible business practices for those of us who just buy GPUs without following the companies?


I'm nowhere as fanatical about it since when it's come to corporations there is no black and white, but there at least few cases I can recall:

* Even decade ago they did a lot of shady stuff when they initially separated GeForce and Quadro series. There was time when you can make little hardware mod on Geforce product that make driver think you have Quadro and magically a FP64 become twice faster and CAD application performance was doubled and tripled.

* Their OpenGL implementation was ever non-standard and allowed both code and shaders to work in situation where it's clearly should not. As result games and apps that was debugged on Nvidia hardware would never work on more conformant Intel / AMD drivers.

* In terms of Nouveau back in 2014 Nvidia started to require signed firmware files [0]. They promised shortly after to provide Nouveau with the files with clear licensing, but it took years. When files were finally released everything related to power management was missing and still not released until this day. So Nouveau can't implement recloking. On top of that they intentionally obfuscate the way firmware files stored within driver binary and also make GPU load firmware via DMA to make extraction much harder.

* Sabotaging of OpenCL in general by not implementing spec for years and keeping it generally much slower that it's supposed to be.

* Several years ago Nvidia twice added checks to their Windows drivers to stop consumer GPU drivers to work when passed into virtual machines. Both bypassed by hiding KVM identifier and changing HyperV hv_vendor_id, but it's was annoying [1].

* Nvidia forbidden to use consumer hardware drivers in datacenters [2]

* Nvidia tried to push some crazy anti competitive program to make exclusive deals with GPU vendors so they can't sell AMD GPUs in any kind of "gaming" product lines [3]

Again it's not like AMD is perfect, but that's a lot of shady stuff and I likely not even listed half of it.

[0] https://hardware.slashdot.org/story/14/09/27/1254219/nvidia-...

[1] http://vfio.blogspot.com/2014/08/vfiovga-faq.html

[2] https://news.ycombinator.com/item?id=16002068

[3] https://news.ycombinator.com/item?id=16675557


> There was time when you can make little hardware mod on Geforce product that make driver think you have Quadro and magically a FP64 become twice faster and CAD application performance was doubled and tripled.

Do you think that overclocking a CPU clock is "magic", too? You were just running the hardware out of spec! Yes, FP64 and CAD were a lot faster, whatever; are you going to trust these faster-reached results to be always correct? Or that the card isn't going to break down in the midst of some important workload? The whole point of labeling something "Quadro" is to say "these cards are good for that sort of critical stuff - they've been extensively tested for reliability".


What's bad about Nestle? I know their products are very expensive, but aside from that? Care to elaborate?


Among the most recent ones, lobbying for water privatization: https://www.google.com/search?client=opera&q=nestle+water+pr...

And the usual reliance on Palm Tree Oil, as another example


This is a reference to the baby milk scandal [1]. Nestlé would market their powdered baby milk in developing countries as being better than breast milk, even though it's not. Worse than that, they would give away samples to new mothers for free or a substantially reduced price, who would then lose the ability to lactate because they're because they're not actively doing it. After that they would be forced to buy the powder, and not all of them could afford it.

This was at its worse in the late 1970s, but it seems the practice hasn't entirely stopped even today

[1] https://en.m.wikipedia.org/wiki/Nestlé_boycott


If everyone from FOSS world will show them the middle finger like Linus did, maybe, just maybe they will become friendlier a bit. But our only true hope if this company will go bankrupt and out of the market.


Well, not quite. The charitable way to put it is that they are requiring signed firmware in hardware, in order to help stem fraud by 3rd party resellers (where some card model X is sold to naïve buyers as a more desirable model Y, with a hacked firmware to match) and project noveau is obviously impacted.


Problem is that Nvidia did much more than just signing firmware:

* Separated firmware that needed for basic and fully functional driver and given Nouveau only bits they wanted.

* Obfuscated and hidden firmware files within the driver binaries so they harder to extract.

* Changed loading process so GPU read it via DMA to make reverse engineering of firmware loading harder.

It's pretty clear that all of that was exclusively done to harm Nouveau.


> It's worse than this. Nvidia does not merely not provide open source drivers, they actively preclude work on Nouveau.

If the noveau developers had true hacker spirit, they would move nouveau development into the darknet.


> First, nvidia for not providing OSS drivers and second, the user who runs an OS on hardware that can't be properly supported on this OS at the moment.

nVidia does provide perfectly working closed source drivers on Linux though - how is it their fault that some 3rd party implementation of the drivers doesn't work in a stable manner?


Define "perfectly working". If I download the Linux sources, compile the kernel and install it, there is no official Nvidia driver to be seen.

I didn't say that Nvidia is at fault for the state of the nouveau driver. I said they are at fault for not providing OSS drivers. Nvidia doesn't want to play by the rules of Linux development but they still want to be on Linux. That's legally possible but why should the user or distro manager give Nvidia a pass here when other companies play by the rules?

Think about it. Which components would you be willing to download drivers for if they weren't OSS? Your monitor? Your sound card? Your harddisk? Your network card (oops)? Your mouse or keyboard?


If you download the current Windows ISO image and install that offline, you won't get a proper driver for your nVidia card either. Yet, this is accepted as normal.

Some Linux distributions that do not hamper themselves artificially on philosophical grounds offer nvidia binary drivers in their package repositories like they do with every other driver they contain. Your focus on the kernel sources is too narrow.

There absolutely was a time when you had to download all these drivers you mention separately - for Windows. And people put up with it. I think even today the Windows installer is forced to offer an option to install 3rd party RAID controller drivers into the installation environment...


Parent is referring to the fact the Linux drivers are meant to be baked into the kernel and Nvidia doesn't wish to play ball.

You can't compare Linux and Windows when it comes to drivers, they have different methods of providing those drivers.


Yeah, and Linux's insistence on not having a stable ABI for drivers is problematic for Nvidia, just as Nvidia's insistence on not open sourcing their drivers is problematic for Linux.


I can fully compare those, your wish to the contrary nonwithstanding:

- both are viable desktop operating systems

- nVidia successfully and consistently releases high quality video drivers for both


I installed Ubuntu last month, I didn't do anything special and the nVidia drivers are running on my system providing excellent gaming experience.

So to me it seems to work just fine.


I'd say, if you care about using open source driver, you'd avoid Nvidia to begin with, and use AMD or Intel instead.

Here is a very good post from one of the key Nouveau developers, about why you should do exactly that:

https://www.phoronix.com/forums/forum/linux-graphics-x-org-d...


And avoid chromium, because there are a number of dependencies under 'third-party' with no/unclear licensing (and thus are assumed to be non-free).


AMD cards for anything advanced are crap, they couldn't recogize one of my monitor (>2 monitor setup), and they're not playing fair with their AMF extension on Linux (works great on Windows though). Even on Windows, I cannot get the card to properly get my minitor resolutions (it's sitting beyond a multiplexer, though there no problem with nvidia).


AMD cards work just fine. Not sure what kind of issues you are having with your dual monitor. May be you are using an ancient kernel?

What's an AMF extension?


Triple monitor on two physical screen, and no AMD cards do not work, in this case using the display port output and DVI->HDMI adapter, and yes, it works with a third physical monitor (not acceptable in my setup). Nvidia/nouveau are not showing these problems.

AMF == Advanced Media Framework.


On Linux you have vaapi for hardware encoding.

AMF doesn't seem to have a Linux version to begin with, so not sure what you were testing exactly: https://github.com/GPUOpen-LibrariesAndSDKs/AMF/issues/4


HEVC hardware encoding is not supported by "amdgpu" VAAPI libs (tested about 6 month ago with an RX560, did you get it working ?).


NVidia comes out of this looking very bad. It's not as if a GPU can't have good Linux support (see AMD, Intel), but theirs doesn't. Although I haven't spent my hard-earned money on their hardware since 2005, I have several machines at work that are burdened with it. Some lock up with Nouveau drivers, others crash with NVidia drivers. I can only conclude that the hardware is unstable.


Then it is good. The worse NVIDIA public image is - the better.


The problem from my perspective is that a major GPU vendor seems to think it's appropriate to paper over hardware problems with proprietary drivers, and they get away with it. While their reputation ought to suffer for that, I'd rather see them mend their ways. I'd love to take NVidia off my blacklist.


It's not the Chromium team's job to debug nouveau. Switching it to be the default for everything was obviously premature.


Yet I use Firefox with Nouveau just fine...


Firefox also disables GPU acceleration with it by default.


It works on my machine, ship it!


Uh no, means that you can make it work and some people are willing to spend the time to do it right. You'd think that if Mozilla can afford to do this so could Google. The only way the problems in the drivers can get fixed is through usage, if Chrome flat out refuses to work with the driver then the issues can't be fixed.

This also illustrates why it's important to have alternatives to Chrome for anybody who cares about open source.


I assess with high confidence that that was a satirical comment.

It could be working on FF only through a quirk in their rendering, rather than a testing. Chrome certainly has a test suite -- did anyone run it with nouveau and turn bugs into PRs? And arguably, that job belongs to the desktop owners, as well as the nouveau and chromium teams. Browsers are insanely complicated, and who is paying for this work?

If I could make my employer pay for a meaningful number of Ubuntu desktop licences I would, but TBH I need it to work with the nVidia driver not nouveau.


Maybe it's working through a quirk, or maybe Firefox team writes better code that handles a wider range of drivers. I choose to use Firefox because the team behind it cares about my needs. If Google can't afford to make a browser that works with open source drivers, I will simply choose not to use it and encourage others to use Firefox instead. This is precisely why having alternatives to Chrome is important.


Pretty sure that was sarcasm, just fyi


With hardware acceleration enabled?


Yeah, I've seen it crap out now and then, but generally works.


Isn't this the key quote:

> I did run the WebGL CTS suite, but that resulted in some

> hangs from the the max-texture-size-equivalent test, and some

> browser-level weirdness after some tests where later tests all fail

> (due to what I have to assume is a browser bug).

Isn't it more likely that they are noveau bugs, and are part of the problem here?


> hangs from the the max-texture-size-equivalent tes

I reported this bug back in 2015 and they have made no progress on it. It seems Nouveau just doesn't have the resources or man power. I can't really blame Google here. You don't want the browser to be able to hard lock system(not even a Magic SysRq will reboot it)

[1] https://bugs.freedesktop.org/show_bug.cgi?id=92136


> You don't want the browser to be able to hard lock system(not even a Magic SysRq will reboot it)

Wait, this isn't just rendering bugs a la "some WebGL things will show green instead of red", it kills your system entirely?

I thought the only effect this blacklisting had was making WebGL unavailable for everyone instead of buggy for a small minority. This just flipped my opinion.

Related, this might also explain the strange bluescreen bug my girlfriend had with Windows 8 + Firefox on her laptop. She switched to Google Chrome a few years ago because every time she used Firefox, her laptop would bluescreen within a few hours (either while Firefox was still running, or at some point later). I didn't know bugs outside of the OS could still do that kind of thing (I thought they started squashing such bugs with Windows 95 and finished somewhere around Vista) and wrote the Firefox thing off as triggering some extremely obscure Windows OS-level issue. Maybe I should check if there is a driver blacklisted by Chrome, though it seems more likely Chrome just doesn't happen to trigger it (like all other software running on it, it really is only Firefox).


Driver crashes can easily take down a system. That's why driver quality matters so much. This is true on every OS. (Although, Windows 7+ has mechanisms to help recover from GPU driver crashes specifically, which usually work.)

Also, Chrome can render WebGL on the CPU. It's slower but will still work.


Doesn't hardware accelerated graphics on Firefox on Linux require modifying a preference? AFAIK you need to go to about:config and enable layers.acceleration.force-enabled. (This is speaking as someone who uses Firefox on Linux as my main browser.)

The bug for enabling hardware rendering by default on some systems appears to still be open: https://bugzilla.mozilla.org/show_bug.cgi?id=594876


Yes, Firefox is stuck with blacklisting basically everything. I guess developers didn't revisit drivers situation in the recent times.

Just set these in your $HOME/.profile

    export MOZ_WEBRENDER=1
    export MOZ_ACCELERATED=1


Please don't encourage people to turn on WebRender without at least warning them that it is experimental, still under development, can cause your computer to blow up, etc. Also the environment variables you specified will force it on even on non-Nightly builds which is very undesirable since those builds are not getting fixes uplifted.

(I work on WebRender for Firefox)


I've been using WebRender on beta builds on Linux for several months already - it works quite well (AMD / radeonsi).

WebRender is more raw indeed, but regular layers acceleration should surely be re-evaluated instead sweeping blacklisting like it happens now.



I'm not the only one who is using layers with acceleration just fine though.


Thanks!

Edit - will that work for the snap version BTW?


You'd need to figure out how environment variables work in Snap and set them there, I haven't used it myself.


> One idea is to flip GL_VENDOR to some random string if chromium is running.

What? Don't do this - never do this. Regardless of whether or not you agree with Chromium's decision, it is their decision to make as the project maintainers; trying to trick the software with falsified data is overstepping your bounds, and both corrodes trust in your driver and leads to whitelists.

I thought we had moved past the days of fake user-agents.


> trying to trick the software with falsified data is overstepping your bounds.

Fwiw, I agree with this statement only because Chrome lets users override the default behavior.

In the hypothetical event that Chrome was closed-source and forcibly disabling acceleration under a driver, I'd consider fakery on the part of the driver to be perfectly warranted.


What we have is two side accusing the other of overstepping their bounds. Chromium making decision about drivers which normally is the domain of the operative system, and the driver making decision of what kind of systems the browser will support.

The more rational behavior is for chromium to let the operative system decide what software and drivers they want to use, and do what they want with bugs from operative systems that they don't want to support.

Whitelists won't work since all that will happen then is that project will do what Internet explorer did, ie call it self "compatible" and use the GL_VENDOR string of that compatible driver.


> Chromium making decision about drivers which normally is the domain of the operative system

Chromium isn't loading their own graphics drivers. They are deciding to fall back to software rendering if they determine that the card/driver/OS combination does not properly support the API that enables them to use the hardware-acceleration features of the card to render. They're still rendering using the noveau driver on the OS, they're just not using the APIs that they have determined to be not functioning properly. Detecting what the underlying graphics hardware and driver supports and falling back to what is functional is an incredibly common part of software using GPUs.


Totally agree.

If you want to use hardware acceleration then go to about://flags and search for blacklist

"Override software rendering list: Overrides the built-in software rendering list and enables GPU-acceleration on unsupported system configurations. – Mac, Windows, Linux, Chrome OS, Android"

Simple and if you have telemetry logging on then you are possibly advertising that nouveau works for you.


The problem is that the Nouveau (and Ubuntu) devs have no way to overturn Googles decision in Chrome. Ubuntu can compile Chromium with a modified blacklist but Google distributes Chrome in a deb. You would need some special logic to catch the Chrome .desktop on install and add in the unblacklist flag.


The user can decide on his own if he wants the application to run on an unsupported driver or not.

The application decided by default not to offer support for it.

Who should absolutely not decide for the user is the unsupported driver itself so I think the hierarchy is just fine as it is.


They could add “--ignore-gpu-blacklist” to their .desktop entry for Chrome.


Ubuntu doesn't have a .desktop entry for Chrome. It comes with the deb. Desktop files are installed by the programs that use them, hence why I said you need to catch the Chrome install and modify the desktop file.


Nouveau is the joke.

It _still_ doesn’t work with Pascal cards, which are 3 years old now. Not to mention newer Turing cards. It’s only good for some obsolete hardware more than two generations ago. The first thing I have to do before installing fresh Ubuntu or Fedora is looking up how to blacklist nouveau, as the system won’t even boot otherwise.

It’s time to deprecate it for good everywhere.


No, it is NVIDIA who is joke, Nouveau doesn't support these cards just because of NVIDIA policy and management. And they are the only who deserve the blame in this case, not Nouveau or Google.


AMD has their own set of quirks, they are not open sourcing their main driver either. Nouveau was always a stopgap measure, to boot the system and make it somewhat usable until you install the official binary driver; so is amdgpu. Even if you are open source enthusiast who wants nothing but FOSS on your system (and this is a very minor use case), these drivers do no better than your integrated Intel graphics, which is integrated into CPU directly since time immemorial.

Nouveau and amdgpu were always barely working, just enough to get something on the screen — no state of the art 3D acceleration, no CUDA/OpenCL. But there is a big difference between “barely working” and “black screen on boot, always”.


"Nouveau and amdgpu were always barely working" — nah, you are spreading misleading information and lumping together completely unrelated projects at different stages of development.

Noveau has been almost close to usable at some point... Until Nvidia started the whole signed firmware nonsence. Then it quickly became intolerable due to Nvidia, supplying no signed firmware or heavily crippled firmware. These days you are actually better off using software renderer/llvmpipe (which at least might benefit from a powerful processor). "amdgpu" is a name of the modern AMD kernel driver, that replaced "radeon" driver. Both drivers were developed with direct help from ATI/AMD. Unlike it's older predecessor, current amdgpu versions include the "display core" code, that was designed to be shared between Linux and Windows AMD drivers (not sure, if that part has worked out yet). All Linux drivers for currently produced AMD cards are open-source, both kernel and userspace parts. AMD also has a "value-adding" package, that is based on their own userspace driver (which is open-source) and simply adds few closed-source components to it.

"amdgpu" is not barely working — AFAIK, it is the only currently available AMD kernel driver, and it worked great for me, both 2D and 3D.


I gave AMD a try with a couple different card, couldn't get them working properly. One was on a dual GPU setup, AMD failed, nvidia worked. Then I needed hardware accelerated HEVC encoding, bought a rv560, works great on windows, but AMD isn't playing fair with AMF on linux, once again failed by AMD, and I didn't want to install the proprietary nvidia drivers to use VDPAU, so that's a stalemate. Then I tried the AMD card on a split virtual monitor setup, once again AMD failed me, couldn't use both dvi & display port at the same time. Now I run a 1060 with nouveau and some quirk, best experience so far. Oh, and on windows, my rv560 can't properly get the resolution of the monitor behind an hdmi demux, the previous nvidia card worked like a charm.

AMD 0 - Nvidia 3


How do you run 1060 with nouveau? My 1080 doesn't even boot to anything except black screen.


It just worked out of the box with the last Fedora.


amdgpu is not a stopgap driver. It's mature and reliable enough driver to play videogames including a lot of windows games via proton. I don't see a single reason to use a proprietary OS or driver anymore.


As long as you don't have a funky setup, sure, but I've never been able to get AMD cards properly working for what I wanted to do (dual GPU setup, AMF on Linux, and multi-virtual monitors, ie 2 simultanious display on 1 monitor, or even getting the proper resolution from monitor behind an hdmi demux) , both with the opensource `amdgpu` AND the proprietary Linux drivers.


CUDA/OpenCL?


>We can also just take this, as yet-another nail in the nouveau coffin.

I don't understand the dramatics. Nouveau is great but being blacklisted on Chromium is not the biggest issue for it right now imo. A lot of Linux users (myself included) use Firefox anyways, and it's still a default in many operating systems. And you can of course override this. I would place lack of reclocking support much higher than this in terms of issues. And of course, the bugs that caused Chromium to decide to blacklist Nouveau in the first place are still going to affect other accelerated applications.

I'm a big fan of Nouveau, it is what I use on my laptop when in Linux. At the very least, it performs more than adequate for running a compositing desktop environment, and for many common configurations I've had few issues booting things up, the few issues I have had generally solved by using a newer kernel.

If we really want Nouveau to succeed though, the support needs to start coming from Nvidia. AMD and Intel both have in the past supported open source drivers to an extent and it really shows. Nvidia's contributions have been pretty piss poor, and unfortunately it seems they just don't care, which is sad.

I guess I do understand the frustration, but given the fact that this is in response to Ubuntu shipping Nouveau as a default, I think Nouveau's death is a bit overstated.


Its weird to work on a project to support hardware when the manufacturer actively tries to thwart you with signed firmware. We need more efforts going into open hardware imho


This sucks. Chromium doesn't do accelerated video decode for intel on linux either. And now this. What's the AMD scene like ?


The amdgpu drivers in Linux are fantastic these days. Works out of the box with most hardware. No proprietary binaries needed, other than the firmware blobs. AMD is doing a really solid job of supporting their Linux user base.


AMD drivers have an OS-independent hardware abstraction library at the core and they "only" have to adapt it to different systems: https://github.com/GPUOpen-Drivers/pal


So does nVidia.


Except the AMD driver is 1) open source and 2) works.


Exactly my point. The fact that nVidia also has an OS-independent abstraction library shouldn't stop them from making it work, at least.


No multi-GPU support (at least 2 years ago, using the binary drivers), no AMF support on Linux, and couldn't get multi-display support on a split screen. AMD is fail for me...


Yes, but does chromium play nice for acceleration ?


For rendering yes. For video decoding no. Chromium needs a patch to use VA-API[1].

[1]: https://wiki.archlinux.org/index.php/Chromium#Hardware_video...


Seems to be. From about:gpu:

    Graphics Feature Status
    Canvas: Hardware accelerated
    Flash: Hardware accelerated
    Flash Stage3D: Hardware accelerated
    Flash Stage3D Baseline profile: Hardware accelerated
    Compositing: Hardware accelerated
    Multiple Raster Threads: Enabled
    Native GpuMemoryBuffers: Software only. Hardware acceleration disabled
    Out-of-process Rasterization: Disabled
    Hardware Protected Video Decode: Hardware accelerated
    Rasterization: Software only. Hardware acceleration disabled
    Skia Deferred Display List: Disabled
    Skia Renderer: Disabled
    Surface Control: Disabled
    Surface Synchronization: Enabled
    Video Decode: Hardware accelerated
    Viz Service Display Compositor: Disabled
    WebGL: Hardware accelerated
    WebGL2: Hardware accelerated


Which laptops have AMD chips? I've looked casually, but only seen Nvidia.


Nvidia also does an excellent job of supporting linux. It's just not open source.


Excellent is a stretch. It works for things they support. Look at optimus and wayland for instance. Also, they restrict nouveau from working by signing blobs and enforcing checks. And this is years after Linus flipping the bird and other repeated requests.


Optimus was the single most pain in the ass thing I have ever experienced in linux. Never again, am I going to run linux on an optimus machine with Bumblebee.


Never again am I going to buy an Optimus machine.

EDIT: Wait, there's an open source thing? Is Bumblebee any good? Optimus is such a pain in my ass that it's making me want to smash this laptop.


They became famous on the net with this bug: https://github.com/MrMEEE/bumblebee-Old-and-abbandoned/issue...


I don't know, I upgraded to 18.10 and my XPS's brightness control doesn't work. The Prime switchy thing can't change cards without a reboot, and if the screen goes to sleep between switching cards and rebooting, it never comes up again. Actually, with 18.10, it never comes up again anyway. I have a whole host of issues with nVidia cards, my next card is going to be AMD.


18.10 isn't even supported by Dell.


That's not the point. The switching issue at least is present on 18.04 as well. Basically, ubuntu changed the mechnanism from bbswitch in 16.04 to a different one in 18.04. The 18.04 one is buggy and inferior at the moment.


Yeah, I spent an hour trying to get Bumblebee to work, to no avail. Alas.


Awesome, a laptop with one-year OS support...


Dell doesn't support non LTS (long term support) releases. The next LTS release of Ubuntu is 20.04.


What does this mean? What sort of support did I get from Dell for 18.04? I had to download and install everything myself.


Try Arch with the 4.20 kernel.


I might get angry enough to switch to Arch at some point, but I have too many Ubuntu-specific preferences and scripts to easily switch...


Their driver is really behind in terms of supporting newer libdrm features. It's also not well integrated in the ecosystem. They basically don't talk to the other Linux graphics devs.


Well, there is that whole GBM/EGLstreams thing with Wayland. Also their supported architectures list is basically x86 these days.


Chromium seems to do accelerated video decode with Intel GPU on Linux for me:

Graphics Feature Status

Canvas: Hardware accelerated

Flash: Hardware accelerated

Flash Stage3D: Hardware accelerated

Flash Stage3D Baseline profile: Hardware accelerated

Compositing: Hardware accelerated

Multiple Raster Threads: Enabled

Native GpuMemoryBuffers: Software only. Hardware acceleration disabled

Out-of-process Rasterization: Disabled

Hardware Protected Video Decode: Hardware accelerated

Rasterization: Hardware accelerated

Skia Deferred Display List: Disabled

Skia Renderer: Disabled

Surface Control: Disabled

Surface Synchronization: Enabled

Video Decode: Hardware accelerated

Viz Service Display Compositor: Disabled

WebGL: Hardware accelerated

WebGL2: Hardware accelerated


You can get native GPU buffers via enabling zero copy rasterizer in flags and passing --enable-native-gpu-memory-buffers via chromium-flags or as an argument.

They work on ChromeOS the same way they do on desktop Linux and I've been using them for months without issue.


I have Chrome Version 65.0.3325.162 (Official Build) (64-bit) on 4.19.13-300.fc29.x86_64 with i915 graphics

00:02.0 VGA compatible controller [0300]: Intel Corporation Skylake GT2 [HD Graphics 520] [8086:1916] (rev 07) (prog-if 00 [VGA controller])

And chrome://gpu says Video Decode: Unavailable

And it points to this bug: https://bugs.chromium.org/p/chromium/issues/detail?id=137247


You need a VAAPI build of Chromium, it isn't on by default.

Arch has a PKGBUILD in the repo: https://aur.archlinux.org/packages/chromium-vaapi/ and I think there are builds for most popular distros out there nowadays.

If you are on Arch or one of its family you can get a -bin version in the AUR or use ArchlinuxCN that builds it in their repo.


The Chromium in Fedora's standard repo is built with it, and when I run it chrome://gpu does say video hardware acceleration is enabled. But at least with a handful of youtube videos, it doesn't matter, still uses 200%+ CPU, same as Chrome (no acceleration), laptop still gets hot and fans still run.


Are you on wayland ? That only works for X. See the comments section of https://fedoramagazine.org/chromium-on-fedora-finally-gets-v...


I haven't been following that bug, but a couple of things. Did you ignore-blacklists ? If you do ignore-blacklists, it overrides some stuff, but not everything. Also, the about:gpu page is not entirely accurate (particularly after ignore-blacklists) from what I remember. I think you should play a video and go to chrome://media-internals to figure out which path it is using. There have also been patches to enable acceleration for intel, but not merged upstream. So if you got a patched version, or your distro did that for you, it could also work. Finally, it works on chromeOS, which also uses Intel hardware, but that's because google supports chromeOS.


https://wiki.archlinux.org/index.php/chromium#Hardware_video...

You can get hardware video on Intel and AMD.


Only decode, there is no va-api support for AMF, say to get hardware HEVC encoding.


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: