The current link says it as if Chrome was not being fair, but reading their reasoning it makes total sense. Key quote from said bug report:
> We thought about blacklisting nouveau driver long time ago, and decided to let more adventurous users to play around. Now that Ubuntu ships with nouveau on default, maybe it's time to blacklist it in Chrome.
> We received quite some bug reports on other rendering issues with Nouveau, and this bug is just one of them. So we will disable all GPU acceleration by default. If someone wants to bypass that, there are two options: 1) install proprietary NVidia drivers (see #24) 2) run Chrome with --ignore-gpu-blacklist, but be aware you are taking a risk here
> Unfortunately we don't have the resources to test every variation of every GPU/driver combination on linux, let alone investigate & fix bugs in drivers.
We want a stable & secure browser first, a GPU-accelerated one second, only if possible.
> The default driver on Ubuntu LTS has severe issues, asking non-technical users to update their driver is just not acceptable as a prerequisite to use Chrome.
> If someone is interested in well-scoping the brokenness (version range and/or devices affected), we're happy to take a patch to the blacklist.
So the driver fails in some (common enough) cases, they let it run anyway but now it's default on ubuntu so a lot of non tech users are/will be affected, so they block it but you can still bypass the block if you want. And if newer version fix it and someone can help them figure out which versions work for what, they are willing to limit the blocking.
This is actually a reasonable position in my opinion, but there are two problems (neither of them is Google or Chromium's fault):
1) Nouveau is buggy and problematic (related more to performance and a lack of support for many features)
2) The Nvidia drivers are buggy and problematic (related more to setup and compatibility with the rest of the system)
Both of these problems can be traced back to the same root cause, which is simply that Nvidia refuses to do a good job of supporting Linux in any way. They could give Nouveau some funding and information about their hardware, but they don't do that. They could develop proper Linux drivers, but they don't do that either. Every way you look at it Nvidia is at the root of the problem.
The solution on the consumer side is to not buy Nvidia products, and to encourage other people to not buy Nvidia products. The solution on the producer side is to contribute to Nouveau (easier said than done unfortunately, your average web dev can't just jump in and reverse engineer graphics drivers on a weekend).
>which is simply that Nvidia refuses to do a good job of supporting Linux in any way.
I really dislike the way Nvidia makes it difficult for projects like Nouveau to reverse engineer an open driver, however I don't understand what you mean by poor support for their Linux driver, it's probably the one thing I can't fault them for as it is performing very well and stable.
I'm with you on not buying NVidia products though, that is the best way to send a message.
YMMV I suppose. Some people seem to have good experiences with their driver but mine have been terrible. I've used Ubuntu as my daily driver for 7 years now and Nvidia's driver has been the #1 source of crashes and configuration problems (half of the time I can't even get 3d accel working with it turned on, spend hours trying different versions that are reported on some forum to have fixed some problem but they don't, can't figure out why things aren't working, upon upgrading from 16.04 to 18.04 it borked the whole system and I had to switch to Nouveau, this list of Nvidia driver pain goes on forever, this is across multiple installs of Ubuntu).
All my experience has been on laptops and there are lots of reports of Optimus related problems on Linux so there's that. I gave up on the hope of having Optimus actually work years ago. I would like to just be able to enable the latest version of Nvidia's driver and see my laptop start using the discrete gpu. That would be a big step forward from the past 7 years.
You should give the SGFXI[1] script a try. it's a bit of a hassle to use video drivers from outside the apt-repos but that script basically always manages to set up a working driver for me.
One other factor: On 18.04, I have found it nearly impossible to use the NVidia driver. After installing, whatever I do, nouveau takes over. This was not the case on previous Ubuntu versions. Even putting nouveau on the blacklist does not solve the problem -- somehow even if I put it on modprobe's blacklist, also in the kernel arguments, it somehow manages to start up. No idea how or why. I solved it before, eventually, I can't quite remember, but since re-installing my operating system I haven't figured it out again. I think I eventually had to actually move/delete the nouveau ko files, which is a very destructive way to deal with a problem and interferes with apt, and even then they got re-built and re-installed by dkms and took over the display again! I did find a permanent solution which escapes me at the moment.
Anyways, it would be good if NVidia and Ubuntu got together and at least made it easier to make use of the official driver.
> 2) The Nvidia drivers are buggy and problematic (related more to setup and compatibility with the rest of the system)
Aren't scientific farm servers running on Linux kernel with Nvidia cards ? It would not be used if it was that buggy, so I think that claim needs precision
A lab or institution claiming a publishable result based on discounted hardware from the manufacturer running code that does who knows what for their TensorFlow/whatever calculations is engaged in something other than science. I wouldn't go quite so far as to say that they're just engaged in advertising demonstrations, but it's coming close.
let's step back from the brink. Plenty of good science is done on proprietary foundations. It would be unrealistic to do otherwise. Reproducibility doesn't necessarily mean every component of the system is completely open and inspectable.
Reproducibility doesn't necessarily mean every component of the system is completely open and inspectable.
If it's not "open and inspectable" then it cannot be reproduced by another group/researcher. It's just unsupported claims building on a wonky methodology.
I wonder what percentage of "Machine Learning" results are based on such shaky foundations and when they'll receive the same amount of skepticism and scorn we see heaped(deservedly) on the field of psychology.
> If it's not "open and inspectable" then it cannot be reproduced by another group/researcher.
That doesn't follow. They can (attempt to) reproduce with the same hardware, firmware, and drivers. They can reproduce with similar hardware from another vendor (AMD). If you're using OpenCL, you can reproduce on CPUs, albeit much slower and probably only for partial results. Maybe you want to break out some pen and paper to verify a few calculations of the CPU for even more partial results because you don't trust Intel's firmware blobs, even though you've presumably cross checked against AMD hardware, but you've gotten quite silly at that point.
Maybe some of these changes won't give you bitwise identical results - but then again, neither does most science. You check if the broad results and patterns are statistically similar. Maybe they are, and the results survive a large variety of similar but different test setups. Maybe they don't, and you discover an uncontrolled variable, like the exact chemical composition of the surface of your glass beakers being contaminated by impurities, or your AI being sensitive to the exact floating point rounding behavior of your GPU.
If anything, you want to see if your results survive variations in the uncontrolled variables, to make sure you've properly determined what factors are controlling the results.
If you cannot reproduce then you do not know whether it is due to the equipment/materiel varying or not and you have severely reduced options for investigating that.
If you look at the problems with even reproducing and testing the bugs that the specific situation under discussion exposes it is illustrative of the difficulties involved with working with a non-modifiable, unexaminable platform.
In your example, the researchers would be forbidden by IP laws from examining and testing the surface of the beakers for contaminants and publishing a description which allowed other researchers to make similar modifications.
It's the most sub-optimal situation in which to obtain something which starts to approximate reality. Not impossible, just difficult, and likely to lead to shenanigans.
The cards are not the issue. Drivers can be problematic with certain workloads (like rendering UIs) and not others (like loading and running non-graphics programs). It's also a lot easier to debug and deploy a fix to a farm of identical servers with identical drivers than it is to be compatible across a huge breadth of configurations.
>> Unfortunately we don't have the resources to test every variation of every GPU/driver combination on linux
So Google doesn't have the resources to test the default driver on the most widespread linux distribution for one of the two most widespread GPU producers in the market.
No, in fact one of the bug comments refutes this sarcasm:
> The default driver on the distribution we support is broken.
They do test the default driver on the distributions they support. The tests show that the driver is unacceptably unstable. It may work better on other distros/versions, but they don't have time to test those. They're happy for others to help out in testing these unsupported combinations, however:
> Again, if someone wants to spend the time to test thoroughly to narrow down the blacklist, we will accept patches.
Not for a free product, no. It's not our place to decide for Google what they want to do. If we want our open source software to work we should make it work. You can rebuild chromium to change the blacklist, or just use the command line to tell it to ignore the blacklist. It's not like Google doesn't provide options.
And in any case the reason nouveau sucks (and yes, folks, nouveau sucks) isn't Google's fault to begin with.
You seriously want to complain about Google's lack of support for an open source driver that barely exists because of NVIDIA's lack of support?
It's free in the sense you don't pay for it with money, but considering how much Google pays apple to use Google as the default search engine, I would say the Google Browser is a big money maker.
No, Chromium is free in the GNU sense, which is what I meant. You don't whine at open source providers to support features you want, it's bad form. Go fix it yourself, or use hardware that doesn't require this kind of hacky reverse engineering.
It's fairly uncommon for open source software to blacklist components that (attempt to) implement a compatible interface. Usually selecting an appropriate hardware driver is left in the hands of the distribution or administrator.
Yes, and the distributor or administrator is free to rebuild the browser or just set the flag to disable blacklisting. It’s only default behavior that changed.
Yes, it's different. You can build trademark Firefox so long as you do not modify it at all. You cannot build Chrome from the chromium sources — there are components which are not present in the open source repository. It's not just the trademark, unlike Firefox vs Iceweasel.
> You don't whine at open source providers to support features you want, it's bad form. Go fix it yourself
This just about sums up the problem with open source. The response is always "if you don't like it, fix it yourself". Because everybody is a developer with copious free time to learn how to fix the problem in their favorite product. It's not like cloning and building Chromium takes hours. Or like you would lose anything by using Chromium instead of Chrome.
> Because everybody is a developer with copious free time to learn how to fix the problem in their favorite product.
But that's the point. The response always works because it's proportional to the problem. For problems that are legitimately small and you could fix yourself, or pay someone to do it for an amount of money that a normal person could feasibly pay, it's a valid way to solve the problem.
And for problems bigger than that, it invites the user to consider what they're really asking for and who they're asking for it. Actually fixing nVidia's dumpster fire would be a huge ordeal for anyone other than nVidia. Asking a third party to do it without documentation... let's just say there is a reason nouveau is in the state that it's in, and it's not a general lack of interest in fixing it.
So in that case "go fix it yourself" means "if you think it's so easy then let's see you do it."
I don't see what the issue here is. If you don't have the time to do it then get someone else to do it. If they won't do it for free then you pay them. The unfortunate part is that if you regularly buy brand new Nvidia cards then the cost of the proprietary driver is built into the price of the card, so it sucks that you would have to pay twice for drivers. But this clearly is a result of Nvidia's decision, they are thes one who are forcing you to take personal responsibility for getting your own libre drivers working, not the developers working on said libre (and gratis) drivers.
Sorry for the confusion, I see mentioning Chromium by name made my comment a bit misleading. I was responding to the parent comment (not to the original post) and using Chromium as an example of how it's unrealistic to ask users to fix things themselves when using OSS. I was not trying to comment on Google's decision on this particular issue here (though I have mild thoughts on that too).
> Because everybody is a developer with copious free time to learn how to fix the problem in their favorite product
It takes copious free time to learn how to run chromium with an --ignore-gpu-blacklist argument? What kind of strawman are you trying here?
Look. The open source NVIDIA driver is terrible, and Google doesn't want it messing up the experience of the people using its product. Who are you to demand that Google fix someone else's terrible driver (which is terrible through no fault of Google's -- NVIDIA would prefer it not exist at all)?
If you really cared about this at all, you'd be upset with NVIDIA. But you're not. You apparently don't even like open source software, which presumably means you're using the binary drivers and unaffected, right?
I mean, just because it's one of two GPU producers, that doesn't mean the nvidia product line is simple. There are desktop chips, there are notebook chips, there are multiple versions of the same card made by different manufacturers who add their own little touches, there are different chipsets, there are different configurations, there are different X11 setups, there are different desktop environments which may enforce different settings, etc.
It's not as simple as "one of the two most widespread GPU producers in the market".
Obviously it’s not that Google doesn’t have the resources. They have enough resources to fund whole new OSes. Of course they could fund testing.
Chrome team, on the other hand, does not have the resources. Which is to say that this is not the thing they believe is most important to work on at the margin, and given the amount of engineers Google will fund for the project.
Which, to be clear, seems entirely justifiable because software rendering exists.
If Nvidia wants their cards to work well on an all-free-software stack, they can provide a free software driver. If users care about making an unofficial driver work, they can do the work or fund the work. Chrome is just trying to get web pages on the screen and it doesn't seem like it's worth the boil-the-ocean level of effort of taking responsibility for implementing a reverse-engineered graphics driver when they can just get web pages on the screen using software rendering.
That's the part that I don't understand. The bug report says on Linux the same rendering is used for the UI as well as the page content... but only the UI was ever having a problem?
Something doesn't add up to me there, and no one that I saw ever addressed that in more detail.
If the chrome team doesn't have the resources for testing the most common Linux configurations, it's not a good sign. Linux market share is low but it's a useful signifier.
you're the only one saying the don't. they are saying the most common linux configuration causes issues so blacklist the buggy driver in it
they are also saying their that driver may work fine on some less common linux configurations, and the chrome team does not have the resources to test those.
you are literally saying they are saying the opposite of what they said, then blaming them for it.
They were using "not enough resources" to support the idea of blacklisting. If they meant it as you say, that means they're blanketly blacklisting the most common config despite having the resources to test it, which is even worse.
I'm not sure what is unclear to you here. They do support the idea of blacklisting for every possible unpopular config, despite it possibly working. This is not only common sense, it is what every company does. When you write your software, do you test it on every possible distribution users could use, or just the popular ones?
The common distribution was blacklisted not because it was not tested, but because it clearly has issues observed and reported by the users.
If your software doesn't work in a popular config they don't blacklist it in all the untested less popular configs? You are clear they are blacklisting the GPU driver, not blacklisting a linux install - right?
If a driver does not work in popular configs, and needs to be blacklisted in those configs, your solution is to still whitelist it in other configs and have the browser likely not work.
yes, "every company" does exactly what google did. you are being purposely dense, trying to catch a "technically correct" chance of not admitting the stupidity of your question. The problem is, you are not even technically correct - just purposely dense.
Huh? It appears they did test it, and for that matter (according to the linked post) so did the Nouveau devs who also found it failed the easily accessible test suite.
We agree Nouveau is completely blacklisted right? That's the main problem here.
And the test suite failed on a minor issue where the cause was uncertain, which gets in the way of checking for the bigger definitely-the-driver problem.
That's the reason I have webgl disabled even though I'm on Windows with a supported driver. It opens a gaping security/DoS hole without any benefit (do you ever see webgl used in any website?)
Unfortunately, 'the most widespread Linux distribution' is rounding error in the overall scheme of things from a desktop standpoint so it's actually pretty good that they support it as well as they do. Adding Nouveau to the mix is rounding error on the rounding error. So from Google's standpoint, sure they don't have the resources to test it relative to the number of impacted users. Looking at it another way: it's not like Apple or Microsoft will (ever) provide better support so they can get away with it.
I solved the Chrome/Chromium on Linux problem long ago by switching to Firefox. I'll still fire up Chromium for Google sites and a handful of other sites, but for the most part Firefox has been getting it done for me.
They don't have the resources to "fix" it, they tested it and decided it was not stable enough. What they did was right. Firefox entirely disables GPU acceleration, so I am not sure what your argument is.
Exactly. Nvidia have been relentlessly hostile to Free Software for ever. No one who wants an easy life or a stable, upgradeable operating system would buy an Nvidia card.
> most widespread linux distribution for one of the two most widespread GPU producers in the market
That's still less than a percent of the market I think. Maybe shave off another order of magnitude. So the economic argument is not the one to make. However, it sucks nonetheless.
Its still probably hundreds of thousands of users. I hate arguments about "but Linux is only a fraction of the market!" because the total OS / browser usage market is in the billions of people. Even small fractions of that pie could found their own countries and be of reasonable size for a nation.
>found their own countries and be of reasonable size for a nation.
Honestly, you could fill up the number of people using nouveau with chromium in a stadium more likely. From the small pie of linux users, you drill down to nvidia, and then further down to nouveau. All laptop users, even with nvidia cards, will only use the intel gpu unless they explicitly enable the discrete mode and not install the nvidia drivers. Desktop users obviously only buy nvidia if they want to use the binary driver. What do you think remains of the pie ? I haven't met a single person in real life who uses this stack. I love nouveau. That doesn't change the numbers though.
If the market share were compelling enough, couldn’t some other maintainer step up and do this work? OSS doesn’t necessarily mean support for everything under the sun at all costs; Chromium may be open source but that doesn’t mean they have carte blanche from Google to test every possible combination of distro and GPU.
This kind of problem specifically doesn't map to "user finds problem, puts money where mouth is on bug bounty to fix it" because actually figuring out your Blink based browser isn't hardware accelerated is at least a nontrivial task most won't even realize until they go "oh wow whys it so much faster on Windows?" and then not even know where to start looking to find out why.
I'm pretty sure Mozilla just don't care about Linux these days. It's been getting steadily worse, and 64.0 is almost unusable for me due to a bug that has nothing to do with graphics which hangs the whole thing for minutes at a time.
>Now that Ubuntu ships with nouveau on default, maybe it's time to blacklist it in Chrome.
I don't get this. All linux distros under the sun ship with nouveau, and always have. It is a kernel driver afterall. What else are they going to ship ?
I think their point is that by default, even with the "use recommended drivers" option enabled, Ubuntu uses the nouveau driver by default instead of installing and using the NVIDIA driver.
Well, installing recommended drivers on Ubuntu installs the Intel or AMD microcode, which is definitely proprietary, so I don't think that's a particularly reasonable argument.
The NVIDIA driver is the driver that yields the best performance and experience; AFAICT the only reason it's not shipped by default is because the kernel is then considered "impure" or something (because source isn't available to debug issues in the NVIDIA driver), hence bugs can't be filed upstream for any issues in the kernel if the NVIDIA driver is installed.
However, none of this is really the concern of the Chrome team. Their responsibility is to give the user the best experience, which they have defined as security, stability, and performance in that order. Since the nouveau driver is causing stability bugs, it's perfectly reasonable for them to fallback to software rendering and take the performance hit in order to preserve stability.
Including Nvidia's binary driver by default. Obvs Debian, gNewSense, and GUIX can't do that, but since when have corps cared about them?
Like, don't get me wrong, I'm a FOSS or GTFO kind of guy, running Debian and Replicant, but shipping proprietary drivers by default is what chrome is hinting at obviously.
AFAIU, Nvidia's legal argument is that their driver can't be 'derived from' the Linux kernel even though it's linked into the kernel since it's older than their Linux port and only is linked against their GPLed (and if push comes to shove, dual licensed GPL and proprietary) kernel abstraction layer. Judges don't care about linkers and the GPL doesn't actually say anything about linking.
This is not the derivation argument (that would be forcing GPL on Nvidia, which they want to counter, of course).
This is about GPL infrigment; you can combine GPL licensed code with differently licensed code, but only in your privacy, without distributing the result to others. Once you would distribute the result, you would be breaking GPL, thus having no right to distribute.
The GPLv2 literally defines itself in relation to trees of derived works:
> The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language.
The argument that I've is that even when linked together, the Nvidia binary driver isn't a 'derivative under copyright law' because it doesn't have Linux specific entry points, and is older than their Linux port, and therefore isn't part of "The 'Program'" as listed in the GPL.
Your argument is, why some piece of software should (or should not) be relicensed to GPL.
My argument is, that since that piece of software is currently not GPL licensed; avoids the relicensing topics and explores how to solve the current situation wrt distribution of the combined work. Use by the end-user is non-issue, due to freedom 0.
This isn’t correct. WiFi router ships with GPL Linux and proprietary router software. The company / foundation creating the distribution just needs to have a license for the proprietary bits.
No, that's not entirely true either. Depending on where the proprietary bits are, it's the other side that is the problem. It is a violation of the GPL. So the usual workaround is to shift the responsibility to the user so that incompliant stuff is done by the user by the way of a script that runs on the user's behalf.
If the wifi routers are shipping with binary kernel modules, then yes, it they are breaking GPL. This problem is actually common with embedded devices.
First, if you're talking about the lawsuits over wireless routers, that wasn't Conservancy, that was the Software Freedom Law Center (SFLC) working with the FSF. (And some feel those lawsuits were poorly timed, preempting active negotiations in progress and being largely responsible for Cisco ceasing large Linux-based projects in progress.)
And second, to answer your question, partly because Linux is sufficiently good and has wide enough hardware support that it's still easier to use Linux legally under the GPL than to use another operating system, and partly because if you're going to use another operating system with narrower hardware support you might as well use a tinier embedded one and ship cheaper hardware (which is what Linksys/Cisco did with the WRT54G, shipping VxWorks and less RAM, and then selling the more-Linux-capable WRT54GL as a higher-end product).
Beyond the whole foss aspect, there are good technical reasons too. For instance, blob won't work with wayland (at least right now AFAIK), and ubuntu was (is) debating switching to wayland as default.
If Nouveau isn't working all that well, then clearly a switch to a new graphics stack that would require it is waaay premature, given how prevalent NVIDIA graphics hardware is.
I had a handful of laptops at work that were briefly on 17.10 (because 16.04 had a showstopping bug for them) and the graphics were a complete headache until I switched them to X. Luckily 18.04 reverted to X and made my life easier.
Sort of. It falls back to X for drivers that don't support Wayland, which has included nvidia for a long time. (Maybe there is some progress there more recently.) And the proprietary nvidia driver is widely used.
I think what they mean is "it's now enabled and used by default even for the clic clic clic through the menu (non technical) users", not that the driver is technically present.
How does llvmpipe do these days? I remember being excited about it ~5 years ago and I can only imagine both LLVM performance and CPU speeds have gone up.
But that's the part that's crashing. I'm suggesting Chrome use llvmpipe to do whatever they needed hardware acceleration for, generate a single boring texture on a single big rectangle, and pass that through to the graphics driver. That way Chrome itself isn't loading the Nouveau libGL and is much more insulated from incompatibilities between Chrome and Nouveau, whosever fault it might be.
That's why you go through llvmpipe - you use the Mesa OpeGL implementation in LLVM-JIT-accelerated but pure software mode to implement WebGL, get the results onto a texture, and pass that texture onto your actual graphics card.
That's not the point. The point is that it results in brokenness, and the easiest way to mitigate that for the bulk of non-technical users is to use software rendering instead (i.e. blacklist it). They're open to more refined solutions, but these will take some effort.
Chrome without acceleration ~~ Firefox without acceleration. I don't think Firefox is enabling hardware video decoding in Linux, even on a fully-free driver?
Acceleration in browser is not only about video decoding.
The composition is GPU accelerated too. Chrome has it enabled by default, Firefox doesn't.
In the end, both browsers treat Linux as a second-class citizen. To make things even more weird, ChromeOS uses the same API (VA-API) and the same driver (for Intel GPUs) for video decoding acceleration that the stock Linux does, and the video decoding is supported on ChromeOS but not on Linux... And Firefox is just plain mess.
One other problem with this is that now, if you want to run Chrome with --ignore-gpu-blacklist (or any other flag), you have to do that for every one of those bloated Electron apps that you have to run. There's no way to set a system-wide flag.
I'm facing this problem and it's very annoying. I have to run Chrome with --disable-gpu-driver-bug-workarounds, which ironically makes it work perfectly---with the "driver bug workarounds", it's terribly broken. But now, any Electron app I run (e.g. VSCode, Postman, Discord, etc.) is also broken by default and needs the flag as well. Frustrating.
Because it shows that there are different opinions about what a user can reasonably expect, the key quote for me is the following:
> Please remove the blacklist as all Nouveau users expect this corrupted behaviour. If at all possible it may be worthwhile to let users know it doesn’t have to be this bad if they flip to the binary driver. Perhaps a pop up letting them know?
That cost assessment is bizarre. Consider how many variants are there in the entire Google software library; somehow I doubt they're spending trillions on testing.
The bugginess of the nouveau driver has been an accepted fact since time immemorial. Hence the reason for Torvalds' middle finger on the subject. Nvidia is slowly coming around but the arc of history isn't treating it well.
The author, Ilia, noticed that this is on HN and responded in the mailing list thread. I don't know if I agree with the take, but I'll repost because I found it interesting.
---
I've glanced at the HN discussion about this situation
[...] and it does seem like
people are focusing on the wrong thing... the important bit isn't that
nouveau crashes and burns in some situations—everyone already knew
that, including the users of nouveau who continue to use it
nonetheless. It's that if every piece of software feels free to ignore
a system integrator's or user's wishes, then the user now has to know
how to override that behaviour separately in every application. The
situation is that Distro X has decided that nouveau is the right thing
for its users. A user can disable that by uninstalling or otherwise
disabling nouveau if they wish. But now chrome comes along with its
own set of rules. What if every application starts doing that?
It should also be noted that outside of a few pathological cases, like
creating 2GB+ textures which never happens in practice, nouveau works
just fine for me. For other people, it dies at random intervals,
irrespective of whether they're using chrome or not. While this is a
non-ideal scenario, chrome shouldn't be in the business of worrying
about things like that. It just confuses the situation for everyone.
> every piece of software feels free to ignore a system integrator
Yes, they do and they have very good reasons for it. Working with distros to ensure they ship the right libraries takes an infinite amount of effort so companies that what to ship a product to their users will work around it.
This also happens in the server space with the massive popularity of containers and language-specific package managers.
The traditional Linux Distro model is extremely flawed.
Nouveau are doing their best to reverse-engineer cards. NVIDIA doesn't help at all, not providing any docs and not interacting with the rest of the community. On the other hand, Intel and AMD have high-quality drivers and are part of the open-source community.
You can also regularly see posts by NVIDIA employees on the nouveau mailing list.
Sure, they are contributing nowhere near as much as AMD or Intel, but to say they are not interacting at all or not providing any docs or not helping at all is clearly false.
You get 3x better performance with a GTX 780 Ti than a GTX 980 Ti. From reading the explanation from a Nouveau developer in the comments, I get the impression that NVIDIA just wants the open source drivers to work well enough that you can boot your computer and download the proprietary drivers.
I think the biggest missing piece currently is signed PMU firmware for GM20x to allow reclocking, which NVIDIA has not released and does not seem to currently be planning to (even though they have released some other GM20x firmware).
Ya. It's too bad their hardware isn't more competitive. I am definitely rooting for AMD to wow us next generation like they did in the CPU market.
Things are getting far better for OSS you drivers. If you wanted any 3d acceleration in the past your only choice was the proprietary Nvidia driver or a broken proprietary ATI driver. (The ATI windows drivers were pretty buggy too during this period).
Back then the proprietary Nvidia driver kept much better Pace with their windows offering. Last I checked their proprietary Linux driver was almost a year behind (point release wise) the windows offering.
> I am definitely rooting for AMD to wow us next generation like they did in the CPU market.
Don't set your expectations too high. Their next generation of consumer graphics cards, rumored to be announced at CES next week, are reportedly still going to be based on a new revision of their current GCN architecture (which originally debuted in 2011 and is very much showing its age these days). Rumored performance for the "high-end" model will be somewhere around the GTX 1080 or RTX 2070 (though reportedly at a much lower price compared to either of those cards).
If you're waiting for them to produce an all-new GPU architecture the rumors are that they are working on one to be launched in the 2020-2021 timeframe. That's also when Intel is rumored to be preparing to launch their own dedicated GPUs so hopefully we will be going from 0 properly high-end GPUs with open-source drivers to 2 by 2021.
I got an amd fury x a while back because the nvidia binary drivers regressed severely with my old card, and I wanted 4K anyway.
It’s by far the best Linux GPU I’ve ever encountered, and that’s with open source drivers don’t set the kernel taint bit.
Its performance in benchmarks was much better than the comparably priced NVIDIA. It is also the quietest gamer GPU I’ve encountered (I bought one with slightly nicer fans).
Maybe things have changed (but the 1080 was certainly out back then), but this card wins on every metric other than CUDA support (nonexistent) and absolute performance (close enough to fastest, and certainly overkill for my Linux Steam collection at 4K).
I'm not saying that AMD's current cards are slouches (I'm planning to buy one myself this year when I build my next computer) but they certainly do not meet the performance bar set by Nvidia's current high-end models (the RTX 2080 Ti and Titan RTX) which is what I mean by "properly high-end". Those are admittedly niche products due to their high pricing but it is important for AMD to have a competitive product in that segment since many people don't do a lot of research and if all they know about GPUs is "Nvidia has the fastest cards" (or an equally-uninformed salesperson tells them that) and buys a 1060 over a 580 based on that "knowledge" then AMD lost a sale.
> they certainly do not meet the performance bar set by Nvidia's current high-end models
Assuming you can't tell the difference between, say, 100fps and 150fps, isn't the 'performance bar' a bit arbitrary after a certain point if the hardware runs the vast majority of the software thrown at it?
Objectively you are correct. However, as the rest of my comment indicates there are important marketing and brand-value perception reasons to have a true high-end card: how many professional gamers (streamers, e-sports, extreme overclockers, etc.) who do buy this class of GPU are there playing with AMD cards vs Nvidia cards? How many (tens/hundreds of) millions of views do they collectively get playing games with Nvidia hardware? Nvidia has likely gotten millions of dollars in effectively free marketing just for having these cards.
AMD's GPU market share has been shrinking with the general gaming audience despite the fact that their mid-range cards have been very price-competitive with Nvidia's offerings during the same time period they've lacked serious competition at the high-end. While focusing on the mid-range where the highest volume of GPUs are sold is the rational market strategy it may not be a winning market strategy in the real world.
> will be somewhere around the GTX 1080 or RTX 2070
Aren't those still adequate to run like, 99.9% of all games on the market? I would (and do) gladly pay for hardware that isn't the absolute fastest if the manufacturer employs a Linux driver team to make that hardware work on Linux.
> Back then the proprietary Nvidia driver kept much better Pace with their windows offering. Last I checked their proprietary Linux driver was almost a year behind (point release wise) the windows offering.
Am I reading this right? Nvidia's drivers were never out of sync for a year with regard to the supported GPUs, OpenGL and Vulkan features. It's easy to see that looking at the Vulkan beta drivers page at https://developer.nvidia.com/vulkan-driver. (That's just a convenient example, non-beta drivers follow the same cadence but there no single page I can link.)
AMD GPU hardware is definitely competitive, they just aren't top of the line. At the mid range, they are very competitive, and that's what most people need anyway.
My next GPU will be an AMD GPU, as soon as my GTX 960 stops being sufficient (and no Wayland is certainly an issue).
In fact, when buying graphics cards I am looking much more on the driver support than on the pure FPS performance. I have a Dell XPS with NVIDIA graphics and an AMD RX460 in my desktop PC.
And in my experience, the NVIDIA driver landscape is just a mess. The state of Nouveau can be obtained from the article above and the proprietary driver causes all kinds of weird issues (e.g. UI spinners rotating at different speeds, fans turning faster than they are supposed to be, etc.). On the other hand, the (open source) AMD driver seems to have evolved quite well over the last years.
As a result, I hate my NVIDIA card and like my AMD card quite much. Every time I see those steam survey results I wonder why there are still so many people buying NVIDIA cards for their Linux boxes. They probably just trust the benchmarks and don't compare the experience first hand.
In the last 6 years I've used a 7870, 290, and 580 as my desktops GPU. All have been stellar cards that have only gotten better over time.
Back when I got that 7870 circa 2012 I was taking a major risk - at the time the 6870 was the premiere foss card and radeonSI was brand new with growing pains. The 290 was another risk being the first major architecture update to GCN. But I was validated in both purchases with usable cards at first that within a few months became excellent.
I've played a lot of the major Linux AAA releases on these cards shortly after release - Borderlands 2, Civ 5, Metro Last Light / Redux, Tomb Raider, etc. There used to be quite a few bugs at release and glitches. Nowadays everything is pristine. And I get about twice the framerate in BL2 today than I did four years ago on that 290.
I'm almost certainly going to buy a Navi card next unless we have another crypto bomb ruin the market again.
AMD GPUs are definitely competitive unless you made a business out of computing things in GPU. For most daily tasks like graphics (games) or even moderate number crunching (OpenCL), AMD GPUs are perfectly usable and high value.
These days Vulkan can be used for compute as well, and it gives the user a far more refined interface than OpenCL. There's very little reason to limit oneself to the latter.
Any Vulkan library you can suggest? Last time I checked Vulkan, it required enormous amount of boilerplate just to connect GPU, compile, send data and run code. So, I ended up using OpenCL for years. Unless there are good high level libraries around I would still stick to OpenCL.
I see it repeated all the time that AMD has great open source drivers, but their latest drivers on their 5xx line card couldn’t run my 4K display at 60hz over Display Port without splitting the screen into two separate X displays. Nvidia’s proprietary driver can handle it no problem.
Needless to say I have an AMD card, with awesome open source drivers, sitting in a bin. While the horrid binary-blob Nvidia actually drives my display.
That's a curious situation. I run my 4k displays over DisplayPort with an RX 580 on Ubuntu 18.04. However, I haven't gone out of my way to obtain drivers from AMD. I'm just using the drivers that ship with the kernel. It might be worth trying a vanilla installation of Ubuntu to see if you can get your set up working.
For those closer to the hardware with insight into the emerging architectures and market dynamics, how far away is AMD/Intel/IBM/Google from catching NVIDIA from a technology perspective, and does NVIDIA have a defensible competitive advantage now that market forces have moved well beyond gaming and shifted US strategy toward a spending spree of buying up all IP?
What’s the rationale for NVIDIA for this? As the whole point is to expose hardware features through an API, what do they gain by keeping it closed? Are there features implemented purely in software?
The most commonly used APIs for graphics (X11 with all the "accelerated" rendering, OpenGL) match the actual current GPU hardware so poorly that most of the driver bloat is about dealing with that mismatch in the most efficient and transparent way possible. The drivers also must contain a collection of compilers and assemblers and other stuff (because the OpenGL spec says so). So there is a lot of secret sauce type stuff in there that these companies don't want you to know about. Whether this is justified is a completely different question.
The code might expose the implementation for a lot of NVIDIA specific features and libraries. Having that open and possibly ported to other hardware would not be in NVIDIAs interest.
Both projects seem to put the blame on the other project. IMO, there are at least two other factions that can be blamed equally well.
First, nvidia for not providing OSS drivers and second, the user who runs an OS on hardware that can't be properly supported on this OS at the moment.
For years I have the rule that I prefer hardware that has proper Linux support. Even going a bit further that I try not to buy hardware that doesn't have OSS drivers unless there really is no other option (nowadays, that means mostly smartphones).
Ouch, that's mean. IMHO the only way to react to that is to close down shop. Ideally including Linux throwing out support for Nvidia cards.
I have a strong opinion on this as I was involved a few years ago in the Kindle homebrew community. After a while I decided that my time could be used better than on a product by a company that tries to prevent us from making his product better and more interesting to its users.
There is so much energy and time wasted when you have to fight against the manufacturing company and in the case of a graphics card, the result is that it just works as well as the other graphics card option that is fully supported OOTB. And then that's just for a piece of hardware that you can replace on average for a few hundred bucks.
I've personally actively avoided buying anything with Nvidia hardware in it for 8 years now because of how awful their software business is. That has meant I've even skipped over tablets and phones using their chips and will never buy a Nintendo Switch because... well... firstly consoles are bullshit crippling perfectly good computers with proprietary OSes but secondly because it has an Nvidia chip in it.
I tell everyone I know not to buy Nvidia, I have built gaming computers both professionally and for family that I've actively advised against buying Nvidia parts for because of their business practices, and when I buy notebooks its a real PITA to get one without some Nvidia GPU in it but I do that too.
But Linux cannot "throw out" support for Nvidia. Nvidias driver is wholly independent of Linux, its why its such a PITA to work with. Its a DKMS blob driver and way too much other useful stuff depends on that interface to break it just to stop Nvidia. Fortunately nobody is going out of their way to support Nvidia either, such as their stupid attempt to fragment Wayland development with eglstreams nonsense.
The most valuable thing any of us can do is constantly decry Nvidia for being the reprehensible company they are and use what influence we have over those we know to distance them from buying their products the same way you should distance friends and family from buying Nestle or Exxon products or should avoid shopping at Walmart.
I'm nowhere as fanatical about it since when it's come to corporations there is no black and white, but there at least few cases I can recall:
* Even decade ago they did a lot of shady stuff when they initially separated GeForce and Quadro series. There was time when you can make little hardware mod on Geforce product that make driver think you have Quadro and magically a FP64 become twice faster and CAD application performance was doubled and tripled.
* Their OpenGL implementation was ever non-standard and allowed both code and shaders to work in situation where it's clearly should not. As result games and apps that was debugged on Nvidia hardware would never work on more conformant Intel / AMD drivers.
* In terms of Nouveau back in 2014 Nvidia started to require signed firmware files [0]. They promised shortly after to provide Nouveau with the files with clear licensing, but it took years. When files were finally released everything related to power management was missing and still not released until this day. So Nouveau can't implement recloking. On top of that they intentionally obfuscate the way firmware files stored within driver binary and also make GPU load firmware via DMA to make extraction much harder.
* Sabotaging of OpenCL in general by not implementing spec for years and keeping it generally much slower that it's supposed to be.
* Several years ago Nvidia twice added checks to their Windows drivers to stop consumer GPU drivers to work when passed into virtual machines. Both bypassed by hiding KVM identifier and changing HyperV hv_vendor_id, but it's was annoying [1].
* Nvidia forbidden to use consumer hardware drivers in datacenters [2]
* Nvidia tried to push some crazy anti competitive program to make exclusive deals with GPU vendors so they can't sell AMD GPUs in any kind of "gaming" product lines [3]
Again it's not like AMD is perfect, but that's a lot of shady stuff and I likely not even listed half of it.
> There was time when you can make little hardware mod on Geforce product that make driver think you have Quadro and magically a FP64 become twice faster and CAD application performance was doubled and tripled.
Do you think that overclocking a CPU clock is "magic", too? You were just running the hardware out of spec! Yes, FP64 and CAD were a lot faster, whatever; are you going to trust these faster-reached results to be always correct? Or that the card isn't going to break down in the midst of some important workload? The whole point of labeling something "Quadro" is to say "these cards are good for that sort of critical stuff - they've been extensively tested for reliability".
This is a reference to the baby milk scandal [1]. Nestlé would market their powdered baby milk in developing countries as being better than breast milk, even though it's not. Worse than that, they would give away samples to new mothers for free or a substantially reduced price, who would then lose the ability to lactate because they're because they're not actively doing it. After that they would be forced to buy the powder, and not all of them could afford it.
This was at its worse in the late 1970s, but it seems the practice hasn't entirely stopped even today
If everyone from FOSS world will show them the middle finger like Linus did, maybe, just maybe they will become friendlier a bit. But our only true hope if this company will go bankrupt and out of the market.
Well, not quite. The charitable way to put it is that they are requiring signed firmware in hardware, in order to help stem fraud by 3rd party resellers (where some card model X is sold to naïve buyers as a more desirable model Y, with a hacked firmware to match) and project noveau is obviously impacted.
> First, nvidia for not providing OSS drivers and second, the user who runs an OS on hardware that can't be properly supported on this OS at the moment.
nVidia does provide perfectly working closed source drivers on Linux though - how is it their fault that some 3rd party implementation of the drivers doesn't work in a stable manner?
Define "perfectly working". If I download the Linux sources, compile the kernel and install it, there is no official Nvidia driver to be seen.
I didn't say that Nvidia is at fault for the state of the nouveau driver. I said they are at fault for not providing OSS drivers. Nvidia doesn't want to play by the rules of Linux development but they still want to be on Linux. That's legally possible but why should the user or distro manager give Nvidia a pass here when other companies play by the rules?
Think about it. Which components would you be willing to download drivers for if they weren't OSS? Your monitor? Your sound card? Your harddisk? Your network card (oops)? Your mouse or keyboard?
If you download the current Windows ISO image and install that offline, you won't get a proper driver for your nVidia card either. Yet, this is accepted as normal.
Some Linux distributions that do not hamper themselves artificially on philosophical grounds offer nvidia binary drivers in their package repositories like they do with every other driver they contain. Your focus on the kernel sources is too narrow.
There absolutely was a time when you had to download all these drivers you mention separately - for Windows. And people put up with it. I think even today the Windows installer is forced to offer an option to install 3rd party RAID controller drivers into the installation environment...
Yeah, and Linux's insistence on not having a stable ABI for drivers is problematic for Nvidia, just as Nvidia's insistence on not open sourcing their drivers is problematic for Linux.
AMD cards for anything advanced are crap, they couldn't recogize one of my monitor (>2 monitor setup), and they're not playing fair with their AMF extension on Linux (works great on Windows though). Even on Windows, I cannot get the card to properly get my minitor resolutions (it's sitting beyond a multiplexer, though there no problem with nvidia).
Triple monitor on two physical screen, and no AMD cards do not work, in this case using the display port output and DVI->HDMI adapter, and yes, it works with a third physical monitor (not acceptable in my setup). Nvidia/nouveau are not showing these problems.
NVidia comes out of this looking very bad. It's not as if a GPU can't have good Linux support (see AMD, Intel), but theirs doesn't. Although I haven't spent my hard-earned money on their hardware since 2005, I have several machines at work that are burdened with it. Some lock up with Nouveau drivers, others crash with NVidia drivers. I can only conclude that the hardware is unstable.
The problem from my perspective is that a major GPU vendor seems to think it's appropriate to paper over hardware problems with proprietary drivers, and they get away with it. While their reputation ought to suffer for that, I'd rather see them mend their ways. I'd love to take NVidia off my blacklist.
Uh no, means that you can make it work and some people are willing to spend the time to do it right. You'd think that if Mozilla can afford to do this so could Google. The only way the problems in the drivers can get fixed is through usage, if Chrome flat out refuses to work with the driver then the issues can't be fixed.
This also illustrates why it's important to have alternatives to Chrome for anybody who cares about open source.
I assess with high confidence that that was a satirical comment.
It could be working on FF only through a quirk in their rendering, rather than a testing. Chrome certainly has a test suite -- did anyone run it with nouveau and turn bugs into PRs? And arguably, that job belongs to the desktop owners, as well as the nouveau and chromium teams. Browsers are insanely complicated, and who is paying for this work?
If I could make my employer pay for a meaningful number of Ubuntu desktop licences I would, but TBH I need it to work with the nVidia driver not nouveau.
Maybe it's working through a quirk, or maybe Firefox team writes better code that handles a wider range of drivers. I choose to use Firefox because the team behind it cares about my needs. If Google can't afford to make a browser that works with open source drivers, I will simply choose not to use it and encourage others to use Firefox instead. This is precisely why having alternatives to Chrome is important.
> hangs from the the max-texture-size-equivalent tes
I reported this bug back in 2015 and they have made no progress on it. It seems Nouveau just doesn't have the resources or man power. I can't really blame Google here. You don't want the browser to be able to hard lock system(not even a Magic SysRq will reboot it)
> You don't want the browser to be able to hard lock system(not even a Magic SysRq will reboot it)
Wait, this isn't just rendering bugs a la "some WebGL things will show green instead of red", it kills your system entirely?
I thought the only effect this blacklisting had was making WebGL unavailable for everyone instead of buggy for a small minority. This just flipped my opinion.
Related, this might also explain the strange bluescreen bug my girlfriend had with Windows 8 + Firefox on her laptop. She switched to Google Chrome a few years ago because every time she used Firefox, her laptop would bluescreen within a few hours (either while Firefox was still running, or at some point later). I didn't know bugs outside of the OS could still do that kind of thing (I thought they started squashing such bugs with Windows 95 and finished somewhere around Vista) and wrote the Firefox thing off as triggering some extremely obscure Windows OS-level issue. Maybe I should check if there is a driver blacklisted by Chrome, though it seems more likely Chrome just doesn't happen to trigger it (like all other software running on it, it really is only Firefox).
Driver crashes can easily take down a system. That's why driver quality matters so much. This is true on every OS. (Although, Windows 7+ has mechanisms to help recover from GPU driver crashes specifically, which usually work.)
Also, Chrome can render WebGL on the CPU. It's slower but will still work.
Doesn't hardware accelerated graphics on Firefox on Linux require modifying a preference? AFAIK you need to go to about:config and enable layers.acceleration.force-enabled. (This is speaking as someone who uses Firefox on Linux as my main browser.)
Please don't encourage people to turn on WebRender without at least warning them that it is experimental, still under development, can cause your computer to blow up, etc. Also the environment variables you specified will force it on even on non-Nightly builds which is very undesirable since those builds are not getting fixes uplifted.
> One idea is to flip GL_VENDOR to some random string if chromium is
running.
What? Don't do this - never do this. Regardless of whether or not you agree with Chromium's decision, it is their decision to make as the project maintainers; trying to trick the software with falsified data is overstepping your bounds, and both corrodes trust in your driver and leads to whitelists.
I thought we had moved past the days of fake user-agents.
> trying to trick the software with falsified data is overstepping your bounds.
Fwiw, I agree with this statement only because Chrome lets users override the default behavior.
In the hypothetical event that Chrome was closed-source and forcibly disabling acceleration under a driver, I'd consider fakery on the part of the driver to be perfectly warranted.
What we have is two side accusing the other of overstepping their bounds. Chromium making decision about drivers which normally is the domain of the operative system, and the driver making decision of what kind of systems the browser will support.
The more rational behavior is for chromium to let the operative system decide what software and drivers they want to use, and do what they want with bugs from operative systems that they don't want to support.
Whitelists won't work since all that will happen then is that project will do what Internet explorer did, ie call it self "compatible" and use the GL_VENDOR string of that compatible driver.
> Chromium making decision about drivers which normally is the domain of the operative system
Chromium isn't loading their own graphics drivers. They are deciding to fall back to software rendering if they determine that the card/driver/OS combination does not properly support the API that enables them to use the hardware-acceleration features of the card to render. They're still rendering using the noveau driver on the OS, they're just not using the APIs that they have determined to be not functioning properly. Detecting what the underlying graphics hardware and driver supports and falling back to what is functional is an incredibly common part of software using GPUs.
If you want to use hardware acceleration then go to about://flags and search for blacklist
"Override software rendering list: Overrides the built-in software rendering list and enables GPU-acceleration on unsupported system configurations. – Mac, Windows, Linux, Chrome OS, Android"
Simple and if you have telemetry logging on then you are possibly advertising that nouveau works for you.
The problem is that the Nouveau (and Ubuntu) devs have no way to overturn Googles decision in Chrome. Ubuntu can compile Chromium with a modified blacklist but Google distributes Chrome in a deb. You would need some special logic to catch the Chrome .desktop on install and add in the unblacklist flag.
Ubuntu doesn't have a .desktop entry for Chrome. It comes with the deb. Desktop files are installed by the programs that use them, hence why I said you need to catch the Chrome install and modify the desktop file.
>We can also just take this, as yet-another nail in the nouveau coffin.
I don't understand the dramatics. Nouveau is great but being blacklisted on Chromium is not the biggest issue for it right now imo. A lot of Linux users (myself included) use Firefox anyways, and it's still a default in many operating systems. And you can of course override this. I would place lack of reclocking support much higher than this in terms of issues. And of course, the bugs that caused Chromium to decide to blacklist Nouveau in the first place are still going to affect other accelerated applications.
I'm a big fan of Nouveau, it is what I use on my laptop when in Linux. At the very least, it performs more than adequate for running a compositing desktop environment, and for many common configurations I've had few issues booting things up, the few issues I have had generally solved by using a newer kernel.
If we really want Nouveau to succeed though, the support needs to start coming from Nvidia. AMD and Intel both have in the past supported open source drivers to an extent and it really shows. Nvidia's contributions have been pretty piss poor, and unfortunately it seems they just don't care, which is sad.
I guess I do understand the frustration, but given the fact that this is in response to Ubuntu shipping Nouveau as a default, I think Nouveau's death is a bit overstated.
Its weird to work on a project to support hardware when the manufacturer actively tries to thwart you with signed firmware. We need more efforts going into open hardware imho
It _still_ doesn’t work with Pascal cards, which are 3 years old now. Not to
mention newer Turing cards. It’s only good for some obsolete hardware more than two generations ago. The first thing I have to do before installing fresh Ubuntu or Fedora is looking up how to blacklist nouveau, as the system won’t even boot otherwise.
No, it is NVIDIA who is joke, Nouveau doesn't support these cards just because of NVIDIA policy and management. And they are the only who deserve the blame in this case, not Nouveau or Google.
AMD has their own set of quirks, they are not open sourcing their main driver either. Nouveau was always a stopgap measure, to boot the system and make it somewhat usable until you install the official binary driver; so is amdgpu. Even if you are open source enthusiast who wants nothing but FOSS on your system (and this is a very minor use case), these drivers do no better than your integrated Intel graphics, which is integrated into CPU directly since time immemorial.
Nouveau and amdgpu were always barely working, just enough to get something on the screen — no state of the art 3D acceleration, no CUDA/OpenCL. But there is a big difference between “barely working” and “black screen on boot, always”.
"Nouveau and amdgpu were always barely working" — nah, you are spreading misleading information and lumping together completely unrelated projects at different stages of development.
Noveau has been almost close to usable at some point... Until Nvidia started the whole signed firmware nonsence. Then it quickly became intolerable due to Nvidia, supplying no signed firmware or heavily crippled firmware. These days you are actually better off using software renderer/llvmpipe (which at least might benefit from a powerful processor).
"amdgpu" is a name of the modern AMD kernel driver, that replaced "radeon" driver. Both drivers were developed with direct help from ATI/AMD. Unlike it's older predecessor, current amdgpu versions include the "display core" code, that was designed to be shared between Linux and Windows AMD drivers (not sure, if that part has worked out yet). All Linux drivers for currently produced AMD cards are open-source, both kernel and userspace parts. AMD also has a "value-adding" package, that is based on their own userspace driver (which is open-source) and simply adds few closed-source components to it.
"amdgpu" is not barely working — AFAIK, it is the only currently available AMD kernel driver, and it worked great for me, both 2D and 3D.
I gave AMD a try with a couple different card, couldn't get them working properly. One was on a dual GPU setup, AMD failed, nvidia worked. Then I needed hardware accelerated HEVC encoding, bought a rv560, works great on windows, but AMD isn't playing fair with AMF on linux, once again failed by AMD, and I didn't want to install the proprietary nvidia drivers to use VDPAU, so that's a stalemate. Then I tried the AMD card on a split virtual monitor setup, once again AMD failed me, couldn't use both dvi & display port at the same time. Now I run a 1060 with nouveau and some quirk, best experience so far. Oh, and on windows, my rv560 can't properly get the resolution of the monitor behind an hdmi demux, the previous nvidia card worked like a charm.
amdgpu is not a stopgap driver. It's mature and reliable enough driver to play videogames including a lot of windows games via proton. I don't see a single reason to use a proprietary OS or driver anymore.
As long as you don't have a funky setup, sure, but I've never been able to get AMD cards properly working for what I wanted to do (dual GPU setup, AMF on Linux, and multi-virtual monitors, ie 2 simultanious display on 1 monitor, or even getting the proper resolution from monitor behind an hdmi demux) , both with the opensource `amdgpu` AND the proprietary Linux drivers.
The amdgpu drivers in Linux are fantastic these days. Works out of the box with most hardware. No proprietary binaries needed, other than the firmware blobs. AMD is doing a really solid job of supporting their Linux user base.
AMD drivers have an OS-independent hardware abstraction library at the core and they "only" have to adapt it to different systems: https://github.com/GPUOpen-Drivers/pal
No multi-GPU support (at least 2 years ago, using the binary drivers), no AMF support on Linux, and couldn't get multi-display support on a split screen. AMD is fail for me...
Excellent is a stretch. It works for things they support. Look at optimus and wayland for instance. Also, they restrict nouveau from working by signing blobs and enforcing checks. And this is years after Linus flipping the bird and other repeated requests.
Optimus was the single most pain in the ass thing I have ever experienced in linux. Never again, am I going to run linux on an optimus machine with Bumblebee.
I don't know, I upgraded to 18.10 and my XPS's brightness control doesn't work. The Prime switchy thing can't change cards without a reboot, and if the screen goes to sleep between switching cards and rebooting, it never comes up again. Actually, with 18.10, it never comes up again anyway. I have a whole host of issues with nVidia cards, my next card is going to be AMD.
That's not the point. The switching issue at least is present on 18.04 as well. Basically, ubuntu changed the mechnanism from bbswitch in 16.04 to a different one in 18.04. The 18.04 one is buggy and inferior at the moment.
Their driver is really behind in terms of supporting newer libdrm features. It's also not well integrated in the ecosystem. They basically don't talk to the other Linux graphics devs.
You can get native GPU buffers via enabling zero copy rasterizer in flags and passing --enable-native-gpu-memory-buffers via chromium-flags or as an argument.
They work on ChromeOS the same way they do on desktop Linux and I've been using them for months without issue.
The Chromium in Fedora's standard repo is built with it, and when I run it chrome://gpu does say video hardware acceleration is enabled. But at least with a handful of youtube videos, it doesn't matter, still uses 200%+ CPU, same as Chrome (no acceleration), laptop still gets hot and fans still run.
I haven't been following that bug, but a couple of things. Did you ignore-blacklists ? If you do ignore-blacklists, it overrides some stuff, but not everything. Also, the about:gpu page is not entirely accurate (particularly after ignore-blacklists) from what I remember. I think you should play a video and go to chrome://media-internals to figure out which path it is using. There have also been patches to enable acceleration for intel, but not merged upstream. So if you got a patched version, or your distro did that for you, it could also work. Finally, it works on chromeOS, which also uses Intel hardware, but that's because google supports chromeOS.
As a Linux user for over 15 years, this is one of the main reasons I am replacing Linux on my next machine, probably with Windows (sigh). Maybe it's just me getting old, but I'm done screwing around with computers to get them to do what they are supposed to do. It's frustrating and embarrassing.
It's not just that some drivers or hardware have obvious problems. It's virtually impossible to get a system where all of the components have vendor-supported open-source drivers, or, when they exist, that a distro supports them out of the box. Many components' drivers only officially support two operating systems. If they don't support your operating system (Linux), then when you have a problem, you're SOL. It's like buying a car that when the transmission blows, you need to hire a mechanical engineer to reverse-engineer it. Better start taking the bus to work, because this could take a while.
Some laptops (about 9 product lines from 3 vendors) make official Linux distro releases available. So you're stuck with a small selection of machines and basically only one distro. Even if that is ok with you, the drivers still won't all be vendor-supported open-source. And on top of that, Linux may have wacky, constantly changing ways of handling the hardware, like hybrid GPU support, or low power mode for Intel wifi chips. Your machine will always be an awkward stepchild.
A limited hardware selection isn't good. Laptops are hard to design. There are always design and procurement tradeoffs, and the less tradeoffs they make, the more expensive it gets. So to get a good experience on a good machine, you need to buy the most expensive one. At that point you might as well buy a Mac.
Finally, there's actual user experience. I've used several distros and environments the past few years, and they all had pitiful user experiences on my laptops. CPU scaling on battery/AC doesn't work out of the box, most of my media keys don't work, hybrid GPU takes a computer science degree to get working properly (with the right combination of drivers and software), echoing the wrong value to a /sys/ entry that had no kernel documentation permanently fixed the cpu fan at top speed, and the latest vanilla kernels with an old kernel config simply won't boot.
I know this isn't everyone's experience, but the only reason I'd recommend a Linux desktop is if they were looking for a complicated new hobby.
> The only reason I'd recommend a Linux desktop is if they were looking for a complicated new hobby.
Not sure if this was just a typo and you meant "laptop", but I imagine most of the problems you're describing could be avoided on a self-built desktop, because you can choose all the hardware for compatibility.
I haven't used Linux seriously in years, but this is how I'm able to make Hackintosh work without constant fiddling, which all logic dictates should be more difficult than Linux.
This doesn't reflect my experience at all. As a Linux user for 20 years I've had my fair share of times when I was ready to throw in the towel, but the last 3 years or so have renewed my faith in the OS, particularly the big mainstream distros like Fedora and Ubuntu. I can't remember the last time I ran into a big system-wide breaking problem.
Not wanting to play fanboy but I was a huge fan of Windows 10 throughout the prerelease and for a good while after release, but it seems to be going downhill in recent updates. It seems like now would be the absolute worst time to switch back.
Agreed. It's very easy to install the proprietary driver in Ubuntu and in my experience it Just Works™. Why not default to that? The vast majority of Ubuntu users aren't open source evangelists who'd choose to use an inferior driver purely for ideological reasons.
My understanding is _a lot_ of stuff in Nvidia drivers isn't _technically_ owned by them. As in they license out A company to make Y thing with Z license. After 20+ years of legacy those weird licenses add up, and no-one feels comfortable with releasing the code. Whether or not that's true? Hard to say... For comparison AMD has been (since 2015) working hard to make nearly all parts of their drivers upstream in some fashion (amdgpu). While that's not a 1:1 comparison there (AMD has less capital than Nvidia) it shows it _is_ possible to free these drivers.
The real answer is that releasing source code means uncovering (some of) their cards. The company, that relies on selling identical hardware under vastly different prices, won't ever do that. Using game-specific speed-hacks to "fix" games, purposefully written to violate standards, is another issue. Especially, when those games were made with help of Nvidia engineers. Why give up such ability?
I also suspect, that they make use of multiple patented technologies, both in hardware and software. When Java has been re-licensed to GPL, one of the most prominent pain points, that caused endless whining on part of OpenJDK users, happened to be it's font renderer. And we all know, that font smoothing is tricky business, and all font-smoothing tech in existence is patended by MS/Apple/Adobe. When you start replacing closed-source code with free replacement, those patented pieces tend to quickly come up — especially when open-source projects go to great length to work around patent issues instead of shoving them under the carpet.
This is another reason why we shouldn't let current web browsers decide the future of the web... which to them will become a proprietary black box of slow JavaScript spyware and horrid markup languages.
I didn't see anything about a black box, except that users had reported issues and they didn't feel that they could investigate them because there were to many variances in Linux distros.
There are screenshots of the issue in the report. They've also said that there is a pattern of Nouveau instability that you don't see in proprietary drivers. Its all in the bug report.
If we are speaking about the fairness. We should blacklist everything. All available GPUs and its drivers whether its official or not on every OSes, cannot be trusted in its current state.
A reverse-engineered driver for Nvidia GPUs on Linux that is open source. Since it is reverse engineered, it tends to be buggy. Most Linux distributions use this driver by default, since it is open source.
There is an arguably superior driver supplied by Nvidia. It is faster, has better hardware support, and is arguably more stable (1). However, it ships as a closed source binary blob with a source-based shim layer to integrate it into the kernel.
(1)My wife has 2 Linux desktops with Nvidia GPUS. She ran nouveau by default. However, on both machines, it would cause a kernel oops, and lock the screen up. My wife assumed it was her entire machine locking up, but I realized from the stack traces that were recorded that it was the GPU driver. I then switched the machines to the Nvidia driver, and they have been stable ever since.
I personally always buy hardware from Nvidia, because they provide a binary driver for FreeBSD. They are one of the few companies in the non-server space to provide good packaged drivers for FreeBSD.
> ...it would cause a kernel oops, and lock the screen up. My wife assumed it was her entire machine locking up...
Thank you! I think you might have just solved an issue for me that happens intermittently with my linux mint 18.3. At least we'll see once i dive into output of stack traces, etc.
Buy an AMD or Intel GPU. They're cost-effective (or non-negotiably bundled, sometimes), they're well supported, and the vendors make a considerable effort to ensure that stable, compliant drivers are available on every major operating system, and that the free ones are no less stable nor any less compliant.
AMD's performance is competitive dollar-for-dollar, and the top of their lineup is about 80% as good the absolute highest-tier competitor most of the time.
On a more serious note what is the motivation for Nouveau? The drivers provided by Nvidia are not good enough or we just need an opensource alternative?
We have Thinkpad Linux laptops running with NVidia at work. The proprietary driver has quality issues, the biggest of which for us is power management and poor battery life. Nouveau performs much better in this regard, even though they had to reverse engineer the hardware (!) I will be advising my IT department not to continue to buy NVidia.
Not sure if this is relevant to you, but I thought I'd share an experience with my T440p that uses a quad-core Kaby Lake CPU and optimus graphics. I run Fedora on it. I am using the HD 4600 gpu which is wired to the LCD, and I occasionally use the nvidia gpu w/ proprietary drivers via optirun.
I'd always seen terrible power efficiency on battery with Fedora. It was much worse than any previous Thinkpad. I mostly treat it like a luggable, always plugged into the wall. I wasn't sure if it was the CPU or chipset, but it never got into deeper package-level idle states like other machines.
However, I recently discovered that if I suspend and resume the laptop right after I switch from AC to battery power, the laptop runs in a much more efficient mode. It gets into those deeper idle states and can get almost 6 hours of life out of its aging batteries for basic office/communication tasks with wifi. If I don't suspend it once, it will only get close to 2.5 hours even if almost completely idle the whole time.
I have a Dell XPS with an Nvidia 1050, running Fedora 29.
The Nouveau driver was so bad here that I blacklisted all the drivers and just use the Intel GPU. It's the difference between 3 hours or 6 hours of battery.
It apparently ran the Nvidia power on even when no program wanted to use it.
>It apparently ran the Nvidia power on even when no program wanted to use it.
Are you sure you didn't install nvidia drivers? That is a common symptom with nvidia binary drivers. Nouveau on the other hand is fine with powering the device off when not used. For nvidia drivers, the bbswitch method is more reliable, but is apparently deprecated by distros these days.
On all my laptops, nouveau never outperformed the proprietary driver in battery life. It didn't even support power management on most GPUs - when did that change?
That is extremely difficult. Lenovo doesn't make any high end laptops without nvidia. You'll be restricted to the X series and the T series. Dell is also quite similar, although at one point they had intel only options for their precision lineup. Basically, all vendors tack on the nvidia card once you get into high end (i.e., real i7 with 4-6 cores, xeon etc.).
> Basically, all vendors tack on the nvidia card once you get into high end
Yup. By adding a Nvidia card (or, far less commonly, an ATI/AMD one) they can sell that laptop as a specialty "Gaming" machine - and with margins on commodity hardware being so razor-thin these days, that's exactly what shrewd marketers need!
Afaik 10 years ago nvidia drivers were most stable and performant on linux. But yes they are proprietary and implemented in userspace for Xorg.
Today Linux graphics stack architecture have been significantly revamped and it requires more kernel code than before for DRM/DRI API and Wayland. nVidia just can't fit into this architecture with its proprietary/nda model without violating GPL for kernel modules.
So it sounds like that the kernel team implemented the APIs in a way that prevent nVidia from integrating with them without opensourcing the driver? Were those changes made as a political statement?
I'm well out of my depth here and welcome corrections:
Managing and multiplexing access to hardware is a core responsibility of the kernel. When video drivers are implemented in user space as X11 has traditionally done there are a bunch of issues: the program that includes them has to be setuid root to have the right access to the hardware, opening up the possibility of security bugs; only one such program can run at a time; and if something goes wrong in user space, it can bring down the system.
The newer kernel subsystem responsible for handling video cards is the Direct Rendering Manager, with the unfortunately confusing acronym DRM. [1] nVidia declined to support DRM in its proprietary driver for quite some time but eventually chose to do so.
They have some quibbles with GBM, one of the parts of the kernel API, but they didn't raise them until after the API was implemented in the kernel, was implemented in the AMD and Intel drivers, and had become a dependency for the Wayland reference compositor Weston. They refused to implement GBM and instead implemented their own alternative, EGLStreams, which is far more complicated. The open source world believes this is the wrong architecture and has had a lot of discussion about how to (reluctantly) support it. The current KDE and Gnome approach seems to be to accept patches from nVidia for EGLStreams support [2][3], but not everyone feels the same way. [4]
nVidia has implemented DRM already. They could implement GBM if they wanted to. I'm sure that having their proprietary driver in kernel space instead of user space makes licensing more complicated, but it hasn't stopped them yet.
1. Lacking support for old GPUs in a maintained driver. nvidia drivers approach for old gpu is "use an old driver" for cards as recent as the 6xx.
2. People like to run fully open source systems
3. Adding support for things like kms and wayland are really up to the whims of nVidia as to when it happens with the closed driver, while an open driver is more likely to do so (or at least can be forked to do so). In the past features like config-less X were also much later coming to nvidia drivers which made it harder for people to get started on Linux.
Your point #2 is especially important: with all the commotion surrounding the various "management engines" (in effect a completely independent system with privileged access to the carefully secured main system) it is surely obvious that a powerful GPU running unverified code is a foolish thing.
Point #3 is also really good. Who wants to wait around at the mercy of a corporation which has an active interest in NOT being transparent about "its" IP (as speculated elsewhere most of the GPUs probably infringe on some or other patent)?
Anyone buying an nvidia card for Linux should know exactly what they are doing: funding an anti-Free Software company, deeply entrenched in the ethos of secrecy, patents and bullshit; saddling themselves with a white elephant which will suck hours of their precious time.
As I understand it, the latter, and it is completely justifiable to do so. Requiring users to run binary blobs in order to use their hardware is non-free.
Meh, I typically use Chrome through a virtual machine with Spice which has broken OpenGL support, so I wasn't getting hardware acceleration anyway. I still get screen tearing in Firefox, for Christ's sake.
There is an easy fix for this:
Set layers.acceleration.force-enabled: true in about:config.
I believe this is because Firefox assumes it's the responsibility of the compositor you are running to properly vsync, but if you're not running a compositor you can turn on this option.
Will this make Firefox the best browser on Linux by default then?
Also, as mentionned in the bug thread, if nouveau is actually stable on Newer versions, similar to User Agents, the newer nouveau versions could change their name to bypass the blacklisting. No need to pretend to be NVIDIA, just something different that nouveau, like 'nuevo' or something.
Per the comment from abrowne Firefox does not do hardware acceleration by default on Linux for any hardware. So same as Chrome for Nouveau but Chrome does support acceleration for other drivers out of the box. It seems slightly easier to enable it for Firefox but both browsers provide a way to enable acceleration if you want to take the risk.
Chromium said they'd do that if someone tested which Nouveau versions work fine and provided a test suite to use. Ilia Mirkin (in the link) said he tried to run their test suite but got errors on the latest Nouveau (possibly due to a browser issue). So as it stands now Chromium doesn't know which if any versions of Nouveau are stable.
Browsers should not need to have GPU acceleration anyway if they are just browsers. But instead, as is Chrome's/ium's style, it wants to be an OS instead of a browser. All other browsers are trying to follow suite but that still doesn't make it right. The more hardware functionality exposed the less secure a browser is.
From: Drew DeVault <sir@cmpwn.com>
To: graphics-dev@chromium.org
Subject: nouveau blacklisted in Chromium
I'm writing to complain regarding the decision to blacklist nouveau in
Chromium. You're creating a hostile relationship with one of the most
important projects on Linux. If you do anything about this problem, you
should be using your influence to pressure Nvidia into being a better
citizen on Linux. Hell, you could even sponsor the nouveau developers
for a fraction of a fraction of a fraction of a percent of Google's
budget. Instead of doing any of the right things, you chose to become a
bad actor on Linux.
Closing bug reports from nouveau users and directing them to the
freedesktop bug tracker? Fine, you don't need to fix someone else's
bugs. Blacklisting the driver? Not even remotely okay.
I strongly condemn your decision and I expect it to be rolled back as
soon as possible.
--
Drew DeVault
> It's not the browser's place to have any code whatsoever which even so much as glances at the graphics vendor to make meaningful decisions.
I'm curious why you hold this position. I think it makes sense for the browser to attempt to deliver the best experience it can to users, and they may believe they get better performance and behavior from unaccelerated rendering. It's no different from websites looking at user-agent strings to work around known bugs, right? It's okay for a user to override their user-agent string, and fine for browsers to mention other browsers for compatibility, but uncool for a browser to not identify itself at all and steal another browser's user-agent string.
The OpenGL API includes vendor strings for a reason. Chrome isn't doing `strings` on the library or anything.
I get the predicament that caused Chromium to blacklist Nouveau. Nothing against the incredible work done in Nouveau, it's just reality: given a thousand eyes, apparently not all bugs are shallow (see also: Heartbleed and Shellshock). That being said:
>I'm curious why you hold this position.
Agreed. There are countless AAA titles (games) that do vendor checks on the GPU. Game developers are a bunch of people who have been working with GPUs for more than a decade, possibly approaching two decades. For some reason, GPUs are the one thing that the OS HAL isn't able to sort out. Windows (and Mac?) drivers are always proprietary, so different hardware vendors are all Windows developers have to worry about. I can't imagine the headache of dealing with differing hardware vendors (and families), on top of differing driver vendors.
Reality fucking sucks and I guess this is just Chromium acknowledging it. I doubt there's a grand conspiracy here.
I think Chrome is empirically quite user friendly - it's not installed by default anywhere and people choose to use it. It is true that Chrome isn't particularly power-user friendly for certain use cases, but that's different.
Also, if you look at the bug report, it's quite clear they spent a lot of time trying to make nouveau work.
Well, people have always used bad software; popularity has never been a good metric of quality. Nothing about Chrome protects users from Google. This is a fundamental role of the browser.
That may be true but I'm not sure how that is relevant. Why would a browser not protecting users from Google be positively correlated with a browser not wanting to successfully render pages?
There's more to being user friendly than having a dumbed down UI and being technically correct.
Helping corporations spy on users is definitely not user friendly, and Chrome is by far the worst offender there. Saying, "nobody cares," because they're ignorant that it's going on is no excuse, either.
That may all be true but I'm really unsure what that has to do with user-friendliness in the way described. The argument was an attempt to rebut my claim that Chrome is trying to do the best job possible of rendering pages isn't true because Chrome isn't user-friendly. If their goal is to spy on users and have a dumbed-down UI, isn't it all the more important that pages successfully render, so that the sheeple keep using it?
User-friendliness is usually understood as being about usability and interface design. Spyware-riddled software may be shitty, and that should be called out, but it's orthogonal to user-friendliness.
Edit: You seem to disagree. Google "user-friendly" and realize that I'm not defending Chrome.
Just because corporate language doesn’t include any way of expressing representing consumer and user interests outside of what can be represented in a transaction doesn’t mean it isn’t worth expressing.
There's plenty of ways to express it. Call it exploitative, dishonest, money-grubbing, soulless. If you instead call it "non-user-friendly," expect people to be confused, because that's not what "user-friendly" means.
As someone who has used IE (especially 5-9) far too many times because I had to, Chrome could impale my palms with stakes and I'd still probably call it more user friendly.
Note: I edited my comment to simply be a copy of the email I sent to the devs. You quoted the original text.
>I think it makes sense for the browser to attempt to deliver the best experience it can to users, and they may believe they get better performance and behavior from unaccelerated rendering.
So should they also blacklist Windows because it's spying on its users? No, of course not. They should do their best to deliver a good experience in the domains for which they are responsible.
>It's no different from websites looking at user-agent strings to work around known bugs
They don't blacklist Windows because it spies on users, but they absolutely blacklist anything from AV to malware because it causes instability. https://blog.chromium.org/2017/11/reducing-chrome-crashes-ca... This seems like exactly the same thing. They're responsible for delivering a browser that renders webpages successfully. They're doing what needs to be done to make this happen.
> So should they also blacklist Windows because it's spying on its users? No, of course not. They should do their best to deliver a good experience in the domains for which they are responsible.
If a Windows graphics acceleration component spied on its users, Chromium should absolutely blacklist that component, and use software rendering instead.
[Clarification: at the time I was writing this comment, parent referred to the team that closed the bug as sons of bitches]
The language you use is unnecessary. (Don't mean this in the blanket sense -- calling a SOB a SOB is fine -- but in this particular case.)
The Chrome team was receiving bug reports due to driver issues, and now this driver is the default. They can either compromise the experience for some Ubuntu users (with crashes, not slowdowns) or blacklist the driver or fix the driver.
If you don't mind the possibility of your driver crashing, you are free to instruct chrome to ignore the blacklist, something I personally did for some time.
Here's why I used that language (and though I edited the comment, it was not because I rethought the use of the phrase): it is not the Chromium team's place to make this decision, and this decision actively harms the health of the Linux ecosystem. The decision of which driver to use is up to the user and up to the distro, and Chromium has absolutely no right to interfere with that. Nvidia is one of the worst actors on the Linux scene, and overstepping their bounds to prop them up is despicable.
Imagine if Google had blocked Tor users in China in favor of Project Dragonfly instead. The attitude of Chromium in this matter is directly supporting bad actors who actively do harm to their users and to the rest of the world around them. That's why they're sons of bitches.
What you're suggesting is actively harming users to put pressure on NVIDIA to become a better citizen in the Linux ecosystem. I don't find this very ethical, but I don't think this would work anyway. When Chrome crashes, users blame Chrome. But when Chrome runs slower on NVIDIA, users blame NVIDIA.
They aren't "interfering." They are falling back to software rendering when the GPU driver is incapable of supporting valid calls without crashing. If Windows or macOS started shipping buggy/crashing OpenGL drivers and not fixing them, Chromium would be entirely justified to fall back to software rendering there, and it would not be "interfering" with Windows or macOS development to do so.
>If Windows or macOS started shipping buggy/crashing OpenGL drivers and not fixing them, Chromium would be entirely justified to fall back to software rendering there
And they do just that, quite often! Just take a look at the list of drivers/hardware/os/feature combinations that are blacklisted in their file at [1].
Tons of MacOS and Windows things on there (and Android, and Linux!)
I have no idea how you interpret this as propping up Nvidia. They're blacklisting an (unofficial) Nvidia driver because it doesn't work. They're not telling users to use the official non-free ones; they're falling back to software rendering. Users who want acceleration can just buy an Nvidia competitor's card. Isn't that what you want?
They're blacklisting ice cream because they're lactose intolerant. They're not telling users to eat Brussels sprouts, they're falling back to plain bread. Users who want to eat something sweet can just buy some licorice. Isn't that what you want?
While I mostly agree with what your email says, if I received a message that said, "I expect it to be rolled back as soon as possible" and accused me of being "a bad actor on Linux" I would probably be quite put off. You make some good points, but rather than coat them in honey, you soaked them in vinegar. Might get a better result by improving your tone, and maybe giving them credit for at least open sourcing their work and supporting graphics acceleration on at least some cards (firefox doesn't). I also appreciate that they support linux so well, even if not perfectly.
Not by default. GPU acceleration is off by default in Firefox on Linux. Distros may change that in their bundles, but the official builds have it turned off.
My unmodified Firefox stable channel on Linux has WebGL enabled OOTB and my distro does not have any specific flags specified for WebGL (or GL in general) in the package.
> GL layers acceleration is not yet enabled by default (see bug 594876). You can enable it by setting layers.acceleration.force-enabled=true in about:config.
That means off by default. Also maybe relevant is that I'm on the Firefox Graphics team so I have some knowledge of this.
If you want to verify, download a stock Firefox build from Mozilla, run it on a new profile, and check what it says for Compositing in the Graphics section. OpenGL means acceleration, Basic means software.
I get you have an axe[0] to grind with NVIDIA, but is attacking people trying to mitigate around a broken driver really the right direction? By your message, we should also be "pissed off" at you for disabling a driver that "works" in your project too.
The Nvidia driver does not "work" in my project. I have added no code which explicitly blacklists it, and the day Nvidia releases a driver which implements the required APIs it will work with no changes in my code.
> the day Nvidia releases a driver which implements the required APIs it will work with no changes in my code.
How about the day when Nvidia releases a buggy (system-crashing?) implementation of the required APIs? How well will your code work without changes, and which course of action will you take?
I'll not change my code for their sake. I already get enough annoyed Nvidia users who think it's my fault their driver doesn't support GBM, so annoyed Nvidia users who think it's my fault their driver is buggy won't be much different. I'll do what I already tell them: it's Nvidia's problem, report it to them. Of course, if some characteristic of Nvidia's implementation demonstrates previously-unknown bugs in sway (bugs in sway, not in the driver), sway will change to accomodate that.
Your project, your rules - but that attitude doesn't scale well.
Chromium chooses to prioritize users' experience over scoring ideological points. That means attempting to work around known external bugs instead of letting things blow up and sending users to pound sand on some unresponsive third-party mailing list.
I'm not sure how you can fault them for that, especially since pretty much all successful large-scale projects take the same approach -- including the Linux kernel.
The new library which replaces wlc, wlroots, does not have that special code necessary to support the stupid Nvidia special-snowflake API.
In simple terms, sway+wlroots DOES NOT have this line of code:
if (nvidia proprietary driver) {
suck();
}
Rather, it looks like this:
do_important_gpu_stuff(); // Nvidia does not implement this
The Nvidia proprietary driver doesn't support the APIs necessary to run Sway. This is the correct relationship: the driver is responsible for implementing APIs and the software is responsible for consuming them. Nvidia didn't hold up their end of that bargain and thus does not work.
In Chromium, the following line of code DOES exist:
if (nouveau driver) {
suck();
}
Nouveau implements the necessary APIs, but Chromium explicitly blacklists it. Nouveau holds up their end of the bargain here and Chromium does not.
This is not correct. They may have methods that match the required signatures, but the APIs do not do what they are supposed to do. "Crashing my computer" is not a documented feature of any webgl API I'm aware of.
From an API perspective, Nouveau lies (perhaps not intentionally, but in reality) to its consumers. Nouveau isn't holding up any part of the bargain. Having correct method signatures doesn't mean you implement an API if you return wrong results.
Where is this idea that nouveau is crashing coming from?
And don't be ridiculous. All software has bugs, including Chromium. Should I detect Chromium and blacklist it on my sites in preparation for future bugs? Just a few weeks ago I found a Chromium bug when I was working on a website. I did not blacklist the browser, naturally.
This subthread[0] implies that there have been nouveau-caused crashes for webgl since 2015. The linked bug[1] seems pretty cut and dry. Nouveau completely locks the system, Nvidia doesn't.
>All software has bugs
If those bugs crash the system and the software is made the default, and the change is a performance degradation, yes. Degraded WebGL performance is superior to the system locking up for nontechnical users.
>Just a few weeks ago I found a Chromium bug when I was working on a website. I did not blacklist the browser, naturally.
Did they fix it in less than 2 years? Did it crash the end-user's computer? Did people navigating to your website report the bug to you?
Did you perhaps work around the bug? For example by using a polyfill that would provide degraded performance but still allow the end user to access your content? Because that's almost exactly analogous to this situation, and exactly what everyone does.
>This subthread[0] implies that there have been nouveau-caused crashes for webgl since 2015. The linked bug[1] seems pretty cut and dry. Nouveau completely locks the system, Nvidia doesn't.
Crashing on a page designed to exhaustively test GL features is one thing. How about crashes which actually affect end-users? I expect that the volume of users who will be negatively impacted by blacklisting is far higher than the volume of users impacted by serious nouveau bugs like this. The difference is that the former user hasn't been annoying Chromium devs in their bug tracker yet.
>Did they fix it in less than 2 years?
No. The bug turns 2 next month and shows no signs of being fixed.
>Did it crash the end-user's computer?
No, but like I said I don't think many nouveau users are actually affected by bugs of this scale. The bug which prompted the blacklisting and being discussed today is not such a bug.
>Did people navigating to your website report the bug to you?
Yes.
>Did you perhaps work around the bug?
Yes
>For example by using a polyfill that would provide degraded performance but still allow the end user to access your content
No. If the workaround caused severe performance degredation and encouraged users to switch to proprietary software, I would have patched Chromium.
>No, but like I said I don't think many nouveau users are actually affected by bugs of this scale. The bug which prompted the blacklisting and being discussed today is not such a bug.
The bug being discussed today is just one of "some bug reports on other rendering issues with Nouveau". It fails the conformance tests, a quick search showed that same bug about crashing having been reported to chrome. Let me say that one more time:
There is a set of conformance tests to verify API compatibility. Nouveau fails them. It's not just buggy, it is literally non-API-conformant. You can't keep pretending its API conformant when it fails the API conformance tests. When Nouveau is actually API compliant, perhaps it will get un-blocked.
Until then, Chrome is doing exactly what you'd do if Nvidia claimed to implement the APIs but just made the system crash instead: block a nonconformant API implementation that lied to its clients and degrade gracefully instead.
>There is a set of conformance tests to verify API compatibility. Nouveau fails them. It's not just buggy, it is literally non-API-conformant. You can't keep pretending its API conformant when it fails the API conformance tests. When Nouveau is actually API compliant, perhaps it will get un-blocked.
It doesn't matter. Nouveau is a buggy implementation but an implementation nevertheless. No one is going to blame Chromium for their desktop freezing up. It's the user's decision to buy Nvidia and Ubuntu's decision to use a buggy driver. They've weighted the tradeoffs and come to a decision which is theirs to make. It's NOT Chromium's place to make that call.
These bugs are not causing daily issues to Chromium users, or even frequent issues, for nearly all users. The blacklisting does affect all users, daily.
It is not Chromium's place to decide which driver their users will use. It's not okay to write vendor specific code. It is not okay. You write code for the APIs and if the implementation of those APIs is buggy, it's their bug, not yours. Changing your code to fix someone else's bug is the objectively incorrect thing to do. Fucking over an important upstream Linux driver in favor of a proprietary driver from a bad actor is outright morally wrong.
> Crashing on a page designed to exhaustively test GL features is one thing. How about crashes which actually affect end-users? I expect that the volume of users who will be negatively impacted by blacklisting is far higher than the volume of users impacted by serious nouveau bugs like this. The difference is that the former user hasn't been annoying Chromium devs in their bug tracker yet.
Do you honestly think that it's reasonable or acceptable for Chrome to allow an arbitrary website to hard lock a user's machine?
Do you honestly think that there isn't a page which can crash any driver? WebGL is a poorly thought out mess.
That being said, you're right, but a more appropriate response than blacklisting nouveau would be blacklisting this particular WebGL feature on nouveau.
Apologies. My original reply (now the deleted comment in thread) was on a misread comment (or edited not sure). Thanks for the expanded context, I still disagree with this grinding though.
Existence or lack of the correct APIs is not the reason Nouveau has been blacklisted - it's the functionality behind them that appears to be buggy and causing issues.
> If you do anything about this problem, you should be using your influence to pressure Nvidia into being a better citizen on Linux.
By making Chrome significantly slower on nvidia systems they effectively are pressurizing nvida to act better. Trying to work around the problems would be more like supporting the status quo instead of driving change.
That would be true only if they weren't rewarding the proprietary driver. If their goal was to pressure Nvidia into being a better player in FOSS they'd disable GPU accelleration on the proprietary driver.
It works fine on Arch. I'm pretty sure the issue is more to do with Ubuntu's packaging. For them to disable it everywhere due to an issue with Ubuntu makes no sense. Besides, Ubuntu packages old software several releases behind stable.
If you want to support nouveau, then you can put the work in. You have no right to demand that others do work for you. Especially when they already have a functioning alternative solution in software rendering.
I am not demanding that they do work. I'm demanding that they do not deliberately sabotoge the nouveau driver. If they find the nouveau driver lacking, they should improve it, not sabatoge it. If they don't want to put the work in, then they should just leave well enough alone.
>The Nvidia driver does not "work" in my project. I have added no code which explicitly blacklists it, and the day Nvidia releases a driver which implements the required APIs it will work with no changes in my code.
That is a "should" based on your feelings about Nvidia, not about what is the correct decision for users. They are falling back to software rendering when the GPU driver does not properly support the hardware acceleration. That's not uncommon, and it's certainly not the outrage you are trying to make it out to be here. Everybody gets it, you disagree with Nvidia's decision to not have open-source drivers. Not everyone else shares your views there, and it is not their obligation to support your views.
>That is a "should" based on your feelings about Nvidia, not about what is the correct decision for users.
No, it's not. If you encounter a bug in a piece of software, you should report it to the maintainers of that software. That's true even for, say, bugs found in the Nvidia proprietary driver. Expecting Chromium to fix it is expecting them to do more work, which is the exact thing others have railed on me for "expecting" from them (which I haven't, to be clear once again).
> If you encounter a bug in a piece of software, you should report it to the maintainers of that software.
I mean, sure. But that's not really the whole of it here. Chromium has encountered multiple bugs in a piece of software, and they decided they don't want to expend the resources to reproduce those bugs and deal with them. In the meantime, those bugs mean that the driver is not properly implementing the functionality for all users, and so Chromium has decided to simply not use that part of the driver. They aren't doing anything to the OS, they aren't bypassing the OS, they're just rendering on the CPU because the driver doesn't properly implement the hardware acceleration features. There is no obligation whatsoever to use software that you know doesn't work just because the OS ships it.
> Also, with this email it seems it is you who is turning a routine technical decision into a hostile relationship.
Yeah, I don't get that. Does the author think that this'll make the Chromium project more willing to sponsor work on nouveau, or for that matter, to expend some of their influence on nouveau's behalf via Nvidia? It makes no sense to me.
If you can't stand for that, then do something about it. Fix bugs, or pay people to fix bugs. On what grounds are you entitled to demand anything of the Chromium team?
Not sure what you mean by this. If there are versions of nouveau 'in the wild' (widely used by non-technical users) that result in brokenness under some circumstances-- and bugfixes in the driver are not promptly available, it makes sense to prevent use of the hardware driver i.e. "blacklist" it. So, are you saying that using nouveau does not result in broken behavior, and that any such brokenness is quickly fixed?
The current link says it as if Chrome was not being fair, but reading their reasoning it makes total sense. Key quote from said bug report:
> We thought about blacklisting nouveau driver long time ago, and decided to let more adventurous users to play around. Now that Ubuntu ships with nouveau on default, maybe it's time to blacklist it in Chrome.
> We received quite some bug reports on other rendering issues with Nouveau, and this bug is just one of them. So we will disable all GPU acceleration by default. If someone wants to bypass that, there are two options: 1) install proprietary NVidia drivers (see #24) 2) run Chrome with --ignore-gpu-blacklist, but be aware you are taking a risk here
> Unfortunately we don't have the resources to test every variation of every GPU/driver combination on linux, let alone investigate & fix bugs in drivers. We want a stable & secure browser first, a GPU-accelerated one second, only if possible.
> The default driver on Ubuntu LTS has severe issues, asking non-technical users to update their driver is just not acceptable as a prerequisite to use Chrome.
> If someone is interested in well-scoping the brokenness (version range and/or devices affected), we're happy to take a patch to the blacklist.
So the driver fails in some (common enough) cases, they let it run anyway but now it's default on ubuntu so a lot of non tech users are/will be affected, so they block it but you can still bypass the block if you want. And if newer version fix it and someone can help them figure out which versions work for what, they are willing to limit the blocking.