Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia seeks peace with Linux, pledges help on open source driver (arstechnica.com)
278 points by bitops on Sept 24, 2013 | hide | past | web | favorite | 80 comments

Here is the full text of the email, also sent to the Mesa3D development community.

---------- Forwarded message ---------- From: Andy Ritger <aritger@nvidia.com> Date: Mon, Sep 23, 2013 at 11:44 PM Subject: [Nouveau] offer to help, DCB To: nouveau@lists.freedesktop.org

Hi Nouveau developers,

NVIDIA is releasing public documentation on certain aspects of our GPUs, with the intent to address areas that impact the out-of-the-box usability of NVIDIA GPUs with Nouveau. We intend to provide more documentation over time, and guidance in additional areas as we are able.

As a first step towards that, we've posted a document here:

that documents the Device Control Block ("DCB") layout in the VBIOS. The DCB describes board topology and the board's display connectors.

I suspect much of the information in that document is not news for the Nouveau community, but hopefully it will be helpful to confirm your understanding or flesh out the implementation of a few unhandled cases.

A few of us who work on NVIDIA's proprietary Linux GPU driver will pay attention to nouveau@lists.freedesktop.org and try to chime in when we can.

If there are specific areas of documentation that would most help you, that feedback would help NVIDIA prioritize our documentation efforts.

If you have specific questions for NVIDIA, you can ask here, or direct them to: open-gpu-doc@nvidia.com. I can't promise we'll be able to answer everything, but we'll provide best-effort in areas where we are able.

Thanks, - Andy Ritger

One thing I guess I don't understand is why Nvidia's own drivers are closed source to begin with. What do they gain by it? Is it that open source drivers would make their hardware easier to reverse engineer? Is it that the drivers themselves are an important piece of IP, and competitors would learn all these cool tricks they use in their drivers? Neither of those fit my model of the role and value of drivers. It seems like they'd just want to maximize the usefulness of the hardware they sell. But I'm obviously missing something.

Edit: Thanks for the responses. That's quite a variety answers.

Disclosure: I very briefly worked for a competitor.

A) Drivers reveal a surprising amount of "secret sauce" for a GPU. The "public-facing" interface to a GPU is something like HLSL or GLSL, but the real interface is some proprietary ISA. The manufacturers are slowly moving in this area, but the culture of the ISA being proprietary remains. That leads to things like implementation details getting inserted into ISA docs, or code comments, or driver architecture designs, because they're for internal use only anyway. This philosophy is slowly changing, but it's still the case quite a bit.

B) There's much less difference than you would expect between a $500 gaming GPU and a $2,000 workstation GPU. It's totally plausible that you are one FOSS driver away from a much more expensive GPU. That is a problem for everybody, including the gamers who want to keep buying $500 GPUs.

C) There is a VERY complicated balance of power between OEMs (who control the purse), Intel (who sometimes competes and sometimes complements the video card manufacturers) and Microsoft (for better or for worse, a de facto standards body) and increasingly, new kinds of competitors like PowerVR. So this is just a hypothetical, but one way it could go down is like this: NVidia decides to fully embrace FOSS. Microsoft gets pissy and tries to write them out of the standard. NVidia forced to lower prices to get PC makers to buy. Apple swoops in and places a huge order of the now-cheap parts. Intel gets pissy that Apple is shipping too many Nvidia GPUs per Intel GPU, raises Apple's CPU pricing to cover the loss. The price of Macs goes up driving some customers to iPads. PowerVR's royalty fees double, and since they don't have to manage any inventory, they are increasingly more attractive to investors than traditional manufacturers like NVidia. NVidia's stock price falls, and some C-level forgoes a vacation in Europe.

> B) There's much less difference than you would expect between a $500 gaming GPU and a $2,000 workstation GPU. It's totally plausible that you are one FOSS driver away from a much more expensive GPU. That is a problem for everybody, including the gamers who want to keep buying $500 GPUs.

I wasn't aware that there was a difference at all. Haven't there been cases of changing a single resistor to switch the card's mode?

That is just how chip manufacturing works.

In theory you make 1 model of chip. You manufacture them. You test them. What clock speeds are they stable at. Are any of the cores faulty. Many (probably most) will have defects. Some of them will be duds and must be thrown The bits with the defects are switched off and resold as lower end hardware.

The common less powerful failure cases cost less and the rare more powerful successes cost more.

The high end stuff sells for more than the cost of manufacturing and the low end stuff goes for less. They offset each other.

So when people force a model of GPU to preform higher they are basically ignoring the Q.C. and are going to be dealing with undefined behaviour. Did the card get reduced just because it produced more heat at the higher levels (in which case a highend heatsint might counteract the issue) or because there is a core that returns faulty data.

That's only the theory though.

In reality it might be artificially screwed with to increase profits. Maybe there are many more successes than the desired price level matches. Maybe they are selling more cheaper stuff so they turn off the bits of the chips even though they passed the QC. There are only 2 companies selling gaming GPUs so there could be an artificial duopoly. Nvidia only have to be price competitive with AMD.

That's binning, sure.

But I thought pro chips actually tended to be clocked a bit lower than gaming chips. There's no 'ignoring QC' there if you trick the board, quite the opposite.

> But I thought pro chips actually tended to be clocked a bit lower than gaming chips. There's no 'ignoring QC' there if you trick the board, quite the opposite.

Yes, the pro gpus have different clocks, ECC memory and there may be other differences in the board as well.

Here's a great example:

Resistor hack turns a Nvidia GTX690 into a Quadro K5000 or Tesla K10


> Resistor hack turns a Nvidia GTX690 into a Quadro K5000 or Tesla K10

No it doesn't. This was discussed earlier in the thread you link to. This hack triggers a bug in the display driver, which allows you to work around a silly limitation in the Linux driver.

You can do the same with a software hack in the kernel that spoofs the PCI vendor:device identifiers. And the driver bug is probably fixed so it won't work any more. Hopefully the silly monitor count limitation (not present on Windows drivers with same GPU) is lifted as well so this hack isn't needed any more.

This resistor hack does not change the GPU clocks or enable units that have been fused off. It does not turn a gaming gpu into a pro gpu.

What is the biggest difference between a gaming gpu and a professional gpu? Warranty and support. This also accounts for a big slice of the price difference.

There are also other differences such as error correcting memory in pro gpus, different clocks, etc. Some units that are not essential for gaming performance have been fused off in gaming gpus so that the chips can run faster and hotter.

It's funny how you never hear anyone complain about the fact that Intel's i3, i5 and i7 models are almost the same chip but there's a huge price (and perf) difference. Intel's latest product line has 22 chips of the same design, and they definitely don't have 22 semiconductor production lines.

> I wasn't aware that there was a difference at all. Haven't there been cases of changing a single resistor to switch the card's mode?

This was debunked in an earlier HN discussion, all the resistor hack does is spoof the PCI vendor:device identifier. The fused off parts won't come back up. It won't turn a gaming gpu into a pro gpu, it only triggers a bug in the driver that allows multiple displays (what a silly limitation).

Couple of guesses

a) If the drivers were open patent trolls and competitors would sue.

b) I have heard stories about some graphics vendors who have a lot of duplicated code in their drivers for supporting different generations of cards. Instead of building abstractions that achieve generality theyve succumbed to the copy paste rabbit hole.

c) If we all saw the mess that lives in the drivers none of us would be okay with it running it as a kernel module.

d) There are questionable optimizations that vendors would be caught on.

e) legal department requires too many changes/check (no swear comments, proper licence headers, ...) so it is too much work

f) convoluted build process which is impossible to recreate outside NVidia

g) some code is licenced from somebody else

h) obvious NSA backdoor built in

Linux doesn't seem to have problems with swearing in the source. I don't know why Nvidia would, either - the words exist for a reason and people use them to fill a conversational need, if you don't like certain syllables in succession grow up.

Here is an anecdote about the Netscape Open Source release which became Mozilla.

> When we created mozilla.org and released (most of) the source code to Netscape Confusicator 4.x, Netscape's lawyers made us go through a big "sanitization" process on the source code. Largely this consisted of making sure we had the legal rights to all the code we were releasing, and making sure every file had proper and accurate copyright statements; but they also made us take out all the dirty words. Specifically, "any text containing vulgar or offensive words or expressions; any text that might be slanderous or libelous to individuals and/or institutions."


Even John Carmack was pressured by the lawyers when he open-sourced Doom 3.


Because not every project is headed by Linus Torvalds. That man has almost no filter (which in my opinion is a good thing).

However, that same type of stuff may not reflect other organizations and members of those organizations in a good light. So there are definitely concerns there.

I'm a developer, and my company's closed source code definitely has its share of profanity and "WTF"s. But that would not fly for things we make available on GitHub, just due to concerns of our image. Most developers wouldn't give two shits about the comment...

Do you honestly believe that there would be any damage to your company's image that would last more than 5 seconds to a small minority of people that would even see an article regrading "improper language"?

Unless your fellow developers are posting extremely obscene racial slurs and threatening rape on the interns. In which case your company has bigger issues to worry about.

These problems almost entirely disappear by providing documentation. Nobody that I know is literally asking for the source to the Windows driver.

Yes, then all they'd have to do is hire a bunch of people to write documentation, like AMD's done for their open source driver. To the extent that they have documentation currently, it's likely wrong, incomplete (because the driver guys can just ask the hardware guys what's up), and it's got various trade secrets and future device plans in it.

a) and c) seem likely.

Also, if folks had the source, they'd likely start doing optimizations for particular cards and drivers--that being the case, there would be a really gnarly backwards-compatibility issues that would start to plague vendors.

drivers themselves are an important piece of IP

I think it's this. From what I've been told, the line between software (driver) and hardware (graphics card/chip) is blurry, and moves around from generation to generation. The "graphics processing model" is only partly executed in hardware; the rest is in software.

I'd say NVidia use closed source mostly to push Cuda instead of people trying to use openCL. E.g. their OpenCL driver imposes 3GB memory limits (or at least used to in the past) for kernels but with Cuda they will enable 64bit if necessary (on the same gpu hardware).

more likely: IP encumbrances in code (or specifications) that they've licensed (or licensed access to).

This is the story they always use when asked. They claim that the board contains proprietary components licensed from third parties, and that it would violate their licensing agreement if the code pertaining to those components was released.

This is a very convenient PR move because it shifts all the blame onto groups of unnamed entities, and tends to place the burden on the open-source advocate to get all of these unnamed entities to modify their licensing agreements.

And that makes it untrue ... how, again?

As the others said, it's not that it's necessarily factually untrue, it's just a cop out. If the company cares about open-source, they can neogiate with suppliers directly and refuse to use components with incompatible licensing agreements.

The party line statement is structured to make nVidia appear helpless and as if the burden is upon the open-source advocate to get the component vendors to allow their code to go open, but this is not correct. If nVidia cared about open-source, they'd use their clout to make sure they got what they wanted.

The grandparent post doesn't say that the story is untrue. It implies a certain amount of scepticism, which is probably warranted: Nvidia must have some amount of leverage to negotiate such terms with its suppliers, so if such terms do exist, it's because Nvidia has accepted them. Of course there are other factors they have to consider in those agreements, but that's a more complex story.

No-one said it was outright untrue, but neither has any company ever detailed which 3rd parties are responsible, so there's no way of finding out just how big a problem it really is.

I'd always assumed it was because they were tweaking things under the hood to specifically target benchmark tests.

The nVidia proprietary driver (same codebase both for Windows and Linux) is chock full of optimizations for specific game exes. Today games are the benchmarks, and optimizing for that in the same way I think is very reasonable. This is not "cheating", it is what makes the graphics in todays games possible at a resonable framerate.

That's been found several times with both nVidia and AMD/ATI in the past without needing to see the driver sources.

The Windows driver and Linux driver are based on the same codebase, and NVidia is (rightly so) afraid of security issues in their drivers that could allow somebody to hack somebody's computer remotely with a special TF2 map or somebody.

Linus' response:

  We'll see. I'm cautiously optimistic that this is a real
  shift in how Nvidia perceives Linux. The actual docs 
  released so far are fairly limited, and in themselves they 
  wouldn't be a big thing, but if Nvidia really does follow 
  up and start opening up more, that would certainly be great.

Why not include the rest? Or at least, the rest of what was mentioned in the article.

They've already been much better in the ARM SoC space than they were on the more traditional GPU side, and I really hope that some day I can just apologize for ever giving them the finger.

BTW, that difference comes from the different amount of proprietary IP in the respective fields.

There really is comparatively little "proprietary IP" on the kernel side of a GPU driver. In almost all cases, the kernel is responsible for doing things like setting modes, managing memory allocation/mapping between the GPU and CPU domains, and transmitting command buffers from userspace to the hardware. Where there is proprietary stuff, it's generally related to stuff like DRM (HDCP) or video playback which aren't covered by the kernel DRM framework anyway and wouldn't prevent an free interoperable driver.

I mean, glsl compilers and optimizers. And as time goes on, Nvidia is moving more of the fixed function out of silicon into GPU software. Which is effectively binary blobs that need to get uploaded to the GPU. This all seems like quite a bit of "propietary IP"

There is a lot of proprietary IP in a modern GPU driver: https://news.ycombinator.com/item?id=6176498

Shrug; you're talking about reverse engineering hardware implementation from the details of the transferred data. Source access is hardly required for that, and the Nouveau driver we're talking about is an existence proof. And in any case all modern GPU manufacturers except NVIDIA ship GPL kernel modules and aren't leaking "proprietary IP" to each other. This isn't an issue of secret sauce in the kernel, sorry. NVIDIA just don't want to spend the effort to conform to someone else's architecture.

Even NVIDIA takes this approach on their mobile integrated GPUs on Tegra (GPL'd kernel driver).

I wish they'd do so on the desktop too. Having the kernel driver (which really does not need to be all that much code) be open source makes it massively easier to upgrade the kernel around the GPU driver.

As someone who wasn't aware of the issue with Optimus nVidia cards before I purchased my laptop, this is fantastic news. I get support thanks to the wonderful work of the bumblebee team but it's never really felt "solid" to me. However, this is most likely because of my inexperience with the subject.

I feel as though I represent a fairly decent number of people when I thank Valve for the steps being taken to make Linux better as I use it as my primary OS.

There's supposed to be native Optimus support in the latest drivers. Ubuntu is aiming to support it for 13.10 (and backport to LTS), see https://wiki.ubuntu.com/X/HybridGraphics

Let's hope it works out. There are still some showstopper bugs like suspend not working and touchpad locking up.

Bumblebee was always a stopgap and a kludge - and a lot of credit is due for the enterprising bumblebee hackers.

I've yet to hear whether or not the new Optimus drivers reduce heat generation for multi-monitor setups (which are fixed to max power mode in Linux when more than 1 monitor is in use, resulting in more fan noise).

Would hope so, but never get much a reply from Nvidia's Linux board: https://devtalk.nvidia.com/default/board/98/

As it stands was forced to modify the VBIOS in order to undervolt the card and finally get the GPU fan in laptop to shut up. Silence is golden.

I've yet to get bumblebee working on my laptop. Same system works fine under Windows. :(

If Nvida could fix Optimus for other OSes, there might be more Linux users/gamers.

Aye, I chose my laptop (ASUS G55VW) mostly for its lack of Optimus (The Intel GPU is dead weight). I'm a Linux nut, but instability like seen with Optimus -despite the amazing efforts of the Bumblebee team- is a no-go for me.

Hopefully, this new openness will lead to better general support for Nvidia cards, as it is there are distros (Arch for instance, my favorite) where Nouveau and Nvidia proprietary driver cannot detect my GPU properly.

As far as the timing of this move and the announcement of SteamOS is concerned, I'm having difficulty seeing the reasoning behind it. Wouldn't Nvidia just continue work on their proprietary Linux drivers, and wouldn't most gamers (who are unlikely to care about FOSS) simply use them? I imagine OEMs would include the best drivers available for the devices the choose to use, which at this point means proprietary.

Surely SteamOS will be using a proprietary driver build. There are reasons to desire DRM/KMS-based drivers, and Valve might have been pushing this to NVIDIA, but surely "open sourceness" isn't going to be a feature driver for SteamOS.

And in any case the NVIDIA proprietary userspace driver stack doesn't work with the Nouveau kernel driver anyway. Maybe this means there is hope that it will someday.

I can imagine that with valve engineers being used to their pretty open source engine development practices. Starting to talk more with Nvidia hardware people they might be asking for a lot more improvements that nvidia could be starting to realise they might benefit from offloading to the open source world.

When Microsoft or sony come knocking to have this conversation they might be more prepared to put a specialist hardware team between the system developers and hardware makers coupled with a very strict QA process.

This is pretty close to my interpretation. Basically I think that Valve has forced the hands of the graphics card makers to elevate the Linux platform to first class citizen status. Due to the poor integration of the Nvidia proprietary drivers with the Linux kernel, it seems like they have two options to get their drivers into a better state:

1. Hire a lot of kernel hackers

2. Try to get the open source driver maintainers to improve integration in a way that is closer to the actual hardware (without giving everything away), and then somehow tie their proprietary driver into that integration code.

The letter certainly doesn't express interest in improving the capabilities of the nouveau drivers to a point that they can compete with the proprietary drivers in terms of 3d performance, and I think that it would be prudent of the open source driver maintainers to be skeptical of Nvidia's motives given the timing and limited nature of the release.

I agree completely. I'd add:

3. Improve the relationship between NVIDIA and the open-source community.

If AMD was both much better on Linux (due to #2/#3), and they were "good enough" for a substantial portion of users who're now using SteamOS, that's a lot of lost revenue for NVIDIA.

Too bad as the target audience of Steam on Linux (mainly, a hacker who also plays games, and doesn't buy Windows licenses) I can't go with either companies products.

Nvidia is direly amoral in their completely asinine hatred of FOSS and their inability to contribute. I won't support a business with that mentality. They either grow up and play ball in mesa or I don't give them a cent.

But AMD has terrible performance and instability in both drivers, and they still have proprietary firmware. It is like trying to have ones cake and eat it too - "hey look, we have foss drivers, except not!" is not good enough. The whole point of foss drivers is not hiding anything from users, and having binary firmware blobs making their devices impossible to reimplement without reverse engineering defeats the purpose.

Then there is Intel, who just makes a foss driver as their only option and does it right. So my next build is using 4600 HD graphics and I'll just play minecraft and quake live rather than whatever shiny new shooter Valve brings to Linux.

Of course, Intel has it somewhat easier, considering their GPU+driver combo only needs to play minecraft and quake live.

NVIDIA's kernel module right now just pokes a hole the size of a truck and reams data in from userspace.

Valve is looking at using Wayland to power Steam, which means that NVIDIA basically need a proper DRM/KMS part in the kernel, and that part will likely be the DRM-side of nouveau.

My guess is that NVIDIA will contribute to the kernel parts of nouveau using open documentation to get patches accepted, while continuing to work on their closed-source userspace.

Ubuntu is moving to Mir instead of X11 - existing proprietary drivers will not work after the switch. It would benefit all parties if graphics support would not be a deal breaker for upgrading 14.04 which is when i think Canonical plans to support mainly Mir. From what I've read it seems that Canonical are actively working on getting this hardware support to work out by talking to vendors. Maybe this is part of that effort?

Nvidia has already committed to supporting KMS etc in the future, Canonical definitely had a lot to do with that, and I'll bet as did Valve.

That's probably more closely related to the bad receiving NVidia is getting by the current Linux users, that have a less closed option nowadays.

It's almost certainly not gamers that complain abut the lack of free drivers. But people like me that want good performance but don't like the problems the proprietary drivers bring, or people concerned about the driver's security (everybody is paranoid now) and not wanting it at their datacenter.

Valve's had a surprisingly good track record with getting companies on-board with improving their code. They managed to get a lot of improvements into Apple's OpenGL implementation, which is something that Blizzard couldn't do for years.

This happens every two years.

Nvidia: we will help the open source community!

> delivers binary blob

nvidia: we will help the open source community with a better integrated driver

> delivers deb and rpm that inject binary blob in kernel

... what they are launching today is information that Noveau would get anyway after a couple weeks with the new cards. All this will do is get you the same crappy support you would get, two weeks in advance.

This is a bit surprising. I was expecting an increased effort into their proprietary drivers due to Valve's Linux announcements, but not this.

Of course they could do both, but isn't Nouveau 3D performance so far behind the proprietary drivers that nobody is going to be using it for gaming anyway (at least for a long time)?

I guess it could be that they're just buying some goodwill from the general community in anticipation of working closer with it.

> but isn't Nouveau 3D performance so far behind the proprietary drivers that nobody is going to be using it for gaming anyway (at least for a long time)?

One of the main reasons for the lack of performance is clock/power management of the cards. The proprietary driver clocks up the GPU while nouveau does not. I would guess this is one of the reasons for why they asked for the power management documentation.

> isn't Nouveau 3D performance so far behind the proprietary drivers that nobody is going to be using it for gaming anyway (at least for a long time)?

Not really that bad. On Kepler+ cards it sucks, but I threw Nouveau on my old gtx 285 pc back before I realized Nvidia were anti-opensource assholes (tbh, just like Intel is with cpus) and it can run most games at 60 fps at 1080p as long as they are running at medium or so settings.

That has to be in context, though, since I was only running minecraft, darkplaces, doomsday, flash games, and humble bundle games like Torchlight and Bastion on it for my mother to play. But you can enable reclocking, and it does run ok, but it is around half as fast as the proprietary driver is in the same games at best.

You're making the assumption that the problems with performance are caused by many small issues. It could very easily be the case that a couple medium-sized changes will drastically improve performance.

This is pure conjecture on both our parts, of course. I was simply taking issue with your prediction of Nouveau not being useful for a 'long time'. It almost sounded like you were suggesting this was a useless gesture on Nvidia's part.

I don't think it's a useless gesture, just that it seems more like a PR move directed at the community than directly related to SteamOS.

"NVIDIA is releasing public documentation on certain aspects of our GPUs, with the intent to address areas that impact the out-of-the-box usability of NVIDIA GPUs with Nouveau."

Seems like they just want Nouveau to work better during normal desktop usage. I doubt they plan to release enough information to make them competitive with their proprietary drivers.

you don't suppose this came about because steam announced steamOS?

edit: noticed it's in the article.

Just came here to make the same observation; I think this move is two-fold:

  1. We are doing this anyway because of Steam OS.
  2. Might as well get good community karma for it.
Win-win for us I suppose, another viable entertainment platform to compete with Google/Amazon/Microsoft/Apple.

Valve is one company which has been developing a distro internally. nVidia is another company which has been working on polishing up some documents in order to release them on their FTP site. Yesterday, both companies stepped forward and announced what they've been up to lately. Neither of them have noted any relevancy at all to each others developments and (as an outsider) it seems unreasonable to speculate on such based on timing alone. SteamOS is unlikely to ever use nouveau and nVidia is even more unlikely to consider SteamOS as a reason to release documents for nouveau. Many more projects are ongoing all over the world and any two of them could at some point make announcements on the same day.

>nVidia is another company which has been working on polishing up some documents in order to release them on their FTP site. //

You know this for a fact? It's not that the announcement was made now to gain traction in the Linux community and push forward Nvidia as the graphics company for Linux then?

It surprises me a bit. Too be honest I'd have thought Nvidia would need to document their drivers for internal purposes and that the detail already in the internal documentation would be more than sufficient for the Nouveau team to use. But I'm an outsider to GPU engineering and programming in general really so a misapprehension is quite likely.

>Many more projects are ongoing all over the world and any two of them could at some point make announcements on the same day.

Indeed but you'd expect Nvidia, or similar, to be manoeuvring themselves to align with Steam's announcement.

If this were about PR it wouldn't be a post to a technical mailing list about a single document being dropped and mention of work going on to provide more (but unlikely all) docs in the future.

Just because two things happen on the same day (or within a week of each other, etc) doesn't mean they're closely connected.

Having worked closely with silicon vendors in the past, the situation that there are no internal docs or that the docs are horribly outdated or incorrect compared to the HDL or source code, is pretty common. When the only consumer of docs is an internal team, it's not uncommon for those docs to be nowhere near what an external team would expect or need (or a company would be willing to release without serious cleanup and checking).

>If this were about PR it wouldn't be a post to a technical mailing list //

You're probably right.

That said your view of PR seems naive. It looks far better to release on social networks that someone found that CompanyN was "already" working on something than it does that there's an official press release. The channel used here - mailing list to social networks - doesn't tell us much IMO.

>it's not uncommon for those docs to be nowhere near what an external team would expect or need //

You think they'd be worse than the nothing that is currently provided by many hardware manufacturers.

Canonical has been courting GPU manufacturers (Amd, Nvidia, Intel) for quite some time, so they can definitely take some credit for Nvidia's recent forthcoming behavior.

One has to wonder about Canonical and Intel falling out recently.

There was a thread on Reddit a while back were a few users said that Nvidia are not that great at documenting their hardware internally.

"Nope. One of the reasons is that NVIDIA still doesn't document anything. A friend of mine worked there, and he told me that most of the knowledge was gathered by having little gatherings with the "village elders" as they were called. Seriously. He even leaked a humorous internal video to me called "Zero Documentation" in the style of Zero Punctuation."


IMO there's one primary reason Nvidia does not release open source drivers, and never will, despite whatever they say now. It has nothing to do with lawyers, or protecting hardware secrets - that's all just smokescreen to obscure the real reason.

If open source drivers were available, it would be possible to port the low level code to any operating system environment anyone wanted to. And that means future, experimental operating systems and GUIs too. Effectively this would enable OS/GUI innovation, allowing radical new 3D graphics based UIs, escaping the Microsoft Windows, Linux and Apple trap.

Considering the dismal state of UIs now (Windows 8, cough, say no more) you can imagine what the appearance of a well designed, sensible and user-enabling OS/GUI in the next few years would do to Microsoft.

Open source 3D drivers would definitely result in a quite rapid overturn of the present OS monopoly-by-three applecart. So the powers that be in the personal computer market are going to allow it ... over their dead bodies.

Incidentally, 3Dfx did make the full source code for their Glide drivers available (for money.) I worked at a company that bought the 3Dfx drivers, and I personally ported the drivers (which were Windows & Linux targeted) to a minimal MIPS processor based platform intended for gambling machines. It worked - passed all the test code. Then the company went through a 'local CEO was ripping off the company, sack him and kill all his projects' spasm, and the machine never got to market. Soon after that 3Dfx was deep sixed and their patents sucked into the Nvidia pool.

Personally I've always suspected those events may have had something to do with certain parties making sure an open source 3D engine never happened - precisely because it would be a threat to the OS status quo.

Here's the relevant part of the video that the picture is from:


Direct link to thread on lists.freedesktop.org:


Skeptical panda is very skeptical. If these open source drivers perform as well as their proprietary ones I will eat a whole crow pie.

My guess is this has something to do with SteamOS and the recent rise in gaming on Linux.

Finally! I have an image of Linus flipping the bird in my image library :)

Still rejecting to buy nothing with nvidia for my personal use.

My akmods in fedora broke a few days ago, and thus I am still on the 3.10 kernel.

Great! Looking forward to seeing this progress.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact