(This is also why DRM never works; the type of person that could implement it correctly wouldn't want to implement it correctly.)
Also, nobody really cares if they work on proprietary code. Such developers are rare.
A good middle-ground seems to be doing like github: you keep working on your closed projects using open-source technologies and methodologies.
Admittedly, it does a lot more than that as well, and I love it for what it is, but it couldn't really exist in it's current form if it weren't for open source.
EDIT: In addition, almost all of the public code on Github is open source, and they have a bunch of open source repositories themselves. https://github.com/github
I think this open-source-like way of operating makes their great coders happy to work on a closed source project.
There are of course many brilliant programmers who are not into open source (like Philip Zimmermann), I don't deny it.
A few people off the top of my head who are noted for being good programmers but whose code is I think mostly not publicly available: Peter Norvig (though I doubt he's writing much code nowadays), Paul Graham, Sophie Wilson, Charles Simonyi (though it'll be a while before he's forgiven for "Hungarian notation").
(I should add in the interests of fairness that while I was typing that list my brain was angrily trying to remind me of Jamie Zawinski and Don Knuth, both of whom are much more open-source types.)
Yes, that was my point, I don't say there are no good closed source programmers, just it's much easier to be known/respected as a good programmer by writing free software.
Anyway, John Carmack.
Commander Keen, Wolfenstein, Doom and Quake were all absolutely ground-breaking. Probably ditto for the megatexture stuff in Rage.
It's fun to hate on Zuckerberg, because it's Facebook. But the guy produced brilliant stuff that was ahead of it's time.
I agree with you about Windows but note that outside of the U.S. very few software engineers ever even get to work with OS X.
Some of the people used Linux in VMs, on the other hand, some new students installed XP on their notebooks. In 2012.
And I think this was even a rather progressive lab, I suppose industry only uses Windows.
The majority of developers (in Europe or elsewhere) work for large companies and are not the kind to go to software conferences. Note that this doesn't mean they write CRUD; I know people who do amazing GPGPU stuff that will never see the light of day outside their BigCo's internal use. They are the ones Nvidia should be targeting with their drivers.
I'm yet to meet one such developer who'd use a Mac for work. It's my anecdote vs. yours at this point, though, so I might be wrong about Mac users being a vanishingly small minority in this group (but likely not by a huge margin given how the corporate world is).
Edit: It is also quite possible that not all of the developers who present at conferences with Macs use those at work if their workplace isn't BYOD.
But I think you are correct that the majority of development work is in the MS world for business. At least that is the case if you look for work - everything is .NET
For a long time, the Nvidia proprietary driver has been the best gpu driver for Linux, not that it has been super good but the competition has been really awful. Only in the recent years there has been any competition if you want a GPU with good 3d features. Intel has had decent 3d only in their latest few generations of integrated GPUs and ATI drivers have ranged from not working at all to barely usable.
That being said, this is a very welcome change and will hopefully make the nouveau drivers better in the long run. In particular, this is good news for the common linux graphics architecture (dri/drm, etc) and Wayland.
nVidia would be the way to go on professional application on Linux (think 3D Modelling and such)
Also stuff like this http://www.nvidia.co.uk/page/gelato.html
Microsoft realized this a long time ago, and so did Apple (starting with iOS only). It took Google even more years after that to finally put 2 and 2 together, and realize that the reason why so many still choose iOS over Android is because of gaming.
Same goes for Facebook with Farmville, and so on. So yeah, if you want a very successful platform/OS, definitely focus on gaming as much as you can. It's what brings the "mainstream" in.
This is a common misconception nowadays, but if you look at the early keynotes Jobs was more dismayed than thrilled by his baby being coopted by games. He resigned himself more than realized it.
It just happens that — on iOS anyway — gaming concerns and Apple concerns align in that both want a good equilibrium between CPU and GPU rather than a CPU-focused device.
Linux exposes a lot of options that Windows and OS X will not.
Imagine if these particular gamers switch to linux. They can significantly improve the platform by trying everything they can for that extra performance.
The improvements could have far reaching consequences. Improvements that could possibly help people solving hard problems (HPC). It is not out of the realm of possibility.
That's not the motivation. The motivation is the ease of transition that is finally coming to fruition. Linux used to have larger barriers in place for a layman.
It's very cool that they're starting to drop docs to support the open drivers, but I suspect it's going to be a long road from these small steps to the open drivers being fully competitive with the closed ones...
I think that's only true when it comes to 3D drivers. They wrote the 2D nv driver:
I do remember people complaining it was full of magic numbers to obfuscate what it was doing but it was indeed open source and written by nvidia.
It is not intended as decent 2D-only driver or anything like that.
Try getting Optimus working on laptop under Linux. I have a Nvidia video card in my Dell that's basically a paperweight, unless I run Windows.
I hadn't seen any possibly-relevant follow-ups until now though. Here's hoping!
The key areas where the documentation will be needed and have impact - power management and clocking, 3D stuff etc. - Nvidia isn't promising anything on that. If they hired a bunch of OSS developers and promised to release all docs eventually like AMD that would be news.
So no Nvidia - I am still not buying anything you sell.
FWIW, the most of the Nvidia proprietary driver works in a user space library too. It would be a really bad idea to make your OpenGL implementation and other high-level facilities running in kernel space.
There is one aspect which really could gain from this: the eventual reduction of the harm from the "3/11 rule" fallout.
If and when Nvidia's hardware works acceptably well out of the box on linux systems, there will be less need for annoyed forum postings and endless "help, please" threads. AFAIK Google weighs recently updated pages more than older ones, so the negative PR hits from badly behaving hardware should fizzle and die out in time.
From a purely personal point of view, a unified DRM/KMS access scheme will be nice with Wayland in mind.
Especially after the NSA scandal, we should really demand open source firmware from hardware manufacturers with every chance we get.
While this is arguably true for most hardware, GPUs are different from mundane hardware like ethernet adapters. The driver source code will reveal secrets about the hardware architecture, and the architecture changes in every generation. The advantage is not in the software itself, it is not showing the cards you are playing with.
In theory, it would be possible to give out the source code to the video drivers but it would not make sense to give out the source for chips in development. At best, you could get source code dumps after a new chip has rolled out. This is not really open source.
Yes, I want quality open source video drivers but on the other hand I want bleeding edge hardware and competition that drives hardware development.
Disclaimer: I write GPU drivers for living and I'm an open source enthusiast.
This will allow them to play nice with the kernel, support previous products while taking bug reports and patches, and help game engine designers target 90% of nVidia's user base without signing an NDA.
Proprietary drivers usually are the result of licensed code that can't be open sourced or IP concerns (giving away the "secret sauce", patent worries, etc).
Provided it can't be updated behind your back, I don't see proprietary firmware as particularly harmful unto itself -- it'd be nice if it were open and make the device less of a black box, but from the POV of the driver or user, there's not really much difference between a device with an embedded CPU running some firmware and a device implementing the same functionality in the asic.
For example if nvidia decides to focus and invest all their effort solely into mir and unity support for their closed drivers, thus giving everybody who backs wayland the shaft in the process, its going to end up being a major problem for developers like myself. Philosophically and pragmatically.
If they pull an AMD, except they do it right (ie, just depreciate the proprietary driver for enterprise compute usage and focus entirely on making gallium great) they have my business forever.
I'm thinking of getting a tegra note next month, on the pretense that at least Nvidia provides open source Tegra drivers through a 3rd party so that the platform is hackable. The new Nexus 7 is a disgrace since smart people have to waste their time reverse engineering the gpu in the Snapdragon because Qualcomm is being a dick.
That Thumbs.db pretty much says it all :)