
The Truth on OpenGL Driver Quality - jcastro
http://richg42.blogspot.com/2014/05/the-truth-on-opengl-driver-quality.html
======
vernie
The write-up by the Dolphin team is also worth reading: [https://dolphin-
emu.org/blog/2013/09/26/dolphin-emulator-and...](https://dolphin-
emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame/)

------
overgard
Not a vendor per se, but since they pretty much prevent you from installing
any updated drivers I'll also say: Apple's OpenGL support is horrid. Like,
buggy, entirely behind the times, incredibly slow. I tend to reboot my mac
into windows whenever I need to do actual graphics coding, which is a shame.

~~~
ekianjo
> Apple's OpenGL support is horrid.

Funny, they still do get a lot more game ports than Linux so far (while Linux
is progressively getting there).

~~~
overgard
Well, sure, but I don't see what that proves. Game developers find a lot of
hacks to work around bad software if there's a market.

~~~
ekianjo
Well, I see a lot of developers bitching about how hard it is to port stuff
for Linux, but I don't recall so much bad mouthing when it came to porting
stuff for Mac in the first place.

~~~
Freaky
A friend of mine works on ports for both platforms, and he _vastly_ prefers
the graphics driver situation on Linux, both in terms of driver quality and
vendor support.

------
ZenoArrow
So... Vendor A - NVidia, Vendor B - AMD, Vendor C - Intel. What are the two
Intel drivers?

~~~
Spidler
Windows and Linux drivers.

~~~
zdw
I'd assume:

    
    
        Vendor C #1 = Intel on Linux
        Vendor C #2 = Intel on Windows
    

Given the "open source wiz kids to keep driver #1 plodding forward" and "GL on
this platform is totally a second class citizen" comments in each description.

~~~
stormbrew
I can't imagine anyone taking it up, as it'd be a monstrous project, but it'd
be interesting to see someone port the Linux OSS Intel drivers to Windows
and/or OSX.

~~~
datenwolf
You'd need access to the Windows OpenGL ICD DDK which is one of the few things
Microsoft doesn't hand out freely and without signing a NDA (unless the
situation somehow changed withing the past few years).

OTOH the ICD API should be easy enough to reverse engineer. The registry keys,
where the OpenGL ICD is registered are well known and there are plenty of
drivers in plenty of versions you can dissect to learn how to talk "ICD".

------
Vektorweg
I think the greatest problem is still OpenGL itself. Vendors have a hard time
implementing these and other Khronos specs.

~~~
overgard
I don't know why you were downvoted for this. OpenGL spec is a mess. I think a
lot of the driver issues arise from the fact that OpenGL has so much backwards
compatibility and so much complexity.

~~~
vitd
Not to mention how obtuse it is to learn. Even without the backwards
compatibility issues (which are very real), the entire mental model of how
OpenGL works is completely messed up.

Things like having a texture ID, which you then bind to a particular target on
a particular texture unit in order to use it make so little sense to someone
learning it for the first time. On the CPU, I just pass the pointer to my
image to a function to manipulate it. I don't have to put it into a special
slot of a special structure in a special place in memory! It took me years to
understand many of these things, and I see others struggling in the exact same
way on Stack Overflow, for example. So sad.

~~~
overgard
Yeah, the API is basically insane. Not to mention how many "best" practices
aren't. For instance, I remember a few years ago the advice was always "Use
VBO's, don't use display lists they're deprecated". Ok, but when I benchmarked
them against each other display lists were still twice as fast for the
geometry I was rendering than very carefully constructed VBO's. Wtf.

~~~
orbifold
If I'm not mistaken there are hardware reasons why VBOs will never be as fast
as other methods, at least that is what I dimly recall having heard from a
talk by a guy at Valva / Nvidia.

------
clarry
It's not just affecting 3D applications that need OpenGL though. I find it sad
that we have no modern standard akin to VESA / VBE -- to get any reasonable
graphics at all at your native display resolution, you have to live with an
incredibly complicated (and unreliable) graphics stack. Purely CPU based
rendering (which was fast enough in the 90s) is no longer a choice really.

~~~
rasz_pl
SDL

~~~
euank
To expound on this comment: SDL1 defaults to fully software rendered output.
You can hack in opengl hardware rendering, but it's fairly common for SDL1
games and such to be fully software rendered and run fine.

SDL2 in general is hardware rendered.

SDL1 is a good counter-example to "software rendering isn't quick enough
anymore". You can make perfectly performant 2d games with sdl1's software
rendering... 3d, not so much.

~~~
tagrun
People usually use SDL1 _with_ OpenGL, and I don't understand what part of it
is "hack". Is using OpenGL with a manually created context with tens of
platform ifdefs less hacky? Or is EGL/glut/<your favorite wrapper> the one
true way of doing things?

~~~
euank
It's not a hack in that it's held together by tape, precarious, or any such
thing; it's a hack because it's going around the back of SDL to do the
graphics, even though SDL is supposed to do all the graphics for you.

Perhaps my wording was poor. It's more of a hack than in SDL2, where all the
SDL functions support hardware rendering with no need to touch OpenGL directly
ever.

The general point still stands that you can write performant software
rendering in SDL1.

------
svdree
This guy should try some OpenGL-ES development on Android. Suddenly vendors A,
B and C don't sound so bad anymore...

------
izacus
Most of those things pretty much stem from the fact that OpenGL is nowhere
near priority for those companies - D3D game performance in benchmarks is what
sells the accelerators. Additional features like HW video encoders are second.

OGL support is far behind.

------
mschuster91
I'm completely missing the mobile space in this writeup, but I bet it is a
situation even worse than ATI (no way to update the GPU drivers, short of a
full OS upgrade on Android)

~~~
simcop2387
It absolutely is from what I've heard. There are devices where the Mali GPU
works well in some games/apps with one version of the OS and are severely
broken in a newer version because the driver broke a lot of things to make
other apps work.

It's one of the big reasons I'm hoping that with efforts like the driver for
GPU for the RPI being open sourced that we'll start to see some more
consistent drivers for mobile chipsets.

~~~
mschuster91
I more fear of a world like DOS/W3.11-era games - that games ship their own
drivers like back in the old days it was done for soundcards...

~~~
mrec
Or move the whole rendering to a server in the cloud, where any awfulness is
at least going to be consistent and predictable. This is what NV were (are?)
pushing as "Grid", though I haven't heard all that much about it. They _say_
they can keep latency at acceptable levels, but it strains credulity a little
bit.

~~~
yuhong
I think it is a real solution to the cheating problem too, far better than say
VAC:
[http://games.slashdot.org/comments.pl?sid=3105&cid=1442601](http://games.slashdot.org/comments.pl?sid=3105&cid=1442601)

~~~
mrec
Yeah. Piracy too.

------
ksec
I have been wondering about this for long time. The biggest cost in GPU
development isn't the Hardware itself. But the drivers. Nvidia famously said
they have much more software engineers then hardware guys. And as history can
tell having great Hardware on paper means nothing when your Drivers aren't
being up to standard ( S3, PowerVR on Desktop). Hence the smaller group of GPU
maker were forced out because they dont have the resources to complete on the
software front.

And yet over the decades nothing has improved. When Browser wanted to do
hardware Acceleration there were many Laptop GPU being blacklisted simply
because they dont have any drivers update. Situation is much better on Mac
because the drivers and testing happens to be the same people.

There are people who wanted the GPU to be just another CPU. Intel Larrabee.
But none of them has succeeded.

I wouldn't have thought with GPU IP taking rounds, drivers quality being in
the hands of vendor would have improve the situation abit. However it seems no
one wants to invest into it.

So do we have no solution for this? Rumors has it Apple are designing their
own GPU. May be they are tackling the drivers problem themselves by getting
rid of it?

~~~
anon4
Larrabee did succeed - it just got rebranded as Xeon Phi and is sold as a
high-performance computing PCI card.

~~~
pjmlp
Not really, at least for those of us that seated at an Intel talk about
Larrabbe on GDCE back in 2009, on how Larrabee would revolutionize graphics
programming.

------
rwbt
ATI drivers are horrible on the Mac OS X platform too, even though Apple
controls all their drivers on the platform. Even though Intel's GPUs are slow
their drivers are very stable and work most of the time.

~~~
mwfunk
That's actually not true, the NVIDIA and ATI drivers are maintained by the
vendors.

~~~
duaneb
Then how come you can't manually install driver updates? Because Apple
controls the release channels and public facing interface.

~~~
Audiophilip
You can install Nvidia drivers on your own for OS X, look up "Nvidia Web
Driver". Here's an example:
[http://www.nvidia.com/download/driverResults.aspx/73628/en-u...](http://www.nvidia.com/download/driverResults.aspx/73628/en-
us)

~~~
rwbt
AFAIK, that driver only applies to CUDA and other proprietary Nvidia
platforms, not necessarily OpenGL.

~~~
Audiophilip
I think you misunderstood something. If you read the info text for the driver
on the linked page, it explicitly mentions that you need a _separate_ driver
for CUDA.

GPU driver:
[http://www.nvidia.com/download/driverResults.aspx/73628/en-u...](http://www.nvidia.com/download/driverResults.aspx/73628/en-
us) Additional CUDA driver: [http://www.nvidia.com/object/macosx-
cuda-6.0.37-driver.html](http://www.nvidia.com/object/macosx-
cuda-6.0.37-driver.html)

------
dkarapetyan
Can someone explain why the drivers are closed source. Is it the fear that the
hardware can be reverse engineered from just the driver?

~~~
wtracy
That, and sometimes device A is x% faster than competing device B because the
_driver_ for device A is faster, not because of the silicon. Open that up, and
you give away your competitive advantage.

Also, there are always rumors about $LOW_COST_PRODUCT being the same silicon
as $HIGH_COST_PRODUCT from the same manufacturer, just with a few features
turned off in the driver. This probably isn't true, but nobody can rule it
out.

~~~
mschuster91
Rumors? Uh, no. NVIDIA is known for this
([http://www.tomshardware.com/news/Nvidia-
GTX-680-GTX-770-BIOS...](http://www.tomshardware.com/news/Nvidia-
GTX-680-GTX-770-BIOS-Flash-Hack,22561.html)). IIRC one could also flash a
consumer-grade card with a Tesla BIOS and "convert" a couple-hundred-dollars-
card into a thousand-dollars-card.

~~~
Crito
While technically the same hardware, isn't it the case that the "higher end"
products are the units off the production line that met a higher QA bar?

That's my understanding of how it works for CPUs. A four-core CPU with
questionable functionality on the 4th core may be sold as a "3 core" CPU.
Depending on how "questionable" the 4th was though, it might be possible to
use it anyway.

~~~
mden
I don't think production quality is the only reason. Sometime it's cheaper to
design one product and sell it as multiple products to capture more value. For
example, some people are willing to pay 1k for a graphics card while others
are only willing to pay 300. So you sell the same graphics card to both groups
but for one you artificially lower it's capability. This allows you to
capitalize on the market a lot more efficiently than selling your product
either at a low price or a high price.

~~~
corysama
With Tesla cards, you get a professional-oriented driver that is good for Maya
and CUDA, but not optimized for games. There are also a few hardware features
that are important for pros but are not an issue for gamers --stuff like ECC
RAM, double-precision fp, better handling of multiple 3D viewports.

But, it's my understanding that most of what you pay for when you buy a Tesla
card is support. If you call Nvidia saying Maya has a driver problem with your
Tesla card, they will pay attention. If Maya has a problem running on a
GeForce card, they will direct you to the forums.

------
AshleysBrain
In my experience of writing a GL Windows desktop app alone, driver bugs have
been the #1 cause of stability problems for real end users. It's a complete
nightmare, with pretty much every manufacturer. I guess that's what happens
when hardware companies need to write software.

~~~
kevingadd
Have you considered shipping a D3D rendering backend for your Windows clients?
Or is it too much duplication?

~~~
AshleysBrain
No time for a startup, and we had previously used D3D and had other annoying
problems (horrible text rendering, annoying redist issues, etc), so it just
had its own set of problems.

------
veesahni
Is it practical to create a driver abstraction that masks cross vendor issues
and provides a consistent interface to the dev? Like what jquery did for
browsers.

~~~
exDM69
> Is it practical to create a driver abstraction that masks cross vendor
> issues and provides a consistent interface to the dev? Like what jquery did
> for browsers.

Yes and no. The situation with OpenGL and drivers is quite different to
browsers and jQuery.

OpenGL drivers are generally quite good in implementing the API as it is
specified and this is tested with a huge bunch of conformance suites. It's not
like ancient browsers where one vendor's understanding of the CSS box model is
different from the others'.

However, OpenGL has a huge number of different versions, some require hardware
support (major versions like GL 4.x vs. 3.x), while others might be software
only additions (minor versions, GL 3.2 to 3.3). And then there are lots of API
extensions that may or may not be available. This is a relatively simple
problem to solve and we have tools like Regal and ANGLE to patch the little
things.

But the real problem is functional bugs. Incorrect pixels on the screen
ranging from a minor annoyance to a completely destroyed image. And more
severe issues like application crashes, completely corrupted display/desktop
to blue screens and kernel hangs. This is something that cannot be fixed.

------
programminggeek
I assume driver incompatibility and things of this nature are one reason why
so many games are built on engines like Unreal, Crytek, Unity, etc.? At some
point you need to be working on shipping a game, not writing driver
compatibility code that a framework could/should handle for you.

------
lolo_
Not to stray too off topic but what does TDR stand for? :)

~~~
klodolph
Apparently, it is a feature where Windows detects that a video card is
unresponsive and resets it.

[http://nvidia.custhelp.com/app/answers/detail/a_id/3335/~/td...](http://nvidia.custhelp.com/app/answers/detail/a_id/3335/~/tdr-\(timeout-
detection-and-recovery\)-and-collecting-dump-files)

------
mattgreenrocks
What's horrible is I've done almost zero OpenGL dev and I could pick these out
_immediately_ based on anecdotal experiences with drivers on Windows. Vendor B
might be getting better (supposedly), but I still have a tendency to shy away
from their drivers (thus, them) due to past experiences.

~~~
anon4
Well let me put it like this. I bought a Radeon 9600 back in 2003 when ATi
still existed as an independent company. In the time since then, I haven't
seen enough improvement to justify buying anything other than nVidia.

To be fair, they are better - I haven't heard them lately crashing the kernel
so hard the alt-sysrq combos don't work.

------
ah-
Any idea wether the Intel osx driver is related to the Windows one or also
completely different?

------
MichaelGG
What do they mean by nVidia embedding developers on teams? Like covertly
getting nVidia employees hired by game teams to sabotage non-nVidia?

~~~
mcpherrinm
Not covert at all. Nvidia engineers come to your office and work on your code
for you. They know more secret sauce and contacts inside their company, so
they can help you get your performance up on their chips.

~~~
nitrogen
So, they're Field Application Engineers, basically?

------
dman
Any ideas on whether Nvidia and Intel will support opencl 1.2 on their gpus on
linux anytime soon?

~~~
sharpneli
Even though this is offtopic I'll answer: Not going to happen.

Nvidia hates OpenCL as it eats their Cuda business and Intel performs market
segmentation, therefore their Linux implementation supports only CPU and Xeon
Phi and Windows supports the GPU. They don't want server builders to purchase
their desktop chips with GPU's. They want them to purchase the massively
expensive Xeon Phi and use normal Xeon CPU's.

------
Someone
"you can _git_ your hands dirty and try to fix it and submit a patch."

Typo or funny wordplay?

~~~
awalton
Intel's OSS driver is maintained in git.

So, wordplay.

