
Crashes, Hangs and Crazy Images by Adding Zero: Fuzzing OpenGL Shader Compilers - ingve
https://medium.com/@afd_icl/crashes-hangs-and-crazy-images-by-adding-zero-689d15ce922b
======
londons_explore
Exploits in shader compilers will likley be a big avenue for exploiting
platforms in the future because basically all desktop systems use Nvidia, and
an exploit of the shader compiler can usually get you direct root access.

~~~
pcwalton
On the Mac it's worse: the drivers are not NVIDIA's (which are relatively high
quality) but instead Apple's own and have become horrendously bad in the past
few years. I've lost count of the number of just plain broken functionality
and hard system crashes I've had to deal with doing graphics programming with
Apple drivers, including a random out of bounds memory read (security
sensitive? who knows!) that was particularly fun.

As far as I can tell, Apple has set a new record for how quickly you can go
from the best OpenGL drivers in the industry to the worst.

~~~
outworlder
Well, this is likely due to the refocus on "Metal". Which is just DirectX _de
novo_.

It's not only bugs, it's the lack of features. It's out of date, to the point
that companies are no longer releasing games. For instance, Frontier has an
Elite: Dangerous version for the Mac. It doesn't have the expansions, due to
the lack of compute shaders, which are not supported by Apple's crappy OpenGL
implementation.

Not that Macs are good gaming machines. But this only compounds the problem.

~~~
pcwalton
> Which is just DirectX _de novo_.

Except without the market share in high end gaming/graphics that made D3D
viable.

I wish they would realize that the people they're hurting with this decision
are not their competitors (who are actually probably happy to see them
underinvest in good GL drivers) but rather us developers, who do have to
develop cross-platform apps for market share reasons and so end up shouldering
the costs of needless fragmentation.

------
dekhn
i should mention that once I spoke to the head engineer for nvidia graphics
cards and somebody asked him what happens if you try to render a texture
filled with NaNs (common result of divide by zero). he said it renders with
the color "nvidia green".

~~~
frozenport
They look black to me...

~~~
SolarNet
Are you rendering natively or through ANGLE? OpenGL or DirectX? Which kind of
NaNs?

~~~
frozenport
Native OGL 4.3, on a 970m. Nans are qnans part of a quad texture.

------
nix0n
Content-free article, full of "read the next post"

~~~
JayOtter
I imagine a lot of the specific detail has been omitted to avoid this exploit
being used for malicious purposes.

EDIT: Followup post is here: [https://medium.com/@afd_icl/first-stop-amd-
bluescreen-via-we...](https://medium.com/@afd_icl/first-stop-amd-bluescreen-
via-webgl-and-more-ba3eaf76c5fb#.on6gjmlrz)

~~~
nix0n
Much better, thanks for posting

------
mastax
In a perfect world the hardware manufacturers would fuzz their own drivers
before release, rather than waiting for someone else to do it and then
ignoring their bug reports.

------
kevin_b_er
Wasn't this type of a thing a concern back in 2011?

Microsoft talked about the dangers of this back then.
[https://blogs.technet.microsoft.com/srd/2011/06/16/webgl-
con...](https://blogs.technet.microsoft.com/srd/2011/06/16/webgl-considered-
harmful/)

------
Bartweiss
Now I'm wondering if it's possible to create pathological images that _should_
render vastly differently in the face of small changes. Specifically, anything
which responds so nonlinearly to alteration that the small floating point
variance mentioned could produce drastically new results.

~~~
xsmasher
That sounds similar in concept to the CSS Acid tests

[http://origin.arstechnica.com/staff.media/acidtrip.gif](http://origin.arstechnica.com/staff.media/acidtrip.gif)

------
Kenji
That reminds me of that time when I was developing shaders for OpenGL ES on
Android and since there were barely any shader debugging tools on that
platform, virtually the only debugging tool was debugging 'by eye'.

