
You Compiled This, Driver – Trust Me - ingve
http://www.joshbarczak.com/blog/?p=1028
======
mattst88
The article is very impressive, and the author is an experienced graphics
developer, which makes me wonder why not do these experiments using the free
software graphics drivers in Mesa?

We (the team that writes the Intel driver in Mesa) would love to be involved
in things like that and would be happy to answer questions where possible --
#intel-gfx on Freenode or mesa-dev@lists.freedesktop.org.

The documentation doesn't contain much if anything in the way of performance
data like instruction latency, so we've implemented similar methods to what
the author used to measure the number of cycles a piece of code takes
(environment variable INTEL_DEBUG=shader_time). We'd likely even learn
something about the hardware ourselves by being involved in experiments like
these.

------
pjc50
_If even one game does this, the IHVs and OSVs will freak out, remove pre-
compiled binaries from future drivers, and never let us have them again, and
you’ll be the one responsible for ruining all the fun. You don’t think they’ll
be able to remove it after you ship it? There are ways..._

Future of computing right here.

~~~
nhaehnle
Most importantly, the whole point of having a driver do compilation for you is
compatibility. AMD's GCN, for example, currently has three different
generations out, and there are small differences in the set of supported
instructions _and in the instruction encoding_.

Not having to deal with legacy stuff in instruction decode is one of the big
advantages GPUs have over CPUs, and it relies on having the driver do the
compilation.

All that said, it's nice that people are having fun with AMD's and Intel's GPU
ISA, but honestly, why on Earth do they do it on Windows? There are perfectly
fine open source drivers available on Linux that you can easily modify to play
around with the ISAs as much as you wish.

~~~
robbies
In my experience, the vast majority of realtime rendering engineers work on
Windows (because of the products). The landscape is starting to change, but
I'd spitball still 99% Windows ownership. Even for console development, the
platform is Windows

------
emmab
Down
[http://webcache.googleusercontent.com/search?q=cache:R2PmRfU...](http://webcache.googleusercontent.com/search?q=cache:R2PmRfUY2T4J:www.joshbarczak.com/blog/%3Fp%3D1028+)

------
0xcde4c3db
Does anyone know whether this functionality is expected in Vulkan? Will it be
covered by uploading SPIR-V binaries instead of native code? My understanding
is that the main motivation for glProgramBinary() is to bypass a bunch of
higher-level compiler passes, and getting back native code is just a result of
vendors doing The Simplest Thing That Could Possibly Work rather than a
deliberate feature. Is that right, or is there more to it that I'm missing?

~~~
nhaehnle
You can get the main motivation for glProgramBinary right from the horse's
mouth (i.e.
[https://www.opengl.org/registry/specs/ARB/get_program_binary...](https://www.opengl.org/registry/specs/ARB/get_program_binary.txt)
):

 _This is a very useful path for applications that wish to remain portable by
shipping pure GLSL source shaders, yet would like to avoid the cost of
compiling their shaders at runtime._

~~~
Qantourisc
I see this a lot on NVidia driver on Linux games. Games where slow to load.
NVidia's fix -> compiled sharers cache. They are stored in ~/.nv
(ftp://download.nvidia.com/XFree86/FreeBSD-x86/295.71/README/openglenvvariables.html)

~~~
robbies
This is relatively old NV functionality. However, there were some complaints
about this folder ballooning after installing a couple driver updates, which
caused brand new shader blobs to be built. Not a huge issue, but that spurred
some of the extension getting traction to allow games to manage the process

------
stevebmark
A blog post about shaders without any screenshots of shaders :(

~~~
nitrogen
It seems less about shaders, and more about the hardware that runs them.

