
Linux Graphics Demystified (2014) [pdf] - gshrikant
https://keyj.emphy.de/files/linuxgraphics_en.pdf
======
iso-8859-1
Keith Packard is a big contributor to Linux graphics, a recent blog post
notes:

> So, you've got a fine head-mounted display and want to explore the delights
> of virtual reality. Right now, on Linux, that means getting the window
> system to cooperate because the window system is the DRM master and holds
> sole access to all display resources. So, you plug in your device, play with
> RandR to get it displaying bits from the window system and then carefully
> configure your VR application to use the whole monitor area and hope that
> the desktop will actually grant you the boon of page flipping so that you
> will get reasonable performance and maybe not even experience tearing.
> Results so far have been mixed, and depend on a lot of pieces working in
> ways that aren't exactly how they were designed to work.

His blog has some recent updates. He is consulting for Valve, working on VR.

[https://keithp.com/blogs/DRM-lease/](https://keithp.com/blogs/DRM-lease/)

~~~
digi_owl
Best i can tell, he is the only one willing to touch Xorg internals. The rest
just fitter around the edges, delete code, and hack on Wayland because its
shinier.

Xorg will live and die with Packard, sadly.

~~~
mattst88
This is an extremely simplistic view that misunderstands the situation.

For one, those people working on Wayland made enormous contributions to the X
Server long before Wayland existed. The creator of Wayland, Kristian Høgsberg
did AIGLX, just to name one example.

------
ChuckMcM
There are few things I hate with more passion than I do the Linux "Graphics
Stack." I realize that it inherited a toxic culture from UNIX (remember the
X/Motif/xNews/Suntools/Etc wars? I do.) and even today there is active warfare
in the stack Wayland vs Xorg, GTK vs QT, Gnome vs KDE vs *desktop, HW Vendors
vs FOSS, OpenGL vs OpenGL, Etc.

<rant> A great example of this is how broken the simplest of things can be.
I've got sitting next to me machine which refused to boot into a "good"
graphics configuration, it boots with a 'safe' default of 1024 x 800 screen
(where the actual screen is a 2K screen) even though it has HDMI (which can
tell it all that it needs to know.) Using a widely deployed nVidia card and
the current 'non-free-non-FOSS' nVidia graphics stack. Re-install the driver
package, and it works (until reboot). Yes, there is a knob somewhere that
screws it up, there are a billion knobs, from a dozen layers, is it DRM/KMS?
is it Lightdm? is it Nvidia? Is it OpenGL? Is it a friggin boot option buried
in the boot args? It can be anywhere and its going to take me a few hours to
figure out where it is. Which in terms of wasted salary time is enough for me
to buy a Mac Pro or a Windows box which works every time all the time. </rant>

~~~
ploxiln
It's the nvidia card. The binary drivers are good for 3D acceleration, but bad
for system integration. The open source drivers are not good enough.

Intel and AMD graphics have much better open source drivers (but are not as
good at 3D acceleration).

~~~
vetinari
The current AMDGPU driver is quite good, actually. Better at 3D accel than the
proprietary one.

------
wodny
There is another great resource mainly about X11 called Xplain[1] with its
accompanying repository[2] with some hidden (not yet ready I suppose)
chapters. For example the override-redirect description[3].

[1]:
[https://magcius.github.io/xplain/article/](https://magcius.github.io/xplain/article/)

[2]: [https://github.com/magcius/xplain](https://github.com/magcius/xplain)

[3]:
[https://magcius.github.io/xplain/article/menu.html](https://magcius.github.io/xplain/article/menu.html)

~~~
digi_owl
Sadly it feels mostly like a PR piece for Wayland.

~~~
gue5t
All understanding of X11 feels this way. Read the docs and source yourself
sometime.

------
datenwolf
I don't like these slides. It's not that they're terribly wrong. But they
gloss over some of the really important aspects. And they make actually rather
simple problems appear harder than they are.

Take for example slide 29. This slide suggests, that off-screen redirection of
OpenGL applications (as required for composition) is something special and
needs to be treated differently than non-OpenGL graphics. This is simply not
true. If OpenGL is used in a _window system integrated_ context (WSI is a
rather new term, which has been properly defined only recently, but the
principle has been the same since the beginning; what's new is, that since
OpenGL-3 you can use a GL context _without_ WSI) the window framebuffer is
_not_ managed by the OpenGL implementation, but whatever windowing system is
used (e.g. X11, Win32 GDI, etc.) and the OpenGL implementation just borrows
that. And the same mechanism (and incidently code paths) that allow to use a
WSI drawable as rendering destination for OpenGL also allows to turn the flow
of data around and use a WSI drawable as source for texture access. Somewhere
at the bottom it's just pointers to regions of graphics memory after all.

It's a pretty simple process, actually, and the only complicated thing are the
weirdly convoluted APIs that have grown around it, to expose something that
has always been there, but had been hidden from applications before. But
consider how quickly AIGLX was hacked together after Xgl showed up. There have
been, IIRC, just a couple of months between.

That's the main insight that lead to the Wayland project. Do away with the API
cruft and expose the one thing that has been possible all the time anyway.

Oh, and it should maybe also pointed out that GLX_EXT_texture_from_pixmap is
useful for so much more than just composition, and can be used without
Composite redirection.

And then there's slide 13, which is simply wrong in stating that there was a
time when "Indirect Rendering (…) didn't allow for hardware acceleration."
That's not what it implies. It just implies, that there is no fast path
between application process and graphics hardware, which slowed down data
transfers. But display lists back then were a staple of OpenGL and they did
offer and for the legacy code that uses them still do offer excellent
performance (it actually took some time for OpenGL drivers Buffer Object based
Vertex Array drawing code paths to catch up with display list performance).
And one could use display lists over indirect GLX just fine (and actually
there ARB_vertex_buffer_object extension defines GLX opcodes, so you can even
have that over indirect contexts, too).

~~~
grover_hartmann
Hi, are you the guy in this video?

[https://www.youtube.com/watch?v=ZTdUmlGxVo0](https://www.youtube.com/watch?v=ZTdUmlGxVo0)

~~~
digi_owl
And the guy jumping the stage is Lennart Poettering, of Pulseaudio and Systemd
"fame"...

------
shmerl
Interesting, but as expected somewhat outdated, like missing amdgpu/radeonsi,
Vulkan and etc.

~~~
TD-Linux
Luckily, those mostly slot in at clean locations in the presentation: amdgpu
replaces radeon, vulkan replaces the bottom half of mesa.

~~~
wodny
And there are quite a few schematics on Wikipedia:

[https://en.wikipedia.org/wiki/Direct_Rendering_Manager](https://en.wikipedia.org/wiki/Direct_Rendering_Manager)

It's a MESA->this[] of technologies :)

------
bonzini
The arrows in slide 45 look like they are in the wrong direction?

------
kdtop
Outstanding article

------
theparanoid
Title should be "Linux Graphics Demystified (2014)"

------
andrewstuart
Are there any/many notable Linux applications that are graphics intensive?

~~~
iso-8859-1
Many. All the games from
[https://www.feralinteractive.com](https://www.feralinteractive.com) .

Blender can be extremely demanding. Cryptocurrency mining is also very hard on
the GPU.

~~~
Rjevski
Crypto mining is not "graphics" though and bypasses X11 (it talks to the GPU
via CUDA or OpenCL). You can happily mine without ever having an X server
running.

~~~
iso-8859-1
What defines graphics? There is linear algebra in all sorts of things. I can
use OpenGL without rendering to screen. Is it still graphics to you?

~~~
Rjevski
What I meant is that mining has a totally different way of using the GPU than
graphics. For OpenGL you have to have X11 running and the correct libraries
installed. Mining bypasses all that and uses OpenCL or CUDA instead.

