Hacker News new | past | comments | ask | show | jobs | submit login

Hmm... this is a very basic overview. For e.g. it doesn't explain what Gallium3D actually is and also, how do proprietary (nvidia/fglrx) driver come into this picture.

Does anyone here know more details?

nvidia and fglrx have a string of binary blobs that plug in to each spot on the stack: kernel, X server, GLX. The way they communicate with each other is pretty proprietary and beyond what I know.

Gallium is an interface for doing 3D stuff. It's very generalized and describes a very general, shaderful, batched renderer. "Shaderful" means that the renderer has shaders instead of old-school fixed-function TCL and pixel combiners; "batched" means that the renderer prefers to operate on pre-cooked buffers full of data rather than vertex-at-a-time data.

Gallium matters because it is a much more direct and low-level interface for describing hardware which enables hardware-specific optimizations to be decoupled from API-level logic. We have implemented GL, D3D (not open-source!), VG, X11 (EXA, Xv, XvMC, VDPAU), etc. on top of Gallium, without having to write entirely new backends each time. It makes it easier to write drivers. Gallium's actual interface is also less verbose than user-level APIs, which means the actual drivers are smaller by anywhere from 30-50% LOCs.

Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact