Hacker News new | past | comments | ask | show | jobs | submit login

A funny note: During Longhorn (PDC 2003), some MS devs indicated that the color pipeline for everything in Windows was moving to a 128-bit pixel struct. Some floating point part, I think. Course that never shipped but it sounds cool.

I think Carmack mentioned that in some extreme-but-possible cases, 64-bit isn't enough to represent the results correctly.




I have to assume that's 128-bit precision in 3D rendering -- partly because it's Carmack, but largely because 64-bit color would be... well beyond human perceptual limits.

This StackExchange post claims ~2.4 million colors for the limit of human color vision; it cites CIE color science: http://photo.stackexchange.com/questions/10208/how-many-colo.... 64 bit per channel color would produce ~2e19 different colors; if it's 64 bits per channel it would produce some even-more absurd number like ~8e57. (Besides the ridiculous numbers, I assume those numbers are per-channel, because it's not divisible by 3.)

And yes, higher bit precision is really important for drawing 3D graphics. Look at the price premium you pay for Nvidia's Quadro cards, which can process in higher bit precision than your stock cards... they're typically 5-10x a consumer card: http://www.nvidia.com/object/quadro.html

Mostly advertised for high end CAD or VFX, where bit precision errors can result in odd display artifacts or incorrect rendering.


I think the context was that after doing multiple layers of texture mapping on a 3D object, a 128-bit color space wasn't enough to be fully accurate. I'll see if I can find what I'm most likely miss misremembering.

Thanks for the SE link. 2.4M can't be right or we wouldn't be able to see banding all over the place right? The link says including luminosity, it might be up to 100M. So 24 bit wouldn't cut it but 30 would?


Like all things perceptual, an exact answer is maddeningly difficult to pin down (individual to individual variation, subjective issues like ambient light, etc).

Even at 10-bit you could in theory see banding if your color discrimination were good enough, I think. It depends on the material... but the 10-bit displays I've seen look awfully, awfully good. (And the ability to represent more dynamic range starts to matter -- you have to have a wide scale to show off the bit depth, too. A monitor I saw at NAB this year has two backlights and can show off a lot more light... campfires glowed, headlights looked like they were bright, very real... that's the real direction for displays, I think.)


Probably 4 32-bit floats per pixel for ARGB, it's what most GPUs use internally for shader computations since DirectX 10.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: