Hacker News new | past | comments | ask | show | jobs | submit login

I've got a wide gamut monitor, and I vehemently disagree. Banding issues are real, but in my experience minor inconveniences compared to the huge step forward in realism that wide gamut brings.

Part of it is probably the kind of content you care about: "artificial looking" UI such as e.g. typical desktop apps care least (imho) about gamut. But if they don't do subtle gradients, then you won't notice banding either. On the other hand, for photographic content, you really notice those new colors you just can't represent in sRGB. I would really regret having to give that up, and would tolerate even visibly annoying banding every day to if it means having pictures that actually look like real life, and not like muddy, lifeless copies.

But to be clear: I very rarely notice any banding whatsoever, and when I do it certainly seems to be due to the source not the processing. Perhaps it's an OS/driver thing, or perhaps it's because more pictures contain sufficient noise to act as poor mans dithering, but banding just doesn't seem to be a (meaningful) issue. I mean: it's visible if you have large area subtle gradients, but not to a huge degree nor and well... don't do that?

Most things I see are either entirely flat (no banding) or photographic (noisy enough that you won't easily see banding).

I agree that a deeper color space is about time, although I suspect that going 16-bit floating point is an overreaction for most scenarios. Floating point isn't free, and neither are all those extra bits. With a decently high gamma, you can probably get away with just 10bits per channel, which would conveniently keep a pixel within 32-bit for efficient packed processing. And in the odd case that you really want to spend more than 10bits on color detail, then even for HDR you really don't need floating point - due to gamma correction, 2 extra bits means 20-50 times more light - more than enough for any kind of hdr that's likely to be displayable any time soon (and really, even if we could make peaks of 30000 nits - does that sound like something you want to look at?)

10-bit with a log or gamma encoding is widespread in film and video work, and I've never known a banding problem, even with purely generated gradients. The Rec. 2020 UHD standard does recommend 12-bit gamma-encoded to deal with the ludicrously wide gamut.

For HDR, a PQ (perceptual quantisation) encoding curve is already standardised by SMPTE - page 8 in these slides goes through the process of how it was worked out, right from human visual system basics: https://www.smpte.org/sites/default/files/2014-05-06-EOTF-Mi...

Given the abundance of log or gamma encodings for display imagery you might wonder why true linear 16-bit float is so common in CG production - why not log encode those 16 bits and get loads smoother gradients? Maybe the answer is that during production those linear files often also encode non-image data like vertex positions and normals, and perceptually "good" quantisation of those could lead to unexpected precision problems...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact