> Originally it's just about being able to show both really dark and really bright colors.
Sort of. The primary significance of HDR is the ability to specify absolute luminance.
Good quality SDR displays were already very bright and already had high native dynamic range (a large ratio between the brightest white and darkest black). The issue was that the media specifications did not control the brightness in any way. So a 1.0 luminance pixel was just whatever the brightest value the display could show (usually tuned by a brightness setting). And a 0.0 luminance pixel was just the minimum brightness the display could show (unfortunately, usually also affected by the brightness setting thanks to backlighting).
What HDR fundamentally changes is not the brightness of displays or their dynamic range (some HDR displays are worse than some older SDR displays when it comes to dynamic range), but the fact that HDR media has absolute luminance. This means that creators can now make highlights (stars, explosions) close to the peak brightness of a bright display, while diffuse whites are now dimmer.
Prior to HDR, a good bright display was just a calibrated display with its brightness incorrectly turned up too high, making everything bright. With HDR a good bright display is a display calibrated with correct brightness and the ability to show highlights and saturated colors with much more power than a normal white.
You're right about higher bit depths, though. Because HDR media describes a much wider dynamic range on a properly calibrated display (though not necessarily on a typical over-bright SDR display), it has a different gamma curve to allocate bits more appropriately, and is typically without any banding in only 10 or 12 bits. https://en.wikipedia.org/wiki/Perceptual_quantizer
Thanks for this, the part about “absolute luminance” is pretty helpful!
So what you’re saying is that with hdr, whoever creates the video, adds to it absolute brightness levels for some/(all?) pixels, and then an HDR-capable screen would be a screen that has an api like ‘draw_pixel(x,y,color,absolute_luminance_in_nits)’ whereas an SDR screen would instead have ‘draw_pixel(x,y,color,relative_luminance)’ with relative_luminance in range [0,1] ?
I can imagine how the monitor brightness control would work in SDR, but what about HDR? If I lower the screen brightness, and the screen is asked to draw a pixel at 900 nits, what happens?
For a real use-case, suppose I want to watch an hdr movie on my MacBook - should I switch the display mode to “HDR Video (P3-ST 2084)” which is one of the presets that locks the display to a specific brightness?
Sort of. The primary significance of HDR is the ability to specify absolute luminance.
Good quality SDR displays were already very bright and already had high native dynamic range (a large ratio between the brightest white and darkest black). The issue was that the media specifications did not control the brightness in any way. So a 1.0 luminance pixel was just whatever the brightest value the display could show (usually tuned by a brightness setting). And a 0.0 luminance pixel was just the minimum brightness the display could show (unfortunately, usually also affected by the brightness setting thanks to backlighting).
What HDR fundamentally changes is not the brightness of displays or their dynamic range (some HDR displays are worse than some older SDR displays when it comes to dynamic range), but the fact that HDR media has absolute luminance. This means that creators can now make highlights (stars, explosions) close to the peak brightness of a bright display, while diffuse whites are now dimmer.
Prior to HDR, a good bright display was just a calibrated display with its brightness incorrectly turned up too high, making everything bright. With HDR a good bright display is a display calibrated with correct brightness and the ability to show highlights and saturated colors with much more power than a normal white.
You're right about higher bit depths, though. Because HDR media describes a much wider dynamic range on a properly calibrated display (though not necessarily on a typical over-bright SDR display), it has a different gamma curve to allocate bits more appropriately, and is typically without any banding in only 10 or 12 bits. https://en.wikipedia.org/wiki/Perceptual_quantizer