Hacker News new | past | comments | ask | show | jobs | submit login

So I’m not convinced this is actually “billions of colours”. Technically, it means having a 10-bit colour encoding over the wire such that you can express over a billion colours. The distinction between 8-bit and 10-bit is not actually sRGB vs HDR in the same way as dithering a GIF to a maximum of 256 colours lets you display the sRGB colour space but limits you to only 256 colours of it. https://helpx.adobe.com/photoshop-elements/using/dithering-w...

Similarly, you can turn on that little High Dynamic Range checkbox and get HDR but only have 16.7 million colours at your disposal because it’s output in 8-bits per colour rather than 10-bits per colour.

And it’s really hard to tell the difference sometimes between 8-bit HDR and 10-bit HDR. Like really hard. Like usually only visible when doing colour grading such that you need every possible nuance of data to more accurately shade and re-colour your pixels. https://youtu.be/MyaGXdnlD6M

Of course I imagine there’s also good vs bad dithering and the output to the attached laptop computer screen is probably better than the multiple cables and adapters required to output to TVs and external displays, but... the easiest way to tell whether something supports billions of colours is to go into monitor preferences and look for 10-bit or 422 or 444. If you see 420 or 8-bit, technically you might still have HDR but you don’t have “billions of colours”, technically.




Author here: I hear you. I had to double check a bunch of time whether you can get HDR with 8-bit displays. It seems like not all TV panels out there are 10-bit and they still claim to support HDR.

That's why I played the Spears and Munsil test pattern video with a 8-bit and 10-bit pattern on the same video. The 10-bit pattern was smooth which convinced me that it was outputting 10-bit signal. I also confirmed the TV and monitor I used has a 10-bit panel (not 8-bit + FRC).

> monitor preferences and look for 10-bit or 422 or 444. If you see 420 or 8-bit

I tried the monitor info but didn't find this information. Neither in the TV info. Also Apple hides this information in their System Report.

If you have other tests in mind, I'm happy to test more and get to the bottom of this :)


If you have an LG OLED TV you can push 11111 over the Channels > Channel Tuner menu (I think? Google this if it doesn’t work) and it will show you a screen of input stats. If you use the arrow keys and OK buttons you can open an HDMI sub-menu which shows you the input signal as 8-bit or 10-bit. Via a Club3D DP1.4 to HDMI 2.1 adapter I was only able to get 8-bit HDR according to the LG TV. Your other monitors might tell you more details maybe? I’m away from my setup but I could post a photo later and more details on the connections and cables. I tried 3 different USB C DisplayPort adapters with the same results on each... I plan to test what the LG TV says for an Nvidia 2060 soon too, to compare...


I've read users' reports of using that input stats screen to confirm 10 bit input.

They used a certified hdmi 2.1 cable to Nvidia 3090, and 3080 should also do it. Not sure if the new consoles are pushing 10bit but they should be.

Can't test it myself yet as my LG CX and Nvidia 3090 are in the mail.


True HDR needs local dimming, so very few monitors outside of mini leds, oleds, and reference displays have large amounts local dimming, but it is coming to consumer displays.


Higher color depth enables new DRM possibilities, because the more colors there are, the better chance that data streams can be perceptibly-invisibly encoded on top of video.

I sure has hell can't tell the difference between an image with 16M colors and 16M^4 colors, so sometimes I think the above is the only reason why it exists or will be used when it's prevalent. But I'm older so maybe my vision simply isn't as good.


Don't know what Apple does, but as far as I know the HDR label in TVs is a bit like the USB bucket of blatant lies. They can call themselves HDR if hey accept the signal, they don't need to be able to display it correctly. So a 6 bit panel is allowed to call itself HDR if it can process the input somehow...


I believe the problem is indeed “HDR-compatibility” though having read the spec just now, no TVs appear to be truly HDR yet as defined by the ITU, they generally support the P3 colour space which is a subset of BT.2020. Citing Wikipedia for the statistics:

In coverage of the CIE 1931 color space, the Rec. 2020 color space covers 75.8%, the DCI-P3 digital cinema color space covers 53.6%, the Adobe RGB color space covers 52.1%, and the Rec. 709 color space covers 35.9%.

Technically DCI-P3 as used by projectors isn’t Display P3 as used by computers and smartphones but the numbers should give you an idea.

What you want to look for is high rating for “colour volume” such as https://www.rtings.com/tv/tests/picture-quality/color-volume... ... it varies based on LCD vs OLED. Even a fancy LG OLED might only be 87% of the DCI P3 colour space due to missing out on the brightest whites but absolutely nailing the darkest blacks in low light viewing.


For TVs there is the "Ultra HD Premium" (seriously, who comes up with these?) label which sets some baseline standard on HDR quality

For displays, DisplayHDR provides certification, I'd aim for at least DisplayHDR 1000 for "true" HDR performance


> For TVs there is the "Ultra HD Premium" (seriously, who comes up with these?)

The USB consortium, who else? :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: