
4k, HDMI, and Deep Color - unwiredben
http://www.toonormal.com/2014/01/10/4k-hdmi-and-deep-color/
======
jrockway
I hate HDMI and wish it would die in favor of DisplayPort for everything. HDMI
even makes manufacturers print HDMI on the front of their monitors for a
discount on the royalties. That's why your new $3500 monitor says "HDMI" on
the front of it for no reason: to save ten fucking sense on licensing HDMI for
the unit. Meanwhile, DisplayPort is free.

HDMI must die.

~~~
freehunter
My laptop (Thinkpad) outputs over DisplayPort. I tried to pick up a DP to HDMI
cable from Best Buy when my DVD player died right before we were trying to
have a movie night, and the sales clerks had never heard of DisplayPort. Every
time I brought it up, they handed me either a VGA or HDMI cable, or insisted
that DisplayPort didn't exist, that there was only Mini DisplayPort. Why the
hell would they name it Mini DisplayPort if a full size DisplayPort never
existed?!

So yeah, in the "real world", DisplayPort doesn't exist outside of last-gen
Macbooks.

~~~
coldtea
Well, "Best Buy" is not the real world though.

Beyond your DisplayPort incident, Best Buy in general is a provincial consumer
electronics store, with ignorant sales personel and quite narrow selection. It
might have been good enough for 1995, but not in 2014.

It's like visiting a Walmart for delicatessen items.

~~~
freehunter
People _do_ go to Wal Mart for delicatessen items. Wal Mart sells packages
lunch meats, sliced cheese, etc. Wal Mart even has ready to eat sandwiches
available. They're terrible, but it's cheap and they're not bad enough to make
people stop eating them. Likewise, Best Buy is indeed the real world. Mom is
going to go to Best Buy to pick up a new laptop. Dad's getting his new TV
there (with Monster cables so he doesn't get an HDMI virus). Your cousin was
there last weekend buying the camera that looks the best on the display stand.
Because, if Best Buy doesn't have it, there's no one to sell it to me. That
means I need to go on Amazon and shop around, or ask the family tech guy, and
he always gives weird suggestions or asks questions around what I'm really
looking for. Best Buy just tells me what to buy.

There's a reason your local deli has one store with three clerks and is only
open until 9pm, while Wal Mart is bigger than Jesus and open 24/7\. We don't
live in the real world. The real world doesn't demand 4K with cables that can
carry the signal at full color because they have different priorities than we
do.

~~~
coldtea
> _People do go to Wal Mart for delicatessen items. Wal Mart sells packages
> lunch meats, sliced cheese, etc. Wal Mart even has ready to eat sandwiches
> available. They 're terrible, but it's cheap and they're not bad enough to
> make people stop eating them._

I was using the term "delicatessen" as a quality indentifier, not merely as a
product category. Any old cheese and lunch meat won't do.

> _Likewise, Best Buy is indeed the real world. Mom is going to go to Best Buy
> to pick up a new laptop. Dad 's getting his new TV there (with Monster
> cables so he doesn't get an HDMI virus) (...) Because, if Best Buy doesn't
> have it, there's no one to sell it to me._

Sure, but it's not like it's 1995 and the local Best Buy at Boise, ID is the
only option. Tens of millions have an account and order from Amazon and lots
of even more dedicated electronics sites.

What's difficult about "going on Amazon and shopping around"? We're not
talking about our cousin or dad here, you were relating your story of visiting
Best Buy for that cable for your Lenovo. Why even go there?

~~~
freehunter
> _What 's difficult about "going on Amazon and shopping around"?_

It makes people have to think, and in a lot of cases people don't want to have
to think. I posted something longer on this subject recently:

[https://news.ycombinator.com/item?id=7105674](https://news.ycombinator.com/item?id=7105674)

> _Why even go there?_

Because I needed a cable today, not tomorrow or the day after. Amazon can't
match that in my location. That's where I ran into the issues that DisplayPort
is not mainstream, and Best Buy only carries things that Uncle Joe is going to
need for the TV and laptop he bought at Best Buy.

I understand you used the word deli to mean quality, but there's a reason
Apple still has less than 10% marketshare of desktops/laptops. People are more
often than not satisfied with "good enough".

------
cjensen
The article asks: _Which is the better trade off? More color range per pixel,
or more pixels with color channels?_

The answer is more color range per pixel.

Chroma is subsampled compared to Luma because that's literally how vision
works: the Human eye has far more Rods (brightness sensors) than Cones (color
sensors) per unit area. Increasing the color resolution on a TV[1] to match
the luma resolution is literally a waste of bits[2]. On the other hand, the
color receptors are very sensitive to the exact shade of color[3], so
increasing color bit depth is not a waste.

[1] Increased color resolution on a computer monitor _is_ useful because you
sometimes closely examine a small section of the monitor -- and possibly lean
in to examine it better.

[2] The old NTSC analog system used the same principle: color was encoded
using less bandwidth than the luma.

[3] As the author mentions, this is best observed in a gradient where the TV
must present the gradations in sufficiently small steps to fool the eye into
seeing a continuous gradation.

~~~
cclogg
If screens move to deep color, does that mean their contrast ratio would also
change to show such great variations? Ie an extreme would be the brightness of
the sun down to pure darkness. Or would it basically be the same as now, just
we could fit more variations of color into the same contrast? It's hard to
visualize because 8bpp color already seems like it covers every RGB I would
need to see...

~~~
baddox
The ability of a screen to display large contrasts is important, and tends to
improve with every generation of displays, but it's entirely separate from the
way colors are represented digitally.

------
bryanlarsen
I'm not sure why this is important. Obviously the author is much more
knowledgeable than I am, so perhaps I can be enlightened.

As noted in the article, Deep Color is important so that sampling errors do
not accumulate while images are filtered, processed & combined. But a
television or monitor is the final step, it should be performing very little
image processing. (Should is the operative word here, that's not necessarily
true).

If you used HDMI to connect cameras to recorders or to connect effects
processors, this would be important. Does anybody do this for 4K?

~~~
icegreentea
It is possible to see banding at 8bit with a final product. Imagine a flat
gradient across the screen from black to white. 255 levels over say 2000
pixels (close enough) on a screen like... 6 feet across. Unless the gradient
is dithered (either in the TV, or at the source...), you get ~8 pixel wide
bands, which works out to 1/4th inch bands, which is definitely visible in
certain situations.

Dithering is certainly a solution, but higher bit depths also works, and in a
way that's much cleaner.

EDIT: This implies of course that the final display is also capable of high
bit-depth display. Obviously, if the display is limited to 8bit, then it's
kinda useless. I think the author's point was that HDMI 2.0 -should- be kinda
future proof, and -should- anticipate widespread adoption of high bit-depth
displays.

~~~
Zardoz84
> It is possible to see banding at 8bit with a final product. Imagine a flat
> gradient across the screen from black to white. 255 levels over say 2000
> pixels (close enough) on a screen like... 6 feet across. Unless the gradient
> is dithered (either in the TV, or at the source...), you get ~8 pixel wide
> bands, which works out to 1/4th inch bands, which is definitely visible in
> certain situations.

If you have a RGB image were each color component uses a byte to represent it,
you should have a effective Luma bit depth of 8 bit. Well, this is false for
LCD screens because they have a effective Luma bit depth of 7 or 6 bits (or
worst), so even using a VGA signal (analog signal) you get banding on these
screens, were on a old good CRT screen you never appreciate it.

------
zanny
I wonder why we don't use the CIE XYZ color space for monitors - in practice,
humans see much fewer blues than reds, so having the same pixel dedication to
both seems like a waste. I'm not a panel manufacturer here, so I imagine it is
hard to architect a panel with subpixels of varying signaling depth, right?

Also, are there even 12-16 bit color panels in production? I like to hope 4k /
8k mean the end of the pixel race, because we really need adaptive vblank
(from Displayport) of at least 100 hz before we keep pushing the pixel
density. Hell, most of my family don't recognize the difference between 480
and 1080p already because the colors are so bad on most consumer tv panels.

------
bhauer
Awesome highly-detailed write-up. Thank you.

I agree that deep color is fascinating, especially in light of the fact deep
color for end users has been right over the horizon for so long (I've ranted
previously about my disappointment that 24-bit color has been the pinnacle of
desktop computing for well over a decade). I personally find 24-bit banding
quite annoying, especially with animation and fade effects, which visually
exaggerate the limitation.

I would absolutely love a 50" concave OLED display with high-DPI, high-speed
deep color. For the time being, in the real world of compromises, I'd be happy
with either HDMI 2 or DisplayPort 1.2+, since I am presently dealing with 30Hz
at 4K. Given that GPUs available today support 4K at higher refresh rates on
DisplayPort, my current very slight preference is DisplayPort over HDMI 2
(which as far as I know is not supported by any GPU I can buy today).

------
drakaal
Filled with wrong information.

First and easiest. Anything that you can display over DVI can be displayed on
a display that has implemented DVI over HDMI, so DVI doesn't look better than
HDMI for any technical reason. This is how all those non-standard resolutions
over HDMI work, and allows for color combinations that are not part of the
standard.

Now that is established. 4k doesn't actually require HDMI 2.0 to work because
any combination of resolution and color that is supported by the "speed" of
the wire and can be communicated in the "handshake" between devices can work
now.

Unlike most standards HDMI has "requirements" for certification, but those are
not the upper limits they are the lower limits.

So 4k 24 FPS at 4:2:0 could be negotiated on an HDMI cable that is 1.1
compliant.

Author should read less Wikipedia. When I was at Microsoft's Media Room I had
to have IT block Wikipedia because engineers were getting so much wrong
information from its articles. Read the standards, talk to device
manufactures. Wikipedia is not a place for deep tech, it is a place for a
quick answer. If it isn't a knol of data like a stat for transfer speed,
assume someone like this author wrote it, and it will be inaccurate.

~~~
lstamour
I have to agree. I didn't see any mention of H.265 (edit: excuse me, HEVC),
yet I'm sure I've read that it's what's going to speed up 4k beyond current
HDMI 1.4 limits. (Though I see now via Google that HEVC requires quite a bit
of hardware for real-time compression of a live signal at broadcast quality.)
That said, I wouldn't bash Wikipedia too much. Write up better sources and I'm
sure the editors will notice ;-)

~~~
drakaal
Nope. We'd fix things and they'd come back.

------
cromwellian
Using 12-bits per color would seem to give more precision within that axis,
but doesn't seem to say anything about the dynamic range covered. sRGB is a
subset of what human vision is capable of, e.g. the CIE XYZ/RGB colorspace.

Seems like the ideal display can:

1) match the dynamic range of the real world in luma and chroma 2) resolution
that makes seeing pixels very hard 3) enough precision/gradation within the
dynamic range to avoid banding 4) support high framerates

~~~
klodolph
There are several errors in this comment.

1\. The CIE XYZ color space is different from the CIE RGB color space.

2\. "Dynamic range" is not about gamut but the ratio between white and black.
A monochrome display can have high dynamic range.

3\. Neither the CIE XYZ nor the CIE RGB color space correspond to human
vision. CIE RGB color space uses three monochromatic primaries (700 nm, 546.1
nm, and 435.8 nm) and so there are visible colors that it can't represent
without using negative numbers (such as the blue color of an Argon laser). The
CIE XYZ space is a linear transformation of the CIE RGB space designed to
represent all visible colors without using negative numbers, the trade-off is
that most of the colors in XYZ are imaginary -- they cannot be perceived by
humans, recorded by cameras, or displayed by monitors. The color space closest
to human perception is CIE LMS, which still contains many imaginary colors.

If you want better colors look no farther than Rec 2020, which is probably
going to come around soon. It still won't represent Argon-laser-blue but the
cost of representing all colors on a real, physical display is prohibitive (to
the point that no prototypes even exist). Agreed that sRGB is a rather small
color space but improvements on color representation have to take into account
the engineering needed to make it happen. The question, "Which primaries
should I use?" was thoroughly explored during the development of Rec. 2020 and
I suggest you read their rationale.

~~~
cromwellian
Well, if you want to be pedantic, "dynamic range" (DNR) is not about white and
black, it means the ratio of the largest/smallest values of a changeable
quantity. This can be sound, it can be light, it can be a neuron response. The
bits of a color component give you how many finite pieces you can slice up the
range into. An encoding (linear/exponential/etc) give you a projection from
that bit representation into the range of reconstructed values. My general
point is, it's not purely about the # of bits, it's also about the range and
projection function used.

But yes, you are right about the other stuff. But I'm not concerned about what
can be achieved with real physical display panels today. I was talking about
idealism, not incrementalism, I want to look at a display and almost not
notice it's there, like looking through window on my wall. is Rec 2020 going
to get us there?

~~~
klodolph
You're asking for something that _nobody_ is prepared to deliver, and nobody
has plans for delivering it, and nobody knows what kind of technology would
even be used to create such an experience.

1\. Natural contrast ratios are so far beyond the limits of current display
technology that it would take a complete revolution in material science just
to figure out what to make the actual screen out of.

2\. Natural colors are have a gamut that cannot be reproduced using the fixed
primary model. You would have to do something like put a dynamically tunable
laser inside each pixel.

It's like asking for a USB port on your computer in 1949.

However, if you're prepared to make some small compromises:

1\. Okay, contrast ratios are limited by glare from ambient light.

2\. Okay, we'll use three primaries: red, green, and blue.

THEN, Rec. 2020 is what you want.

> "dynamic range" (DNR) is not about white and black, it means the ratio of
> the largest/smallest values of a changeable quantity.

In the context of displays, the changeable quantity is the amount of light.
White is the name of the largest value, black is the name of the smallest
value.

------
acd
People can not see 4k resolution at normal couch viewing distances unless they
have really huge displays.
[http://s3.carltonbale.com/resolution_chart.html](http://s3.carltonbale.com/resolution_chart.html)
[http://carltonbale.com/does-4k-resolution-
matter/](http://carltonbale.com/does-4k-resolution-matter/)

~~~
function_seven
I see this all the time, that people cannot perceive pixels smaller than _x_
degrees (using angular units to cancel out pixel size vs. view distance). And
it's true--the eye cannot resolve smaller that that value.

But just as we cannot perceive individual frames at 24 Hz, but instead see
motion, there's still an emergent effect that is perceived when we juice that
frame rate up to 60, or 120, etc. There exists a frame rate where the
perception changes from "really fast slideshow" to "motion" and that value is
pretty low (something like 12 Hz?). Yet, higher frame rates still deliver a
noticeable change in the quality of the video.

I think the same effect will apply to very high pixel densities. Moire effects
that occur when the news anchor is wearing a striped jacket, for example.
There's gotta be other emergent qualities that come out of having such a tight
pixel density, even if no single pixel can be discerned at a 3 meter distance.

(Disclaimer: the above is pure conjecture, I have not compared a 4K display to
an HD one from normal viewing distance)

------
zokier
Video formats can be weird.

I think there is (at least) one factual error in the blog post; 16-235 range
was not chosen because of CRTs:
[http://en.wikipedia.org/wiki/Rec._709#Digital_representation](http://en.wikipedia.org/wiki/Rec._709#Digital_representation)

In that light I'd also claim that the expansion to full-range of values is
probably one of the smallest advantages of xvYCC. The wider gamut is
significantly more important.

Also if when watching video files you notice that blacks do not look very
black, then you should change your video player. As the aforementioned
wikipedia article says, value 16 is intended to represent pure black in Rec709
colorspace.

~~~
cjensen
4K Uses Rec 2020[1] which has a slightly larger gamut than 709.

[1]
[http://en.wikipedia.org/wiki/Rec._2020](http://en.wikipedia.org/wiki/Rec._2020)

~~~
zokier
But Rec2020 also only specs 10bit and 12bit color depths, so I'd assume that
8bit 4k is not Rec2020.

------
alanctgardner2
It's kind of interesting, but damn this guy cannot communicate effectively.
Was the point that > 8 bits of colour in HDMI is a good idea? That modern
displays can't show them (I don't think this is true). Is his example image
really suffering from being 8-bit colour, because it looks like it was
exaggerated by setting the quantizer steps really big. There's also no real
discussion or example of what chroma subsampling looks like.

------
shurcooL
Very nice detailed write-up, thanks!

Chrome subsampling goes a long way to explain the issues I was seeing with the
Seiki 4k TV at
[http://hardforum.com/showthread.php?p=1040546819#post1040546...](http://hardforum.com/showthread.php?p=1040546819#post1040546819).

------
seanalltogether
"That’s why we GameDevs prefer formats like PNG that don’t subsample."

Can anyone explain why we have this disconnect between video/monitor standards
and bitmap/png standards? What would be the problem with doing 8/10/12 bit RGB
front to back, or YCbCr front to back?

~~~
bryanlarsen
3840 * 2160 * 36 bits * 60 Hz = 18 GBits/second, which is more than an HDMI
2.0 cable can handle, especially once you add audio and framing.

~~~
jfb
18Gb/sec.

------
mistercow
>All this video signal butchering does beg the question: Which is the better
trade off? More color range per pixel, or more pixels with color channels?

I would tend to say higher resolution is more worthwhile. I obviously don't
have a way to test this directly, but my guess is that 8k at 180 fps and
2-bit, using spatial and temporal dithering, will look better than 4k at 60
fps and 12-bit, even though those two signals would contain the same amount of
information.

The reason is that the dithering allows the same color depth to be presented,
but the discreteness is kept well beyond the boundaries of what your eye can
pick up.

Obviously there are plenty of _technical_ reasons that this isn't exactly
practical.

------
devindotcom
Good thing to point out. Though I think this is, as zokier points out, a less
important point than accuracy and breadth of color gamut. And also in regards
to "the blacks don’t look very black," a lack of dynamic range is also
somewhat to blame for that. Luckily, OLED displays are here to address both
issues!

4K at this point has little point, but the increased resolution will be nice
once the quality catches up.

~~~
zokier
> 4K at this point has little point, but the increased resolution will be nice
> once the quality catches up.

Added resolution has the nice property that it can be traded for color depth
by the use of dithering.

~~~
cookingrobot
Does anyone do time based dithering to increase color depth? If a monitor can
only control color to 8-bits, but has 10 or more bits of color data coming in,
it could shift the pixel rapidly between the desired colors. I'm not sure if
our eyes would be able to see a rapid 1 color step change in any single pixel,
and it should break up the color banding that we can see.

Is this technique used anywhere?

~~~
brigade
Yes, all non-IPS laptop displays for instance. They're actually 6 bits and use
temporal dithering to display 8-bit color. And indeed, most "10-bit" panels
are 8-bit with FRC.

~~~
lstamour
Ah don't get me started on how I hate being lied to by monitor manufacturers.
I have to start looking for 14-bit LUTs before I begin to trust that the panel
might actually be 10-bit. Then again, I've a 8-bit with FRC and can't tell the
difference. They both oversaturate as much as I'd like them to. :)

