
5K iMac Gets 10-Bit Color - tambourine_man
https://www.cinema5d.com/5k-imac-10-bit-color/
======
ryandamm
In some ways, this is Apple pushing things forward... but in other ways, it's
just them catching up. Until this update, there was no way to get 10-bit
displays working with Apple products _at all_ , even third-party monitors that
were 10-bit capable. The OS simply wouldn't support 10-bit display. I know you
could get it in Windows, and almost certainly in Linux with the correct
voodoo.

My post guys wanted to stay on the Mac platform, so we dealt with 8-bit
displays (color wasn't a huge part of our workflow -- but 10 bit would've been
nice). But it was one of those head-smacking moments for a platform that was
supposedly media-friendly.

Note, there's some confusion in the comments so far -- 10 bit is a
professional display, and is extremely uncommon. Pro cameras and software can
usually handle higher bit depths -- 16 for DSLRs, 10-12+ for pro video cameras
-- but you're probably not seeing it unless you've carefully set up your
computer. Those bits are still useful: they're the raw material for generating
your 8-bit final image, so you don't get banding when adjusting color or
exposure. And they're essential for containing a wider range of possible
values, letting pro cameras represent a wider dynamic range than is possible
in 8 bit systems.

And 8 bits is more typical -- it's literally baked into many file formats,
like JPEG. Some crappy screens on consumer electronics can't even represent
the full 8 bits per channel; 6 bits + dithering is sadly really common, even
in screens that advertise themselves as 8/24-bit. Also, color depth can be
reported both in bits-per-channel and total bit depth; a 10 bit-per-channel
display is 30 bits of color information per RGB pixel, 10 each for red, green,
and blue.

(Some of the confusion is probably intentional; I had a hardware partner brag
about their '15 bit display,' which sounded very weird to me... until I
realized it was really 5 bits _per channel_ , which is roughly bad-ATM-screen
quality.)

~~~
jacobolus
How much software fully supports the 10-bit Eizo or NEC displays on Windows
these days? Do typical GPUs now support 10-bit color, or is it still only
expensive workstation GPUs?

My impression was that only a few niche applications have bothered (notably,
Photoshop has had support for a few years), since 10-bit displays are so
rare/expensive, but I’m not an expert.

As for Linux, as far as I know the stable version of GIMP still only supports
8 bits/channel color, so displaying at higher bit depth is right out. I guess
maybe some specialized video or 3d rendering software could use a 10-bit
display? I doubt typical Linux apps support it.

* * *

Edit: In doing some web sleuthing, there seem to be endless issues with
compatibility. (As you’d expect from a new technology that requires upgrading
every part of the pipeline for support: applications, operating system, video
card drivers, video cards, display interface, and the displays themselves)

I see this on a random web forum: “I got an answer from an engineer at Adobe,
why Photoshop CC cannot display 10bit/color under Windows 8, 8.1 and 10: There
is no OpenGL-path with 10bit/channel like in Windows 7 any more :(”

Adobe’s troubleshooting site says “Note: 30-bit display is not functioning
correctly with current drivers. We are working to address this issue as soon
as possible”

~~~
MichaelGG
A funny note: During Longhorn (PDC 2003), some MS devs indicated that the
color pipeline for everything in Windows was moving to a 128-bit pixel struct.
Some floating point part, I think. Course that never shipped but it sounds
cool.

I think Carmack mentioned that in some extreme-but-possible cases, 64-bit
isn't enough to represent the results correctly.

~~~
ryandamm
I have to assume that's 128-bit precision in 3D rendering -- partly because
it's Carmack, but largely because 64-bit color would be... well beyond human
perceptual limits.

This StackExchange post claims ~2.4 million colors for the limit of human
color vision; it cites CIE color science:
[http://photo.stackexchange.com/questions/10208/how-many-
colo...](http://photo.stackexchange.com/questions/10208/how-many-colors-and-
shades-can-the-human-eye-distinguish-in-a-single-scene). 64 bit per channel
color would produce ~2e19 different colors; if it's 64 bits per channel it
would produce some even-more absurd number like ~8e57. (Besides the ridiculous
numbers, I assume those numbers are per-channel, because it's not divisible by
3.)

And yes, higher bit precision is really important for drawing 3D graphics.
Look at the price premium you pay for Nvidia's Quadro cards, which can process
in higher bit precision than your stock cards... they're typically 5-10x a
consumer card:
[http://www.nvidia.com/object/quadro.html](http://www.nvidia.com/object/quadro.html)

Mostly advertised for high end CAD or VFX, where bit precision errors can
result in odd display artifacts or incorrect rendering.

~~~
MichaelGG
I think the context was that after doing multiple layers of texture mapping on
a 3D object, a 128-bit color space wasn't enough to be fully accurate. I'll
see if I can find what I'm most likely miss misremembering.

Thanks for the SE link. 2.4M can't be right or we wouldn't be able to see
banding all over the place right? The link says including luminosity, it might
be up to 100M. So 24 bit wouldn't cut it but 30 would?

~~~
ryandamm
Like all things perceptual, an exact answer is maddeningly difficult to pin
down (individual to individual variation, subjective issues like ambient
light, etc).

Even at 10-bit you could in theory see banding if your color discrimination
were good enough, I think. It depends on the material... but the 10-bit
displays I've seen look awfully, awfully good. (And the ability to represent
more dynamic range starts to matter -- you have to have a wide scale to show
off the bit depth, too. A monitor I saw at NAB this year has two backlights
and can show off a lot more light... campfires glowed, headlights looked like
they were bright, very real... that's the real direction for displays, I
think.)

------
mixmastamyk
Why did this support take so long? I remember handling "10 bit" files in SGI
in the late 90's, and while most vfx houses are on linux these days, I though
mac os was useful also.

p.s. I think this was the format:
[https://en.wikipedia.org/wiki/Cineon](https://en.wikipedia.org/wiki/Cineon)

~~~
sjwright
You've been able to handle "10 bit" files on Macintosh or Windows since
forever. What's new here is native video card driver support for allowing 10
bits per channel (30 bpp) content to be viewed real time through the main
display.

Let's be very clear here — being able to view 30 bpp is a very, _very_
marginal benefit for most people most of the time, even if they're working on
high bit depth material. Far more important has been the substantial
improvements in display gamut and accuracy.

For content creators the important thing is for this additional bit depth to
be captured in the first instance and preserved during manipulation.

~~~
MichaelGG
Is it very marginal? Shouldn't it reduce banding by a factor of 4 or so? I
notice banding a lot on all sorts of content.

~~~
miahi
The banding is mostly caused by color compression rather than device support.
99.99% of the content one usually sees is compressed, and many times it's a
second compression over something already compressed (I don't know any
"uncompressed Youtube" or "raw Instagram").

------
DAddYE
From a comment in the article:

    
    
      Hey guys, it appears it’s not just for the 5K iMac, 
      I have tested this on 2013 Mac Pro with an Eizo CS230
      monitor and can confirm that you can get 10bit output.
    

[http://www.lsdigi.com/2015/10/el-capitan-10bit-display-
suppo...](http://www.lsdigi.com/2015/10/el-capitan-10bit-display-support/)

~~~
pilif
Yeah - and it also works with last year's 5K imac. (source: I have one)

~~~
finnh
I also have a 2014 5K imac with El Cap, and my system report shows 30-bit /
ARGB2101010

However, when I view the test image from the article's comments
([http://www.imagescience.com.au/kb/getattachment.php?data=MTU...](http://www.imagescience.com.au/kb/getattachment.php?data=MTUyfDEwIGJpdCB0ZXN0IHJhbXAuemlw))
I still see banding. Hmm.

~~~
pilif
What are you viewing it in? It opened in Pixelmator for me where it was
showing banding. Then I tried again with preview where it was ok. So the app
you use to open the file might not have proper 10bit support

~~~
finnh
Preview. The banding is pretty odd (the bands have gradients _within_ them,
it's weird), and is also displaying artifacts from OS X's default window-
transparency settings. Turning those "down" helps, a bit, but there's still
banding. I don't see an option to turn it off completely. So I'm not sure
where to point the finger.

------
haberman
On the downside, it appears that the newest iMacs still won't do Target
Display Mode. This is a bummer -- I like the iMac but hate the idea of having
this big beautiful display on your desk that you can't use to give your laptop
more screen real estate when you want to.

------
MrBra
Anyone care to explain in brief to someone who reading 10 bit color still
thinks "hey, weren't we at 32bit color already?"...

~~~
Monory
We are currently at 24-bit colour palettes - 8 for each colour. 32-bit is used
for ARGB, with alpha-channel, so it's still 8 bit per channel. This, however,
moves us to 30-bit palette, which is nice.

~~~
crusso
What I don't understand is why 8 bit color isn't enough. Back when 8 bit color
was released (yes, I'm old), we were told that it provided way more colors
than the human eye could discern - 24 million as opposed to the 10 million
that the human eye could discern.

What changed?

Is 10 bit color smoke and mirrors or is it something that only a smaller
fraction of the population can appreciate?

~~~
mtw
For seeing pictures and watching videos, 8 bit is good.

Its when you start manipulating images that it doesn't work out. For example,
you have a nice sunset picture but its dark and you can't see the people in
front. As a result, you try to increase saturation, contrast and exposure. The
people's faces are still dark so you try to change the exposure more
aggressively. That's when the picture breaks down. The sky sunset becomes
pixelated, maybe the skin becomes white etc. If you use a 8-bit display you
wouldn't know if it's because of the effects or because of the 8bit monitor.
With a 10-bit display, if it looks crap, at least you know its from the
picture you took.

This might look an extreme example bit it happens often for photographers and
videographers. It can also happen for consumers if they try to check a picture
which lots of minimal variations if the same color in the same picture. A good
example is a picture of a white sky. It can appear pixelated in a few parts

------
handonam
Would that mean the RGB values change in the future?

It would go from 0-F to 0-K wouldn't it?

so... #KKK = white? ;(

~~~
huuu
A lot of software already use normalized RGB.

rgb(1.0, 1.0, 1.0) == white

~~~
StavrosK
And the idea is that you specify as many decimals as you need there?

~~~
huuu
Yes but that's an interface problem.

Ofcourse you can't provide more decimals than the number of bits used by the
float.

EDIT:

    
    
      0.999 * 255 (8 bit) = 255
      0.9999 * 255 (8 bit) = 255
      0.999 * 4096 (12 bit) = 4092
      0.9999 * 4096 (12 bit) = 4096
    

So for lower color depths the number of decimals isn't that important.

------
therealmarv
And why why why I cannot buy an external Apple monitor for my Macbook for
getting that? I need to buy a whole Mac to get that :(

~~~
jacobolus
Give it some time, and you’ll probably need a new Macbook to get an Apple
retina display working.

Support for Thunderbolt 3 (which should have enough bandwidth to drive a 5K
display at 60 Hz) is coming with Intel Skylake chips, which should be arriving
next year.

[http://blogs.intel.com/technology/2015/06/thunderbolt3/](http://blogs.intel.com/technology/2015/06/thunderbolt3/)

~~~
melling
Thunderbolt 3 is USB C. That explains why Apple didn't ship USB C on their new
iMacs.

------
xyproto
I think it's misleading that the title says "10-bit" instead of "30-bit".

8-bit graphics is normally 256 different colors (at least for DOS era
programming). 24-bit graphics is 8-bits per RGB channel.

------
Asbostos
That's a great step, but isn't it still not ideal because at 5k resolution,
you could still get banding on a continuous gradient covering the screen
without dithering?

~~~
sjwright
Theoretically, yes — 10 bpp increases the number of distinct shades from 256
per channel to 1024. So with a 5K display at 10 bpp, each shade would be
repeated for five physical pixels.

In reality, no — the differences in brightness between each shade are going to
be too subtle to notice.

Most sources of egregious banding usually comes from 8 bit source material
manipulated or "colour managed" to an 8 bit output product. If the source
material was 10+ bits then banding would be far less noticeable, even if the
final output is still 8 bits.

~~~
interpol_p
I find that most significant banding comes from a large gradient that spans
between two very similar shades. (E.g., a full screen gradient that goes from
grey to slightly-darker-grey).

I fear that 10 bpp would hide the banding from you, so that when your users
see it on their 8 bpp displays they notice the banding that you didn't see or
get a chance to dither out with noise.

~~~
Gracana
I notice this a lot with web media. Designers have really nice setups, so they
don't have any problem designing sites with tiny thin gray text on gray
backgrounds with little gray icons etc, not realizing that it looks like
indecipherable mush on other people's hardware.

------
adcsd
My system info says 32-Bit Color. Is this incorrect?
[http://imgur.com/PAk8K8e](http://imgur.com/PAk8K8e)

~~~
jacobolus
ARGB8888 = 8 bits each of alpha, red, green, blue. If you had 10 bits per
channel it would say ARGB2101010
[http://i.imgur.com/ds1t6HV.png](http://i.imgur.com/ds1t6HV.png)

~~~
twsted
This is really confusing.

ARG8888 and it says Pixel depth: 32-bit Color

ARG2101010 and it says Pixel depth: 30-bit Color

~~~
Rexxar
I'm waiting for the first real 32-bit colour monitor, the one that becomes
transparent when alpha is set to zero.

~~~
PudgePacket
So you'd actually see through the monitor to the internals our out the other
side? That would be interesting :)

------
alkonaut
Interesting. Do API:s support 10 or 12 bit colors? Is it limited to "low
level" rendering, similar to when you render a region of screen using a 3D hw
device, while the rest of the screen is being drawn at 8bpp using the graphics
API:s of the OS?

I guess my question is: Has OS X always had graphics API:s that uses floats
instead of rgb bytes for everything?

~~~
0x0
At least iOS uses CGFloats for argb components in its UIColor class:

[https://developer.apple.com/library/ios/documentation/UIKit/...](https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIColor_Class/#//apple_ref/occ/clm/UIColor/colorWithRed:green:blue:alpha):

Going beyond 8bit would be awesome; if you have large gradients and don't use
dither, you get really heavy banding. Imagine a gradient on a web page
background. If your screen is 1024 pixels high, thats 4 rows of pixels of the
same color even if your gradient is going from 0 to 255 on a channel. If your
range is smaller (dark gray to lighter gray), it gets even worse. Even older
APIs could draw benefit of this, for example a CSS gradient specifying
#rrggbb...#rrggbb with 8bits/channel could be rendered with more than 8 bits
per channel to avoid banding.

------
MichaelGG
As much as I dislike Apple, I do appreciate when they push things forward.
They've done the opposite for laptop design (bad keyboards, hot chassis) but
Retina has probably contributed a bit to other OEMs shipping not-totally-crap
panels. Maybe this will push that a bit more?

What I'd pay for a X series ThinkPad with a 10-bit "Retina" display...

~~~
LeoPanthera
I also love Apple, but it's a bit of a stretch to call this "pushing things
forward", when 10-bit color has been supported on Windows since Windows 7.

~~~
MichaelGG
I thought support was sorta so-so, requiring special apps + driver support.
Plus I've not seen any popular PC products shipping with it. If Apple makes
this part of their flagship, it might help. Or maybe PC vendors will keep
shipping low-res, ugly ass displays.

~~~
FireBeyond
Kinda like support on Apple now - "only with certain models, certain versions
of the OS, and only in two apps". I realize this will change, but it's not
much different.

------
LeoPanthera
The "original" article is here, though in German: [http://www.heise.de/mac-
and-i/meldung/iMacs-lernen-10-Bit-Fa...](http://www.heise.de/mac-
and-i/meldung/iMacs-lernen-10-Bit-
Farbwiedergabe-2854496.html#mobile_detect_force_desktop)

------
Keyframe
I was hoping for a push to 10-bit monitors for awhile. First step would be for
nvidia to unlock 10-bit output on their geforce series and then for monitor
manufacturers to lower the prices / introduce 10-bit panels... Not to mention
standardize on a port for 10-bits.

------
shurcooL
Well, I know the first thing I'm checking when I get to work in the morning.

------
shurcooL
10-bit color. Retina-level DPI. 120 Hz refresh rate. External display. Pick...
two?

------
post_break
10 bit color, 5400rpm laptop hard drive. 2 steps forward right?

~~~
lallysingh
Wait, seriously? Dude.

~~~
post_break
Sorry I was wrong, the 5200 laptop drive is in the 21.5" 4k iMac.

------
pasiaj
When you need that extra push over the cliff...

------
ctstover
tldr; too much mac universe; I'm assuming 10 bit means 10 bit per channel
instead of 8, so what the rest of the world would call perhaps 30 bit over 24
bit?

~~~
foolrush
I am wondering why you are getting downvoted when what you said about 30 bit
is precisely correct; 10 bit is 10 bits per display plane, which has
traditionally been called 30 bit as per AMD[1] and NVidia[2], as supported
since 2006.

[1] (PDF)
[https://www.amd.com/Documents/10-Bit.pdf](https://www.amd.com/Documents/10-Bit.pdf)
[2] (PDF)
[http://www.nvidia.ca/docs/IO/40049/TB-04701-001_v02_new.pdf](http://www.nvidia.ca/docs/IO/40049/TB-04701-001_v02_new.pdf)

~~~
arm
He’s getting downvoted because of the snark. If he read the article, he’d see
that even Apple themselves refer to it as 30-bit:

[https://www.cinema5d.com/wp-
content/uploads/2015/10/iMac-10-...](https://www.cinema5d.com/wp-
content/uploads/2015/10/iMac-10-bit-color_2.jpg)

[http://i1.wp.com/www.LSdigi.com/wp-
content/uploads/2015/10/S...](http://i1.wp.com/www.LSdigi.com/wp-
content/uploads/2015/10/Screen-Shot-2015-10-30-at-05.32.09.png)

