
24-bit color sucks - bhauer
http://tiamat.tsotech.com/24-bit-color-sucks
======
twelvechairs
Missing wood for trees. RGB displays are deliberately made to be 'good enough'
rather than 'great'. None of them can display whole areas of the visible
spectrum [1]. Before taking issue with tiny amounts of visible banding, just
maybe it would be better if monitors could actually display a reasonable
amount of the visible spectrum...

[1] For a reasonable graphic example
<http://en.wikipedia.org/wiki/File:CIExy1931_srgb_gamut.png> where grey is the
entire visible spectrum and the coloured areas are the standard RGB colour
space.

~~~
0x0
24bit is far from enough to prevent banding, and in many cases it's not just
"a tiny amount" either. For example, a css gradient from #333 to #666:

<http://i.imgur.com/9xeDT.png?1>

It's quite obvious that for a gradient from 0x33 to 0x66 (one component),
there are simply not enough distinct values to prevent banding. In this case,
the range is 51 different values, for a gradient that's maybe spanning
500-1000 pixels. It really looks quite terrible.

------
cromwellian
It's overreach to say high DPI displays eliminate the need for anti-aliasing.
You will still get shimmering, pixel popping, and temporal aliasing, it will
just be harder to notice. If you have thin shapes like hanging cables or
chain-link fences being displayed, even at high resolution, you will get
pixels coming in and out of existence. Higher sampled AA tends to add spatial
"stability" so that slight shifts of viewpoint won't trigger this.

~~~
bhauer
Author here. You're right, high-DPI does not actually eliminate aliasing and
I've toned done the wording slightly (added "nearly"). Certainly high-DPI is a
huge step in the right direction even if several problems with aliasing
remain.

I was simply attempting to say "thank you" to Apple for innovating with high-
DPI displays so that I could follow that by saying more or less what you did:
many things remain to address image clarity.

------
ck2
What on earth is with all the motion on the website.

Just because you __can __do animation doesn't mean you should.

~~~
fuzzix
It made the point about being able to distinguish fine colour gradients
virtually impossible to verify - the background kept changing.

Also, it slowed site scrolling to a crawl on this reasonably beefy desktop
machine.

~~~
snuze
What browser are you using? Not a problem on this mid grade laptop.

------
1331
> Academics have told us that human eyes can't distinguish between the
> 16,777,216 colors provided by 24-bit depth, so we believe that even though
> it can't be true.

This is a fallacy. That the human eye cannot distinguish N different colours
does not imply that a given colour space of >N colours contains every colour
that the human eye can distinguish.

To give an equivalent example that is easier to understand, consider a colour
space with 16.7 million shades of red. 16.7 million colours is more than the
human eye can distinguish, but there are clearly many colours that the human
eye can distinguish that are not in that colour space (notably shades of
green, shades of blue, and combinations of red, green, and blue).

~~~
4ad
You are in violent agreement with the author. This is what he says as well.

~~~
mpyne
I think he disputing the author's assertion that "being unable to distinguish
the millions of colors in 24-bit sRGB can't be true" is necessarily true given
the conclusion. It's important to get the fine details right even if you agree
with the overall point (though I'm not sure who's actually right here).

------
Xcelerate
I agree. Displays in general suck. The display on the RMBP is a step in the
right direction, but there's still a lot of work to be done. Anyone who
settles for "good enough" may as well live life like it's the 16th century. It
was "good enough" then too.

My vision for the future? Specifications that fully exceed the human capacity
for discernibility.

2880x1800? Nope, I can still see aliasing (particularly in Terminal when I'm
coding).

IPS? Nope. Move your head slightly up and down and -- while the chromaticity
stays (roughly) the same -- the luminance does not.

Black levels? I can still distinguish a black screen from the bezel, so it
needs some work too.

Color depth? See the provided banding examples. Not to mention that the three
primaries in most LCD panels form a very small triangle in the chromaticity
diagram.

Refresh rates also stink. It's 2012 -- motion should be so fluid it looks real
by now.

Anyway, there's a lot of potential for improvement but I'm afraid if you're
not OCD (like me) or a color scientist, most people just don't care too much.

~~~
donaldc
While we're at it, let's also add 3D to displays.

~~~
lmm
Mine already has it. Doesn't yours?

------
kevingadd
The number of bytes of memory available on GPUs is hardly the biggest issue
when considering rendering at more than 8 bits per channel of color precision.
Even producing a 30-bit framebuffer (as has been supported for a while on many
desktop GPUs) comes at a performance penalty and is incompatible with many
pieces of software and hardware. Modern GPUs actually allow 16 bits per
channel of precision when rendering, but it comes at a significant cost -
various features no longer function, memory bandwidth is devoured, etc. You
can't simply say 'we can spare the bits' and wave away all the technological
challenges here, especially when the advantage gained from all those costs is
comparatively miniscule.

To be fair, this sort of applies to retina displays as well: The hardware put
into some of the Retina macs can barely handle the bandwidth demands of
realtime rendering at such high resolutions. Until that problem is addressed,
you certainly shouldn't be running around demanding >8bpc color precision.

Dithering for gradients could certainly make a minor difference in rendered
quality, but I don't think most browser vendors are interested in making
rendering performance _slower_ right now - they're quite busy trying to make
it faster. As evidenced by the fact that the OP's site runs like complete
garbage in even modern browsers. Even if you spend the cycles and the power to
dither, you're basically approximating something like another bit of
precision. Is it really worth the cost just for another bit? You could support
the argument for dithering by showing a side by side comparison, at least.

~~~
snprbob86
> Modern GPUs actually allow 16 bits per channel of precision when rendering,
> but it comes at a significant cost - various features no longer function,
> memory bandwidth is devoured, etc.

I'm not sure that this is true. It's extremely common for games to render
lighting texture maps with HDR render targets. 16 bit-per-channel render
targets are often used to accumulate excess light. The excess light gets
processed by successful full screen pixel shader passes. These passes
generally include blurring and flattening into a normal 8-bit-per-channel
texture on the swap chain.

Furthermore, multiple render targets have been common since before the release
of the Xbox 360. Deferred shading engines come into and out of popularity on
what seems like a biyearly basis depending on what visual style is popular in
games currently.

It seems to me that modern hardware is quite capable of greater color range,
but the most demanding applications, games, would rather use that extra video
RAM and fill rate for storing and processing surface normals, lighting
scalars, material properties, etc. Desktop applications always lag behind
games dramatically. Hell, browsers are just now starting to see some of the
benefits of modern GPUs.

~~~
kevingadd
>8bpc render targets have limited feature sets, especially when talking about
older architectures like the XBox 360.

It's true that they're used to accumulate lighting and other information, but
that's basically one of the simplest possible use cases for a render target;
you're just writing pixel data.

Multisampling, blending, etc - all things used often in 3D rendering - are not
necessarily supported on a >8bpc render target. Whether you can do them
depends on the drivers and the hardware. The same goes for textures that are
>8bpc (filtering might not work, for example).

I ran into this problem just yesterday in a program I was writing when I tried
to do blending on a high precision render target. :)

------
fusiongyro
Your site design is cool, but the motion is interfering with readability.

~~~
graue
Not only that, it's such a processor hog it makes my MacBook Air fans whir at
full volume.

I disabled JavaScript on the site, hoping that would make it behave, and
got... no blog, only this:

> This blog uses a little JavaScript. Nothing dodgy, though, and nothing
> hosted at third-party sites. Just some jQuery and animation bits. So please,
> if you'd be so kind, ask Noscript to call off the hounds.

Congratulations, those “animation bits”, which you won't let me turn off, make
your site unbearably annoying to read.

~~~
xnxn
> which you won't let me turn off

In the lower right corner of the site there is the word "normal". Hover over
this area and select "stop animation" from the menu that appears.

(Not that this excuses any of it. Just saying it's possible.)

~~~
darklajid
Ahahaha. Hover. On a tablet.

(Didn't see any controls whatsoever, but that site happily slowed my FF to a
crouch)

And that ignores the issue of argueing for more fidelity while making the
argument hard to read.

------
yen223
Am I the only person who couldn't tell the difference in that horizontal
gradient example?

~~~
jamesaguilar
I could, but barely. Not enough that this website made me care about the
issue.

~~~
mrspeaker
That's his point though, right: you are saying "meh, it's good enough".

~~~
mpyne
It's _always_ only good enough. Who actually has a literally perfect life?

~~~
bhauer
True. Every step in technology's progress is selected because it's "good
enough" to ship the technology out of the lab.

The point I'm making is this: it was good enough in ~1995. Frankly, it was
awesome back then! But is it really still good enough in 2012?

------
lucian1900
I don't see any difference between those two bars, and my eyes are generally
pretty good.

24bit is fine. It's <200dpi that has to die.

~~~
w0utert
I don't see it either :-/

I've zoomed it in until the bars almost filled the screen and I still didn't
see any gradient, despite reading in the text it should have been so obvious.
Kind of defeats the point about 24 bit sucking...

------
devsatish
With all due respect, I don't see the gradient break. may be because of that
gray animation rotating behind that div?, this posts reminds me of that
yesterday's HN parody.

~~~
mrb
Me neither. I can't see the gradient break. Not even after tweaking the
brightness of my monitor.

Edit: I can't see it because my config does automatic dithering. See my other
posts.

~~~
bhauer
mrb, I am envious of your configuration. If I could turn on system-wide
dithering, I would do so in a heartbeat. You tempt me to ditch Windows even
though I am (embarrassingly enough to admit here) actually fairly happy with
Windows overall.

------
raverbashing
An important item is missing in this discussion.

Several monitors turn your 8-bpp image to a 6-bpp image.

~~~
curiousdannii
How would we determine if our screen is doing that?

~~~
ars
If it's an LCD and you didn't pay a lot for it, then it's doing that.

Graphic designers usually pay for good quality monitors, but they would be
well served by also having a bad monitor to check what their work looks like
for most people.

------
brudgers
What problem does 30 bit color depth solve?

The obvious answer (or strawman) might be aesthetics. Yet two bits is
sufficient to create art. Each medium has its limitations. Pen and ink,
watercolor, clay - why should a computer screen be seen as inherently
different?

We tend to view computer art on our own often miscalibrated displays, under
variable lighting conditions, and at a variety of resolutions. Computer art is
generally mass produced. Yes, it is behind glass, but not in the manner of the
_MonaLisa_.

I'm not saying that "deep color" isn't worthwhile. Only that a coherent case
for its practical advantages wasn't made. The problem wasn't obvious on my
screen, and I am biased toward content over form.

~~~
4ad
Let's only use two bits then!

~~~
mpyne
Are you really trying to suggest that the difference between the 24-bit color
that we already can't reliably reproduce on current monitors and the 30-bit of
the author, is the same order of magnitude as between 24-bit and 2-bit
monochrome?

~~~
4ad
No, I am being ironic in order to try to diplomatically say that the comment I
was replying to is not hacker news worthy (especially in its original form).

------
seanalltogether
Green is always the worst offender

<https://dl.dropbox.com/u/1437645/banding.png>

~~~
mrb
I don't see anything wrong with your image. The gradient looks perfect on my
monitor.

Edit: it looks perfect because my sw/hw config (driver Xorg fbdev 0.4.2, AMD
Fusion E-350, Lenovo X120e laptop) does automatic dithering. I can see a tiny
bit of dithering on the white end of the other gradient posted by nhw:
<http://file.st/KWb7XS7x> Other that that, no banding on either images.
Dithering invisible on yours. Dithering 24-bit colors really helps.

~~~
lucian1900
Same, on my similar E-350 Lenovo E325. Using the proprietary driver.

------
haberman
I'm missing the part about what awesome thing would be possible to do with
higher-bit displays. The answer for higher-DPI displays is obvious: everything
looks sharper, all the time. While it may be true that 30-bit color would
allow the gradient between those two grays to be gradual instead of a single
step, when does someone actually want to draw a gradient between such similar
shades of gray?

~~~
Keyframe
It's not about gradients between gray shades. It's about more shades, which is
useful in DTP, video and film, digital painting and so on. If you ever have a
chance, take a look at a high quality picture or video on a display like Eizo
CG275W (or newer like 276) connected to 30-bit output capable card (Quadros
have it, GeForce don't). It's a whole level above what you see on regular
monitors, even calibrated. OTOH, they are really expensive - I get by with
calibrator and regular monitors.

------
sukuriant
What's even more fun is when monitors, especially LCD displays, are only 5
bits deep for each color :)

------
lflux
I'm not sure what's with the background, but it turned my laptop into a
hairdryer.

------
georgraphics
"With high-DPI displays, the aliasing problem caused by insufficient pixels
has been extinguished." that's not true. Aliasing is an omnipresent problem in
discretized data every rasterized display is based on. Even though a retina
display samples at a much higher rate, it needs anti aliasing for optimal
results.

I'd like 64bit displays too. The problem is that there's quasi no content and
also no content pipeline. Someone has to take the first step here.

------
achal
Screenshot if your browser crashes/blows up: <http://i.imgur.com/nWnqA.png>

------
Superpig
I don't see the gradient switch either.

And having 2GB of VRAM does not imply that you'll need a 268-million pixel
display to use it all. In modern games, the vast majority of that VRAM is used
by textures (and we need as much of it as we can get!)

------
verroq
Who are these people who only test their sites in Chrome?

------
btrask
The equivalent bug report on Chrome/Chromium:

<http://code.google.com/p/chromium/issues/detail?id=41756> Issue 41756: WebKit
gradients show banding on Chrome

------
mtgx
It's exactly the same thing with "retina displays" that only "need" 300 PPI (a
rule Apple themselves often break, and yet still call them like that). The
human eye can discern up to 600 PPI, and even up to 1200 PPI.

------
anjc
This site made my CPU go nuts, looks nice though, if a little distracting.

------
timcederman
On a Retina MacBook Pro using FireFox Nightly 20, I couldn't see the step.

~~~
astrodust
I think it depends on gamma and brightness settings.

------
ekurutepe
I can't see the gradient step on my 13" MBA as hard as I try. Those extra bits
would definitely be wasted on me.

~~~
dsego
If you're on Safari the colors look the same, try Chrome instead.

------
martinced
Oh wow, the nineties called and they want their grayscale-only AA back... To
make a point he conveniently totally forgot to talk about RGB decimation /
subpixel rendering. WTF !? Seriously...

If he really thinks that there's only one step between #484848 and #494949
then I'd suggest reading a bit on MS' ClearType and using a magnifier to take
a closer look at how Windows or OS X fonts are rendered. Hint: it's not
grayscale anti-aliasing.

I honestly don't think that using dithering "because pixels are small" would
provide better result than sub-pixel anti-aliasing.

