
What is Color Banding? And what is it not? - deverton
http://simonschreibt.de/gat/colorbanding/
======
bluedino
Very well written and illustrated article.

This is one of those things that's so difficult to explain to a mediocre to
average graphic designer, much less a layperson. Now the trick would be to get
them to read the whole thing...

------
kevin_thibedeau
Dumping color table entries from the last frame of the "256" color GIF:

> gifbuild -d colorBanding_255colorsOn1920pixels.gif | tail -n 200 | grep rgb
> | sort | uniq | wc -l

> 100

What happened here is some sequence of color space conversions with gamma
correction added/removed that ended up compressing the darks.

A true 256 color gray gradient will not display so much visible banding. Just
throwing together a naive 256 color gradient in Gimp reveals some RGB triplets
with mismatched values suggesting the encoder is doing some odd color space
conversion. Even then, it has much less perceptible banding when scaled with
nearest neighbor to 1920.

I suspect most of the examples stem from improper gamma handling than any true
demonstration of the limitations of 8-bit channels.

------
hfsktr
Every image on the page is broken for me (Chrome if it matters). I was hoping
they would help understand what I was reading. Console is full of errors for
where each image tried to load.

~~~
simonschreibt
It seems that my site has problems when it's visited for example from behind a
firewall which doesn't like my SSL-Proxy. I've heard that sometimes from
people visiting the blog from work but at gome everything was fine...

Are you behind a firewall?

~~~
hfsktr
When I tried it in IE I got loads of popups about SSL/certificate stuff that I
just don't get. So it probably is work messing it up. I will hopefully get a
chance when I get home to revisit the page.

To the others, I did disable ublock and ghostery but that didn't do anything
for me.

------
oflordal
The greyscale gradient animations that are meant to show banding are full of
gif artifacts so they are not actually smooth gradients.

~~~
nitrogen
Is this an example of Poe's Law? GIF's palette size is 256, there are 256
shades of gray, so no GIF artifacts should be present. GIF doesn't have
artifacts anyway; those would come from the dithering/palletization step
before compression.

------
mark-r
I was really surprised the first time I saw banding in a 24-bit gradient
image, I spent many years thinking it was impossible. But I'm still not sure
if that's an inherent limitation of 24 bits, or whether my display is not
capable of displaying all 24 bits with full fidelity. The LCD manufacturers
are using tricks to squeeze the last couple of bits out of panels that aren't
up to the task.

------
Aardwolf
Dithering is seen so often in 90's 256-color images, but why so rarely in the
ugly banding of 24-bit RGB images?

~~~
bluedino
The alternatives to dithering a 256 color image both end up with undesired
results. You can either create a palette specific to your image or application
(which makes the rest of a multitasking system look like a tie-dye shirt), or
have really, really ugly banding since you have a mere 256 colors to work with
(minus reserved colors) instead of 2^24

24-bit images are 'good enough' and dithering would just add another step to
the asset workflow.

I can remember the days with a 1MB video card and Windows 95 - switching
between 16-bit color at 800x600, or 8-bit color at 1024x768. Resolution vs
dithering...

~~~
kuschku
And there is one alternative: 30bit (+alpha) colors.

But that’s currently only used by designers.

~~~
exDM69
30 bit color means 10 bits per channel equals 1024 discrete values. If you do
a full-screen gradient from, say, red to black on a large resolution monitor,
there will be color banding.

It's better than 24 bit color but doesn't solve the problem.

~~~
kuschku
Current movies use 36 bit – that’s enough for 4K usually.

------
vardump
That video itself has a lot of color banding, posterized or quantized look.
Chrome on 2015 Macbook Pro Retina 13". Same happened on Safari.

I thought this was intentional and expected the author to mention about it at
the end of the video, but... nothing.

~~~
kevingadd
The video looks bad because YouTube's encoder butchers video content. Add to
that the reality that video decoders typically apply filters to a video post-
decode to compensate for the artifacts introduced by encoders. Those filters
can add banding, and the encoder already quantized the video.

It would look better if the video had been uploaded at 1080p60 or something
ridiculous, because that increases the bitrate budget, and the smaller video
macroblocks cover less important image information. But that's kind of an
awful hack.

You can observe this clearly with youtube clips of classic games, like those
at TASVideos (
[https://www.youtube.com/channel/UCFKeJVmqdApqOS8onQWznfA](https://www.youtube.com/channel/UCFKeJVmqdApqOS8onQWznfA)
) the source games are 480i or 240p at best, but the quality of the youtube
480p bitrate is significantly compromised in ways you can see. If you watch it
in 1080p it looks much closer to what you'd see in an emulator (or off the
hardware when connected via RGB/component.)

------
qnaal
simply hack pixman and then most things that request gradients in linux
(including cairo therefore inkscape) will generate flawlessly smooth color
curves

------
silentplummet
Bit depth is not a measure of quantity, but of dynamic range. In other words,
what matters is not only how many quantization steps you have, but how large
of a space you are trying to map them over.

The examples in the article point to this obvious conclusion, but don't quite
state it explicitly. The relationship between space and quantization effects
is demonstrated for a physical dimension interpretation of space by the
horizontal greyscale bar that stretches and contracts.

But what if you stretch and contract the dynamic range of your monitor itself?
Each bit in the encoding space (naively) offers a doubling of dynamic range in
the natural space, so even your 30 bit encoding can be stretched if you
display it on a monitor that intends to output a contrast ratio many times
greater than what we are used to.

For instance, imagine a monitor that could output light perceptually as bright
as the afternoon sun, next to effectively infinite blackness. Will 30 bits be
enough when 'stretched' across these new posts of dynamic range, or will
banding (quantization) still be visually evident when examining a narrow slice
of the space?

~~~
corysama
HDR monitor do exist. BrightSide Technologies was showing them at SIGGRAPH
over ten years ago. Looks like consumer OLEDs are starting to get on board
[1]. And, they did really want 10 bits per channel. But, even that seems
lightweight.

10 bits per channel will carry us for a while. Apparently Dolby bought
BrightSide and now they are pushing for 12 bits. 16 bit ints will probably be
enough for home use in practice. Internally, most games that do HDR rendering
use 16 bit floats for their intermediate frame buffers. That format is popular
in film production as well. I would be surprised if consumer display tech ever
bothered to go float16-over-DVI. But, maybe it will get cheap enough
eventually that we might as well have the best :)

[1] [http://www.avsforum.com/forum/40-oled-technology-flat-
panels...](http://www.avsforum.com/forum/40-oled-technology-flat-panels-
general/1972841-high-dynamic-range-displays-nab-2015-a.html)

