
HCL: a color model that actually matches our perceptions (2011) - laughinghan
http://vis4.net/blog/posts/avoid-equidistant-hsv-colors/
======
davidjohnstone
This is a fun topic. Here's a Lab and Lch colour picker I once made:
[http://davidjohnstone.net/pages/lch-lab-colour-gradient-
pick...](http://davidjohnstone.net/pages/lch-lab-colour-gradient-picker)

A keyword not mentioned in this article is "perceptual uniformity". It's the
idea that the difference between two colours as measured by the Euclidean
difference (of the L, a and b components) will be equal to the perceived
difference between colours. That said, it turns out that this isn't quite as
true as it was hoped for this colour space, and now there are much more
complicated formula for this:
[https://en.wikipedia.org/wiki/Color_difference](https://en.wikipedia.org/wiki/Color_difference)

~~~
niccaluim
Wow, the color difference stuff is fascinating. How on earth do they measure
the space of perceptual difference?

~~~
Terr_
Not sure, but maybe you could test by showing people two similar colors and
asking them to decide if the are identical.

With enough samples (and enough individual participants) you establish how
many "statistically-significant distinguishable differences" exist along a
given path through the color-space.

Then combine those and get a sort of topographic grid...

~~~
niccaluim
Yeah, something like that was my best guess—either that or letting subjects
adjust a second color until it matched the first. Turns out that's exactly
what they did: "MacAdam set up an experiment in which a trained observer
viewed two different colors, at a fixed luminance of about 48 cd/m2. One of
the colors (the "test" color) was fixed, but the other was adjustable by the
observer, and the observer was asked to adjust that color until it matched the
test color." \- from
[https://en.wikipedia.org/wiki/MacAdam_ellipse](https://en.wikipedia.org/wiki/MacAdam_ellipse)

------
liminal
I wrote a viewer to better understand the HCL color space. By seeing it in 3D
and then looking at different slices you can start to build an intuition about
how the colors relate to each other and which colors are available at
different parameters.

[http://briancort.com/hclviewer/](http://briancort.com/hclviewer/)

~~~
anewhnaccount
That's pretty neat. The 3d shape looks a little odd for some reason to be
though. I think the perspective or projection isn't quite as I'd expect.

EDIT: Maybe if I could tilt the camera to look from above and below that would
be better? Also it would be cool if it worked as a colour picker too. (Okay,
that's enough feature requests for now...)

~~~
liminal
I'm happy to share the code, but realistically I'm unlikely to add features.
There are already so many colour pickers that I didn't see the point in adding
that functionality.

I created the tool while putting together a couple blog posts on how to use
colours effectively in data visualizations. There are a number of links in
those posts to other pickers out there:
[https://briancort.com/?p=146](https://briancort.com/?p=146)
[https://briancort.com/?p=174](https://briancort.com/?p=174)

------
samch
I think that Mike Bostock did a fantastic job with his cubehelix rainbow. It
really does look visually less-jarring than HCL.

[https://bl.ocks.org/mbostock/310c99e53880faec2434](https://bl.ocks.org/mbostock/310c99e53880faec2434)

~~~
davidjohnstone
It depends what properties you want. That is a much nicer looking rainbow, but
it drops the perceptual uniformity that often makes HCL useful.

~~~
tacos
This rainbow is interesting precisely because it shows that "perceptual
uniformity" is not as useful as you might think. This is why people are still
futzing with UX around volume controls and light dimmers. It's not just the
math. Given "perception" there's no "uniformity" \-- by definition.

~~~
tamana
Huh? The pretty rainbow is a _perceptual mathematical model_. The reason is it
looks nicer than the HCL rainbow is that, unlike a HC rainbow at constant L,
it includes luminance in its model of perception

~~~
tacos
To clarify: commenter above states that the pretty rainbow "drops the
perceptual uniformity that often makes HCL useful." My point is that you
shouldn't claim one is more useful than the other without specifying the
problem domain you're optimizing for. Also the language itself isn't helpful:
"perception" is fuzzy by definition and we have two distinct positions on what
"uniformity" means. You've now added the magic word "model" to the mix.
[https://en.wikipedia.org/wiki/All_models_are_wrong](https://en.wikipedia.org/wiki/All_models_are_wrong)

If any of this were obvious, Adobe wouldn't have stupid color pickers and I
wouldn't have three wildly diverging ways for adjusting audio volume and light
levels in my home.

------
Kristine1975
There's also HUSL: [http://www.husl-colors.org/](http://www.husl-colors.org/)

~~~
mcnamaratw
Wow, for dark colors that one seems even more useful.

------
tgb
Well presented. Interestingly, I thought the RGB interpolation example at the
end looked just as reasonable as the HCL interpolation. Though RGB was a
little bland. But RGB should have been the 'dumbest' colorspace to interpolate
in. Even trying random other colors the RGB always looked fine while the HCL,
well, usually couldn't even come up with all the intermediate colors.

HCL actually seems to do very poorly interpolating towards black. Try #000000
and #AA0000. Apart from the missing color, the aparent hue changes
significantly.

~~~
davidjohnstone
The missing colour in the #a00 to #000 interpolation is because HCL is trying
to create a colour outside of sRGB (the green component is less than zero). As
for the hue change, one difficulty of making a gradient to black in HCL is
that black doesn't have a hue when specified as rgb(0, 0, 0), which seems to
mess up some tools.

Edit: It's interesting to see how the different colour spaces react to the
#000 being changed to #010000 or #000100 or #000001.

------
SuperPaintMan
The first example comparing the differing fully saturated hues for the same
value is what led me to creating my own color-picker. It takes in RGB hex and
spits out the equivalent Munsell value and vice versa. I can edit in a human-
understandable format and switch it back to standard Hex.

Let's take a dark purple, that's too light: 573F63 The tool converts it in
place to 5P 3/4, then we just modify the value (second term) to 5P 1/4 and
convert back to hex (#291333). All we have changed is the brightness, Hue and
Chroma have stayed constant.

Hex values are horrible to work with, they contain no understandable info. 10Y
8/6 is going to be complementary to 10PB 8/6, without going outside
equidistant chroma levels and having matching value.

Check out the demo in the readme.

Edit:
[https://github.com/germ/munsellScript/](https://github.com/germ/munsellScript/)

------
mzs
When I want to compare to solid like white/black background I just use Y. When
I want to compare colors I have had decent enough results with this simple
formula (conveniently in RGB for most programing tasks):

[http://www.compuphase.com/cmetric.htm](http://www.compuphase.com/cmetric.htm)

I think it's pretty telling that CIE and others have made a number of
revisions to Lab color distance formulas over the years to the point of them
having constants you tweak. There's no perfect out there yet, so I use a good
enough that is simple to compute.

------
danbolt
The first example of how HSV's hues have different perceived brightness is an
interesting topic.

Some researchers have been looking into better grayscale filters that could
pick out colour distinctions better, too.

PDF, in case you're interested:
[http://www.cs.northwestern.edu/~ago820/color2gray/color2gray...](http://www.cs.northwestern.edu/~ago820/color2gray/color2gray.pdf)

------
SeanLuke
I wonder why he didn't mention YUV.

------
raould42
We do not all see color the same way in the first place!
[http://www.ncbi.nlm.nih.gov/pubmed/16647849](http://www.ncbi.nlm.nih.gov/pubmed/16647849)

------
mark-r
The actual code used on the page needs better handling of out-of-gamut colors.
When I plugged FF0000 into the first value for the gradient comparison, two of
the samples turned pure black.

Other than that, it looks very promising.

------
mjpuser
I can't figure out what this has to do with perception. In our retina, we have
rods for brightness and cones as color sensors. So perception would be farther
up, possibly in ganglion cells, thalamus, or visual cortex.

Side note: this is still an interesting topic though, and there is nothing
wrong with exploring non-human modes of perception.

~~~
tamana
It's black-box modeling of perception by tweaking inputs (light) and measuring
outputs (human statements of how intense/smooth/whatever it looks)

------
flixic
It's quite a bit less practical than other color spaces, because it's far less
reliable. For instance, try their gradient tool with #9C0A0A and #AA2AB5.
Every other space produces good looking results, HCL fails completely.

------
melling
"HCL is a cylindrical transformation of CIE L _a_ b _"

What is CIE L_a _b_?

~~~
Aardwolf
[https://en.wikipedia.org/wiki/Lab_color_space](https://en.wikipedia.org/wiki/Lab_color_space)

