
Open source color copy paste from your phone camera to web, sketch, figma plugin - sonnylazuardi
https://colorcopypaste.app/
======
daenz
How do they account for the differences between what the sensor sees, what our
eyes see, and what our screens show? I like the idea, but I want something
that can calibrate to my eyes, so that the color that I capture matches
exactly what I saw.

~~~
Wowfunhappy
> How do they account for the differences between what the sensor sees, what
> our eyes see, and what our screens show?

I mean, obviously they don't; that would be next to impossible, particularly
with a phone camera. It's an approximation, but that doesn't make it useless.

~~~
daenz
Why is it next to impossible?

~~~
Wowfunhappy
Well just for starters, even Rec2020 can display only a fraction of the colors
visible to your eye.

~~~
pureliquidhw
Why can't they show multiple options similar to the clear type calibration
tool in windows? Approximate 6 different colors, user selects the one that
matches their eyes best.

You could even offer a pro account and send out a camera to monitor to eye
calibration postcard that starts with a known input.

For sure not impossible given that was 2 minutes of thought, someone closer to
the product already has better ideas I'm sure.

~~~
Wowfunhappy
So, the real problem here is the "calibrate to your eyes" part. It's true that
with a good camera and a professionally calibrated display, you could "copy"
anything in the rec2020 color space accurately enough (but not everything!)

But our eyes, on the other hand, are really bad at "seeing" color. The same
color will look wildly different depending on the colors around it. This isn't
something you can calibrate per-person, it'll differ every time. You'd need to
read your brainwaves to figure out what you're mind's eye is seeing.

~~~
pureliquidhw
But your eyes would perceive the color IRL and on the monitor with equivalence
so presumably regardless of how other eyes see it, it would be 1:1 copy paste
from the camera after calibration.

I refuse to believe we have to measure brainwaves to get something close to or
indistinguishable from perfect.

~~~
Wowfunhappy
> But your eyes would perceive the color IRL and on the monitor with
> equivalence

So, consider this image:
[https://hips.hearstapps.com/cos.h-cdn.co/assets/15/09/480x37...](https://hips.hearstapps.com/cos.h-cdn.co/assets/15/09/480x372/gallery_1425045941-kq2pcqd8jgz7mr0hnz63.png)

Squares A and B are actually the exact same color. I know, it's actually hard
to believe—feel free to download the image and sample the colors. :)

What "color" does the app copy?

~~~
pureliquidhw
So the source color is the truth, let's pretend we know it's the paper
equivalent of #333.

The app captures a color and renders it as #444.

The user then says, no wait that's wrong, and the app then provides
alternatives that are close, let's say #111, #222, #333, #555, #666.

In a perfect world, the user will pic #333 and the app will then know the
system introduces saturation error of all three colors by 1 so the next time
it will compensate for that. Those errors could come from image processing,
light sensor calibration, monitor calibration, backlight color temp, ambient
color temp, etc.

If the user doesn't pick #333 but instead picks a "wrong" color, that's OK,
because the value in the app is rendering the color the user wants and
perceives as correct, not what happens to be the truth. The difference between
a strict functional requirement "app must render the true color" vs a user req
of "I want to take a picture of a color and have the color I see render on my
screen."

------
jcun4128
That's pretty cool, I remember taking a picture of the sky/using an online
inspector/color picker to get the hex value from it...

