
Smartphone cameras struggle to capture San Francisco's orange sky - rurp
https://www.axios.com/san-francisco-orange-sky-smartphone-4008ba2d-c586-4fe2-9ce6-28cdc1aa57d6.html
======
matthewowen
This article is awful. It's about smartphone cameras not being able to
capture, but the three photos included are all ones that at least somewhat
worked. Seems like it would be better to include one that failed?

~~~
blahblah4332
As someone with a professional photography background, this article was
actually super upsetting.

The author is like "look, this gas station one is the only one that worked"..
well yeah, the camera's white balance algorithm saw that the scene was about
80% daylight, so it set the white point at around 3500k, the rest of the scene
is recorded accurately as a result.

The thesis of the article "I don't understand white balance, so smartphone
cameras are bad"

~~~
SpicyLemonZest
I think you're multiple steps ahead of the curve here. Most people think that
an ideal camera captures things the same way you see them with your eyes, and
any differences reflect a flaw in the camera. (I personally thought that was
true until yesterday!)

~~~
Leherenn
Could you please explain for those not in the know?

I realise that why I see is just one "interpretation" of the reality, but it
seems sensible that one possible ideal for cameras would be to perfectly match
my perception of the world. Is it just impossible because the difference in
perception is just so big between individuals that what would be perfect for
me would be way off for someone else?

You can also create (at least theoretically, no idea how they compare
currently) cameras that are better than human eyes at things like resolution,
contrasts and so on, and that seems like another possible ideal; but which one
is more relevant seems very context dependant.

~~~
SpicyLemonZest
The challenge here is that your perception of the world itself performs
contextual color correction. If you've ever seen those optical illusions where
you see two different colors but they're actually the same
([https://www.syfy.com/syfywire/another-brain-frying-
optical-i...](https://www.syfy.com/syfywire/another-brain-frying-optical-
illusion-what-color-are-these-spheres) is my favorite recent example), that's
a demonstration of the effect. So a camera that's trying to match your
perception of the world can't just identify the "true" color; it has to
identify what color it should print on your screen to generate the same
perception as if the contents of the picture were outside your window.

I wouldn't necessarily say it's a bad ideal, though. Smartphone cameras do a
very good job at automatically performing all the right corrections in most
circumstances. It's just nontrivial, and thus unsurprising that the normal
algorithms fail in weird edge cases like "the entire sky is deep red".

------
geocrasher
Scott Manley did a quite good video about "How The Red Skies From Fires Are
Related To Blue Sunsets on Mars" in which he has this exact issue with his
phone making the sky white instead of red- which he also discusses. If you
aren't familiar with his channel, check it out:

[https://www.youtube.com/watch?v=wOIBQIpdv10](https://www.youtube.com/watch?v=wOIBQIpdv10)

~~~
mc32
Is this different than when you have to adjust your camera a few steps so that
snow is not rendered completely (blown out) white?

~~~
majormajor
Yes, this is the camera trying to figure out what "white" should be (e.g. "is
this sunlight or are there clouds or is it a old street light or is it an
indoor fluorescent...) and failing because its heuristics don't account for
this sort of situation.

It has the exposure right, it's just trying to compensate for what it thinks
is "bad lighting."

------
jxcl
One thing that I had to get through my head when I was learning photography
was that there's no such thing as "no filter". Every image you take is an
interpretation of the scene, either through digital signal processing or film
chemistry or retouching.

When I spend time editing photos my dad would claim I was "lying" by editing
them, he thought any modification to digital photos was wrong because the jpeg
the camera produces is "reality."

But of course, as this article shows, the camera is adding its own
interpretation to the scene. Since our eyes can adjust to many lighting
situations, the camera has to try to compensate and reproduce what it thinks
our eyes would see, but it has no way to know if it's right! So, sometimes
it's necessary to go back to the raw data and recreate the image as you
remember it, along with whatever emotions you want it to elicit, rather than
taking what comes out of the camera as "gospel".

~~~
mrob
I draw the line at object recognition. If the processing involves semantic
meaning (e.g. this is sky, this is a face) instead of geometric meaning (e.g.
this is an edge, this is a Bayer filter artifact) then it's lying. It doesn't
matter if the object recognition is done by AI (e.g. faking depth of field by
blurring background) or by humans applying selective processing (e.g. painting
out red-eye effect, or using a graduated neutral-density filter to enhance
sky). Honest processing has no awareness of what objects are in the scene.

~~~
jxcl
I don't really agree with this. I'm not sure what my definition of lying is
but of course it entirely depends on the purpose of the photo. A
photojournalist cropping out an important piece of context is definitely
lying. Me cropping out another tourist of a photo I took of a friend is not
lying.

If I'm editing photos and someone has a temporary blemish or a pimple, I'm
more than likely to edit it out. We don't remember faces to the same level of
detail as a camera can, so it feels more correct to touch up such blemishes.

~~~
ghaff
>A photojournalist cropping out an important piece of context is definitely
lying. Me cropping out another tourist of a photo I took of a friend is not
lying.

Rules are even stricter than that for photojournalists. Photographers have
lost jobs just for editing out extraneous objects (like a telephone pole
"growing out" of someone's head) to make a photo look better. The thinking is
likely along the lines of it's a slippery slope once you start editing out
things.

~~~
shock-value
Even within journalism there is a world of difference between cropping and
outright manipulation.

------
orev
The way they describe the “correct” image seems to imply that it has something
to do with the car also being orange, but I’m pretty sure this has more to do
with the surrounding parts of the gas station being white. That allows the
phone to pick the correct white balance, which then sets all the colors
correctly.

------
gambler
This is the dark side of "smart" devices that replace high-quality physical
sensors with clever algorithms. When something weird happens, they might
misbehave in the most bizarre fashion. The general pattern will hold for AI.
If you give too much control over things to software that you don't really
understand or control, it will behave unpredictably when you need it the most
and there might not be a way to correct or override it.

~~~
jeffbee
This doesn't have anything to do with the sensor. How sensor outputs are
mapped to display inputs lies 100% within the domain of software.

~~~
sudosysgen
The fact that the sensors and lenses in phones are so small is why the
software has so much less to work with and what allows so much more to go
wrong.

~~~
jeffbee
No. A bigger lens (more light) would correct for gain errors. A bigger sensor
would improve either spatial resolution or gain error or both. None of that
has _anything_ to do with color balance.

All digital color photography is computational because sensor physics, display
physics, and human biology are distinct processes. Whenever you have multiple
color channels you need to make assumptions about the relationships between
them. This is color balance and it happens in software.

~~~
sudosysgen
That's not how it actually works out in practice. Small lenses mean small
apertures which means you get diffraction, and you get it pretty fast. With a
1-2mm aperture lens you get diffraction limiting resolution to around 8-10Mpx.

As for color balance, sure, but the parent wasn't limited to color balance.

------
thelean12
I found that bumping up the saturation and "warmth" got a more accurate color
to what I was seeing. Other colors sucked but it was good enough to
communicate to others what I was seeing.

~~~
nomel
I did the same. I held the phone up to the sky while moving the sliders around
so I could match the color. It worked pretty well, but also made me realize my
eyes are also trying to correct. Standing outside and looking up looks very
different than standing inside and looking out, with some artificial lighting
and white walls acting as a reference.

------
xg15
Looking forward to reading the first conspiracy theories/Facebook memes about
how the wildfire is a hoax, the photos of the orange sky were doctored and the
gray photos are the real ones...

------
jxramos
I went out to shoot with my Alpha99 yesterday. I used the daylight AWB
settings. It did a decent job, but didn't capture the degree of saturation of
the actual color.

I had to really boost the ISO, the camera thought it was so dark outside. In
fact the light sensors for the man automatic light controls around our
property, including some street lights, were still detecting that it was
evening.

------
MattGaiser
Does anyone have a failed photo? The article is awful.

~~~
hundchenkatze
[https://twitter.com/ceejay35_/status/1303702328323653632](https://twitter.com/ceejay35_/status/1303702328323653632)

~~~
MattGaiser
Thank you.

------
muststopmyths
My iPhone SE2020 handled the orange just fine, but damn did I miss my Lumia
with the wide range of manual controls on everything.

------
clairity
it's understandable given our relative disempowerment on the matter, but it
feels a little bread-and-circusy to focus on picture-taking rather than a
real-life approximation of an airborne toxic event[0].

in LA, we had a bit of a clearing yesterday, but the bobcat fires (and others)
have turned us hazy and slightly orange again. i can feel it in my airways.
it's more worrisome than the corona everyone else seems to be fretting about,
because it certainly will cause unseen damage to everyone's lungs in varying
amounts. hundreds of thousands die _every year_ from air pollution.

[0] delillo, _white noise_

~~~
thethethethe
I don’t think this article is bread and circusy. I think it highlights how
absurd this situation is. Our denial and assumptions about the world are so
deep, they are encoded in our algorithms to the point where we can’t even
record this event accurately.

~~~
clairity
sure, but that's because you've interpreted what is otherwise a pretty banal
piece. you've said something more interesting in a sentence than that whole
"article".

(i didn't even see the photos because google amp infests the site and i block
it out of principle. not a fan of leading every paragraph with bold in an
effort to be hard-punchingly pithy either.)

------
lxe
Are there no apps for iPhone with the ability to manually adjust white
balance?

~~~
xsmasher
There are many; VSCO and camera+ for example.

Even in the default camera app you can point at something white, like white
paper lit with white light, and long-press to get an auto focus and auto
exposure lock. Then you can get a picture of the orange sky / landscape and it
will use the locked white balance setting.

------
bla3
If you do statistical modeling, you have to watch your priors. Orange skies
are very rare, so the priors in the camera image processing software are off.

------
numpad0
Lr Mobile app has camera feature that lets you calibrate on random gray pieces
and produce very film-like(not as in Lomography-like)photos.

------
mholt
My Pixel can't even capture sunsets because there's no way to turn off the HDR
processing anymore :(

~~~
geocrasher
Try the app called "OpenCamera". It does a lot of things that built-in camera
apps won't.

~~~
doc_gunthrop
It's a great app and open source:
[https://sourceforge.net/p/opencamera/code/ci/master/tree/](https://sourceforge.net/p/opencamera/code/ci/master/tree/)

------
30minAdayHN
> This was a moment for my Canon to prove that, despite its bulk, it can't
> always be replaced by a smartphone.

Such a naive statement. Once those phones' white balance is corrected, they
should be able to capture the orange tinge. Even in a DSLR, if the white
balance is set to auto, it would not capture the orange tinge. Cloudy might be
a good starting point.

~~~
dividedbyzero
How would I do this on iOS, if I had just my phone and no CaptureOne on a
laptop or the like? Are there apps that let me use a gray card to set the
white balance of raw photos?

~~~
30minAdayHN
When you click Edit on a photo, you will have bunch of edit options. Warmth
and Tint sliders represent white balance. You can increase the warmth until it
matches what your eye perceives.

Essentially you are telling your camera post capture what the temperature of
the source is.

