
Low Light and High Dynamic Range photography in the Google Camera App - jmintz
http://googleresearch.blogspot.com/2014/10/hdr-low-light-and-high-dynamic-range.html?
======
ioedward
I've been messing around with the Google Camera app and the new Camera API
provided by Android 5. It turns out that you can actually obtain a lot higher
quality images by using the DNG (digital negative) of the photo, instead of
using the JPEG. By "HDRing" images yourself, you can actually outperform
Google Camera's HDR+ functionality.

Here's the JPEG, non HDR+ shot on a Nexus 5:
[http://i.imgur.com/So44muL.jpg](http://i.imgur.com/So44muL.jpg)

Here's a similar image shot with HDR+:
[http://i.imgur.com/QFS3ZYd.jpg](http://i.imgur.com/QFS3ZYd.jpg).

As you can see, the dynamic range is increased greatly; however there's
strange black spots in the shadows.

Here's the same photo that I took in DNG format, edited in Lightroom:

[http://i.imgur.com/VRFsnf5.jpg](http://i.imgur.com/VRFsnf5.jpg)

And here's my HDR photo, combined 5 DNG exposures inside Photoshop HDR Pro's
functionality:

[http://i.imgur.com/RTT6ULz.jpg](http://i.imgur.com/RTT6ULz.jpg)

~~~
higherpurpose
Could Google's HDR+ also use DNG instead of jpeg and combine those
automatically? That would make it much slower, though, so they'd have to take
3 at most.

~~~
ioedward
There's nothing technical that's stopping them from doing it. I expect third
party apps to do something similar once Lollipop has a higher market share.

------
ChuckMcM
Seems like two people come into this experience and walk away with completely
different expectations. One group are people, for whom camera phones have been
their only "camera" and they are amazed at how great the pictures are now. The
other group are people who argue the details of a phase detection autofocus in
the Canon DSLR and the 51 point field autofocus of the Nikon DSLR and can't
believe that the handset makers are allowed to actually call these things
'cameras' in their advertisements. Very little input from the folks in the
middle.

~~~
potatolicious
> _" The other group are people who argue the details of a phase detection
> autofocus in the Canon DSLR and the 51 point field autofocus of the Nikon
> DSLR and can't believe that the handset makers are allowed to actually call
> these things 'cameras' in their advertisements."_

This demographic can be safely ignored. The people who get the most up in arms
about the "purity" of photographic tools are also the ones producing the least
work. They're the ones who buy $10,000 worth of bodies and lenses but can
never go beyond photos of their local park, or sharpness test charts in their
basement. This group isn't good for much more than vociferous, highly-
technical religion wars on the Internet. We'll start caring what they think
when they start producing work.

In the mean time there are _many_ passionate photographers out there producing
great work, with a variety tools, cheap to expensive, simple to complex.

This guy with a crappy old iPhone 3GS has been taking better photos (and
publishing them) than the bulk of people with 5D3's and 70-200 f/2.8's:
[http://boingboing.net/2009/10/29/photographer-
takes-p-1.html](http://boingboing.net/2009/10/29/photographer-takes-p-1.html)

~~~
piyush_soni
Really, I don't see anything good in his photos. All these post-processed
photos lack a lot of detail, even from a smartphone standards.

~~~
potatolicious
And this is part of the problem - it was never about detail, or more generally
technical perfection. But yet that's where the conversation starts and stops.
The conversation around cameras focuses on noise level, dynamic range, or lens
sharpness - none of which are particularly critical traits for producing great
photography.

The Tank Man[1] photograph would not pass muster even by low-end cell phone
standards today, but it's as powerful now as when it was shot.

Nor Cartier-Bresson's famous "leaping man"[2] - more dynamic range, more
sharpness, less noise, less grain, would not have made the image any better.

Even moving into modern times, Bruce Gilden didn't need perfect sharpness or
dynamic range to document the yakuza from the inside[3]. Not only did he not
fuss over focus points and phase-detect vs. contrast-detect autofocus, he
didn't even have autofocus!

Like Weegee said: "f/8 and be there". It's about the picture, not the gear.
You only care about the gear insofar as it enables you - and nearly all
cameras (including cell phone cameras) are _well_ past the point of enabling.

And this is the problem with the "argues about gear on the internet"
demographic - they don't produce. They spend a lot of money and time
acquiring, testing, and verifying the technical perfection of their gear, and
too little time actually photographing. The best you get out of this group are
is technically-perfect banality.

[1]
[http://iconicphotos.files.wordpress.com/2009/04/030.jpg](http://iconicphotos.files.wordpress.com/2009/04/030.jpg)

[2] [http://www.dienes-and-
dienes.com/Assets/CBManLeaping.jpg](http://www.dienes-and-
dienes.com/Assets/CBManLeaping.jpg)

[3]
[http://3.bp.blogspot.com/\--vyBAjPjz6U/TkgF6Qla38I/AAAAAAAAJA...](http://3.bp.blogspot.com/--vyBAjPjz6U/TkgF6Qla38I/AAAAAAAAJAo/Csvy5n5uCzQ/s1600/bruce%2Bgilden%2Byakuza%2B2.jpeg)

~~~
vinkelhake
All other things equal, better equipment will allow you to get better shots.
You are aware that the iconic Tank Man photograph was shot from half a mile
away. It would have been impossible to shoot with today's camera phones.

I know that when I shoot my banal pictures of banal life, I prefer pictures
where the focus isn't accidentally on the background or where the subject
hasn't half-exited the frame because of a delay between pressing the button
and the shot going off.

Most of us are not in the habit of documenting the Yakuza or happening to be
present in world-changing events. We just want to take pictures that look
good, even if the subject is banal.

Edit (since I'm not allowed to comment on the post):

> This is the part we disagree. Better knowledge will allow you to get better
> shots

We're not in disagreement. "All other things equal" means just that - all
other things being equal - including knowledge. Of course a bad picture with
perfect focus is still a bad picture. Likewise, a good picture can become even
better if it's technically accomplished. Tank Man is an iconic photograph
solely because of the subject matter. Ansel Adams didn't settle for a dinky
rangefinder when heading out into Yosemite. He carried large heavy equipment
because it would allow him to capture the detail and sharpness he wanted.

~~~
potatolicious
> _" All other things equal, better equipment will allow you to get better
> shots."_

This is the part we disagree. _Better knowledge_ will allow you to get better
shots - we've gone past the point _long_ ago where improving technical
capability made dramatic improvements to how well people can photograph.

Ultimately what makes or breaks a photograph isn't sharpness, or even focus in
particular, it's exposure and composition. That shite picture isn't made any
less shite because the focus is bang-on. Likewise, a well-composed image
survives a great deal of mis-focus, blur, or other technical faults.

If an image is shit because the focus was off, I'd hate to say it, but it
wouldn't have been an excellent image even if the focus was on.

Short of extremely equipment-demanding niches (like macro, or sports) the
problem is practically always with the photographer, not their equipment. The
photographer is the most common bottleneck in creating great images - of any
subject, banal or world-changing. Spend money on education, not more gear -
and more importantly, spend _time_.

Technological advancements will give us much-appreciated conveniences, it
won't make you a good shooter when you weren't before.

Side note: this is why I'm a fan of things like iPhonenography classes, as
much as people like to mock it. Ultimately putting a camera into everyone's
pockets has been _great_ for photography and expression, and elevating the
quality of this stuff (whether intended as art or just personal enjoyment)
involves education, not gear.

~~~
tripzilch
I'm inclined to agree with you, but a bunch of common counterexamples
immediately spring to mind. Very often, sorting through a huge amount of low
quality phone camera pics from a party or something, I encounter a "happy
accident" that would have made a great photo _if only_ : it had been properly
in focus, not blown out by light, less noisy (especially when you'd normally
crop the pic, phone camera grain is way uglier than film grain), or any of
those things that'd a medium-quality camera suffers way less from.

Am I wrong? Is the blurry picture possibly just as great (even if you can't
recognize my friends' face?), or would it have turned out to be a shitty
picture after all, obscured by the lack of focus?

~~~
vertex-four
Would you have got the details right and taken that particular photo at all if
you were paying more attention to each shot? I don't think this has much to do
with the camera - if the picture's blurry, you were likely taking it in a
hurry, and you would've taken it in a hurry if you had a real camera.

------
Hopka
The automatic alignment helps prevent ghosts from camera shake. But it would
be interesting to see how they avoid ghosts from moving subjects like leaves
in the wind if they are effectively exposing longer (by taking several photos
in a row).

------
boxcardavin
One should remember that writing camera software that is agnostic to the
hardware is technically hard, and nearly impossible if you want gorgeous
photos (a la iPhone). When taking astronomy images we would characterize the
CCD (mostly for noise) each night, and you need that kind of approach to tune
your software to your hardware.

~~~
xexers
> One should remember that writing camera software that is agnostic to the
> hardware is technically hard

In this case, is it?

Their solution involves using burst mode, then taking those many pictures and
turning them into 1 high quality image. Burst mode would simply output a bunch
of .jpg images... couldn't you run the algorithm on those standardized images?

~~~
boxcardavin
You get called dark current (thermal noise) on CMOS and CCD sensors, and the
signal-to-noise ratio goes down as you decrease the exposure time. You see the
effect when taking a low light photo, the graininess is from the SNR being
very low while the noise is showing through. You can smooth this out quite a
bit by combining burst photos (see Cortex Camera app) as the noise is random,
but you always get better results with more light for each exposure.

------
bentcorner
> _However, bracketing is not actually necessary; one can use the same
> exposure time in every shot. By using a short exposure HDR+ avoids blowing
> out highlights, and by combining enough shots it reduces noise in the
> shadows._

Does this mean they're using ISO bracketing instead of exposure bracketing?

~~~
sp332
No, it takes multiple photos with the same settings and just averages out the
noise in the shadows. That wouldn't keep the shadows from lacking detail
though. You'd just get a smoother black!

~~~
vilhelm_s
I think it should work for shadows also? Adding together 10 photos taken at
1/100s should give the same amount of light as one photo taken at 1/10s.

~~~
sp332
But if each photo is just black because it's underexposed, then adding them
together only gives you black.

~~~
vilhelm_s
If the pixels were literally black that would be the case, but even on very
underexposed pictures I don't think that ever happens. There's still photons
on the sensor, it's just that they are swamped by the noise. Like, most DSLRs
nowadays capture 12-bit values, which suggests you would have to underexpose
by something on the order of 12EV to truncate the output to zero.

~~~
CHY872
Yes, and that is very, very possible. In any case, doing that would give you
some relatively painful quantisation error, which is why the HDR approach
works better.

------
FreakyT
It's unfortunate that HDR mode in Google Camera remains unavailable for phones
other than the Nexus 5 & 6 -- I have a Moto X, and the app is really only
useful for photospheres, since normal photos tend look awful without HDR.

Has Google discussed any technical reason for this restriction? Seems like
lots of third party apps support HDR on a wider variety of phones...

~~~
adrusi
Google's HDR+ is different from regular HDR, which can be acheived through
burst exposure bracketing and computation alone. HDR+ also performs
stabilization and other tricks to improve the result, and these operations
might depend on additional hints from the hardware to acheive the desired
accuracy. OEMs are free to implement such features in their own apps,
specialized to their own hardware, and most in fact do.

------
davb
It's worth noting that with HDR+ enabled on the Nexus 5, you can no longer
enable the flash or timer. And the processing time can feel frustratingly
long. So leaving it enabled all the time doesn't work for everyone.

~~~
higherpurpose
Not sure if there will be a change in the processing itself (bar filling up
afterwards), but with Lollipop taking the picture (circle loading) should be a
lot faster. They also say it's 0.3-1 sec, which isn't too bad. Normal photos
used to take that long on high-end phones a couple of years ago.

------
joosters
Does anyone know if this is the same technique that iOS uses for HDR (I know
the basic principle of multiple exposures is the same). Are there different
algorithms at play here?

------
ot
I posted this 2 hours ago [1] and both the posts are in home; the URLs differ
just by a trailing question mark.

How come the duplicate detector didn't trigger?

[1]
[https://news.ycombinator.com/item?id=8515417](https://news.ycombinator.com/item?id=8515417)

~~~
joosters
I think the duplicate detector looks at exact URL matches. Things like a
trailing / or ? will let a new story go through.

~~~
ot
The duplicate detector is usually quite more clever than just exact URL match.
Why the downvote?

EDIT: In fact, I just tried to re-submit an existing link by adding a question
mark and it didn't go through.

~~~
peteretep

        > The duplicate detector is usually quite more clever than 
        > just exact URL match
    

This doesn't match my experience, but then I usually editorialize my titles if
I'm resubmitting something I know is here already.

------
forrestthewoods
Yikes. That first picture looks awful. The HDR+ version looks like cheap CGI.
Very plasticy. Some of the other pics are nice so that seems like a really
weird one to lead with.

~~~
georgemcbay
As someone who is a pretty serious hobbyist photographer, it is actually the
second picture that looks very unnatural to me.

The first one I can imagine existing in "real life" if there were a soft-box
enclosed fill light out of the frame that was lighting up the lady's face (it
can't be the sun since the tree isn't being lit the way it would if the light
hitting her was point-light-esque), which would make the photo very "posed",
but still possibly something that isn't highly post-processed.

The second picture with the two ladies is much more distracting. The HDR
version keeps a lot more detail than the non-HDR image, but at the expense of
making everything look extremely flat and unnatural as your brain tries to
process how the lighting is working (this may be unique to people who are used
to worrying about light with regards to photography and a non-issue for normal
folks, I can't say for sure) since the scene is clearly midday but the light
across the entire scene makes it seem like the sun must be very low, which
doesn't match with the contents of the scene.

The HDR version of the second photo would look better if the exposure on the
two ladies were bumped up close to the value that the background sky was
bumped down, but doing that automatically would be an amazing visual detection
feat that I wouldn't expect out of a phone camera. The lighting still wouldn't
make sense but at least it wouldn't look so flat.

~~~
driverdan
The interesting thing about the second HDR+ photo is that it seems to have
true high dynamic range instead of the limited range created by different
exposure settings. The subjects are actually darker than in the non-HDR
picture. I'm not saying that it looks great, just that it's interesting from a
technical perspective.

------
piyush_soni
The blogspot fails to discuss why Samsung Galaxy S4/S5 HDR results are much
better than my Nexus 5's. Also, there are limitations in leaving it on.
Besides it being slow, it also has other restrictions like the inability to
use Flash while in HDR+ mode. Howsoever good it may be by default, flash is
sometimes important not only in dark, but also when facing sunlight while
taking the shot.

~~~
magicalist
> _The blogspot fails to discuss why Samsung Galaxy S4 /S5 HDR results are
> much better than my Nexus 5's_

Well, $300 price difference can buy you a lot.

Now I get it. Comments like this one must be why Google felt pressured into
making the next Nexus phone so much more expensive :P

~~~
tracker1
Probably has a bit to do with it... I, for one was really happy with a great
mid-range option that was by far the best value in each release... still on my
N4, and not sure what I'm going to get as an upgrade.

~~~
magicalist
Yep, I'm with you.

I'm more or less fine with the nexus 5's camera (usually if an event is
important to capture with great quality, someone else has a better camera
already there), so I'd much rather the nexuses stick to the price point than
go up in quality for a disproportionately larger increase in price.

------
no_news_is
The blogger wrote SynthCam for iOS first:
[https://sites.google.com/site/marclevoy/](https://sites.google.com/site/marclevoy/)

Some similar technology, my impression is less "automatic" I think but more
control. Haven't seen either in action so I'm not sure how it compares.

------
notlisted
I have an Android phone, but what is described in the post sounds a lot like
what the app Average Cam Pro [1] does on my iPad (multiple exposures, average
them out) with the refinement that the ACP app also allows you to subtract a
"dark image" to combat sensor-specific banding noise, plus the user can adjust
exposure after the shot.

It takes the cleanest pictures by far, beating out my Canon 7D at times, but
for optimal results it does require the device to be extremely steady. I've
been waiting for this app to appear on Android too, but it hasn't so far.

[1] [https://itunes.apple.com/us/app/average-camera-
pro/id4155778...](https://itunes.apple.com/us/app/average-camera-
pro/id415577873?mt=8)

~~~
notlisted
PS some sample shots created by another photographer:
[http://blog.instagram.com/post/66187514665/howishootwater](http://blog.instagram.com/post/66187514665/howishootwater)

Edit: I just discovered Flickr is filled with em. Some good examples on what
happens when the number of shots you select (anywhere from 4-64 I believe) is
less than optimal if you scroll down:
[https://www.flickr.com/search/?q=%23avgcampro](https://www.flickr.com/search/?q=%23avgcampro)

I use Avg Cam Pro in touristy situations with lots of movement. Take 64 shots
and (moving) people simply fade away by the time you're done.

------
fumar
Every picture that is on my VSCO Grid was taken with the HDR+ Photo app.
Please note: I have manipulated most of the images. After purchasing the N5
last year, I quickly learned that the Nexus 5 camera should always be set on
HDR+. Even though the HDR process takes multiple images. Many of the pictures
on my VSCO grid were taken in motion. They still have a focus and do not look
blurry.

[http://fantasma.vsco.co/](http://fantasma.vsco.co/)

------
TwoBit
Why can't Canon and Nikon DSLRs provide some basic built-in HDR?

~~~
ghaff
At least some modern DSLRs do. My Canon 5D Mark III supports hand-held HDR for
example. It did take a while to find its way into higher-end models though. I
assume the thinking was in part that the primary audience would be shooting on
tripods and could just combine bracketed photos on the computer.

------
nakedrobot2
Yeah, this is fine. But my Nexus 5 takes half of its pictures out of focus. It
is really, really frustrating. Can't they solve that hard problem first? :-)

------
conradfr
HDR in my Nexus 4 allowed me to take _almost_ focus photos on live music
events, until it started to crash the camera / reboot the phone, oh well.

------
lnanek2
Not really interesting since other OEMs have had this for years. Good to see
Google trying to keep up with their low cost alternative offerings, however.

~~~
fumar
I have had HDR+ for over a year. This post is an overview of the technology
and not a product launch.

------
thrownaway2424
Meh. The camera software on Android is so bad that a default-setting picture
from an iPhone looks as good or better than an HDR picture on Android. People
have demonstrated the quality you can get from the sensor on a Nexus 5 if you
capture raw and process offline[1]. I realize that these Google[x] researchers
are probably not responsible for the mainline camera application, but the
organizational details of Android management don't interest me. What would
interest me would be an Android camera app that captures decent pictures, has
usable auto-exposure and auto-focus algorithms, and doesn't take tens of
seconds to start.

1: [http://imgur.com/a/qQkkR#0](http://imgur.com/a/qQkkR#0)

~~~
heliodor
I have a Nexus 5 at the moment and while the pictures are not amazing, what
boggles my mind is that the camera sensor and the phone's screen don't have
the same aspect ratio!

~~~
simias
It usually makes more sense for sensors to be closer to a circular shape (i.e.
square aspect ratio) because the image projected by the lens is going to be
circular anyway. So if you have a very wide sensor you lose a lot of light on
the top and bottom in order to expose the sides correctly.

