
Experimental Nighttime Photography with Nexus and Pixel - monort
https://research.googleblog.com/2017/04/experimental-nighttime-photography-with.html
======
BeetleB
Amateur photographer here.

This should be the future of DSLRs. Provide some sort of API so that I can
create recipes for my photography project. Bonus if the hardware is powerful
enough for me to process the way _I_ want to on it (as opposed to the builtin,
mostly useless, features).

As a silly example, say I want to take 10 photos. I want the first photo to be
1/30s. The next 1/15s, and so on - doubling the interval each time. I just
want to be able to program this, and assign it to a button/menu item, so it
will do it automatically.

Or I want to do custom focus stacking. It should automatically take N shots of
predefined focal distances, and if powerful enough, stack them.

I've never coded Android apps, so I don't know how much control over the
camera is exposed to you, but why can't camera companies provide the same
level of control?

~~~
aortega
Amateur astrophotographer here.

The techniques in that blog post are known since 10 years ago. In fact they
are quite crude, as they don't account for other types of noise like readout
noise, also it does not takes a map of individual pixel sensitivity taking
"flat" frames.

Stacking pictures _is_ the foundation of astrophotography and there are many
free utilities that does this, for example:

[http://deepskystacker.free.fr/](http://deepskystacker.free.fr/)

BTW digital cameras already take a "black" frame and substract it from the
"light" frame automatically, that's why sometimes the camera takes some time
to show you a long-exposure picture: it's taking a black frame with the same
exposure time.

~~~
BeetleB
I'm well aware of all these techniques. However, I'm looking for more
flexibility, like the type I mentioned in my comment.

Something like auto focus stacking. The camera need not do the stacking - just
automatically focus on various portions of the scene and take pictures. I can
then stack on my computer. Currently I have to focus at one spot, take the
picture, then focus on another, take the picture, etc. What I'd like to do is
to specify the five focus points, and have it then do the rest of the work.

The in-built bracketing capability in cameras is really minimal.

I was looking at Magic Lantern, and it has a time lapse that essentially
maintains a constant brightness - so when the day transitions into night, the
camera auto-adjusts the exposure to make sure the subject does not become
darker.

Lots of possibilities.

------
ivanech
Once you know some basic principles of photography, this actually shouldn't be
all that surprising. The Google Nexus 6P uses a 1/2.3 inch type sensor that
measures 6.17 x 4.55 mm. That gives us ~28 square mm of sensor area. The
initial example image was taken with a full-frame camera with a sensor that
measures 24 x 36 mm, yielding 864 square mm. That gives the DSLR ~30x the
sensor area of the Nexus. Then with the same amount of light per square mm per
second (measured by the f-stop of the lens) the Nexus needs to expose the
image for 30x longer than the full-frame camera to gather the equivalent
amount of light. It just so happens that this approach used 32 exposures - it
makes sense that the results look comparable to a full-frame camera because
the phone gathered just as much light.

------
rb808
My Nexus 6p takes amazingly good photos - I rarely use an SLR now.

The main problem I have is that it shuts down after 10-30 minutes of use - no
help from google.
[https://code.google.com/p/android/issues/detail?id=227849](https://code.google.com/p/android/issues/detail?id=227849)
Makes me want an iPhone. (see below Huawei might be fixing)

~~~
thatcherc
There's actually a class action lawsuit related to this issue [1]. I first
read about it here [2]. I've got a Nexus 6P too and it's great except for the
battery and shutdown issues. Mine will shut down at like 20% or so so it's
very usable, but it's often at night when I most need to use my phone (calls,
rides, etc).

[1] - [https://chimicles.com/google-nexus-6p-battery-early-
shutoff-...](https://chimicles.com/google-nexus-6p-battery-early-shutoff-
bootlooping-class-action-investigation/)

[2] - [https://www.engadget.com/2017/04/21/lawsuit-takes-aim-at-
goo...](https://www.engadget.com/2017/04/21/lawsuit-takes-aim-at-google-
huawei-over-nexus-6p-battery-issues/)

~~~
emmelaich
The 5x too. It may well be the same design flaw as the 6p.

[https://issuetracker.google.com/issues/37117345](https://issuetracker.google.com/issues/37117345)

Despite the comments there about software it is most likely about heat and a
component which is affected over time.

I've had a 5x just go dead. Bought a Pixel, it died after a week (probably
unrelated to this fault). The replacement is fine though and the phone is
great overall and the camera is fantastic.

------
soylentcola
I've been pretty happy with my Pixel XL's camera (easily keeps up with the
incremental improvements in cell phone cameras over the years) but I also
appreciate how they didn't make this into too much of a puff piece where they
"hand wave" away the amount of work still needed in Photoshop or similar.

The takeaway is that with some effort and intelligent use of other software
tools, you can put together a nice image with all sorts of lower-end cameras.
The bit at the end about hopefully adding some of this functionality to
software available on the phone was a nice touch, as I imagine all of the big
players are always working on that sort of thing.

------
zokier
The photos are impressive. But considering the introduction, it would have
been nice to have some actual side-by-side comparisons with dslr. And then
just for fun, the same exposure stacking applied to dslr at max iso.

~~~
twiss
Here's the first and second-to-last image in the article:

DSLR:
[https://1.bp.blogspot.com/-JJORxMmYFuw/WPkESrozTuI/AAAAAAAAB...](https://1.bp.blogspot.com/-JJORxMmYFuw/WPkESrozTuI/AAAAAAAAByU/pkjoul07knIfKWyvM-
xmlaCzvODG28FJQCLcB/s1600/image5.jpg) Pixel:
[https://4.bp.blogspot.com/-yo9hRd3xLLg/WPkERdbOs7I/AAAAAAAAB...](https://4.bp.blogspot.com/-yo9hRd3xLLg/WPkERdbOs7I/AAAAAAAAByk/UDE_MAh3bXszzir_Grxs672mpiKY82DlACEw/s1600/image11.jpg)

~~~
fudged71
The author really should have cut down on the blue saturation for the phone
examples, they're pretty ridiculous IMO. But regardless, it's an incredible
result considering the sensor capabilities.

------
polyvisual
"The camera cannot handle exposure times longer than two seconds."

What's the reason for this? Is it a hardware restriction? I suppose an
artificial software restriction could be removed by using root / other camera
software.

~~~
quantumet
Mostly hardware.

Shortest explanation: The sensor exposure time register has a maximum value.

Next shortest: But it's actually in units of row readout time, on many
sensors, which is also configurable, so the exposure time can be made longer
at the cost of slower image readout. In normal operation, readout has to
happen at 30fps at least, so extra code is needed to switch to slower readout
for extended exposure values. This code then needs validation, the image
processing tuning tables need to be updated and verified for the new long
exposure durations, and any preview glitches, etc, from resetting base sensor
configurations need to be addressed. So a lot of extra work, for a relatively
niche feature on a smartphone.

Even longer: Many sensors also have an external shutter trigger signal pin,
for unlimited exposure duration. But that needs to be wired to the CPU, and
all the SW considerations above also apply.

~~~
ahakki
Couldn't it just be a heat dissipation issue? These sensors pack a lot of
photo sites into a tiny package.

~~~
quantumet
To first order, long exposures are probably less power-hungry - much of the
sensor power is burned in the image readout, and longer exposures mean you're
reading images less often.

When collecting light, an image sensor pixel isn't really using up any active
power (each pixel is basically a capacitor collecting electrons generated by
light hitting the silicon).

~~~
jfindley
I don't believe this is entirely correct. DSLR sensors get very hot during
long exposures - to the point where excessively long exposures can introduce
noise into the output from this heat. I don't see why this wouldn't apply even
more so to phone sensors, with their incredibly high photosite density.

However really long exposures are going to suffer from star trail effects -
the earth is rotating relative to the stars, so a long exposure changes stars
from a point to a short line, which _usually_ isn't what you want. On a 35mm
camera with a fairly wide lens you can get away with ~ 30s of exposure time.

On a pixel phone I think you'd be able to get away with ~ 3s exposure time,
but as it's going to get pretty hot over this time I'm not sure how much extra
image quality you'd actually end up with.

------
pdelbarba
reminds me of the low light video camera that was on here ealier.
[http://kottke.org/17/04/incredible-low-light-camera-turns-
ni...](http://kottke.org/17/04/incredible-low-light-camera-turns-night-into-
day)

Not as fast, but the results look fairly similar.

------
jbarham
FWIW Olympus already provides in-camera shot stacking for night shots in its
Micro Four Thirds cameras via their "live composite" mode. See
[http://www.duford.com/2016/08/explaining-olympus-live-
compos...](http://www.duford.com/2016/08/explaining-olympus-live-composite-
mode/) for example shots.

------
fudged71
Incredible results!! I would really love to be able to do this on a phone.

I've tried with iPhone to take long exposures and crank up the brightness/HDR
to bring out as much signal as possible. This is the best that I could do on a
fully moonlit night: [http://imgur.com/a/km1D9](http://imgur.com/a/km1D9)

~~~
anc84
You could probably get better results by taking multiple independent shots and
merging them later.

~~~
fudged71
The app Hydra does this. It's not specifically meant for super dark photos so
I don't think they extend exposures to the max.

------
HappyTypist
What I don't get is why JPEG is still so commonly used when it introduces
significant artifacting (even at high qualities). When a more advanced format
like WebP can deliver higher quality images for the same file size.

~~~
TulliusCicero
What does this have to do with the article?

~~~
spyder
The images in the article are JPEGs with visible compression artifacts which
isn't good if they want to demonstrate the quality of the photos that they can
make with the phone.

------
Theodores
I think the timelapse style stars is a feature and not a defect, although it
goes against the original challenge.

Although this article is about the challenge - decent photography with a
mobile phone - it does outline how easy it is to layer up lots of images in
Photoshop, median everything out and get a long exposure image. Taking out the
sensor 'median' was clever too.

So you could use this with DSLR images too, to take better long exposure
images whatever the sensor, so long as everything is fully HDR and manual.

I think that I might just give it a go. With PHP Imagemagick so that I can
automate the Photoshop part and tweak settings easily.

~~~
semi-extrinsic
By "taking out the median" you're talking about what's known as dark frame
subtraction. I believe most DSLRs already do this internally for long
exposures - try setting your camera for 10s exposure and time how long it
takes before it's ready again; I'll bet you dollars to donuts it's about 20s.
Smartphones can't do this because they lack an internal shutter.

On advanced models (and hacked cameras like MagicLantern/CHDK), you can turn
this off and do it manually, e.g. shoot a dark frame for only the first image
in a series so you can get ~95% duty cycle rather than ~45%. Especially useful
if you're trying to capture a rare event, e.g. lightening.

What's weird about the technique in TFA is he takes N bright frames and then N
dark frames, computes both medians and then subtracts. By interleaving bright
and dark frames, and doing the subtraction before stacking, I'm pretty sure
they'd get better results.

As for image stacking, there's tons of ways to do it, and quite a few turnkey
apps for image stacking for astrophotography as well, e.g. here's a review
from last year of a Mac app doing just this:

[https://petapixel.com/2016/02/20/stack-photos-epic-milky-
way...](https://petapixel.com/2016/02/20/stack-photos-epic-milky-way-
landscapes/)

~~~
gjm11
Doing all the bright frames together and all the dark frames together is
surely _suboptimal_ , but it's hardly _weird_ : there's an obvious reason for
it. (Namely, that the transition between bright and dark requires covering the
camera lens with black tape and it's easier to do and undo that _once_ than
_64 times_.)

~~~
username789
If the sensor is stable over the period when the frames are shot, why would
subtracting the mean of the black frames from the mean of the exposures be
worse than subtracting individual black frames from individual exposures?
Aside from rounding errors and possible overflows the two procedures should be
equivalent:

    
    
      (e1 + e2 + ... en) / n - (b1 + b2 + ... bn) / n = (e1 - b1 + e2 - b2 + ... en - bn) / n

~~~
gjm11
The suggestion isn't about what order you do the arithmetic in, but about what
order you do the captures in. ++++---- versus +-+-+-+-. The former has the
disadvantage that if something that affects the images (temperature, say) is
varying gradually, or changes abruptly at a particular point, it's easier for
it to have substantially different effects on the light and the dark frames.

It's possible that something like Thue-Morse (+--+-++--++-+--+ etc.; one way
to define this is to look at the parity of the number of 1-bits in the binary
representation of the frame number) might be better than alternating. If
whatever disturbances you might worry about are smooth in the right sort of
way, it gives you more exact cancellation than alternating.

------
LoSboccacc
if this could become an app it'd be great

~~~
veli_joza
I'm even more interested in mentioned SeeInTheDark app (that's unfortunately
not available yet as it's part of Google research). The video demo is so
impressive. I don't really care so much about photography, but decent sensors
and optics combined with innovative processing in real-time is really starting
to pick up.

I've already started using phone camera app to enhance my vision for text
that's too far to read. Advances in post-processing will increase our vision
even more, as well as allow for night vision and understanding text in other
languages. We are transitioning into cyborgs by means of android platform...

~~~
Synaesthesia
Seems like quite a cool app and simple to implement as well. Aligning the
images if the phone is not held still could be a problem.

------
locust101
This is great that they are investing so much in camera. In modern smartphone,
I do not feel any difference in performance between my moto play and s7 edge.
It is just the camera that makes me keep coming back to s7 edge, despite the
$300 price tag difference. It is just much more convenient and I have stopped
bringing my dslr on trips now since the phone camera works really good.

------
pareshverma91
I think it would be interesting to compare an image shot in the daylight with
the one created with this processing scheme in the same daylight. I expect the
normal exposure shot to be better than the processed one, but it would be
interesting to observe the difference in quality.

------
Jack000
for the stars shot the next obvious step would be to automatically segment the
image with optical flow. Could also try solving the hand-held problem with ORB
feature matching.

given the previous posts from the googleblog I kind of expected a bit more
algorithmic involvement beyond image stacking.

------
th0ma5
I know on the XDA forums, at least for the 5x, there is an app someone threw
together to do long exposure times. I played with it some and got a pixelated
mess, but I didn't try a tripod. Some people on the thread produced some
pretty good pictures.

------
dboreham
Interesting work. However, in order to reproduce the marin headlands DSLR
image the smartphone would have needed to also stitch a panorama from multiple
frames (because the lens used in the DSLR shot has roughly 2x the FOV angle).

------
tambourine_man
>Still, this may be the lowest-light cellphone photo ever taken.

Not so fast buddy. Some of us have been playing with this stuff for a really
long time :)

------
shaklee3
Fascinating article. Thanks for all the work.

------
carlob
Why take the mean instead of the median of the images? Wouldn't the median be
less sensitive to outliers?

------
Gravityloss
Cameras, like so many things, have been increasingly software limited during
recent years.

