
Night Sight for Pixel phones - mattbessey
https://www.theverge.com/2018/10/25/18021944/google-night-sight-pixel-3-camera-samples
======
dhruvp
If you're interested in learning more about this stuff the head of the Pixel
team (Marc Levoy - Prof emeritus at Stanford) has an entire lecture series on
this stuff from a class he ran at Google. They are here along with lecture
notes:[https://sites.google.com/site/marclevoylectures/home](https://sites.google.com/site/marclevoylectures/home)

What's really cool is you can see him talk about a lot of these ideas well
before they made it into the Pixel phone

~~~
lawrenceyan
If you're thinking of taking a look, I particularly enjoyed Lecture 7 of this
series. The practical applications of Fourier analysis in image
recognition/filtering/processing are really quite amazing.

Plus, if you're at all curious about the technical details for how exactly
something like Night Sight is implemented on the Pixel, understanding what
Fourier transformations are and how they are utilized is vital.

~~~
paul7986
Yeah like build this tech into AR glasses.. see day while in night or see a
ton clearer at night.

------
londons_explore
Prior work at Google Research before it made it into the product:

[https://ai.googleblog.com/2017/04/experimental-nighttime-
pho...](https://ai.googleblog.com/2017/04/experimental-nighttime-photography-
with.html)

And by the original researcher in 2016:

[https://www.youtube.com/watch?v=S7lbnMd56Ys](https://www.youtube.com/watch?v=S7lbnMd56Ys)

~~~
yorwba
There's also this set of slides on the technology behind SeeInTheDark:
[http://graphics.stanford.edu/talks/seeinthedark-
public-15sep...](http://graphics.stanford.edu/talks/seeinthedark-
public-15sep16.key.pdf)

~~~
trhway
multiple frames approach sounds similar to Hubble telescope approach.

------
nkoren
What the Pixel cameras are doing is staggeringly good. My father is the
founder of [https://www.imatest.com/](https://www.imatest.com/), and has a
substantial collection of top-end cameras. He's probably in the top 0.0001% of
image quality nerds. But most of the time, he's now entirely happy shooting on
his Pixel.

~~~
walrus01
If you don't mind my derailing the conversation a bit, could you ask your
father what he would recommend for 4K/60fps video as competition for the
Panasonic GH5s? In its weight class, it's one of the few 60fps (and 400Mbps
bitrate capable) cameras that can be mounted to a relatively small sized
gimbal, and flown on a drone that doesn't cost $10,000.

Is there anything expected to be released in the next few months that will be
in a similar price, feature set and weight class?

~~~
elijahparker
Personally I would suggest checking out the new Fuji X-T3 -- I just got it and
am very happy with the video quality. It's actually the only other small
camera right now that will do 4k60p 10-bit.

~~~
walrus01
I just spent ten minutes skimming reviews and specs for that - it looks very
nearly ideal for what I have in mind. Should be sub $1500 for just the camera
body, no lens, and is under 500 grams.

------
rdtsc
> Google says that its machine learning detects what objects are in the frame,
> and the camera is smart enough to know what color they are supposed to have.

That is absolutely impressive.

The color and text on the fire extinguishers along with the texture detail
seen in the headphones in the last picture are just stunning. Congratulations
to anyone who worked on this project!

~~~
Johnny555
It's impressive, but it also means that your camera isn't always going to
capture what's there -- it'll capture what it guesses was there. I wonder how
easily it is fooled to capture something that's not there?

~~~
simias
I think the article is factually correct but makes it sound a little more
complicated or advanced that it probably is. I mean, depending on how you
interpret it you could think that it does basically "hey, this looks like a
fire hydrant, let me paste a fire hydrant in there" which is obviously not
exactly something AI can do reliably today, especially on phone hardware.

I'm _guessing_ that it works similarly to low-budget astrophotography but with
the computer doing all the busywork for you: when you want to photograph stars
or planets and you don't have a fancy tracking mount to compensate for earth's
rotation you'll have very mediocre results with long exposure. If you expose a
lot to see the object clearly then you get motion blur. If you use a shorter
exposition to reduce the blur you don't have enough light to get a clear
picture.

One solution is to take a bunch of low-exposure pictures in a row and then add
them together (as in, sum the value of the non-gamma-corrected pixels) in post
while taking care of moving or rotating each picture to line everything up.
This way you simulate a long-exposure while at the same correcting for the
displacement.

An other advantage is that you can effectively do "HDR": suppose that you're
taking a panorama with the milky way in the sky and some city underneath it,
with a long exposure the lights of the city would saturate completely. With
shorter exposures you can correct that in post by scaling the intensity of the
lights in the city as you add pixels (or summing fewer pictures for these
areas). This way you can effectively have several levels of exposures in the
same shot and you can tweak all that in post. In the case of the city/milky
way example you'll also need to compensate for the motion in the sky but
obviously not on land which is also something you can't really do "live".

I have a strong suspicion that it's basically what this software is doing:
take a bunch of pictures, do edge/object detection to realign everything
(probably also using the phone's IMU data), fit the result on some sort of
gamma curve to figure out the correct exposition then add color correction
based on a model of the sensor's performance under low light (since I'm sure
by default under these conditions the sensor will start breaking down and
favor some colors over others). Then maybe go through a subtle edge-enhancing
filter to sharpen things a bit more and remove any leftover blurriness.

If I'm right then it's definitely a lot of very clever software but it's not
like it's really "making up" anything.

~~~
usrusr
> which is obviously not exactly something AI can do reliably today,
> especially on phone hardware

But do we know that it runs on phone hardware? If voice interfaces have taught
us anything, it's that we can't ever make that assumption again.

The amount of data you'd have to send to run this off board would be enormous,
but hey, anything for a jawdropping hype feature, right? It just works, those
preview pictures literally made me check the price and size of the pixel 3,
and I haven't been interested in anything but a Sony Compact since the z1c
came out.

------
londons_explore
I would like super sensitive cameras like this to be used inside fridges to
see the very faint glow of food going off.

Chemical reactions by bacteria breaking down food produce light, enough for
humans to see in only the darkest of places (if you live in a city, you won't
ever encounter dark enough situations).

A camera simulating a 1 hour exposure time in a closed refrigerator ought to
be able to see it pretty easily.

~~~
yorwba
Reading between the lines of [1], bioluminescence is used to detect
bacteria/fungi/other organisms in food, but only by adding luciferase to a
sample and measuring the light emitted when luciferase reacts with ATP.
Because living organisms contain ATP, the ATP content can be used as a proxy
for contamination by microorganisms.

But I didn't find anything on bioluminescence occurring naturally in the kinds
of bacteria you'd want to be warned about. Did you ever personally see glowing
food?

[1] [http://cdn.intechopen.com/pdfs/27440/InTech-
Use_of_atp_biolu...](http://cdn.intechopen.com/pdfs/27440/InTech-
Use_of_atp_bioluminescence_for_rapid_detection_and_enumeration_of_contaminants_the_milliflex_rapid_microbiology_detection_and_enumeration_system.pdf)

~~~
somebodynew
Rotting meat definitely glows. I've personally seen decaying mammal carcasses
glowing in the woods at night (always either green like foxfire or an odd
almost monochromatic cyan that must really stimulate the eye's rod cells).

It's very faint and would be difficult to notice without trees to shield it
from moonlight. A camera could pick it up with a long exposure.

~~~
nkurz
Thanks, I hadn't known about this effect. Apparently it's been known for a
long time. In the 1600's, Robert Boyle claimed to have been able to read an
issue of The Philosophical Transactions by the light of a rotting Neck of
Veal: [https://blogs.royalsociety.org/publishing/boyle-and-
biolumin...](https://blogs.royalsociety.org/publishing/boyle-and-
bioluminescence/)

------
fuddle
This reminds me of a similar project: "Learning to See in the Dark". They used
a fully-convolutional network trained on short-exposure night-time images and
corresponding long-exposure reference images. Their results look quite similar
to the Pixel photos.

[http://cchen156.web.engr.illinois.edu/SID.html](http://cchen156.web.engr.illinois.edu/SID.html)

------
londons_explore
It's notable that this 'accumulation' method effectively lets you have a near-
infinite exposure time, as long as objects in the video frame are trackable
(ie. there is sufficient light in each frame to see at least _something_ ).

I'd be interested to see how night mode performs when objects in the frame are
moving (it should work fine, since it will track the object), or changing (for
example, turning pages of a book - I wouldn't expect it to work in that case).

------
dannyw
Damn. I’ve had an iPhone since the 3G, but this is really tempting me to get a
pixel.

------
whoisjuan
Damn! That's honestly impressive! I started reading thinking it was going to
be a simple brightness-up kind of thing, but it's incredible how they are able
to recreate the whole photography based on an initial dark raw input.

I must imagine that the sensor is doing an extra but un-perceptible long
exposure than then is used to correct the lightning of the dark version.

------
tjr225
This might be a weird criticism but... making photos taken in the dark look
like they are not actually dark seems kind of like a weird thing to do? I've
struggled with my micro 4/3 camera to capture accurate night photographs, but
the last thing I wanted of them was to be brighter than I was perceiving them
to be.

That said, the effect of some of these photographs is striking, and I'm sure
the tech is interesting.

~~~
Waterluvian
I hear what you're saying. But I almost always capture photos for data, not
art, and would really benefit from this. As long as it is a setting, we both
win.

------
pavel_lishin
Prepare for a lot of cute sleeping baby photos on your feeds, folks. That's
what I'll be using this for.

------
woolvalley
Now if we could only get this on APS-C & 1" compacts like the sony rx100 or
fujifilm xf10. With first class smartphone integration and networking.

~~~
bscphil
My thoughts exactly. If a little IS work and some great noise reduction like
they've shown here can look this good, can you even imagine what an APS-C or
full frame could pull off?

------
lostmsu
Any hardware reason for it to only work on Pixel phones?

~~~
shittyadmin
Supports a better Camera API - some people have cracked it to work on non-
Pixel devices already though.

See: [https://www.celsoazevedo.com/files/android/google-
camera/](https://www.celsoazevedo.com/files/android/google-camera/)

~~~
bscphil
Can you say more precisely what version you would need to get the new night
stuff? I tried installing the latest "recommended" version on my Nexus 5x, but
it refused to even install it.

------
Erwin
The Huawei P20 shipped out in April with this feature -- I look forward to
dxomark's analysis of the Pixel 3 phone compared to the P20, which currently
remains on top: [https://www.dxomark.com/category/mobile-
reviews/](https://www.dxomark.com/category/mobile-reviews/)

Upgrading from a 3-year old Samsung S6, where I could almost see the battery
percentages drop off percent by percent, the P20 Pro's 4000 mAh battery has
been great (too bad the wireless charging didn't appear until the new Mate P20
Pro).

~~~
shaklee3
The Huawei does _not_ have this feature. Look at the review in your own link
and you will see that the images for low light are comparable to the pixel or
iPhone. If you compare night sight to those, it's completely different.

~~~
endorphone
[https://www.youtube.com/watch?v=wBKHnKkNSyw](https://www.youtube.com/watch?v=wBKHnKkNSyw)

Except the Huawei does and in actual same-setting situations the results are
better than the Pixel.

~~~
shaklee3
You're confusing sensor size for the algorithm again. The Huawei sensor is
twice as large as the pixel 2, which is why both cameras without night mode on
has a vast improvement in the p20. It's also why the improvement by turning
night mode on in the p20 is not as great of a leap as on the pixel.

~~~
endorphone
Incredible.

Did you actually watch the video?

The P20 Pro does exactly what this new night mode does, albeit did it months
ago. It does image stacking (which is not a new approach). In direct
comparison testing -- in that video -- it yields better results.

This whole discussion has just been an bizarre.

~~~
shaklee3
There's really no point in continuing this since I don't think you understand
what the size of the camera sensor does in low light. I was simply pointing
out, over and over, that it is not the same feature Huawei had. Just like you
can't take a $50 point and shoot with extremely complex software that took the
same image quality as a $5000 Nikon, and say that "this technology exists".
Huawei has better camera hardware in almost every dimension you can imagine,
and more than half the pictures in the video you sent are worse on the Huawei,
despite those advantages. I did not say that Huawei can't take any pictures
that are better than the Pixel. I said it's not the same technology, and
dismissing this as "it's already been done" is ludicrous. If the Pixel 2 had
the same hardware as the P20 the results would be even more impressive.

~~~
endorphone
Dear god. Incredible.

You are completely and utterly wrong. Yet you continue. Amazing.

And you're not the first to distract with claims that it is someone else's
ignorance.

The P20 Pro does image stacking. _Period_. That is exactly what this new Pixel
mode does. In actual results the P20 Pro is better.

~~~
bscphil
>That is exactly what this new Pixel mode does. In actual results the P20 Pro
is better.

I'm not the person you've been talking to, but I don't think I'd agree with
that statement. To take the video you linked earlier for example, the Pixel
frequently gives better results. For example, this one shot
[https://youtu.be/wBKHnKkNSyw?t=227](https://youtu.be/wBKHnKkNSyw?t=227)

Note that the Pixel 2 has a much smaller sensor, and the exposure time on the
P20 is _18 times longer_ , and yet the Pixel generates a much better sharper
image. What you're saying is correct, that the P20 is using some very advanced
image stabilization to get results that good from 6 seconds of data, but the
Pixel seems to clearly offer more advanced software.

~~~
endorphone
The example shot you gave is clearly better on the P20. Also the P20 has a
slightly larger sensor.

Further, you're reading entirely too much into the exposure times. They are
artificial and either camera software can choose to put whatever number they
want in there. The aggregate time. The average time. The theoretical
equivalent time. Etc. It is _not_ the actual times.

~~~
shaklee3
No, it's not better. The P20 is over-exposed, causing the colors to be off.
The level of detail is equivalent, but the colors on the P20 are the worse.
And no, it's not _slightly_ larger. The sensor is TWICE as large in area as
the pixel 2.

I took actual photos with the pixel 2, and I know the actual time taken. It's
less than 3 seconds every time. By all accounts and reviews I've seen, the P20
is 10-25 seconds. Show me a review saying otherwise.

------
gingerbread-man
Kind of a tangent, but it was really cool to see a picture of the author's
Schiit Jotunheim headphone amp in the article. One of the founders wrote an
_amazing_ book on building a hardware startup:
[http://lucasbosch.de/schiit/jason-stoddard-shiit-happened-
ta...](http://lucasbosch.de/schiit/jason-stoddard-shiit-happened-tablet-
lblb.pdf).

------
bwang29
The biggest challenge to do this technically is to use gyroscope data to work
together with the stacking algorithm. It's hard to tune the gyro to work great
for any phone. A pure software solution to analyze the perspectives
transformation would be too slow.

~~~
kartickv
Is it hard to tune the gyro because its accuracy is too low, or because it's
accurate to begin with but degrades over the course of the long exposure?

Is a pure software solution even reliable enough under these conditions?
Slowness can be worked around by doing it in the background, and you get a
notification when it's complete. Some people would be okay if that's the only
way to get photos they wouldn't otherwise be able to get, short of buying an
SLR.

------
jakobegger
All those shots look amazing, but they're of stationary objects.

I really want to know how that works for people! 99% of photos I take are of
people, and the lighting is always bad.

Are there any photos of people?

~~~
joshschreuder
There's a photo of the author in the article.

------
polskibus
How does Pixel implementation of low light photography differ from Samsung?
Are they comparable in photo quality?

------
Kagerjay
Wouldn't video still be extremely blurry? This is mostly for things not moving
/ pictures

I wonder if this technology will eventually supercede military night vision
goggles. Having the ability to add color perception at long distances could
have useful for identifying things at night.

~~~
celeritascelery
you are right that this would be useless in its current state for video. But
if it ever got smart enough to start stacking images on a 3d model of the
world you could do some incredible stuff.

------
golfer
Why not use the actual article title?

"Google’s Night Sight for Pixel phones will amaze you"

~~~
gpvos
"will amaze you" is an unnecessary clickbaity addition. They could have left
Google in though.

------
hammock
How are you going to do a review of Night Sight and not even go outside? Every
photo just taken in a room with the lights turned off. Come on, man. Tell your
editor he needs to wait until nightfall.

~~~
Yhippa
Huh? There are a bunch in the second half of the article.

~~~
hammock
Ah. An update was posted literally in the last ten minutes. This article came
out yesterday (when I first saw it)

~~~
shaklee3
I loaded loaded this on a pixel 2. The results are absolutely stunning. It
does take ~4 seconds to take a picture, though.

------
yanonymous2
That's great, but we should find a different name for "photos" that change
image information in the process.

------
endorphone
Interesting, but a tad rich with puffery.

Pre-OIS Google did this with image stacking which was a ghetto version of a
long exposure (stacking many short exposure photos, correcting the offsets via
the gyro, was necessary to compensate for inevitable camera shake). There is
nothing new or novel about image stacking or long exposures.

What are they doing here? Most likely it's simply enabling OIS and enabling
longer exposures than normal (note the smooth motion blur of moving objects,
which is nothing more than a long exposure), and then doing noise removal.
There are zero camera makers who are flipping their desks over this. It is
usually a "pro" hidden feature because in the real world subjects move during
long exposure and shooters are just unhappy with the result.

The contrived hype around the Pixel's "computational photography" (which seems
more incredible in theory than in the actual world) has reached an absurd
level, and the astroturfing is just absurd.

~~~
mediaman
I can see your point: this basically just looks like long exposures, or
stacked exposures, which is basically the same thing of letting more photons
hit the sensor, aligned using OIS.

Any thoughts on why Apple, as the other leading phone maker with a heavy
emphasis on camera quality, has not implemented anything like it? Not to
discount the difficulty, but OIS aligned long exposures kind of seems like low
hanging fruit. Instead, they keep trying to open the aperture more.

~~~
modzu
well to be fair google hasnt implemented it yet either. there could be
significant reasons for that, like results could be worse than default except
in specific situations

