
The iPhone 11 Camera, Part 1: A Completely New Camera - tomduncalf
https://blog.halide.cam/inside-the-iphone-11-camera-part-1-a-completely-new-camera-28ea5d091071
======
londons_explore
I want to see a _real_ teardown of these cameras...

Show me the lenses. Put the coatings under an electron microscope. Cut the
sensor in half, and measure the depth of the electron wells.

Then on the electronics side, hook up a logic analyzer. Find the speed of
those ADC's. Dump the default register values.

And on the software side, feed in manufactured frames to find out exactly
_how_ the HDR algorithm works. Use high speed strobe lights to find out the
exact shutter timings. Figure out if it uses the gyroscope or optical flow for
frame stacking, and how many frames does it even use? Publish the bit depths
and colour spaces of intermediate image pipeline stages.

These 'technical' reports which really boil down to "we took some photos and
they're quite crisp" really disappoint me.

~~~
OnlineGladiator
If you actually did a tear down like you just mentioned and documented it
accordingly, it's hard to imagine you wouldn't recoup the cost of the phone
several times over just by uploading the video to YouTube.

~~~
justinator
I think if blended the phone in a blender you could recoup your costs of the
phone on Youtube. And that's probably the rub.

------
gdubs
This was a really thought-out, in-depth article. I really appreciated the
artistic considerations.

The camera has usually been my primary motivation for upgrading, and I'm
tempted. I'm still sporting a 7+ with broken lenses. Which has meant mostly
terrible cell-phone pics that occasionally due to a happy accident look like a
"Holga" picture.

The upside is that I went back to really actively using my DSLR, and I'm not
sure anything really compares still. However, this article was pretty
compelling.

I think the title is accurate – these pictures no longer look like 'iPhone'
pictures, which was a kind of brand identity more than a technical constraint.
This feels like a refreshing shift. That said, they have a look. They remind
me of Andreas Gursky photos – hyper realistic, super detailed, somewhat matte.

------
chooseaname
I wonder how much of the image processing Apple could back port to iPhone 8/X,
XS, XR if they wanted to? Those phones are extremely powerful still. And Apple
has touted the power of their neural engine ever since the 8/X.

~~~
spike021
They won't though. How would they upsell customers if they provide them with
the same features that newer phones have?

I wish they would, but that doesn't fit Apple's typical MO.

~~~
coldtea
Err, Apple is far more typical in supporting older mobile phones with new
versions -- and getting them updated to the latest version fast.

Android vendors had (have?) a horrible history on this...

~~~
spike021
In terms of stability usually, but not tentpole features. That's quite
alright, I'm not even complaining. Just saying it would be nice to get some
newer features but that it isn't how Apple normally operates. That's just
fine.

------
ruminasean
I’m a professional photographer, for whatever that’s worth. Mostly it’s
relevant because I have a camera in my hands all day for most of the week. I’m
not a gadget guy, I don’t need the latest and greatest, but when I pick up new
gear I want it to do the thing and do it well and to not need to be replaced
within a few years because it’s completely obsolete.

I picked up an iPhone 11 pro last week, upgrading from a 7. The camera really
is great. What I keep thinking is that I’m finding it _creepy._ I’m not a
luddite by any means, I’m all for innovation….I shoot with a Sony mirrorless
camera as my main everyday, that felt to me like a massive change.

I think what gets to me is I don’t know exactly what the iPhone camera is
doing. When I press the “shutter button,” what exactly is happening? How many
shots is it taking? Is it shooting on all three lenses? Is it recording audio
even when I have Live Photo turned off? If I think I’m intentionally leaving
something I don’t want to record out of the frame, how can I be sure that one
of the other lenses isn’t also recording and including it? I just don’t feel
like I’m sure about any of it anymore, with all the processing I hear about it
doing, layering multiple shots together, shooting outside the frame. The tech
is cool and all and I’m not prone to paranoia, but I just keep wondering.

~~~
FreezerburnV
If it makes good photos, and the extra information isn’t being harvested to
track you like Facebook would do, what does it matter what it’s actually doing
under the hood? It probably is doing a lot of that because it can condense all
that extra information into a picture that looks better to humans. But the
tech of how it reaches that really shouldn’t matter or be creepy, it’s just
processing a bunch of pixels to make something look nice. Just like your
mirrorless camera, but with more frames put together to make the final result.

~~~
jobigoud
An example is, if the extra data is retained to improve post-processing
capabilities, for example refocusing or a slight change in perspective, then
you could inadvertently share files with more information than you think they
contain. Remember the story of people cropping out their naked bottom half
just to have it plain in the exif thumbnail.

~~~
ruminasean
This is definitely one example of my thinking.

------
Havoc
Camera is the one thing that has me on the edge of should I get a new one.

Everything else...could probably rock my 7+ for another year or two. Maybe do
a battery replacement.

~~~
cheschire
I was in the market for a new DSLR, but their software just hasn’t kept up.

The iPhone 11 pro blew me away with the night mode photography, and the
simulated bokeh that can be modified in post because of the depth camera is
downright genius. And the most important feature, it’s always with me, unlike
the DSLR.

I’ve been using my wife’s hand-me-down phones for a decade now. This was the
first phone I bought for myself, and really it was a camera that has a phone
feature, not the other way around.

~~~
heisenbergs
Same. I actually bought a last gen Sony RX100 VII and the difference in
quality is negligible, other than the Sony having 20MP and the iphone only 12.
This means the tiny iphone 11 pro sensor can produce similar results to a 1in
top-of-the-line compact camera. This is mind-boggling.

After comparing them, I'm convinced that it's only a matter of time until
phones are going to catch up to full frame cameras. Software these days is
more important than the raw hardware it seems...

~~~
stickydink
Do you think, or, is there any hint that, camera manufacturers are developing
this kind of thing?

What could a DSLR with iPhone-level software achieve?

~~~
rodgerd
Olympus and Sony have been the most aggressive about sticking clever software
in their cameras, so they'd be the ones most likely to invest more in
computational photography.

The problems for camera manufacturers around this, though are:

1\. Their market is incredibly, loudly regressive about a lot of this stuff. A
noisy chunk of the photography market is really hostile to workflows that
don't mimic hundred year old darkroom processes. Doing in-camera is an
abomination. Automation is an abomination, etc.

2\. Building the compute in to the camera is non-trivial. You've already got a
_lot_ of compute power focusing on running the complexities of things like
continuous AF tracking (e.g. on top-end Olympus that's 4 of the 8 cores
available). A lot of compute budget is used for things phone cameras are
rubbish at. At some point things like heat become a problem.

3\. Data volumes are hard: Olympus are "only" dealing with a 20 MPix image.
Sony are dealing with 24 - 60 MPix images. Olympus do that at up to 60
frames/s (20 with AF), and to read the data within the constraints of shutter
speeds of 1/16000 s. That is... a _lot_ of data compared to the relatively
modest rates on an iPhone. Oh, and while iPhone users are generally OK with a
bit of lag, DSLR users get really pissy if you're making them wait for that.
Latency needs to be very low.

4\. Physics is hard: A Sony sensor is a 24 x 35 mm hunk of silicon. Quite
apart from the challenges of the data volume, reading a chunk of CMOS that
large has been a challenge. Sony have done a lot of clever things to work
around those limits, but still... (Olympus have been able to do high frame
rates longer than anyone else in part because their sensors are smaller)

~~~
rodgerd
(I should add to that: the pre-post capture already exists on Olympus since
they released the E-M 1 II a couple of years ago. But DSLR makers are
generally more focused on capture-time optimisations like that, eye focus,
Sony's facial recognition in some of their cameras - which allow you to
register a specific person's face and priority autofocus on them.)

------
josephjrobison
That’s a great way to sell an app, as 3/4 the way down I shelled out $7 for
their two apps.

~~~
LeoPanthera
There are so many third party Camera apps that it is genuinely difficult to
tell whether any of them actually add value or do things that the built-in app
cannot - beyond the ability to do "instagram" style filters.

~~~
ayoisaiah
They do add value. You cannot, for example, shoot RAW or manually control the
camera with the built-in app.

~~~
LeoPanthera
Sure. I get that part. The tricky part is _which_ of the many many apps do
that _well_.

------
notadoc
My iPhone 11 Pro pictures look like they've been put through an artistic
painting filter when you zoom in on them, it looks strange and smudged.

~~~
acchow
Can you show some zoomed-in examples?

~~~
cthalupa
[https://spectre.cam/full-
res/Full%20Size%20Comparison%20for%...](https://spectre.cam/full-
res/Full%20Size%20Comparison%20for%20Semantic%20Stuff.jpg)

This one from the article is a pretty good example. It looks like a
photorealistic painting when you zoom in, and not just on the sky, like they
point out in the article - the whole thing does.

I'm really impressed with the 11 pro for stuff that gets viewed on the same
medium it was created in, but the trade offs are very apparent when you look
at things on a computer monitor.

~~~
coder543
Those photos aren't straight out of the iPhone's image processing pipeline.

From the article:

> If we distort the contrast in an image, we can bring out the ‘watercolor
> artifacts’ in the clouds

I think you missed that sentence. You posted the "distort the contrast"
examples, which were hand edited to exaggerate any artifacts in the photo the
iPhone had output.

This is the photo that the iPhone gave the photographer:
[https://miro.medium.com/max/1800/1*oZgBZGHyxwMevHKiGfATiQ.jp...](https://miro.medium.com/max/1800/1*oZgBZGHyxwMevHKiGfATiQ.jpeg)

~~~
cthalupa
>I think you missed that sentence. You posted the "distort the contrast"
examples, which were hand edited to exaggerate any artifacts in the photo the
iPhone had output

I think you missed part of the subtitle. The one on the left has had contrast
adjusted, the one on the right has not and matches the one you posted. The
watercolor artifacts are just as visible on your link as they are on the right
half of the picture I posted.

I find them incredibly noticeable, even on your link.

------
ballenf
> For a while now, you haven’t been the one taking your photos. That’s not a
> slight at you, dear reader: When your finger touches the shutter button, to
> reduce perceived latency, the iPhone grabs a photo it has already taken
> before you even touched the screen.

I wonder if there’s a line where copyright could attach to the developer who
wrote the code instead of the thumb owner.

~~~
mc32
Nah, there was a guy (a photographer) who set up a camera system that was
triggered by passers by. There was a lawsuit[1] as to whether it was legal and
who owned the photo (one of the subjects happened to be very orthodox, and I
guess photos are forbidden), anyhow New York decided the work belonged to the
photographer who set up the auto trigger system. It was dismissed as being an
artistic expression by the photographer.

There was also a photographer who was in Britain for some show. He wanted to
do some photography but wasn’t allowed as he was only there as a tourist and
didn’t have a work permit, so he handed his camera to his daughter. He then
set up an exhibit with her work. I’m not sure who “owns the rights” but he
assigned them to her so as not run afoul of his visa.

[1][https://en.m.wikipedia.org/wiki/Nussenzweig_v._DiCorcia#Laws...](https://en.m.wikipedia.org/wiki/Nussenzweig_v._DiCorcia#Lawsuit)

~~~
lotsofpulp
And there’s this case about a nature photographer:

[https://en.wikipedia.org/wiki/Monkey_selfie_copyright_disput...](https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute)

------
willis936
Now that multiple sensors are standard, I wish sensor fusion would get some
attention. It isn’t clear to me that there are not significant gains from
capturing from three camera sensors at once. The big tradeoff in image fusion
using only one sensor is loss of temporal precision. You could gain some of
that back while maintaining the gains in spatial accuracy if data from
multiple sensors is used. Apple’s camera app at least demonstrates that
they’re capable of getting multiple camera video streams into main memory
simultaneously by showing the out-of-frame view. Maybe that’s the
preview/lead-in for AppleTech Pro sensor fusion TM 2020.

~~~
OnlineGladiator
> Now that multiple sensors are standard, I wish sensor fusion would get some
> attention.

Sensor fusion has been a hot area of research in robotics for at least a
decade. Why do you feel it gets no attention?

~~~
willis936
Because it doesn’t (in the consumer space). Everyone who needs to use kalman
filtering and sensor fusion does, but if it isn’t necessary then it isn’t
used, even if it could result in better products.

------
rolltiide
the biggest thing to me seems the be the user experience of applying these
'smart' technologies.

for example, photoshop's liquify feature can now directly isolate eyes, noses,
lips, cheek bones, facial structures etc without you needing to be good at
using the liquify brushes

but ios is now doing this kind of thing automatically and integrates the
editing right there either in the photo by default or in an integrated editing
app

------
ThouYS
fantastic article, loved the obvious effort put into it!

------
suyash
Great review about the new iPhone, however as a normal camera phone user, I
really don't care about the finer details in pictures. Any of the modern
smartphones camera are pretty good for me.

------
markdown
OT but I looked at the app they sell (Halide Camera) and as far as I can tell,
it's an app for iPhone 11 and iPhone 11 Pro. Is there a similar app for older
phones?

~~~
sandofsky
Technically we support all the way back to the iPhone 5S, but those devices
don't support RAW, so you're really using it for its interface and access to
manual controls. I mean, I'm biased, but I'd say that's compelling even
without RAW.

As far as technical features, things get really interesting with an iPhone 6S
and later. That's when Apple added RAW support, which requires a third-party
app. On depth capable devices (7 Plus, 8 Plus, X/XS/11), we have detailed
depth visualizations.

Anyways, we're generally forward looking, we'd be crazy not to serve most of
the folks out there running older phones.

~~~
markdown
Oh ok, thanks. The very first thing you see in the App Store listing is "The
best camera app for iPhone 11 and iPhone 11 Pro", hence my assumption.

------
tiffanyh
Can law enforcement convict someone due to a Computational Photograph?

Google & Apple are digitally altering photos so much so to “enhance them”,
couldn’t it be agrued the evidence of a crime photo has been _tempered_ due to
computational photography.

~~~
adrr
Wouldn’t every digital camera be guilty of that? White balance, lens
correction profiles, noise reduction, compression.

~~~
bufferoverflow
Not if you shoot raw.

Plus none of these modifications really alter the content of the photo.

