
iPhone SE: The One-Eyed King? - gok
https://blog.halide.cam/iphone-se-the-one-eyed-king-96713d65a3b1
======
leejo
The article alludes to something I've noticed since upgrading from the 5c to
an 11 pro last year - The computational aspect and/or current hardware
limitations mean these phones can shoot images that look great _on the phone_.

I mean, you scroll past images on Instagram, look at your own photos on the
phone you took them with, look at your images on your Instagram feed on other
people's phones or on the Instagram web app, and yeah they look pretty good.

Get them on something bigger though? Look a little closer? Use decent
resolution monitor? A reasonable sized print? They look bad. I'm not even
talking about pixel peeping here, which is bullshit. I mean they look
overprocessed. Like the machine took a course in photoshop on YouTube and
didn't finish it.

The examples in the article show this, they're full of little mistakes that
don't happen when you have proper hardware - larger aperture lenses on larger
sensors. Proper graduated neutral density filters to prevent halos. Sensors
with large enough area/photosites to give better dynamic range or better high
ISO performance. Real camera movements to tilt/shift/rise/fall/swing.

I mean, most of this is boring technicality stuff that doesn't really matter
in the grand scheme of "is this an interesting photo" type considerations.
It's more a case of computational photography is currently trying to climb out
of its uncanny valley. It'll get there eventually.

For now we're still at the point of "Eventually, anyone that values photos
will carry a standalone camera." I say this as someone who shoots mostly with
an iPhone, but still carries around a 4x5 camera when it matters.

Edit: to expand on this a bit I've just uploaded some images I shot with the
11 pro a couple of weeks after I bought it back in October:
[https://leejo.github.io/2020/04/28/off_season_ibiza/](https://leejo.github.io/2020/04/28/off_season_ibiza/)
# the caveat is that I was still figuring the phone/camera out but wanted to
see "what it could do out of the box" \- and yeah, the images look good on the
screen with very minor lens corrections and levels in lightroom, but look
overprocessed even with my attempts to pull that back: the sky in the photos
looks added in, there are halos around parts of the images where HDR is trying
and failing, the sharpening is too aggresive (real life doesn't have "sharp"
edges), and so on. When you view these at larger sizes than seen in the blog
post they look much worse.

~~~
fxtentacle
Just like Instagram has developed its own aesthetic, so has "shot on an
iPhone". I fully agree with you, those images have the typical "surface blur"
look that all tiny lens + backlit camera combinations produce.

I'd say it's even worse with video. My phone can make 4K videos, but watching
them on a 4K TV is painful. They'd probably look better if you would downscale
them to 1080p and then re-upscale them to 4K. They'll be a bit more blurry,
yes, but they won't have this artificial flat look everywhere.

Memorizing high-frequency patterns and then reproducing them is one of the
very few tasks that AI is actually good at. As such, one can use the raw HDR
exposure data from a phone and the noisy raw image data from the sensor and
produce naturally-looking high-res images from it. The way that it works is
that the AI has memorized the small-scale details from millions of good photos
taken with large lenses and proper equipment, and that way it can replace the
noisy phone sensor data with its noise-free counterpart.

However, such an AI currently needs 10GB of GPU RAM and it runs for 2 seconds
on my 1080 TI for a 4K image. So we'll need another 10x in mobile GPU
performance before it becomes feasible to integrate such technology into
camera apps. And then of course another 20x for live video processing.

~~~
dharma1
> However, such an AI currently needs 10GB of GPU RAM and it runs for 2
> seconds on my 1080 TI for a 4K image.

Which process is this? Got a github link? Are you talking about AI upressing?
I haven't found it improves noise

~~~
fxtentacle
Sadly, I don't have a publicly available link for this.

But no, this is not AI upscaling.

Instead, it is representing the input image in a feature space trained to
store natural images and then reconstructs the output from that intermediary
representation. If chosen well, the intermediary will store details relevant
for high quality images, but discard those specific to low quality images,
like noise.

It's called a convolutional autoencoder with bottleneck.

------
mrweasel
I don’t get it, it’s a phone, with a camera. The camera just needs to be good
enough, and for most people it’s been good enough for years.

Most phone reviews these days focuses waaay to much on the camera, because the
reviewers are often people who are also into photography or shooting videos.

For myself, I care solely on privacy and price, so I only have one option, if
I need a new smartphone. I still think the iPhone SE need to come down in
price, but it the only choice, there are no competitors

~~~
gregoriol
I very much second this: the camera is like the snakes game on old Nokias,
it's very cool it's there but it's not what I need the most.

~~~
gregoriol
Also, on the iPhones it has actually made the device more annoying because of
a very simple thing: the back is not flat! This creates a very bad experience
when the phone is on a table or any surface. So the ideal phone would not be
an iPhone SE as many suggest this week, it would be an iPhone SE with a flat
back, whatever the camera is.

~~~
nonamenoslogan
I think most people use a case. Not using a case on a glass body $400-1200
device you're going to carry around in your pocket is pretty risky.

~~~
gregoriol
This is not acceptable: having to use a case to fix a "defect", are you Steeve
Jobs from the antenna gate?...

~~~
brewdad
It has nothing to do with Steve Jobs or Apple. I've never owned an iPhone.
Every single Android phone I've owned, I used a case. Same with pretty much
everyone I know who has moved beyond the flip phone. It's just a fact of life
for pretty much everyone.

------
gfiorav
The whole portrait thing is going to plague this decade's pictures. We'll
cringe when we look back at them the same way we cringe at 80's shoulder pads.

~~~
seppin
It's the same as shooting a wide aperture, the blurred background look has
always been a thing.

~~~
gfiorav
It’s the crappiness of the effect... it’s too much! And the borders... ugh

~~~
seppin
I guess 99.9% of people don't notice.

------
thdrdt
Side note: depth of field blur is not a gaussian blur but a bokeh blur. Most
software uses gaussian blur to fake depth of field but this always looks
unnatural.

Bokeh blur looks more like a lot of circles.

If you compare the iPhone 11 blur with the SE blur it looks like the SE is
using bokeh blur which looks much more natural.

~~~
kqr
It's more subtle than this: bokeh blur can look very different depending on
lens design and configuration. It can look like toruses (tori?), various sided
polygons, stars, double overlayed low-pass circles with resonance at the
edges, ovals shaped and rotated into a full-picture swirly effect, and so on
and so forth.

Depth of field blur is a very organic effect and in many cases part of the
signature look of lens designs. It is not just one, parameterless filter one
can apply to an image.

------
renewiltord
The portrait modes on these are getting really good. The blur is pretty
convincing looking. The only open-source software I know that does similar
stuff is body-pix which does matting, but I don't think it generates a smooth
depth map like this thing. It would be cool because then you can do a clever
background blur for your Zoom backgrounds with v4l2-loopback webcam.

By the way, I decided to also quick summarize the usual HN threads that have
the trigger word iPhone in it:

\- No headphone jack

\--- Actually this is good because ecosystem built for it

\----- Don't think ecosystem is good. Audio drops out

\------- Doesn't happen to me. Maybe bad device.

\----- Don't want to be locked in. Want to use own device.

\------- That's not Apple philosophy. Don't know why surprised.

\--------- I have right to my device

\----------- cf. Right to Repair laws

\------- Can use own device with dongle.

\--------- Don't want dongle. Have to get dongle for everything. Annoying.

\----------- Only need one dongle.

\------------- If only audio, but now can't charge.

\----------- Use dongle purse.

\--- Apple quality have drop continuous. Last good Macbook was 2012.

\----- Yes. Keyboard is useless now. Have fail. Recalled.

\------- I have no problem with keyboard.

\--------- Lucky.

\------- Also touchpad have fail. Think because Foxconn.

\------- Yes. Butterfly? More like butterfly effect. Press key, hurricane form
on screen.

\----- Yes. Yes. All Tim Cook. Bean Counter.

\----- Yes. Many root security violation these days.

\------- All programmers who make security violate must be fired.

\--------- Need union so not fired if manager make security violation.

\----------- Don't understand why no union.

\------------- Because Apple and Google have collude to not poach. See case.

\------- Yes. Security violation is evidence of lack of certification in
industry.

\--------- Also UIKit no longer correctly propagate event.

\--- Phone too big anyway. No one make any small phone anymore.

\----- See here, small phone.

\------- Too old. Want new small phone. Had iPhone 8. Pinnacle of small
beauty.

\------- That's Android. No support more than 2 months.

\--------- Actually, support 4 months.

\----------- Doesn't matter. iPhone support 24 centuries and still going.
Queen have original.

\--------- Yes, and battery on Android small.

\--- Will buy this phone anyway. Support small phone.

\----- No. This phone is also big. No one care about small hand.

\------- Realistically, phone with no SSH shell dumb. I use N900 on Maemo.

\--- Who care? This press release. Just advertisement.

\----- Can dang remove clickbait. What is one-eye anyway? Meaningless. Phone
no have eye.

\--- Also, phone not available in Bielefeld.

\--- Phone only have 128 GB? Not enough. Need 129 GB.

\----- 64 GB enough for everyone.

\------- "640 KB enough for everyone" \- Bill Fence, 1923

~~~
nindalf
This appears like frivolous comedy but if enough people see this, it has the
potential to change discourse slightly on this topic. Most of the people who
make this comments do so in earnest, thinking they’re contributing something
valuable and original but after seeing this they’ll realise it’s not as
original as they think.

~~~
searchableguy
Same repeated rehashed arguments are done for tribal recruitment. By making
others more familiar and pushing agenda on unfamiliar people to be in your
group.

I guess, it feels good to be acknowledged that you are correct and lead many
others to the correct path.

------
karolist
My Pixel 2 XL I'm typing this on also has just one camera and does pretty
amazing DoF for portraits, SE is definitely not the first phone to do this in
software alone.

~~~
renewiltord
No, that device does it using the thing called DPAF
[https://ai.googleblog.com/2017/10/portrait-mode-on-
pixel-2-a...](https://ai.googleblog.com/2017/10/portrait-mode-on-pixel-2-and-
pixel-2-xl.html)

The iPhone XR technique described in the link does that too. And it'll usually
be better (than the monocular thing).

~~~
CarVac
The Pixel 2 uses true monocular depth estimation for the front camera.

~~~
renewiltord
Completely forgot about the front camera. Thank you.

------
Accacin
For all those bemoaning the lack of a headphone jack, if you like music do
yourself a favour and get yourself an older iPod and do all your podcast and
music listening on your iPod. The sound is much better and it's nice to have a
device where I can listen to music and not be distracted or tempted to do
other things.

~~~
skrebbel
How is the sound better?

~~~
coldtea
Parent means compared to Bluetooth compression codecs...

~~~
skrebbel
Ah! Thanks, that makes sense.

------
f311a
Which iPhone is better if I want a good camera, but I don't want to go with
the PRO model?

I stopped following iPhone models a while ago and have no idea about the
differences between 11, XR, and XS.

I need a simple phone with a good camera and swapping SE1 for SE2 does not
sound like a good option.

~~~
klodolph
There’s no simple answer for everyone but my current take on phone cameras is
simple:

Every current production iPhone has a good camera.

The rear-facing cameras (which you use for taking pictures) are within
spitting distance of perfect, barring any revolutions in camera physics. The
front-facing cameras are inferior but they get the job done and are very
respectable.

If you want a noticeably better camera, it’s simply not available in this form
factor. You can play with gimmicks like the 108MP Xiaomi Mi Note, but once you
zoom in, you can see the dirty secret that there is _heavy, heavy_ processing
that makes the resolution possible at all, and from an image quality
standpoint, you’re not any better off.

~~~
jahlove
> The rear-facing cameras (which you use for taking pictures) are within
> spitting distance of perfect, barring any revolutions in camera physics. The
> front-facing cameras are inferior but they get the job done and are very
> respectable.

I disagree. The pics looks good on a phone, but when you compare a large print
vs a full frame camera, the difference is obvious.

Also, if you try to photograph the milky way with your iPhone, it's
limitations are obvious (even on a phone screen).

~~~
klodolph
As far as I can tell, you aren’t disagreeing with what I was saying.

I’m talking about “perfect” relative to what is physically possible within a
mobile phone form factor, under the assumption that we are using traditional
camera technology—using a lens to project a two-dimensional image onto a
sensor, which records the image. From quantum physics we know the resolution
is limited by diffraction through the aperture, so barring radical new
materials, this won’t change much. Also from quantum mechanics we know that
the noise floor will never drop below the shot noise of the actual photons
hitting the sensor.

If you want a better camera, it’s easy to just get a bigger one. However,
camera technology has plateaued, and we no longer see radical improvements in
image quality just by upgrading our cameras to newer models. (We still see
incremental improvements, like the appearance of mirrorless cameras, but
there’s only so many incremental improvements you can make before you run out
of physics, and need to change your assumptions.)

------
Causality1
Sometimes I think I'm cocked in the head because I still completely fail to
see any point at all in bokeh images. It's rather like phone manufacturers
dedicating hardware and bleeding edge software to the production of fish-eye
images.

~~~
michaelt
A keen photographer would point you to an image like [1] as bokeh removing
'clutter' and 'distraction'. Cues you would get from depth perception seeing
the same thing in person.

Like so many things in art, these are all subjective judgements and a matter
of taste. If you want a more objective reason, crisp-subject-blurred
background is associated with high quality because it's what professional
portrait photographs often look like [2]

It's not the be all and end all of photography, of course - few people would
want their selfie with the Eiffel tower to blur out the Eiffel tower!

[1]
[https://commons.wikimedia.org/wiki/File:Aperture_and_bokeh.j...](https://commons.wikimedia.org/wiki/File:Aperture_and_bokeh.jpg#/media/File:Aperture_and_bokeh.jpg)

[2]
[https://www.google.com/search?q=the+queen&tbm=isch](https://www.google.com/search?q=the+queen&tbm=isch)

~~~
bluedino
Paintings work the same way - [https://paintingportraittips.com/homepage-
old/portrait-paint...](https://paintingportraittips.com/homepage-old/portrait-
painting-techniques/)

------
ebg13
I think I would much rather they secretly capture multiple frames and compute
structure from motion or use flash-based segmentation to avoid the presented
real world failures. The given "wave their phone in the air for several
seconds" dismissal is a pretty strong exaggeration. For one thing, my hands
are moving anyway, because I, like everyone else, open the app before holding
up the phone to take the shot. And seconds? What is this, 1995?

~~~
sandofsky
Author here. ARKit takes several seconds, and that's with comparatively sparse
feature extraction. Can you speak from a different experience?

------
6gvONxR4sf7o
Regarding depth estimation through motion:

> This is similar to how augmented reality apps sense your location in the
> world. This isn’t a great solution for photographers, because it’s a hassle
> to ask someone to wave their phone in the air for several seconds before
> snapping a photo.

This is interesting because with my shakey-ass hands, that's essentially what
I'm doing unintentionally every time I try to take a nice picture.

------
rchaud
Really interesting read, and I love that there was a small GIF shoutout to Top
Secret. Funny, campy action film in the spirit of stuff like Hot Shots Part
Deux.

I'm pretty ignorant about how cameras work in general. Bit embarrassed to
admit that I didn't realize stuff like ML was used to guess depth and know how
to add background blur.

------
safgasCVS
When I got my first iPhone (an XR) I remember the first 3 weeks I kept playing
around with portrait mode. Now a year in, looking at the photos in my iCloud
stream I haven’t used the feature in more than 9 months. I had to read this
article to remind myself the feature still existed. Similar story with Touch
ID.

~~~
syntheticcorp
Do you mean 3D Touch instead of Touch ID? I agree that 3D Touch isn’t that
useful or an everyday feature, I believe it’s now discontinued

------
nromiun
I never expected Apple to make something this cheap. But still in some
countries the price is the same as OnePlus 8. Which has much better hardware
and features. Still, it is the cheapest iPhone right now so maybe some people
will buy anyway (despite the ridiculous price difference in different
countries).

~~~
eugeniub
It's debatable whether it has better hardware. The iPhone SE has a faster
processor than any Android phone in the world.

Also, where in the world is an iPhone SE ($399 USD) cheaper than a OnePlus 8
($699 USD)?

~~~
nromiun
I doubt a few percent increase in CPU power can make up for the only camera,
those huge bezels, the abysmal battery life (mrwhoistheboss got only 3 hours
of SOT).

It starts at exactly the same price in India (about 41k rupees - 590 dollars).
Shocking isn't it?

~~~
saagarjha
iPhone SE is at least 50-70% faster than top-of-the-line Android phones, and
there's no way that its actual battery life would be 3 hours. Also, the phone
comes with iOS.

~~~
nromiun
The Geekbench5(multicore) scores for SD865 and A13 are 3463 and 3517
respectively. How did you get a 70%(!!!) percent difference? Can you give a
source?

Of course the actual battery life would be much better then just the SOT. But
still it gives an indication of how quickly you are going to drain it with
your usage. For me it's unlikely to last more then 6 hours. Which is way worse
then any phone I have used in the last 5 years.

And regarding iOS, that's exactly the point I was making in my first comment.
Despite the ridiculous price in some countries people will still buy it
because it is the cheapest iPhone.

~~~
saagarjha
> The Geekbench5(multicore) scores for SD865 and A13 are 3463 and 3517
> respectively. How did you get a 70%(!!!) percent difference?

Single core. For most tasks I think that’s the better measure, anyways.

> For me it's unlikely to last more then 6 hours.

I don’t know what you’d have to be doing to make it do that; maybe keep the
screen at full brightness with GPS on?

~~~
nromiun
That's the thing though. I can do all that and more on the OnePlus 8 and it
will still give me more then 2 days of battery life.

The only pro of the iPhone SE is that it is the cheapest iPhone. If it was an
Android phone (even with the processor) nobody would pay more then 200 dollars
for it.

------
jayd16
If its pure software will Apple add it to iPhoto (or whatever their photo
software is called now)?

~~~
dangus
It’s called Photos.

------
rwmj
Can someone answer this point I'm confused about: Why can't a phone generate
bokeh blur using the same technique that an ordinary camera does, ie. a large
aperture / small f-stop? IOW why is ML or depth sensing needed at all?

~~~
foldr
Two reasons.

First, phone cameras all have a fixed aperture. If you set the aperture wide
enough to get bokeh in a typical portrait shot, then the depth of field would
be too narrow to be practical for general photography.

Second, the depth of field depends not only on the f-stop but also on the
focal length (or, to put it another way, it depends on the absolute diameter
of the aperture). Given the focal length of a typical phone camera, you'd need
a lens with a very wide aperture to get significant background blur for
subjects at portrait distance. Wider apertures complicate the complexity and
expense of lens manufacture.

It is interesting to speculate on the use of very wide aperture lenses on
future phone cameras, though. If such lenses could be made cheaply, one can
imagine that the shallow DOF could be overcome via some form of automated
focus stacking.

------
ghostpepper
This seems like a cool feature. Is there something comparable on flagship
Android phones?

~~~
haditab
Portrait mode in Pixel2 did the same thing and imo opinion worked better than
portrait mode in the dual-camera iphone 7 plus (I know iphone 7 plus is older
but that's the only dual camera iphone I had access to at the time).

~~~
renewiltord
No, the Pixel 2 and iPhone XR do the same thing. They use DPAF. In the post,
you'll see that the XR's depth map actually smoothly follows the ground into
the background. The Pixel 2 will do that too and it will work for inanimate
objects as well.

------
Angostura
Thinking about this, why can’t the SE’s software work out depth-perception by
using Live Photo-type tech - we all move a little bit while taking photos, so
comparing frames would give you depth info. Tripod shots excepted.

~~~
Traster
Probably because the first thing that would happen is a vlog from MKBHD saying
that it doesn't work whilst mounted on his $17000 tripod and gimbal setup.

In all seriousness though, you can do a to imitate two cameras with one, but
this is designed to be a differentiating feature, the iPhone SE is designed to
have a worse camera. Why would you spend time focusing on features exclusive
to the camera that you've already decided shouldn't be that good.

~~~
Angostura
Because it sounds as if they _already_ focussing on those kind of features.

------
bborud
Some of the depth blur fakery occasionally fails so badly I get dizzy when
looking at it because my brain is wired for shooting and editing DSLR images
that have been shot in RAW mode. Since I've short DSLRs for almost 20 years
now, my workflow is wired to work with the technology and its limitations. And
DSLRs have some significant limitations.

 _Let 's forget about the limitations of small sensors and tiny optics for a
bit. It's a phone and not a DSLR or a medium format camera with glass that
costs more than your car. It can only do so much._.

When I shoot I aim for data capture and not how the image looks as shot. How
it looks comes later. Which, in overly simplified terms, means: don't blow out
the highlights ever and try to keep the noise level in the shadows as low as
possible. Or in other words: try to get as much as possible within the dynamic
range of the sensor at the lowest possible sensitivity (amplification?)
setting of the sensor. (I'm not fond of large grain - artificial or
otherwise).

The post processing is essentially about taking more data that your output
medium can reasonably render and then offset, squeeze and stretch parts of the
dynamic range until you get the tonality you want from the data you have.

Even the iPhone 6 did an amazing job at automating this. But I can't help but
feel that with each generation, iPhones get a little bit worse. It tries to do
more, which means that when it gets things wrong, the results can get really
horrible. And I'm not fond of images that have had heavy alterations of
pixels.

Sure, most people aren't going to see this when the image is rendered on a
surface that isn't that much bigger than a baseball card. And sure, for most
pictures it probably isn't going to be as noticable for the 2 seconds they
spend looking at it.

And as someone else pointed out, if you are going to fake focus blur, you're
going to have to do a hell of a lot more complex stuff than just smearing
pixels with gaussian blur or any other blurring method that doesn't model how
lenses actually work.

If you want a really stark example, take a 85mm Petzval lens, or some other
"bokeh monster", shoot it wide open with a foreground subject that is 2m away
against a background that is 4-5 meters away that contains lots of small
highlights. Then repeat the same with various iPhones.

No the iPhone doesn't have a Petzval lens or any other (relatively) large
lens. So why try to fake that you do? If you want the visual effects of large
glass then shoot with large glass.

As a hobbyist photographer I find it easier when a camera doesn't try to be
clever. When it just does what it does. To me a mobile phone camera is more of
a "documentation camera" \- it should try to capture what you are interested
in. It doesn't always have to be pretty. It just needs to be reasonably
accurate. Subtly applied HDR tricks are okay since it helps you capture what
you are looking at, but when you start to alter pixels, it becomes less useful
as a tool.

~~~
mschuster91
> So why try to fake that you do? If you want the visual effects of large
> glass then shoot with large glass.

Apple is trying to carve a chunk out of the market of people who would have
bought a real 35mm full-frame system camera - if a phone can "look like it"
for the photos of the latest vacation I send to Grandma or post on Instagram,
why invest the thousands of euros that a good kit costs?

Most people aren't gonna shoot pictures at night/lowlight scenarios or with
extreme zoom... I do though, so I chonked out some money and got a Sony A7S2.
Now, if only Sony's software quality was better... sigh.

------
type0
That just looks strange, I would not subject my dog to such terrible
portraits.

------
diogenescynic
I really wish there was an iPhone SE Plus. This is just too small, but
everything else is perfect.

~~~
kgwgk
This is the iPhone SE Plus and it's too large. In the original SE format it
would be perfect.

~~~
diogenescynic
I meant I want an the SE Plus model that follows the “Plus” sizing format of
other iPhones. The iPhone Plus models were 6.2” tall and 3” wide, whereas the
iPhone SE is 5.4” tall and 2.6” wide.

~~~
kgwgk
I know, I know :-)

I was just venting my frustration.

~~~
diogenescynic
I assumed I was wrong and that was why I was being downvoted so I had to re-
check the specs. All good.

------
Luc
The visual artefacts (white spikes) on the woman's shoulder are pretty bad.
Unacceptably bad, I would say.

~~~
playeren
> Shot on an iPhone XS in 2018

~~~
Luc
So they included it _because_ it looks bad? If so, I didn't catch that!

~~~
playeren
> If so, I didn't catch that!

The reasoning for including the picture is in the actual _text_ of the
article. I do not wish to antagonize people who have trouble reading, but I
also think they shouldn't be surprised when they miss out on material
information.

~~~
Luc
All that and a roasting too! Thanks so much.

------
downerending
Yeah, well, I wish it had one eye, in the form of a headphone jack.

~~~
qubex
I honestly find it hard to believe that people are still bemoaning the ‘loss’
of the headphone jack. I stopped using it well before it was removed, probably
sometime while using an iPhone 6 Plus/6S Plus (so 2015-2016). It was an
absolutely obsolete interface and clearly a weak point for water resistance
and structural integrity.

If you really want to use headphones you can use a pigtail on the Lightening
connector or plug them into a bluetooth adapter.

EDIT: Fine, there are people who still want headphone jacks. I’m not the one
who decided to remove them and I’m glad they did, but I can’t give them back
to you.

~~~
Lio
> If you really want to use headphones you can use a pigtail on the Lightening
> connector or plug them into a bluetooth adapter.

How fast will this wear out the Lightning socket?

Since Apple put the springs in the socket and not the plug (like USB-C) when
that socket wears out the phone is a right off.

~~~
kayoone
I've been using the adapter in my car for 2 years, twice a day, no problems so
far. Lightning is far more resilient to this than USB-C ports in my Macbook
for example, which have become pretty loose over the years of plugging in
chargers and monitors.

~~~
Lio
It's the lead that becomes loose not the port with USB-C so you can change the
cable you use to connect to your monitor and it will be like new again.

------
soylentgraham
The article/advert misses this information out; is this depth estimation a
model in the OS somewhere, or part of the vision api, coremedia api (where the
stereo->depth frames are), or just inside the camera app and not exposed to
developers?

~~~
sandofsky
Author here. In the article, we mentioned it's a neural network, and the
results are accessible to developers. We left out further details because this
isn't an implementation guide.

~~~
soylentgraham
Could you give me a hint here which of their API's it's using? There's a few
"depth" frame interfaces in iOS for various devices and googling which apply
to which devices rarely yields much. (I wish the apple docs would hint as to
which devices apply to which APIs)

------
fulafel
Is this related to Halide the language that Google also uses to implement
computation photography?

[https://en.wikipedia.org/wiki/Halide_(programming_language)](https://en.wikipedia.org/wiki/Halide_\(programming_language\))

~~~
xuki
No, it's a reference to silver halide in analog photography. Halide is a
camera app on iPhone.

