
The iPhones XS - ihuman
https://daringfireball.net/2018/09/the_iphones_xs
======
nakedrobot2
Hi, camera designer here. There are few issues in this article.

"The other explanation I can think of is that this almost certainly isn’t
Apple’s own sensor." \- Correct, it's a Sony sensor. Apple has been using Sony
sensors since they dumped Omnivision back in the iPhone 4 days.

"Why isn’t Apple touting this larger sensor?" Maybe because the sensor got
_smaller_ in the previous iphones (1/2.5") compared to the iPhones before that
(1/2.3"). I guess they've gone back to 1/2.3" now?

" Android handset makers can’t just buy a “neural engine” chip and stick it in
a phone. " Yes, you absolutely can. [1] This chip is in all the DJI drones,
also in the "google clips" camera. OK, it's not an A12, but it is exactly the
kind of thing that Gruber is describing - it's a neural image processing chip.

[1] [https://www.movidius.com/myriad2](https://www.movidius.com/myriad2)

~~~
saganus
What exactly is a "neural image processing chip"?

A cursory search didn't turn up anything simple to understand.

My uninformed guess is that it's a chip that does neural network stuff to
offload image processing for antialiasing, scaling and such.

But I would guess they do a lot more.

Could you elaborate a bit?

Seems pretty interesting that my phone might be already using "AI" algos for
taking better pictures!

~~~
Steko
These chips, like GPUs, are optimized for parallel processing, but they are
also designed for lower precision math which requires less memory and fewer
transistors/operation and hence are considerably more power efficient.

~~~
kartickv
So it's just a GPU optimised for half-floats and/or integers?

I remember reading that it's optimised for matrix calculations. Is that true?
If so, what is it about matrix calculations that makes a GPU not the most
optimised for this task?

~~~
Steko
Yes it's optimized for 8 bit matrix multiplication.

> If so, what is it about matrix calculations that makes a GPU not the most
> optimised for this task?

Well this gets into what is a GPU since now NVIDIA and others are making
special tensor GPUs that are also good at this. Most GPUs are optimized for 32
floating point math though, you're probably familiar with "8-bit graphics"
being synonymous with 80's NES console quality.

~~~
kartickv
I see. When I said GPU, I meant a normal GPU, not a special tensor GPU. BTW,
what makes tensors / matrices hard for GPUs to handle?

~~~
Steko
AFAIK GPUs are ok at matrix math, it's just they are typically built for
higher precision. AI chips probably have some other advantages but I'm not the
best person to speak to them. Here's my simplified understanding:

CPUs are optimized for taking a decent amount of data and doing anything with
them, especially if the order counts as is often the case with algorithms.

GPUs are optimized for taking a lot of data and doing the same few things to a
lot of them (shaders, physics).

AI chips are optimized at taking a massive amount of data and doing a large
number of calculations on all of them. In a neural net each "neuron" and/or
"synapse" can represent a function/weight that needs to be calculated across
many inputs.

In a deep learning set, this could be millions of weights and the more weights
you have the more important memory and power efficiency become. When a NN is
being looked up into (as opposed to the initial training) it's often in real
time (self driving cars, "hey siri", etc.) so the demand to finish all those
calculation's faster is even higher. OTOH the accuracy is less important, as
neural nets - like people - are good at filtering out noise.

------
aktau
One thing that seems suspect about this article is when he mentions that
nobody else can put a chip like the neural engine in a phone. He says Google
does it in the cloud. I don't think he's tried a Google Pixel (2) device
before. Put it in airplane mode and take some (portrait) shots. There are a
number of articles talking about the Pixel 2 Visual Core, which sounds similar
to the neural engine, and predates it: [https://techcrunch.com/2018/02/05/the-
pixel-2s-visual-core-p...](https://techcrunch.com/2018/02/05/the-
pixel-2s-visual-core-photo-processor-now-works-with-instagram-whatsapp-and-
snapchat/)

Some basic research would be nice.

~~~
Steko
> which sounds similar to the neural engine, and predates it

Apple's neural engine shipped in the A11, a few weeks earlier than the Pixel
2.

~~~
alphabettsy
Correct, I believe it wasn’t enabled when shipped on the Picel 2. It was
sometime later.

Maybe he’s ignorant there, but I think the point should’ve been that outside
of Google nobody else is doing the same on android.

------
princekolt
(Following on Gruber's examination of the camera lenses on the XS)

I have a full-size mirrorless camera with interchangeable lens (Sony
alpha-5000), and I love it. I usually import all pictures I take with it to
Photos, and you can immediately tell if a picture was taken with it or with my
current iPhone 7 (no surprise).

But here's the catch: I understand many principles of photography, but I'm no
professional photographer. I try to make adjustments to focal distance,
aperture, manual focus, and etc when taking pictures, but the results are
usually not good (it's all fun though, especially when you get it just right).

Therefore, even when I take my camera with me, I will still snap pictures with
my phone because the balance between how easy it is vs the quality of the
product really favours smartphones. And it only gets exponentially harder with
difficult light conditions (overcast skies, moving vehicles, etc). I've
ordered a XS despite the high price but I'm really looking forward to using
its camera and comparing it with my Sony camera.

~~~
nobrains
I use something in the middle of a mirrorless interchangeable lens and a smart
phone camera. The Sony RX100. Because of the lack of too many manual controls,
its a good point and shoot, and because of the large sensor size, the photos
are better than current best-camera smartphones. Thats the next target for
smart phones.

------
thanatropism
It's strange that there is an apparently objective, value-free aesthetic ideal
for personal photographs. What is the endgame here? The look of fashion
magazines and advertisements?

Because hey, like Gruber, I'm also an unattractive mid-30s gremlin, nothing
like a fashion model. My wife, whom I love, is not Kate Moss either.

I used to run around with Dianas and Holgas - plastic cameras with plastic
lenses that expose 120 film (medium or 4x5 format). Half of the shots turn out
unusable, which gets really expensive -- but the half that turns out good is
great. Unlike my (cheap SE) iPhone images, they have a quality that's
uncannily "real". Like real people living real lives.

And when I say "strange" I don't necessarily mean "bad"; just that it sure
looks that we will soon be normalizing the hard-to-notice glitches of ML
models and miss them in "normal" photos. With cheap plastic cameras you knew
that the artifacts (light streaks, dust on lenses...) were artifacts.

\---

Next up: thanatropism complains about auto-tuners in modern pop music.

------
ariwilson
_> Google does advanced machine learning — including for photos — but they do
it in the cloud._

This is just wrong; Google is doing lots on the Pixel devices in hardware
(HDR+, Portrait, Motion Photos, etc):

[https://ai.google/stories/ai-in-hardware/](https://ai.google/stories/ai-in-
hardware/)

------
gammateam
I'm very pleased with the point cloud technology called "depth cameras". What
was, and is, a gigantic attachment to the xbox is now smaller than my
fingernail and there's two of them in the Iphone X series.

What I'd like is for greater range and more situations for the algorithmic
depth to kick in (ie Instagram's focus camera). My use of exchangeable lens
cameras in traditional form factors is about to disappear if I can use it for
selective focus on subjects 20 feet away. (Note: my use of cameras are a means
of expression, and although I'm well versed in the technology and discipline I
don't have any interest in the elitism of how an image was captured.)

I don't think this new generation of Iphones achieve that, but its an area I'm
watching.

The other improvements suggest a 2019 or 2020 model will be amazing.

~~~
dman
What I would like is for a non processed photo to be stored along with every
processed photos. I have multiple photos on both the Pixel2 and IPhone where
the computational photograhy algorithms got the scene very very wrong. Would
have been great to have a non processed fallback in these cases.

~~~
gammateam
that would be good, for now there are apps for that. Your pixel2 should make
it easy for any third party camera to be your main camera app.

------
johnbellone
> Apple describes the glass (front and back) on the iPhone XS as “the most
> durable glass ever in a smartphone”. I asked, and according to Apple, this
> means both crack and scratch resistance.

I feel like I hear this statement every other year. I'm not invested enough to
go back and actually validate my theory, but this definitely isn't the first
time I've heard that from a phone company. They always seem to fail the 6ft
drop test.

~~~
sebazzz
I guess they get a little bit better each year, just a little bit. But not
enough.

~~~
beamatronic
You don’t have to upgrade every year though. Still happy with my 6S.

------
ValentineC
> _… to play with the bokeh depth of field f-stops, you have to use an iPhone
> XS because it depends on the A12’s Neural Engine. An iMac Pro has a more
> powerful CPU than an iPhone XS, but it doesn’t have a Neural Engine, and the
> bokeh effect depends upon it._

This doesn't bode well for anyone who cares about being able to manipulate
their photos without being _too_ tied into Apple's ecosystem.

I hope that Adobe and the developers of other photo editing programs manage to
reverse engineer this.

~~~
kartickv
Agreed. Not just a Mac but any PC should be able to adjust the aperture using
software. Let it be slow. Better than not being able to do it at all.

------
enz
> Apple describes the glass (front and back) on the iPhone XS as “the most
> durable glass ever in a smartphone”. I asked, and according to Apple, this
> means both crack and scratch resistance.

This is great news if it is solid enough. I would happily spend +$1k for a
phone only if I can let it fall without causing damage.

------
carloscarnero
I can't be sure, but I think that there's an error on the 5th photo: the
column in the background, between his right temple and his right ear should
also be processed.

~~~
madcowherd
It also looks like part of the outside edge of the ear got blurred mistakenly.

------
mankash666
Why should we consider this fawning opinion piece by a world renowned Apple
fanboy, a review?

iPhone cameras haven't been the best on smartphones for years now, let alone
among all camera devices.

~~~
macintux
Those of us with iPhones are interested in how the new generation compares to
the old.

He's not comparing the cameras to other phones, although he does briefly
discuss the device vs cloud processing approaches.

And, shockingly, even fanboys can have useful information to share.

~~~
mankash666
Indeed they can. "Is the iPhone XS the world's greatest phone of the year, or
all time?"

"No science involved, but Apple engineers and I agree that the iPhone XS
camera is the best"

I bet fanboys print out DXOMark as cleaning paper for their iPhone lenses -
"best lens cleanser for the iPhone is DXOMARK ratings of other phones being
better"

~~~
macintux
HN is really not the appropriate forum for this sort of straw man hyperbole.

