Hacker News new | past | comments | ask | show | jobs | submit login
iPhone 8 Plus Camera Review (austinmann.com)
136 points by ruang on Sept 25, 2017 | hide | past | favorite | 64 comments



This isn't a review. It is a competent photographer showing off. Which is fine, but you should be aware of it while reading.

1. The fake bokeh looks surprisingly good. I don't think any of my non-technical friends would be able to spot it on their own.

2. These pictures are absolute best-cases for the iPhone (or any) camera. For headshots with well-chosen natural light as in these pics a picture taken with an iPhone 4 would look almost as good. "Studio lighting" and other tricks won't fix bad light. Neither will a $6,500 Nikon D5 with a $1,000+ lens.

3. Nikon, Canon et al need to try harder. Cellphones have already replaced several categories of "real" cameras, and they keep improving every year.


Nikon and Canon should have gotten into the licensing game with smartphone manufacturers. And by that I don't mean suing their asses off "because patents". They should've started making camera modules and lenses and sensors for smartphones 4-5 years ago. Now it may already be too late. In 5 years, even amateur photographers won't be using DSLRs, at the rate smartphone computational photography is improving.

Just look at these cropped images over a span of 4 years:

https://cdn.dxomark.com/wp-content/uploads/2017/09/asian_old...

They got disrupted and like most incumbents, they failed to capitalize on the disruption "because it could eat their margins" or whatever their reason for not getting into the smartphone market was.


> In 5 years, even amateur photographers won't be using DSLRs, at the rate smartphone computational photography is improving.

I call BS, if by "amateur photographers" you're talking about enthusiasts who care about composing and exposing a great shot, not just taking a nice picture of the kids in front of the Christmas tree. It's like someone in 2012 saying "at the rate smartphone input apps are progressing, in 5 years no-one will buy physical keyboards for their computers."

The reason someone carries a DSLR today is because of the optics, the speed and the control mechanisms. Having shutter speed, aperture, white balance etc. at your fingertips when shooting. Having the ability to switch from a 105mm portrait lens to a 20mm fisheye. Having that rapid autofocus and response time that lets you capture great shots. Having 14 bit RAW files that you can post-process to save that reflex once-in-a-lifetime shot that was underexposed. Having the ability to mount a flash off-camera (or even three strobes). Those are things you can never get on an iPhone or Samsung.

Nikon, Canon etc. have already "bled dry" of customers going to smartphones instead of DSLRs, and at this point their business models look fairly stable. The share price of both companies has also been fairly stable for the past 3-4 years; both are up 30% over the past year, although that (and a lot of their volatility) is tied to the JPY:USD exchange rate.


>The reason someone carries a DSLR today is because of the optics, the speed and the control mechanisms. Having shutter speed, aperture, white balance etc. at your fingertips when shooting. Having the ability to switch from a 105mm portrait lens to a 20mm fisheye.

I can see that in 5 years smartphone speed, coupled with advanced phase detection AF, will make smartphone as fast as today's pro DSLR bodies (the likes of the D5, Canon 1D line etc..)

The need to quickly adjust settings(f/stop, shutter, etc..) can be greatly diminished by having even more advanced software. Like a bokeh/portrait mode, where the lens is kept wide open and the phone maintains all other settings, include fake bokeh via depth map.

Different lens focal lengths is a big differentiator. However, if the sensor has sufficiently high resolution, the software can crop the sensor up an extend to largely simulate telephoto. Granted, compression won't be achieved, but it's largely there...

5 years is a long time.


> if the sensor has sufficiently high resolution, the software can crop the sensor up an extend to largely simulate telephoto

The software will in this case be inventing an image, not capturing one. Don't get me wrong - I'm delighted with the improvement in quality that pocket cameras, embodied in smartphones, have achieved in a mere decade. But physics is still physics, and a 6x5mm sensor can only capture so much light.


6x5mm of your eyes retina, coupled with our brain does a pretty good job, so there is no reason we couldn't achieve similar results.

I love using my old film cameras, and as an amateur, I actually only use an iPhone and film cameras. I don't really see a point using a bulky and short lived dslr to produce nor quick nor great pictures.

As a professional, I guess the issues are different though.


I'm anything but a professional at photography. But the thing about the way our brains interpret visual input is that it's lossy as hell. I don't need my camera to be the same.


but traditional physical limitations of an image sensor's light gathering ability is always in context of the sensitivity of the sensor. If the sensor is made even more sensitive, there is more than enough light to create images.


More noise, too, which scales as 1/(pixel size * sensitivity) - that is, decreasing pixel size and increasing sensitivity both raise the noise floor.

This can be filtered to some extent - indeed modern phone cameras already do so quite heavily, to overcome the limitations of their already small pixel size and already high sensitivity. But doing so costs the same detail you'd need to try to invent a "simulated telephoto" image, which would give your fake telephoto process even less to work with than otherwise - guessing you have a neural net in mind here, and while I'm not about to argue that a sufficiently well-trained net won't produce some kind of result given an input of sufficient similarity to its training set, I see no reason to expect that result to bear any particularly photographic similarity to the original input.

I mean, don't get me wrong - what you seem to be suggesting isn't all that dissimilar from how we currently understand the human brain's own optical system to work. But I don't think it is especially likely that many such brains will happily accept another neural network's best-effort guesses as the output of a process that we've all learned to expect will give us representations as precise as is within the capabilities of the devices we use to make them.


The main near- to midterm issue for Canon/Nikon (and, to a lesser degree, the various mirrorless interchangeable lens makers like Fujifilm) IMO is that a fair number of people buy DSLRs who don't really need them. They buy a low to midrange model with a kit lens, set the camera to Auto, and never take the lens off. And only post the pics to Facebook.

Increasingly, there's no reason for that type of user to buy a DSLR. A good phone will handle 90% of their uses for a lot less (incremental) money and effort.

But I agree that there's still a huge gap between people really using their DSLRs as DSLRs and smartphones. And, as for cameraphone accessories, the fact that you can stick awkward add-ons onto a phone to make it a better camera, doesn't mean that you should or that it's something most people want to do.


>The reason someone carries a DSLR today is because of the optics..

LOL, I was not sure of which usage of optics was intended here, had to read the whole sentence.


It was unintentional, but I think both usages apply. If I hired a professional photographer for something and they show up with an iPhone I'm going to laugh them to the door and look for someone else.


Which is a perfectly reasonable thing for you to do. A talented professional may be able to produce relatively good results with equipment that would generally be considered sub-standard. And, under the right conditions, the results might even be excellent for most uses.

But, if I'm paying them to do a job, I expect them to show up with gear that's more or less the professional standard for what they've been hired to do.


The only things DSLRs will always have over smartphone cameras is physical sensor size and glass (optics). External lighting control is possible on smart phones but definitely a fringe option at this point, but that's more a function of market size than physics, unlike optics and sensor size.

Everything else is processing and software. Processing on smartphones is progressing WAY more quickly than DSLR. Takes Canon a couple years for each Digic processor generation. New smartphone image processor generation each phone cycle.


I’d really like to agree with you, but you’ve already been proven wrong. For instance, off-camera flash existed for the iPhone back in 2014 ( http://9to5mac.com/2014/09/06/review-nova-wireless-flash-for... ). That’s just a random m review I searched on.

And maybe not a fisheye, but 360 degree camera attachments already exist.

My gut is many other things either are or will be wrong soon.


The Light Camera technology of integrating multiple small lenses and sensors into a single small flat device could significantly erode the DSLR optics advantage. It will never be as good, but perhaps good enough for most customers.

https://light.co/technology


> Those are things you can never get on an iPhone or Samsung.

Why couldn't you have adjustable white balance, rapid autofocus, high response time, high bit depth, external flash, etc on a smartphone camera? These seem like things which should be possible even with today's technology.


I think once the smartphone's internal capabilities are exceeded, a "System" (as in put together from multiple parts) DSLR or Mirrorless has more advantages:

- if you need more parts, eg. flash, you go from having one thing in your pocket to carrying a bag anyway

- dedicated UI and buttons for adjustments while shooting without having to look away from your subject or having to change your grip

- much larger sensors means more light to work with during shooting and post-processing - dedicated glass that you can't quite yet replicate with light-field tech

The smartphone can do many of the things a dedicated camera can, it's just not as good on almost all fronts, and much worse in some aspects. You can under more and more conditions get images that rival DSLRs, but not ALL condititions: If you can control time, light, and subject all at once, a dedicated camera can be matched. If you can't control of only one, grab a dedicated camera.

The tactile UI is one major gripe against smartphone photography for pros and enthusiasts alike, which is why a phone to some extent needs to be smart/automatic, and while today's image sensors straight beat the pants off any predecessing image sensors, physics still poses hard limits on noise and light capture.

Today's image sensors are very close to be able to count individual photons, and making the sensor larger means being able to capture more of them at a time. Tricks are being worked on to extend dynamic range and lower noise (like double exposure HDR), so image quality still increases, but the larger image sensors profit from those developments just as much as the small phone ones. The days of small image sensors being good enough to beat a human eye are still far off.

TL;DR: dedicated tactile UI, physical interfaces, physics, can't quite be beat by all the high-tech we can pack in a smartphone package.


>adjustable white balance

You probably eventually will.

>rapid autofocus

Probably physical limitations. DSLRs have lots of space and dedicated hardware for making this fast.

>high bit depth

No clue

>external flash

sure


Mostly I'd guess they don't have the fabrication capability to do this. Nikon as far as I can tell doesn't make any of their own sensors. Canon doesn't make the sensors in their cheaper cameras (from what I can tell from googling, and my own teardowns).

Sony does, and a number of smartphones use Sony sensors.


The article includes some shot by shot comparison between the iphone 7 and 8 - I found those useful as a comparison point. For more reference, it's worth also checking his previous reviews for the iPhone 7 and iPhone6 - It gives a pretty good idea of the progress made.


A top performing smart phone "camera" is really just the algorithm that creates a psychologically appealing composition.

Note that AI is not being used to (simply) mimic a better sensor and lens, there is all sorts of stuff going on in the algorithms that a photographer would do in photoshop or in the dark room.

The problem is, there is a specific aesthetic being targeted, and this removes some of the artistry from photography.

I think there is a fundamental difference between a) the camera capturing multiple depths of field, focal points, etc., and then allowing the user to make the final decision in post production and b) the camera computationally simulating lens effects and lighting effects in the way that snapchat filters widen eyes and add animal ears.

Cameras are supposed to capture reality, not create a postcard-like view of whatever was in range or generate a flattering selfie.

These reviews should not be called camera reviews, they should be called "image algorithm reviews".

What's next, phones whose "microphones" make our voices sound more masculine or flirtatious?


> Cameras are supposed to capture reality, not create a postcard-like view of whatever was in range or generate a flattering selfie

I think that's exactly what many, many people want from their camera, and I don't see why it shouldn't be up to them.


OK so in the next release it might make human arms look more buff, teeth whiter, and faces friendlier-looking. Is this really a good thing?


Well, people have always manipulated how they look, through cosmetics, clothing, etc. This is just the latest technique.

Whether this is a good thing or not isn't a new question. My answer is, it's a normal and OK thing. However (as before) it can be taken to extremes or used for fraud. Obviously, these aren't good things.

It's new, too, so it will take some time for us to learn the limits of good taste and good judgement and for norms to develop.


You mean like art directors do with Photoshop for pretty much every magazine with a celebrity on the cover :-)


Haha my camera already does this. Filters! Also Meitu, and their other program Makeup Plus can do all kinds of crazy things with your pictures. The VR on them is really neat imo.

The thing is people can usually tell if it's extreme enough. After all we see each other in person sometimes still. :)


"Cameras are supposed to capture reality"

This seems like an ironic fantasy to me, because I don't think there is such a thing as an objective reality. Human vision is not objective in the slightest. The objective reality we think we see is actually a fiction made up in real-time by the brain (https://www.ndtv.com/offbeat/what-colour-are-the-strawberrie...).


Sure I can use a tiny fish-eye lens and some algorithms to create something that looks good, but if the goal is to start with an accurate representation of reality, then the algorithm does not necessarily help.

Some of the landscape photos and portraits shown are difficult shots that professionals can achieve after understanding lighting. Using a filter to simulate this is fine. I don't judge it. I am not a professional photographer and use filters some of the time.

The issue is calling it a review of a "camera" when it is really a review of filters. Debating over which company's fake bokeh is better is like debating whose animated kitten ear filter is more lifelike.


To a degree I do sort of like it though. Personally, I don't try to take artistic photos with my phone. To me, my phone offers an in-the-moment photo, not something I want to flex my artistic muscle with. For that, I would.. well, flex my artistic muscle. I would use my artistic knowledge and or tools (what little of both I have lol) like lenses, DSLRs, etc - to capture/create what I'm aiming for.

I understand that's not the same for everyone - but I enjoy the idea of my phone making the "pop music" of digital photography. Hell, I can't even zoom in without compromising quality. I just use it to document.


I am really afraid of the direction "camera algorithms" are going. My partner, bless her to death, prefers to take pictures on Snapchat now instead of any other app. Mainly because the filters there make your nose smaller, eyes bigger, and alter some of the other facial dimensions to make you look prettier. We are going to end up in some weird future where everyone looks beautiful digitally, but, uhh, "normal" in reality.


> We are going to end up in some weird future

I worry about that too. It may also end up being a future where we are no longer fooled by fake effects and they start to feel inauthentic.

Chances are when plastic were new, people remarked at how similar chrome-painted plastic was to actual metal. These days we can easily tell them apart.


MySpace angles were a thing, glamour shots are still a thing, I believe.

Glamour Shots - the Analog algorithms of beauty.


You can still do this with your phone - these machine learning images are all controlled as a completely separate tab in the photo app. It's the consumer's choice to (easily) turn it off and on. In my own personal tests, I found that I could better capture totally backlit faces that would have made terrible photos with portrait mode and the outcomes were great. When your objective is to capture a fleeting moment in the best way possible, this is a great advance and I welcome it.

If I want to shoot Sony Alpha SLR and edit the raws, I'll do that. But most of the time I just want to share what my family's doing with the rest of my family and this makes it really easy to do it well.


> separate tab in the photo app

Where is this? Which OS? I believe the composition optimization is not something you can turn off.

What's the difference between optimizing the landscape composition and modifying a composition of a face to make it look more friendly, or modifying a picture of arms to make them look more buff?


TL;DR: This is little more than an advertisement for luxury tour operator Ker and Downey and the luxury hotel brand The Taj Group masquerading as an Apple hardware review.


These photos all have a very "Shot on the Iphone" feel. Can't tell if it's because Iphones excel at one type of picture or because Apple/this reviewer feels the need to go all the way to India to test out a phone camera.


I got the 8 Plus (upgrade from the 6S Plus), mainly for the camera. Lugging my Nikon D610 has become a pain in the ass.

I wish iOS would allow native DNG (RAW) captures with their camera app. They added HEIC but not DNG? It'd be so much faster to snap a pic and capture DNG with the native app, rather than firing up LR Mobile / VSCO, etc.


The native Camera app doesn’t have it but iOS 10 included the ability for apps to get the raw sensor data. I imagine there are third-party apps that provide it.


While I agree, I speculate that Apple doesn't allow this simply for iCloud Photo Library backup aspects. The HEIC/HEVC save backup space. Backing up RAW/DNG would be data costly. I realize there are ways around this, but it's not Apple's style.


There are apps like ProCam that take RAW images and save them to the camera roll, they sync over iCloud Photo Library without issue. I think the normal user is just fine with JPEG or HEIF and the space savings are more important, so Apple has kept it out of Camera.app.


I'm not sure why they couldn't, though. They provide several different format options for recording video, like 1080p vs 4K.


Yeah, but at the end of the day the different video settings still use "standard" formats that a normal consumer can open on their desktop or laptop. HEIF being the real outlier, changing any of the settings buried in Settings.app for the built-in Camera.app doesn't effect the ability of a normal user to view their content on a current operating system (Windows 10 or macOS High Sierra). RAW images change all of that - Windows 10 has really basic support for some RAW formats, and there are some workarounds like RAW+JPEG in ProCam but it results in two separate files combined as one "image" in your iCloud Photo library, syncing this to a Windows PC results in two files and just confuses users (macOS works around this by hiding everything in the Photo Library, invisible to most users).


Check out the app “Manual"


That is some of the best SEO/content marketing I have seen all year. Nice camera review too. Have already passed the link to a number of friends.


There are lots of reviews popping up about iPhone 8 vs iPhone 7. What I'd like to see is a side-by-side of Portrait Mode on: iPhone 7 Plus running iOS 10, iPhone 7 Plus running iOS 11, and iPhone 8 Plus.


Isn't it possible to offer the "Slow Sync" feature on older models like 6/6s? I would assume this would be software controlled... Maybe another camera app has this already?


The marshmallow comparison was spot-on at the end :)


never been a fan of iphone camera but this is next level. For iphone standards at least.


In Today's news: A newer model iPhone has a better camera than the previous model.

Neat new features though, like that "slow sync". Why don't older models get "slow sync" though? It seems like something that is controlled by software


> In Today's news: A newer model iPhone has a better camera than the previous model.

Except that, the point of a review is to go into details - in this case to outline the ways in which it has improved, and how much each of those has improved


All of these except the "smarter sensor" seem like software features -- I wonder if it's technically impossible to get them on the iPhone 7 or if it's just a matter of differentiation.


Stage lighting uses custom circuitry in the CPU. Some Other effects are only possible due to the ISO of the new camera hardware. They’ve talked about this in interviews. So no, they couldn’t just back port everything to older hardware.


I expect you could but perhaps not with a very good experience. I think a lot of the lighting modes rely heavily on doing that work in real time, rather than in post processing. It could be that it's just too laggy without the new processor.

That and Apple make their money from selling this hardware, so having software that only runs on new devices helps sell them.


An honest iPhone 8 launch event would say "we improved the camera hardware but also had some fantastic breakthroughs in the camera software, and we'll be backporting those to all supported models in iOS 11". The pessimist in me says that nobody would buy an iPhone 8 if they did this. Wait a second, now that I've thought it through it's obvious that this is exactly why they didn't backport the software improvements to previous iPhones. :)


They said a lot of that come from the new isp, so no.


> The pessimist in me says that nobody would buy an iPhone 8 if they did this

People with an iPhone 4, 5 or 6 have more reason to upgrade to an 8 rather than a 7 than just the camera.


I think his argument is that they are not backporting other software features including those that are unrelated to the camera.


The improvements aren't just to the software. There's non-software reasons for a, say, iPhone 5 user, to upgrade to an 8 over a 7.


The ISP (image signal processor) is a dedicated circuit.

Sort of like a GPU, you could emulate what it does in software, but it'd be prohibitively slow.


Like Apple say in every single event they do: "It's the most powerful iPhone ever with the most amazing camera ever".

Of course it is.. it was announced today.. :|


It's still amazing that we see visible progress every year. apart from zoom, phone cameras are now in many situations on par with many dedicated cameras. We shouldn't take for granted that Apple can produce such improvements despite the space restrictions.


I agree, and I am happy for the progress every single year. But this marketing makes me cringe every time. Can't they think of some other adjective to describe the changes for each new thing/update?


Why is space a critical variable? the sensor in question is tiny. The rest are software and the phone's CPU.

It doesn't mean if the iphone8 was designed to be 2x thicker, Apple could have done much more with the camera quality.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: