Apple cheats because they failed to build a true density independent UI toolkit that actually works. Thus you end up with hacks like this where the UI is in a permanent state of filtered hell because the UI toolkit is just broken.
Android has proved this can work, and work fantastically. The article tries to spin it as being broken somehow, but clearly they never actually worked with Android much because this all works so, so well.
You don't need whole multipliers when you work in DIPs instead of PX.
The point of the article is that the iPhone 6+ renders to a larger offscreen buffer and then downscales to a 1080p panel. This is the same as the higher resolution modes on Retina Macbook Pros.
The output will be slightly worse, but at 400 PPI you probably won't be able to tell without a magnifying glass. The current Retina MBPs do this in their higher res modes and it's pretty hard to notice.
On battery life: perhaps the tradeoff of using a 1080 panel outweighs the scaling.
Also it leaves the door open to switch to a full resolution native panel in an iPhone 7+ with no hit to existing apps or developers.
I do hope that there is some sort of programmable setting or API for enabling native resolution output so that games built with this option are rendering at 1080p and displaying at 1080p, or 1080p movies aren't needlessly being stretched and then downsampled. For legacy apps and not graphically intensive stuff it mostly makes sense to just display at 3x and scale down for simplicities sake, but if the option isn't otherwise there that seems pretty silly.
Furthermore, have you used auto layout in Xcode 6? Not sure what about it doesn't work. I've got an app in review right now that's ready to support iPhone 6 and 6+ without any extra work because of Autolayout. I'm not saying it's perfect, but I'm just not sure how you can claim that it plain doesn't work. Also not sure why that has anything to do with why they're scaling @3x down to 1080p
I've used a couple of Android devices, with latest-ish OS versions and "Fantastically" is not a word I'd use in relation to their "density independent" operation, and GUIs in general.
>You don't need whole multipliers when you work in DIPs instead of PX.
You don't work in PX in iOS either. There have been logical units enforced from even before Android had a hi-DPI device available...
This is more about image (bitmap) assets apps have to use, and how they best adapt.
Care to state why?
As for iOS's @1x vs. @2x images, think of them as your drawable-*dpi resources on Android.
For transition purposes, apps that haven't been updated yet will simply be scaled up (everything on screen will be slightly bigger). This is similar to when the iPhone 5 introduced a lengthened screen and apps were letterboxed until they submitted an update confirming that they were compatible with the larger screen.
The introduction of the taller iPhone 5 screen, and Apple's subsequent rejection of "letterboxed" apps (ones that don't make any use of the additional screen space) mean that most developers will almost certainly have switched to autolayout.
I can open up an Android layout file and almost instantly understand how it works and what I have to do to change it.
Since discovering it I've fallen in love with Autolayout and its abilities — where I used to dislike it before.
It's a bit like imperative vs. functional programming. In the former model, I have an idea in my head and I need to nest containers step by step to get to the final form. In the latter I'm simply building up a ruleset.
But like I said, that was my personal preference. I guess we'll have to differ on opinions ;)
And it does have a density independent UI toolkit. It just only had 2 densities until now, whole multiples of one another. Which made it a lot easier for developers to update a million apps in short order.
Now there's one more. But the "cheat" here of keeping it to a whole multiple from the dev perspective is undoubtably an intentional choice to keep it easy for developers to adopt it. Which is smart when your platform has a lot of apps and you want them all to work perfectly on all devices.
Compare that to six dpi levels for Android. It can theoretically accommodate a greater diversity of devices but developer support for them is scattershot.
But let's face it, that's one less thing we can mock Android devs about. Pixel perfect Wysiwyg ui building in interface builder has one foot in the grave.
You should learn auto layouts using Interface Builders, otherwise it's a nightmare because you're not getting the feedback you need to learn it right in the first place.
You need to start using priorities right, specially when you are doing >=, <= constraints you also need to add a low prio == constraint. It's not that hard, but you will be really frustrated if you keep clinging on to your coded views.
Try to chop up your Storyboards if you work in teams. They shouldn't be so big that you have trouble following the flow on one monitor.
Also it's much more documenting to they guy that's going to maintain your screens later on.
But the whole point is that auto layouts from IB can be controlled from code as well so you don't have to write the code for the auto layouts completely in code. Writing auto layouts in code is laborious and it's easier to make mistakes.
So if you "park" a low priority constraint for the end point or even the points in between you can let them take over by changing priorities only. Also you can animate most other properties as well if you need it.
If you don't believe me, look at this blog article that measures it: http://floriankugler.com/blog/2013/4/21/auto-layout-performa...
P.S. At least on the iPhone. The iPad mini (sporting the same resolution as the physically bigger brothers) will produce smaller UI elements.
As for "downsampling" of new 3x assets, the main image processing artifact to look out for is moire which happens to images with small repeated details. There are techniques to mitigate these artifacts but it will have to be a very smart algorithm to avoid artifacts in all cases.
OTOH, Android seems to have no problem handling small resizings of images within their size buckets system.
My grock of the article is that there will be artifacts, its just that they are difficult to notice at that pixel density.
When working, I keep mine set to smallest/most-information setting available by default (dubbed "Looks like 1920x1200"), but when relaxing I often give my tired 40-year-old eyes a break by bumping it down to "Looks like 1680 x 1050" and making everything uniformly bigger and easier to read.
When you look closely, of course it doesn't look as good as an exact match to the display native resolution does, but it looks pretty great. I think it will look even better on the 401ppi iPhone 6+ display, and the increasingly high-res screens of the future.
Tying logical display resolution to actual physical pixel resolution is pretty 💡🙉 at this point.
That's what Windows 8.1 does when you have two screens with very different DPI (physical). When I connect my Yoga Pro 2 (with "retina"-like resolution) to my external 23" screen, it renders most applications at the high resolution, and then downscales them so they have the same size on both displays. The biggest problem with this approach is that crisp fonts on windows rely heavily on rgb sub-pixel-rendering. If your application doesn't directly paint to individual pixels, it can't control the subpixels, and can only do greyscale antialiasing, which looks very blurred (especially next to my Yoga Pro, where text looks like printed).
You can tell windows "I know what I'm doing, give me the raw pixels", but that comes with a whole lot of other problems, and I've never seen a single app get that right.
(What I'd really like as a developer would be to develop in (real) cm, inches, and em. Have them exactly match the physical sizes, for mobile and desktop screens (projectors would pretend to be a desktop). Have an option to "snap" lines to physical pixels. And on non-hidpi screens, render everything with subpixel-awareness, which triples your horizontal resolution.)
"Reasonable" and "not-overdesigned" are rarely concerns that last into the future. And I don't think they even hold up given today's (high-end) technology.
Why on earth would font size be the arbiter of anything other than font size? Some programs don't even display text.
Logically, it makes a lot more sense to scale the entire display (or, sure, subsets of it). That's still technically challenging for some of today's systems, but not all of them. And it will be a vanishingly small number of them as we move into the future.
An iOS developer has to support eight different screens these days to cover all devices that will run iOS8:
* iPad 2
* iPad mini (the last non retina devices, same resolution, different DPI)
* iPhone 4S
* iPhone 5/5C/5S/iPod Touch 5th gen
* iPhone 6
* iPhone 6 Plus
* iPad 3/4/Air
* iPad Mini Retina (again, same resolution, different DPI)
Many devs will ignore the DPI differences between the iPad and the iPad mini, but that isn't ideal.
I think the iPhone 6 Plus should have had a 1600x900 display to keep the DPI in line with the previous phones, however Apple was probably afraid to be compared unfavorably to the Android phones that have 1080p panels.
* iPad 2 = iPad Mini = iPad 3 = iPad 4 = iPad Air = iPad Mini Retina
* iPhone 5 = iPhone 5C = iPhone 5S = iPod Touch 5th gen
* iPhone 4S is a "sort of" outlier. If you're using the auto-layout stuff that's available it "just works" on a 5* phone. The exception is apps that aren't widget-based, like games.
* iPhone 6 and 6+ seem to have different logical resolutions, so there's that. But again, auto-layout helps here.
The "different DPI" thing is irrelevant. Get the UI elements to have the same size proportionally and you're set.
You have to support only 6 pretty much, which comes quite close to how many you have to support on iOS right now (you mentioned 5):
ldpi (low) ~120dpi
mdpi (medium) ~160dpi
hdpi (high) ~240dpi
xhdpi (extra-high) ~320dpi
xxhdpi (extra-extra-high) ~480dpi
xxxhdpi (extra-extra-extra-high) ~640dpi
Contrast that with iOS. The latest version adoption rates are something absurd like 90% within the first month. You can actually take advantage of those spiffy new APIs without having to worry about limiting your audience. The limited set of resolutions is icing on the cake... far less to worry about.
I remember at my old job, we had an issue with a game being shipped because a handful of phones wouldn't be able to get past the first screen after hitting play. IIRC the issue was around the processor (or something hardware related) not having the ability to render certain things from Unity. It affected about 5 phones, but the published deemed it big enough to not be shippable.
API/Platform fragmentation is the big problem. Supporting multiple APIs and gracefully degrading to older versions is much harder than being able to scale your UI. Apple has been pretty good with keeping customers devices as current as possible. Android, not so much.
>Many devs will ignore the DPI differences between the iPad and the iPad mini
That's because iOS-only developers were spoiled and got lazy. If your app breaks because now it has to work with a slightly different aspect ratio, or doesn't take advantage of a higer DPI screen, that's your failing.
This is highly dependent on which kind of APIs you are interacting with. Most API families are part of Google Play Services now which means they are generally available back to 2.3. Framework API differences between platforms are generally small and haven't been much trouble to work with, although some types of hardware interaction changed significantly between some platform levels (e.g. audio).
Yes it is fragmentation. Just because it is "solved", doesn't mean it doesn't create extra work, extra bugs, and from a test matrix perspective, more compromises.
Again, iOS developers got spoiled by not having to deal with this stuff for the last few years so maybe it seems like it's a big deal. The reality is, once you stop expecting a specific screen properties, it really isn't that big of a deal. It's just reality.
And no, it's not fragmentation. There is absolutely no reason to freeze out certain users because of their screen-size, or not provide certain features.
Android has 5000 devices, differing OS versions (half of the devices out there use a 2 to 3 year old OS), and differing hardware capabilities (including different qualities in stuff like sensors, some of them even being nearly useless).
And Apple is losing to that because it has "8 different" screens?
For one, you can support most of them (all iPads for example) with the exact same code plus 2 sets of bitmap assets: normal and 2x.
As for the iPhones, they transitioned without a hitch to 5 taller display, and the same will happen to 6/6+.
That stat is pretty out of date by now. Android <= 2 now only accounts for 12% of devices, and most app devs aren't targeting it any more.
And Android might have 5000 devices, but it doesn't have 5000 screen sizes. Still more than Apple has, but hardly 4992 more.
Of course, operating system fragmentation has become less of an issue generally since ICS/JB became dominant.
Most modern phones are 16:9 (720, 960, 1080). There are still a few stragglers (Motorola in particular), but they're becoming a thing of the past.
You basically have three aspect ratios to worry about right now: 16:9, 5:3, and Motorola's 427:240.
With iOS, you have 3:2, 4:3, 71:40 (iPhone 5), and now these new iPhone 6 internal aspect ratios that get hardware resampled.
So, it all boils down to: 4:3 (all iPads) and 1.177 (5, 5S, 5C, 6, 6+).
3:2 is just a legacy ratio that will go away going forward as the 4S gets older. It's not a ratio they have produced iPhones with for several years now.
However, as you point out, this is not true for the iPad.
* Designing for iPhone 4/4S and 5/5c/5s is almost the same, at least in portrait orientation. Most layouts are a little flexible vertically - and they have to be, because users might be in a call, or have the personal hotspot active, in which case the OS reserves some space for an additional status bar. Apps have to be flexible vertically, period.
* It is much harder to support all the larger text sizes that are available in iOS 7. Much of the iOS UI is built on UITableViews, and these "tables" are terrible when it comes to line breaks. I have increased the text size and many apps can't handle it, including some of Apple's own.
* If you are extra masochistic, enable "Button Shapes" and "Reduce White Point" in the Accessibility options in a custom-styled iOS 7 app. For example, this will darken tint colours, which can make buttons harder to read if they are placed over an image (which the OS will not darken).
Is it even possible to tell the difference between the iPad and iPad Mini in software? When it first came out, it wasn't, on purpose.
Most developers do ignore this, and Apple encourages ignoring it. Incidentally, this is why tap target sizes (button sizes and such) are a bit bigger on a full-sized iPad than on an iPhone; to allow for the scale-down.
The iPhone 4 and ipad3 both suffered seriously by having 4x the pixels as their predecessor but only 2x the GPU. Mostly games suffered but it's still an issue elsewhere with fancy animations, transparency, etc.
But this is all speculatory theorycrafting at this point
I think without question the 6+ will have worse per pixel perf than the 6. It would be even more worse if the screen were even more higher res.
That's not enough to make or break anything. But it's got to be another consideration.
I do appreciate the technical explanation, but as Apple has done this before, for a quick explanation it's sufficient to point to the rMBP precedent.
I don't know that. Unless you're talking about scaled modes, which do resampling, the retina models render at the native resolution. They will scale apps by 2x that are not capable of rendering at the native resolution (not many).
I wanted to share pxcalc  for anyone doing DPI/PPI calculations, please be aware it exists (and it's awesome).
Gee -- what's wrong with y = √(w² + h²)? It's much easier to interpret.
y = sqrt(w^2 + h^2)
Raster-based pixel manipulation applications suck in this brave new hidpi world!
I've heard 1x pixels called "device-independent pixels", "CSS pixels", and "points"; but I've never seriously heard the term "device-pixels-per-inch" until this article.
In my mind DPI == "dots per inch", PPI == "pixels per inch", and they measure the same thing (although some people frown upon using DPI in a digital context). The author redefines them to mean "device pixels per inch" and "physical pixels per inch", where PPI / devicePixelRatio == DPI. I wasn't sure if that's an OP-thing or an iOS thing.
I don't think smoother text (and pictures) is anything to be trivialized. I mean it actually looks much much nicer and (imho) comforting - I experience joy anytime I look at anything that's above the 100 or so ppi of average 1080p monitors.
People spend huge swaths of money on the looks of things that they'll hardly ever be looking at. Considering this, spending the money on a screen (i.e. something they'll litteraly be staring at for large periods of time) is actually a borderline-rational choice in comparision..
I experience joy when I can go online and check my emails when I'm in the bus or outside.
What I'm getting at is, priorities may vary. You might care for portability of checking emails outside, I'd rather have crispy text because I'm working with all day inside.
If there's more resolution than the display (or the observer's eye) can support, the extra resolution can be used to make diagonal lines look better (as one example), through a sophisticated drawing method that eliminates jaggies. I'm not saying this is always true, but it's on the list of options to exploit the extra resolution.
> I mean it increases battery usage, it costs more, and all you get is smoother text and smoother pictures on a less than 5 inch screen.
That's pretty much it. But consider that people who buy Apple phones know they're not buying the least expensive, most cost-effective devices, so maybe they expect to get something not present in the bargain phones, like resolution that no one can possibly notice. Sort of like having a refrigerator whose light stays on when the door is closed.
There's no brilliant magic golden ratio formula behind Apple choosing 1080p. They simply wanted to choose a "regular" resolution for once, while still being bigger than 326 PPI.
If 720p would've been more than 326 PPI for the 4.7" one, they would've chosen that, too. The reason they didn't is because 720p is less than 326 PPI, and they didn't want to have 1,000 headlines out there say how "iPhone 6 has less pixel density than previous iPhones".
In my opinion they should've done that anyway, and end it with the stupid resolutions that Apple keeps using just because they didn't have the foresight to add resolution scaling early on to iOS, because they never imaged they could be building any other mobile device that isn't 3.5" in size. They messed up the "perfect scaling" for developers with 1080p on the Plus anyway. They could've done it on the smaller one, too. It's silly that they didn't do it.