
Retina display Macs, iPads, and HiDPI: Doing the Math - tuhin
http://www.tuaw.com/2012/03/01/retina-display-macs-ipads-and-hidpi-doing-the-math/
======
ebbv
Mostly a good article and I agree with his main points, but I have two big
disagreements.

\- Viewing distances. I only sit about 16-18" away from my 13" MBP screen and
only about 24" from my 24" display. This varies obviously as I don't sit in a
locked position all day, but I think he's erring a bit too high on estimated
view distance, which means his necessary resolution to reach "retina" level is
too low.

\- Screen size. Right now the 13" MBP I'm staring at has a very significant
bezel that I would like to see mostly go away in an upcoming model refresh.
The iPad's bezel makes sense since it's meant to be held in the hand. The MBP
only needs enough bezel to fit the camera up top and needs none on the sides
or bottom of the screen.

But yes, his overall point that Apple does not need to go so far as screen
doubling on laptops and desktops to achieve pixels that are indistinguishable
to the human eye is correct. I just think the resolution at which that point
is reached on laptops and desktops is a bit higher than what he's calculated.

~~~
penllawen
Hi ebbv, I'm the author of the post.

I've had a quite a bit of feedback that I've moved the viewing distances too
far out. I measured from my own experiences, but I guess I must be atypical.
It might be because I use dual monitors on the desktop (I have a 27" and a
26", so I sit back to reduce head turning) and slump when I have a laptop in
my lap.

Anyway, I've expanded the spreadsheet that goes with the post to include some
extra settings for closer distances and for your specific distances and
devices.

Hope this helps!

Here's the spreadsheet:
[https://docs.google.com/spreadsheet/pub?key=0Aq8W2-V7OXqfdGV...](https://docs.google.com/spreadsheet/pub?key=0Aq8W2-V7OXqfdGV3OFJ5R1RxOHJjMFRfYW5VbThORXc&output=html)

------
robomartin
The landscape is getting a little messy for app developers. Not saying this is
bad. It's just a fact.

Today you have to deliver _.png,_ @2X.png and *~ipad.png image sets with your
app. And, there is no off-the-shelf way to reuse @2X images with the iPad when
in most cases they'll work just fine. You can, but it requires creative
coding.

Still, this results in app packages that are bloated with image assets in
triplicate and now soon to add a fourth version.

If you build a universal app it seems that even someone downloading your app
onto an iPod Touch is going to end-up with @2X, ~ipad and ~ipad2X (or
whatever) images that the app will never use.

Maybe this is the beginning of the end of the universal app?

~~~
AceJohnny2
Maybe it marks a new rise in using vector image formats? (note: I am not an
iOS developer and do not know the level of support iOS provides natively for
vector formats)

~~~
coob
You can render PDFs as UIImages[1], there are categories around that make this
easy[2].

However, this is never ever going to be as cheap as loading a converted PNG
(which Apple's modified pngcrush converts for you). I think a lot of devs have
the draw/vector vs. precomposed bitmap tradeoff the wrong way round.

Drawing all of your gradated UIButtons with CoreGraphics methods is a false
economy compared to just loading a stretchable PNG. Almost all of Apple's UI
system imagery is bitmap based, and for a good reason.

[1] [http://mattgemmell.com/2012/02/10/using-pdf-images-in-ios-
ap...](http://mattgemmell.com/2012/02/10/using-pdf-images-in-ios-apps/)

[2]<https://github.com/mindbrix/UIImage-PDF>

------
seanalltogether
"Retina Display" should have a very clear definition. It should refer to
resolution at which anti-aliasing becomes unnecessary. Anti-aliasing is
rightly classified as a hack placed on top of modern drawing systems, and one
of the reasons that many games don't support it out of the box. With a high
enough resolution, anti-aliasing technology will become irrelevant.

This is going to be different for each resolution depending on the distance
that you view it at, I built a quick image that you can test this on.
<http://dl.dropbox.com/u/1437645/alias.html> Put that on your phone or desktop
and see how far you have to step back before the aliasing affect disappears

------
alok-g
From OP: "makes a solid argument for why an iPad retina display must be pixel-
doubled -- i.e. 2048×1536 -- and not some intermediate resolution (just as was
the case for the iPhone 4 before it). Anything else means every single
existing app either has to re-scale art assets -- resulting in a fuzzy display
-- or let them appear at a different size on-screen -- resulting in usability
problems as the tap targets are resized. This is because every single existing
iPad app is hard-coded to run full screen in 1024×768."

Doesn't this suggest that Apple is getting bitten by backward compatibility to
the mass of pre-existing apps, just like Microsoft got stuck with the mass of
existing software running on Windows (and also actually users who get too used
to existing UIs/UX)?

~~~
lloeki
Not exactly. This theoretically only pertains to bitmap UI elements (mostly
icons). Sadly bitmaps are everywhere, notably because they're so cheap to
create and render compared to vector, and more computation implies more
energy. I suppose one could generate a cache of rasterized vector UI elements
to cut on subsequent rendering.

Apple tried to bring resolution independence to Mac OS X since quite some
time, and in all honesty it worked well... for vector stuff. It broke in
varying ways across iterations of it every time there was a bitmap involved,
in which case they were at best either blurry or unscaled. There's no miracle,
unless you generate bitmaps for numerous multiple sizes (like in icns files,
where they range from 16x16 to 512x512, downsampled if scaling is needed, like
on the Dock), initially small bitmaps will just look bad unless you use a 2x
factor, in which case you will at best have no improvement (but no loss
either) over a non 2x screen, or you have an uncanny effect when a 'fat pixel'
bitmap stands near a 'thin pixel' vector curve. Anyway as noted by robomartin,
things are sufficiently bloated already not to include full-scale 16->512
bitmaps.

What's more 2x is computationally way simpler and much less costly for
everyone. The only non-hackish, seriously viable alternative is to go all the
way vectorized. A typical case of 'less is more'/'worse is better' if you ask
me.

[0] <http://news.ycombinator.com/item?id=3658369>

------
spitfire
This is a very good post. One I've thought about in a different context
before: How much bandwidth/resolution do we really need.

The human senses have an upper limit of resolution, once we reach that limit
further progress is irrelevant. So once everyone is streaming netflix at
limitx2, Where does further bandwidth/storage demand come from? Growing
populations? There's a limit to that growth. "big data"? Hardly.

We're rapidly approaching the point where individuals' need for further
storage is exhausted. I think it'll be somewhere in the 10-100PB range. Which
is pretty damn close.

~~~
OpieCunningham
How much data does a 2 hour immersive holographic "film" consume?

~~~
a-priori
Assuming the holographic system uses voxels, it would need to be capable of
displaying 108,900 (330x330) voxels per cubic inch to allow the same level of
resolution as an iPhone 4 if it produces a flat surface. This would allow
Retina-quality images when viewed from at least 11 inches away.

Assuming also that one voxel is 32 bits and not compressed, then the hologram
would be 435,600 bytes per frame per cubic inch. At 24fps (you did say "film")
that's 10,454,400 bytes per second per cubic inch.

Let's say it projects a hologram to fill a room the dimensions of a Star Trek-
style holodeck, a cube of maybe 10 metres on a side. That's about 400 inches
on a side, or 64,000,000 cubic inches. That means that a holographic film
would be 669,081,600,000,000 bytes (608.5 terabytes) per second.

So, to answer your question, a 2 hour holographic film would be
4,817,387,520,000,000,000 bytes (4.178 exabytes) in size.

~~~
swalsh
Fortunately, i'd be willing to bet you could use some kind of Occlusion
culling to compress that stream quite a bit.

~~~
a-priori
Yup, there's surely a ton of ways to compress that. Occlusion culling would be
one, since you'd never see the insides of objects. For another, modulo
atmospheric effects like fog and smoke, there'd be a lot of empty space
between objects that even run-length encoding could compact quite a bit. I'm
sure that even with lossless compression you could get it down to the petabyte
range.

------
joejohnson
This is a great post. If an iPad 3 has the "retina" display with resolution
2048×1536, I still think that a similarly high-resolution display will follow
soon after for the soon-to-be-unified MacBook Pro/MacBook Air lines. If these
high-resolutions screens are available at 9.7" for the iPad, it's hardly a
stretch to manufacture them at 13" with a slightly reduced DPI.

------
wmf
There's an elephant hiding behind "UI elements would look proportionally
larger": such a non-doubled Retina display _would show less information_ than
the old non-Retina one. This makes perfect sense in the Lion (and Metro and
Unity) single-window world, but not for people who do work.

------
crgt
Is anyone else wondering about speed/performance issues when scaling up iPad
graphics like this? Apple must have a hot new (A6?) chip up its sleeve that
will enable performance on par with what you get from an iPad 2 now while also
smoothly handling the larger images necessary for an iPad retina display. If
not, what's the point outside of HD movies? Who cares about retina display if
graphically intensive apps all chug? Or if devs have to use non high res
images in order to preserve performance... Will be interesting to see how it
shakes out.

Here's hoping for an awesome chip that will make all of the graphics
production rework worth it..

~~~
Drbble
Well static images need hi res more than dynamic video, so animations could
run at lower resolution and get blurred up with no user experience
degradation.

------
ewanmcteagle
This can't be correct when it comes to the larger screens. I found the text on
the iPhone 3G to be fuzzy but not on the iPhone 4. I don't like Apple's
antialiasing but on the iPhone 4 it doesn't matter anymore. Antialiasing
becomes irrelevant. However, my 17 inch Macbook still has text that looks
fuzzy to me and the antialiasing is still bothersome. I'm really hoping that
there is some increase in resolution for the larger displays coming.

