I inspected the 3rd gen iPad image, trying to figure out roughly how much gap there is between rows, because it looks unexpectedly large. It looks like the pixel-to-gap ratio is about 9:5. This would seem to indicate that the screen is 35.7% black.
Obviously, the eye doesn't perceive this large of a gap at no magnification, but still, I was surprised at how much of a gap there is between rows.
(Contrast this to the iPhone 4S, which is about 9:3.6, or 28.6% black.)
But it's a power efficienty issue. The black areas are covering up backlight areas that otherwise would show through as useful illumination. There was a discussion yesterday about this in the iPad teardown thread. I was surprised about the apparent power consumption, and this was posited as one of the reasons.
It's painful to see that some people have OpenPandoras, and use them to take reference images to show differences in LCD (and OLED) screen properties.
Me, I just want to hold it. Smell it. Feel it. I'm sure I will too, in another 2 months from now.
(Explanation: the OpenPandora (http://openpandora.org/) is a custom portable QWERTY Linux clamshell ultra-console. Or it was, three years ago when I ordered mine. Still waiting for delivery, it's a very small operation and they've had some hiccups along the way.)
Man this guy's got a lot of nice toys. What are the gaps in pixels for? Circuity? Or does it have something to do with the amount of light that needs to be transmitted vs. horizontal/vertical resolution of the eye to represent a square pixel?
Each subpixel needs a transistor (hence TFT - thin film transistor) to drive it, along with the addressing conductors to drive it. I suspect the transistors have to have a certain minimum size to be reliable enough, so their relative size to the size of the whole pixel grows with decreasing pixel size. I'm sure they'd make them smaller if they could, as the black areas absorb a lot of precious backlight energy. I also recall reading about progress with making the transistors (which are typically silicon-on-glass) transparent, but I guess that tech isn't ready yet.
Although, I can still perceive the individual pixels if I look close enough. This is what I expected, because I often hold the iPad at about the same distance as an iPhone, and the iPad has still lower res (326 vs 264 ppi). It's not yet this "device that hides its technology in plain sight beyond any human capability and therefore appears magical". But it's very, very beautiful anyway.
Here's hoping that Apple will someday produce a 3x display with 396ppi :-D
So I've been comparing my ipad3 to my ipad1 (I didn't get an ipad2 because I only wanted the higher density screen)
Back in 1999 I saw a technology demonstrator at IBM's Almaden facility in San Jose where they showed a 200 ppi LCD display. It blew me away. The scientist who worked on it talked about eye strain and how 200 ppi was the 'magic' spot where more thank 75% of the people stopped noticing the strain of making lines out of pixels.
So after unpacking my ipad3 one of the first things I did was to download the Economist app (my favorite btw) and bring up the latest issue on both the ipad1 and ipad3. Side by side the difference is very visible, absolutely can distinguish between the ipad1 and 3 with p < 0.01 :-) More interestingly comparing the ipad3 version to the print magazine its clear that the ipad3 is pretty much identical from my eyes perspective (disclaimer I do wear glasses, no 20/10 vision here).
Now if I could convince folks that send me catalogs to offer an ipad version I'll be in good shape.
There's a big spread to visual accuity. There are two big factors: receptor density on the retina and near-field focus ability. The former doesn't vary too much I'm told, but the latter can be 3-4x better in some people than others. I can focus reliably on a high contrast object at about 6" or so. Some people can apparently do it at 3-4". And of course it gets worse as we age. Far sighted adults have trouble reading at arms length, even.
I can certainly believe that. I'm a long way from perfect sighted, but I am apparently close to the average once corrected. The fact remains though that the pixel density is comparable to phones released a couple of years ago, and the distance between 'retina' and 'non-retina' is mere inches.
It is the old audiophile argument I guess, we'll never know for sure without double blind testing. I would like to give that a trial, sit people at 10" from a black mask screen with a small rectangle in it, display a pure white image at the lowest brightness and see what people report.
.. Apple’s definition of a “Retina Display” is actually for 20/20 Vision (defined as 1 arc-minute visual acuity). However, 20/20 Vision is just the legal definition of “Normal Vision,” which is at the lower end of true normal vision. There are in fact lots of people with much better than 20/20 Vision, and for almost everyone visual acuity is actually limited by blurring due to imperfections of the lens in the eye. The best human vision is about 20/10 Vision, twice as good as 20/20 Vision, and that is what corresponds to the true acuity of the Retina. So to be an actual “True Retina Display” a screen needs at least 573 ppi at 12 inches viewing distance or 458 ppi at 15 inches. The 326 ppi iPhone 4 is a 20/20 Vision display if it is viewed from 10.5 inches or more. ..
Apple doesn't do that kind of stuff. Most people will not be able to see the pixels (at their comfortable viewing distances). That’s good enough for them (also considering they are streets ahead of everyone else in the pixels department).
3x would cause antialiasing artifacts for apps with images optimized for 2x retina displays. Apple decided to wait for pixels to be 4x smaller before jumping to a higher resolution, so they must think these artifacts are unacceptable. Therefore, the logical next step is 4x displays with 528ppi :)
I think there will always be room for e-ink. This afternoon, I took an hour off to sit on the balcony and read a bit. It was so bright outside that I was barely able to read anything on the iPad. Switched to the Kindle, worked perfectly well.
I think it's a technology that fills a niche that LCDs or OLEDs won't fill for a very long time, if ever.
I don't think so. Tablets are for people that want to get the most possibilities out of their $300. e-ink readers are for people with $300 that want to read books with the least eyestrain. I read on my phone and it just doesn't feel like reading. When I use my Kindle, it does.
(I think the problem is actually with page turns rather than the pixels; turning pages on the Kindle Android app is very flaky. On the physical hardware, it works much better.)
It would affect subpixel rendering of fonts, but I think the iPad doesn't do any subpixel rendering. Not 100% sure though.
It does create another strange effect, though. Since the screen effectively redraws sideways if you hold the iPad in portrait, scrolling vertically quickly makes stuff look a tiny bit distorted; horizontal lines look a bit slanted (like taking a picture out of a car using a cellphone camera, but much less so).
You're right about not doing subpixel anti-aliasing. It would need to change the direction of the subpixel anti-aliasing on a switch from landscape to portrait, which would presumably lead to strange visual effects.
It's not tearing, it's just a slightly slanted picture, because the lines on the left (or right, depending on how you hold the iPad) redraw a tiny bit earlier than the ones on the other side of the screen.
I find it fascinating that the human eye can pick up on that kind of effect, considering the screen is driven at 60Hz (16.7ms). I haven't noticed it on the first-gen iPad , but it suddenly becomes apparent when turning some desktop screens into portrait mode. There must be something about horizontal movement that the eye picks up on. Either that or we're completely desensitised against vertical refresh because all screens we've grown up with refresh that way.
 I can only assume the new iPad's LCD is more inert in some way? Smaller pixels, slower crystal?
I think one of the reasons why we don't notice it on desktops is because there are so few vertical lines. Basically, it's the window edges where it might be noticeable, when dragging windows sideways.
On the iPad, on the other hand, there are tons of horizontal lines in a table view, for example.
Also, I thought about the 60 Hz thing, and I think high frame rate is a bit of a red herring. After all, the lines actually are slanted; the higher the frame rate, the fewer degrees they're slanted at the same scroll speed, but whether we notice the slanting is independent of the frame rate.
You're right, I'm able to make out the effect on my desktop screen when dragging windows from side to side. I guess we only scroll vertically most of the time, not horizontally, so it doesn't come up. That suddenly is no longer true when you rotate to portrait orientation.
Any chance you have a Motorola Droid (1st one) hanging around? I'd love to see how that screen compares. It was near-retina, at least by the look of it, and I think was AMOLED (since you mentioned the Nexus One).
The microscope only shows a sharp picture at a specific distance; as soon as you tilt something, most of the picture becomes so blurred that you can't see much anymore. And there's not enough space between the lens and the focus point to really tilt the device much, anyway. Sorry!
That's not true at all. Maybe for the given shape of the pixels on these displays, but as a general statement, this is false.
If you have red, green, and blue subpixels that have exactly a 3:1 height-to-width ratio -- thus forming a perfect 1:1 square -- then you could have zero black space on all sides, and the pixels would appear perfectly square.
What's funny about this is just how distorted the pixels are. Ever since I've understood what displays were (and CRTs before that) I've always imagined this idealized array of perfectly round, very tightly packed red green and blue circles as the pixels. Even looking close (not under a microscope) they look like a grid of squares... not these rectangular things with lots of space between them.
They seem to be getting more distorted over time. Maybe when they're so small its less of an issue. I'm surprised at how the red and blue on the kindle fire are so much bigger than the green. The eye is most sensitive to green... yet on the PSP Vita the blue seems to be the smallest of the pixels (meaning the blue pixels will have to put out more light to compensate, or the screen will have a green/red color cast.)
The OpenPandora shot doesn't look sharp because it has an anti-glare coating. Some of the others may be a bit too bright for the cheap microscope I'm using, so they're a bit blurry. But mostly, the pixels really do look that way; most of them aren't rectangular at all.
Thanks for the pics! Since you have these devices and can simply count up the number of pixels and divide by the size of the display, would you please consider photoshopping in scale bars to reflect the length of some unit of measure?
No, that's how PenTile screens work. There are twice as many green pixels as red and blue ones (one "visual" pixel is made up of a green and red pixel, or a green and blue pixel, so there are two sub-pixels per pixel). Thus, the green pixels are smaller.
Have you actually seen an actual display of Galaxy Nexus? There's absolutely no problem that you are describing in the "horrible PenTile OLEDs" link. I've seen an iPhone 4 display, and I can tell you it is worse than the one on Galaxy Nexus.
I just picked one up and I'm kind of disappointed. There are 3-4 specks of dirt under the glass, and with the exception of the amazing display, I'm not sure it's worth the upgrade from the iPad 2. Probably going to return it, and either get my money back or get a device that has a clean display.