This article reminds me just how much Apple has driven technology forward.
For clarification for Android lovers... This doesn't mean that Apple invented all of this.. It's a recognition that without someone driving expectations of the smartphone industry, it stagnates - Apple is the driver.
For clarification for Apple lovers... This doesn't mean that Apple invented all of this.. It's a recognition that without someone driving expectations of the smartphone industry, it stagnates - Apple is the driver.
The upcoming nokia 920 is expected to have a display even better than this. Atleast that is what they claim. It will be interesting to see it validated by Anand.
I'm not sure I crave for sRGB gamut and perfect color accuracy in a smartphone. Isn't more important that it has eye-popping contrast? When was the last time you needed sRGB and compare that to how often you look at the phone with a lot of ambient light or possibly reflections. It seems that Nokia here is much more focused on the relevant innovation here:
I believe you are right that contrast is more important than both gamut and color accuracy, but a higher gamut will make images feel more alive and give them more "pop", especially in the reds and yellows (as the article discusses).
A high contrast ratio helps prevent banding artifacts in dark shades of gray.
A high gamut does the same for color.
With a high contrast display with a low gamut (a display that can't output a very wide range of colors) you might see banding in reds or yellows even. The brain will compensate for lower gamuts but the image will look more washed out. As an example, a color photo in on newspaper is clearly less vibrant than the same image in a glossy magazine or online.
Also, color accuracy and gamut are not necessarily the same thing. The typical human eye can see far wider range of colors - a wider gamut - than is possible in any display or print technology. Displays are a compromise. sRGB is far less than the human eye can see, which is why image editors use the wider ProPhoto RGB or Adobe RGB gamut (color spaces) and then after editing cleverly compress down to sRGB (or for particular applications, the exact color space of a particular printer / paper combination or of a particular display).
Low-gamut means some colors are missing and compressed to colors that can be shown - so the richest red becomes an orangish-red. While inaccurate, this isn't quite the same the same thing color-shifting all colors in an image, such as the yellow-cast you might see with indoor lighting, or blue-cast with outdoor shadows. All displays introduce color-shifting inaccuracies as well. High-gamut displays have a wider color pallet and are generally tuned better at the factory.
I expect high-gamut displays will just "feel" better in the same way retina / high-res displays do.
Does anyone know if there is a way in iOS to apply a color profile to the display? Even if the displays are well-calibrated for accuracy out of the box, there are other reasons to mess with the color rendering (colorblind users, etc.).
What I'd like to see is this exact analysis done to the S III, even just to have some no bullshit data when the 'my phone is better than your phone' comes up again.
Talking purely from a consumer perspective, you want to argue that one phone reproduces accurately the full RGB spectrum and another blows it out of proportion and that makes one phone better? Isn't that a matter of subjective taste? How does that make one phone better than another? For eg. I may like to turn the color all the way up on my TV. That doesn't mean my TV's picture is objectively bad or the other way around.
From my perspective, the issue with selling a phone that is over-saturated versus one that is accurate is that in one case a company is just shipping something that meets the standard, and in another case they are making that subjective choice about what looks better for you.
With monitors and displays this isn't as big of a deal, as you can adjust it yourself to what you want, but with smartphones and tablets there are few or no adjustments available so if you receive a display that is purposely over-saturated, you are stuck with that. Perhaps it catches eyeballs in the store and gets you more sales, but it also means you are stuck with it. If a company isn't going to provide a way to adjust the display (such as including presets that are calibrated, vivid, and so on) then I'd prefer they just ship something that tries to conform to the standards that exist (sRGB or AdobeRGB in these cases).
Color accuracy is certainly not the defining aspect for the usability of a product, but it is a way better aspect than something like clock speed - which is something almost every manufacturer feels required to mention when they compare their phone to Apple's newest incarnation.
You can look at the results at different saturation levels for the iPhone 5 and see that all the intermediate saturations are almost spot on. It's not too saturated (which the Galaxy 3 appears to be), but accurately saturated where what you see on the screen accurately reflects what is in the source content.
Cranking up saturation in software is likely to introduce artifacts due to the precision of the data being sent to the panel. For example, most framebuffers are still at most represented with 24 (color) bits per pixel, and the signals being sent from the GPU to the display may not have much more precision than that either (how much precision they have can depend on a variety of factors). The immediate consequences of this:
User-mode software is unlikely to be able to increase saturation without hardware assistance, because it ultimately has to send 24bpp pixel data to the GPU. The dynamic range possible in 24bpp is pretty limited when you factor in what happens to it after it goes through the rest of the pipeline.
Kernel-mode code, drivers, and firmware on a GPU can potentially adjust saturation more accurately by sending higher-precision color values out to the display, if it supports it. Not many displays today support higher than 24bpp precision, but such displays do exist. Unfortunately, the inputs from user-mode software were 24bpp, so some accuracy has already been lost and any lost accuracy will be compounded by saturation changes produced at this level.
Finally, the display itself may actually be reducing the quality of input signals - many panels have less than 8 bits per channel (24bpp = 3 channels) of precision and represent higher accuracy signals with dithering and other techniques. This can make the result of adjusting saturation in software even worse.
So, in practice, adjusting saturation in software will produce visible artifacts for images that need high dynamic range or high precision.
And therefore no one will find it interesting? I'm not that interested so I didn't upvote the article but I don't see any problem with others voting it up and me seeing a link to the article on HN.
For clarification for Android lovers... This doesn't mean that Apple invented all of this.. It's a recognition that without someone driving expectations of the smartphone industry, it stagnates - Apple is the driver.
For clarification for Apple lovers... This doesn't mean that Apple invented all of this.. It's a recognition that without someone driving expectations of the smartphone industry, it stagnates - Apple is the driver.