It will take into account how close to the monitor you are, which is very relevant. A low resolution monitor at a distance is as good as a high resolution one close up.
Given that there is a PPD limit to human vision, I think this is useful for proving to yourself whether a given TV or monitor upgrade will actually be noticeable.
I bet a lot of people will be shocked to find that they’re not seeing any benefit of 4K over 1080, and almost no one will have a setup putting them close enough to a big enough TV to justify 8K.
That is only true and just barely for video. For any high-contrast and detailed elements, like text, there is a significant difference between 1080 and 4K.
Anyone doubting this need only run a given display rotated 90 degrees.
Regarding video, age plays a big role too.
An under 30 user may claim to benefit from 8K video, and that is assuming the source material even delivers that resolution in the first place.
Over 30 viewers get far less out of 8k, generally speaking, the older they get.
And assuming high acuity, there is whether they enjoy the video more.
For me, and though I am aged now, learned a long time ago my overall enjoyment peaks at standard HD resolution. Looks more than good enough.
Frankly, higher resolution material is not always better. A display can be too good.
Sometimes a movie needs to just be a movie. That is harder to do at the very high resolutions. Viewers are taken out of the story when they see effects as effects.
The "soap opera" effect can be dramatic! High resolution and high frame rates trigger it increasingly easily.
Human eye resolution is in units of pixels per degree, and not PPI or DPI. Pixels per angle is constant regardless of viewing distance, where DPI has to be converted in order to understand whether a display is high or low resolution. And DPI doesn’t just change depending on distance, it’s also not constant across a flat display surface.
I believe you flipped things, PPD does change based on distance, DPI does not. This change based on distance is a feature since when you're really close to a screen it does lower the visual clarity.
I guess I wasn’t clear. Maybe this is one of those cases where it’s hard to get the terminology right in a way that isn’t prone to being misread. Both our comments “Pixels per angle is constant regardless of viewing distance”, and “PPD does change based on distance” are making two different unstated assumptions and need clarification.
Maybe it would help to think of PPD as a measure attached to the viewer, while DPI is a measure attached to the display device or screen.
When a projector shows something at 100 PPD on a curved screen, and you make the screen bigger (moving it further away), it’s still 100 PPD to the observer sitting near the projector (even though the DPI on the screen is lower) - the observed resolution stays constant regardless of distance, because that’s what pixel per degree is measuring- observed resolution.
When a flat monitor screen is 100 DPI, it says nothing about the observed resolution. Moving closer to or further away from a constant DPI display changes the observed resolution. I wouldn’t necessarily call changing based on viewing distance a “feature”. It’s a byproduct of that unit of measure, but the answer to the top comment’s question is that PPD is an improvement over DPI when the question is what resolution does the viewer see. DPI is mainly used because it’s easier to measure, and it’s good enough in practice if the display in question has a narrow range of sizes and typical viewing distances, like monitors often do.
Yes, exactly. Both are useful depending on what question is being asked. Either one will need additional conversion if you ask a different question than the measure was designed to answer.
This unit is most useful in the context of VR headsets/displays. In that context changing the fov through the lenses on the same screen would lower the effective resolution which is shown through ppd.
I am not anti or critical here. Really, I love units and want to understand this one better.
reply