Let's assume someone has a 65in (165cm) TV and sits 6.5ft (~2m) away. This is a larger TV than most people have, and a closer viewing distance than most people sit. Let's also assume that the viewer has 20/12.5 (6/3.75) vision. 20/20 (6/6) is the low end of normal, and it isn't uncommon to have slightly better vision. My optometrist (my fiancée) can correct me to about 20/15, but 20/12.5 is unusually good.
The screen width is 56.65in. At this distance, that subtends an angle of about 40 degrees:
>>> math.degrees(2 * math.atan(56.65475464133182 / (2 * 6.5 * 12)))
I chose the inputs to get to that result, but my point is that viewers would need to have unusually good vision, and unusually large TV, and be sitting unusually close to hit the limits of 4k. And that's only for very high contrast images, like black text on a white background. Our eyes are less sensitive to low contrast transitions, as found in most videos. Try reading an eye chart with the letters printed in gray on a lighter gray background. It's more difficult!
I think there are much better things we can use our limited bandwidth for than increasing resolution: we could use better compression codecs at higher bitrates, reduce or eliminate chroma subsampling, increase color depth and/or gamut for HDR content, or use higher framerates for content where that is appropriate.
EDIT: Here's a source for 20/20 = 30 cycles per degree, and 20/12.5 vision being unusually good: https://www.opt.uh.edu/onlinecoursematerials/stevenson-5320/...
Angle of eyes may be different and I'm more skeptical of whether we can tell if people really are looking at us off-angle or from considerable distance.
On the one hand, we're certainly hyperaware of faces both in central and peripheral vision, as part of our brain is dedicated to face detection, and eyes have high contrast (black on white). And looking dead-on at someone's eyes we can infer where they're looking even from pretty far away. All of this of course serves a social / evolutionary / survival purpose.
On the other, it may be that peripherally all we can tell that someone's is facing us and has open eyes, and this leads us to have a premonition that someone is looking at us. When we look up to confirm, this may or may not be true, but we'd be more likely to remember/encode that they were, thus making that sense err on the side of assuming people are looking at us. Add to that our own sudden shift in the angle of our face/eyes, which may cause them to react and look our way. We also don't get confirmation of whether people far away are looking at us, or if they're looking at someone next to us. Thus we don't really know if they are.
For me, both of these situations (thinking they are looking at me and finding they aren't, and seeing people look at me in response to my looking their way) happen frequently. I'm not sure our senses are that good at off-angle 'who's staring at me' though; instead I think we're just really well optimized to assume that just in case. Our brain is phenomenal at filling in the blanks visually, and making inferences based on important survival signals like where others are paying attention.
I mean, the blur there is very slight, but still!
The error in reducing the fps to that of the human eye is that there is no guarantee the two are in sync. To guarantee I will actually see 72 fps (or x fps) as intended, the manufacturer needs to take into account the amount of time each frame is displayed, as well as the human response to changing frames on the screen during one "frame" of the eye. How many times should a pixel refresh during one observation to create a lifelike scene for the human eye?
Something analogous can be said for dpi, and the fact that we can't see a specific pixel doesn't mean that each of our eyes' "units of resolution" correspond perfectly to where a pixel is on the screen. If we are seeing the intersection of four pixels at once but only seeing subpixels of each one that don't correspond to the intended color when taken together, theoretically all the color is falsified. If we are instead seeing a number of complete pixels in each of our own visual pixels, there is less chance of this. Further, the spacing between pixels varies by manufacturer, and this can have effects on the required resolution.
The same is true of visual resolution. You only have high resolution sight in a tiny portion of your vision, the rest is much lower resolution and your brain uses a combination of rapid eye movements and spacial memory to fill in the blanks.
Additionally one of the issues with perception of a pixel is your eyes have nothing like a pixel grid, it’s pretty random. You can see this if you use a digital camera to take a photo of a screen, you can see interference patterns as the grids align and unalign.
This makes the whole equation very complicated, certainly nothing I can answer. I guess I’ll have to find an 8k/120hz display and find out.
You can also perceive very high frequency lights by the strobe effect. For example if you wave your hands in front of a PWM LED or a bad fluorescent light you can quickly tell. By moving my eyes quickly from left to right, I can easily spot the order in which my office projector flashes the colour components of its image.
It was pretty easy to set up a blind trial with 5 monitors by putting cardboard around the monitor bezel and stand, and asking people which one looks better.
Everyone, even nontechnical people such as HR and sales, can identify the 4k monitors, super reliably, at a distance waaaay beyond what optometrists claim should matter.
Of course you could argue that the experiment was confounded by backlights or color calibration or monitor age whatever. But IMHO the result is so stark that it would easily survive even if you repeated the experiment with a bunch of identical 4k monitors, and set some of them to crappy low resolutions.
I do think there are some possible edge cases where 4K might not be good enough, even in the parent's reference example for possible worst case scenario. But we could have 5K or 6K. Going 8K takes ridiculous amount of bandwidth and processing power for... virtually little to no gain.
I am more interested in Colour Space, does it mean we get Rec 2020?
Also, frankly, the last thing I worry about every day is whether I can turn off anti-aliasing yet. Text has been comfortable on computer monitors for a long time now.
Rtings has close-up pictures of the layout of OLED screens. Those are TVs but I would think monitors are the same.
I think only phones sometimes use pentile, probably because of the pixel density needed to fit 4k on a phone-sized screen.
> Galaxy S8 5.8" Quad HD+ Super AMOLED (2960x1440) 570 ppi
"for complex objects stereo acuity is similar to visual acuity. But for vertical rods it can apparently be as low as 2 arc seconds (1/30th of an arc minute). In other words, 30x as strong as typical visual acuity."
Assuming that the author of the blog post picked 30x as representative of the most extreme cases, and if we take the root commenter's word that 4K is approximately 1x visual acuity, and if 8K is double the width and height of 4K (is that correct?), then that suggests that by the time we reach 128K resolution (32x visual acuity, >8 billion pixels) we'll have finally mastered screen resolution, which, alas, will be of little comfort to laptop users, whose screens will inevitably still be 1920x1080. :P
Plugging that into your formula we get ~5780 horizontal pixels, or about 75% of 7680 provided by 8K.
In the future it's easy to imagine going even more extreme if you look back to the past where the first mass-produced TV looked like this:
The other thing to point out is progress. I used to think 60 inch was massive but of course it’s not that big. Our field of view is incredible, and if we’re going to admit we like watching movies we may as well do the best we can to make the experience immersive.
So I’d hope the advent of 8k will also bring about 90, 120, and beyond. I think in a decade or so this will be the norm for most new TVs, and so this calculation also ignores technical progress which is good as we certainly aren’t near the maximum of an ideal cinematic experience.
I agree, I think these screen sizes must be where the industry is going. As someone who also likes movies, and owns a 65" TV, I would love if it literally twice the size and mounted flat on/in the wall. This kind of mounting is definitely where the industry is heading, too: the highest-end TVs from LG and Samsung are ultra-thin and have the screen detached from the processing unit.
Except that the actual part of our field of vision our brain can focus on is relatively small and cinema favors pictorial representations where it's appropriate to grasp the "whole image". What you're describing is VR, whose tech isn't TV.
Don't forget to blow it up to 8k to see the full effect.
Bear in mind, if you're on a 65" screen you have to be at 2 feet to appreciate the image. Unrealistic for most purposes I understand, but at 1080p, you should be at 10 feet which is about normal. 4k should be about 8 feet which is the best viewing distance in my opinion for a theater type room. It is how my theater room is personally laid out.
8k in my opinion is probably as large as any home theater screen will ever need to be. I would not turn up my nose at being able to pause the move and step up to the screen to see some details. I already do that now on my screen.
Maybe you don’t care about the difference in quality, and if so that’s totally fine, but it is definitely noticeable.
The may be some learned preferences in here. I will choose 720p or lower given an option. 4k and high fps look just weird for movies. I'm sure the next generation won't understand me here.
55" 4k TV here and even 1080p is clearly blurry. Watch some movies or documentaries in 4k and it's obvious how much better they look. I recommend Planet Earth II on Bluray.
...but the textures become astounding. Also, long shots of landscapes are so rich with detail as your eye can move around.
So for interior shots in an arthouse film, or exteriors in pretty much any film, it's a world apart. For talking heads on the news of course, who cares.
I'm so used to 1080p minimum for movies and TV shows that standard definition makes me think I have something wrong with me eyes, it's just blurry. Like I want to focus but it just won't.
I have a 55" 4k OLED and now I find 720p looks very average. Maybe on a smaller screen it doesn't matter but for a larger screen, those pixels matter.
Obscene? That's a bit dramatic. I expect my next TV will be 77" or larger. Why not enjoy what you're watching?
You can increase bitrate without increasing the resolution.
Basically, could higher resolution help in some edge cases?
> we could use better compression codecs at higher bitrates, reduce or eliminate chroma subsampling, increase color depth and/or gamut for HDR content
Those are likely to be rolled out in tandem with new resolution standards
OLED-on-glass-substrate is allowing TVs to get lighter making them more installable. OLED-on-plastic-substrate TVs could be shipped rolled up. They may not be that far away (<10 years).
It will become completely indistinguishable when error caused by that imperfection will be less than some threshold, for which you'd need higher density.
Of course, the positive effect of density diminishes as it goes up, and only makes sense if you want absolute realism of the picture, which would also require sensory-element perfect 3D mapping.
For media consumption at longer viewing distances is is clearly pointless.
This isn't a troll. I don't know, so I'm asking whether it can be a factor.
However, one thing that 8K would be really good at is zooming content: even if zoomed at 10x, the image would still look good.
Admittedly, I have no idea why someone would want to zoom video content.
Where video games strictly benefit from a higher framerate, adding more frames does weird things to movies and television. As you may know, most movies are shot at 24 frames per second, which is low considering humans can see far more fps than that.
This lower FPS means movies are actually impressionist paintings of reality. The framerate blurs what is being filmed so it looks less like a set and more like what we've been trained to think of as a movie.
Movies that experiment with high framerate like Billy Lynn's Long Halftime Walk (120 fps) find it makes selling the fiction harder. Viewers can see the cracks in everything a lot easier. This is one reason people disliked Peter Jackson's 48 FPS version of The Hobbit. Instead of seeing a group of dwarves, high frame rates made them look like a bunch of guys in funny costumes.
All this is to say it's interesting they are making high frame rate a part of their video future when its future within movies and TV is undecided.
Weird how we are trained to dislike what is a better picture quality.
So I'm not sure that when we're talking about broadcast television it's that surprising to have them pushing for higher frame rates.
And it's entirely possible that like other things pushing modern broadcast TV advancements, the enhanced 'reality' and better motion fidelity is advantageous to television because of the one thing people still care a lot about watching live: Sports.
It's obvious that consumers want higher framerates. Just look at any high end panel from the last few years: they all do frame interpretation.
Something is not bad by virtue of being uncommon.
It wasn't for lack of trying either - they went to ridiculous lengths with film grain and motion blur.
I can tell the difference between HD and 4K, so my eyes aren't completely shot. But I think we're getting to the point of resolution for resolution's sake. Like having a speedometer that goes to 200 MPH, when the legal limit in your state is 85, and the car simply isn't capable of 200.
I remember back when the Amiga came out. My friend who had one said it showed 4,096 colors because that was the limit of human perception. Now I know he was wrong, but it made sense at the time.
What is needed is better display technology to exploit the 8K.
Give me 120fps and HDR before 4K, let alone 8k
 picture of the Costco inventory card, http://i.imgur.com/hNWgFt7.jpg
(This is the card: https://www.amazon.com/EVGA-GeForce-Support-Graphics-08G-P4-... )
From that day on every time I see a 4K film in theatres the low resolution bugs me. 8K should get us much closer to the old 35mm experience.
An 8k 360 degree immersive environment might not be quite enough resulution to get fine detail if you were looking at it through a VR headset.
One rule if thumb I recall hearing is that the TV diagonal size should be about 1/3 to 1/2 the viewing distance. I'd need to brush up on my trig to get degrees from that.
Discussions of resolution against screen size are meaningless without consideration of viewing distance.
what if it's going down a steep hill?
My mother-in-law in the back seat was not happy.
I was in the backseat of a car driving like that in high school. Driver lost control while changing lanes, we swerved across multiple lanes of traffic, exited the interstate and flipped five times. The two people in the car not wearing seatbelts were thrown from the vehicle as we rolled.
Nobody died, but by rights any or all of us should have.
No, because of the flimsiness of the car.
You're going to kill someone driving like that on public roads.
Ordinarily, I'm on your side with that. But this was a one-off passing a Winnebago on a downhill in a part of the country where you can see the road is clear for 20 miles ahead through the valley up into the next mountain pass.
It looks like 5k is the sweet spot for 27", and 8k for 32".
Not to mention VR needs 8k to provide 1080p-like resolution over a large field of view.
1. set top box is by default sending only 720p to the TV
2. HDMI input on the TV is by default in "video mode", where it rescales the picture even when fed native 1080p
It took me months to eventually figure this out, and now I have sharp pixel-perfect FullHD picture...that makes compression artifacts pop up :(
STBs from Comcast over the past few years support 1080p and the ones released over the past year support 4K. Also most newer Android TV based STBs support 4k. You’re probably saying “yeah, but the content doesn’t”, oh, but it does.
Companies like Comcast and OTT providers like DirectTV Now support 4K linear broadcast as most transcoders in the past five years support 4K streams and most broadcasters are supplying these providers with 4K streams. Even the recent move to HVEC is making these providers rencode their VOD libraries and titles are being rencoded at 4K.
The transition to 8k won’t be as bad when people had 1080p TVs with 480 content.
Some channels in some markets may be at 720p and some older STBs top out at 1080i, but recent boxes like the XG1, any boxes that are IP, like the Xi series, and all of their apps take top bitrate profiles, so channels like your local affiliates or ESPN are received at least 1080p.
There may be some 4K or 1080p on-demand or special event content (i.e. Olympics), but it is very rare.
Where streaming services do fall down for me is audio stream handling. If you stream high quality in most services on most dedicated streaming devices you get 5.1/7.1 by default, with awful, unconfigurable downmixing to 2.0/2.1 that makes you crank the volume to make dialogue audible but is still immediately drowned out by the slightest peripheral foley effect. Amazon video is the worst about this.
I still haven't seen any streaming service look nearly as good as a Blu-ray and this is on a mid-range TV, from farther away than I should be with probably-should-wear glasses vision.
I personally don't see the point in anything higher than 4k. I'd much prefer if manufacturers focused on better colour reproduction, better blacks, better audio and better network and media functionality.
I don't even had bad hearing and I had to get a 5.1 system, just so I finally had a proper center channel.
People back than were paying serious moneys for HiVision set, but I remember one case in particular. Some rich dude from Moscow flew all the way to Vladivistok, to buy a HiVision set from father.
First, he paid a ton of money, and happily flied away to Moscow, then father gets an angry call "the TV set is broken!" In a few months time he returned with the set, father asks him to turn in on to see what happened, and it "miraculously" starts working.
The rich dude wasn't able to hide his embarrassment and disbelief, and flew back again. As you can guess, the guy was not away for long, and that time father answered "if you that rich, maybe you will buy an own satellite," and pointed him to fine script in the contract.
How many displays are actually capable of not just displaying 120hz, but actually accepting a 120 fps input (as opposed to just interpolating)?
I'm one of those weirdos who thinks movies aesthetically look better at 24 frames for the "dreamlike" quality. Racing/Action Sports, yes, please more frames.
I always disable motion smoothing on 120Hz TVs. I think high frame rates are really only good for video games and live sports. For fictional cinema/tv, it just looks weird and unnatural.
Similarly, just because you can technically put the entire scene into focus all the time doesn't mean you necessarily want to!
Sometimes you technically can't put the entire scene into focus. Even if you have unlimited lighting, there's only so far you can stop down the aperture before diffraction wipes out your depth-of-field improvement. Citizen Kane famously used composite shots to achieve extreme deep focus that would have been impossible as a single shot.
I'd love to see another attempt at a photorealistic 100% CG movie like Final Fantasy: The Spirits Within. We'd be free from the limitations of real world optics, and could produce a "hyperrealistic" style even better than what you can see with the naked eye. I find shallow depth of field almost as annoying as low frame rate.
Isn't that what Disney's Lion King remake is doing?
I'm okay with playing video games at 60fps, but my brain keeps rejecting photorealistic content above 30fps
This was quite noticeable to a trained eye in the finale episode of David Lynch’s Twin Peaks returns season.
As I was watching it, somewhere mid episode where the story moves between two realms of reality, I noticed the image suddenly had a much more real (or “cheap”) appearance.
I was in awe. That was a super nice touch done a technical level which really helped elevate the story itself.
I also expect the effect to be lesser among younger people who are used to e.g. video games, although I have no actual evidence (anecdotal or otherwise) that this is the case.
And although I remember 60fps being very noticeable at first, I don't think I notice at all any more.
That’s not weird, that’s what the majority of people prefer. Otherwise they wouldn’t keep doing it.
It would be more weird if you said you preferred films at 60fps
No you are not Weirdos. Perfectly normal with great taste in Movies frame rate and Sports Action Frame rates.
I’m wondering if the 120Hz is just frame-multiplied 24Hz video. Watching a 24Hz movie would certainly be a better experience at 120 fps than at 60, since there won’t be the experience of judder as 120 and 24 divide evenly.
Regarding displays capable of accepting 120 Hz input, there are plenty of computer monitors that can do that at least.
Yep, just not televisions! That's a big difference, since most people don't want to put monitors in their living rooms.
This is only going to be for sports bars.
About 10% of American households have a satellite dish, and about 20% of western European households. I think it's much higher in parts of Africa and western Asia, where there isn't the infrastructure for terrestrial broadcasts, or where satellite receivers get around state censorship.
You can argue that because you look at the screen from a distance this criteria is irrelevant, but there are all sorts of discussions and argument about perception of clarity and whatnots which makes calculations based on distance hard to defend properly. The only way to be sure is to be unable to perceive pixels even if it's right in front of your eyeball. Like actual things in life.
I'm convinced resolution will improve until this limit is reached, as it should.
I think it's quite the opposite.
I would say the minimal reasonable distance is the minimal focus distance of a single eye, a few centimeters.
As I understand it, in the anime world most airing shows are still being animated at 720p or at slightly higher resolution, with a handful of them occasionally being animated at 1080p. Although the best studios typically clean things up and make a huge number of improvements for their 1080p BD release. Movies are generally animated at 1080p.
Older cell-drawn anime are being scanned at 4K / 5K to produce new 4K releases, and I'm told the level of detail in those is incredible, which makes sense given that they're hand-drawn with extreme care. Currently the best known release is probably Ghost in the Shell (1995).
Even the incredibly popular movie Your Name was animated at 1080p. The special edition is upscaled 4K with HDR.
The high cost of 4K releases also creates a barrier. A standard 4K Blu-ray movie seems to go for around ~$80. If you want both the English and Japanese versions of Your Name at 4K you're looking to pay around ~$195, admittedly that's for the 5-disk collector's edition. Compare that with a typical ~$20 1080p Blu-ray release.
Being unable to prepare a personal backup is also a problem. Ideally I'd dump the movie right away so the physical medium can be stored in as pristine conditions as possible. It would feel very frustrating to spend that much money, only to have it casually damaged.
I used to pirate a ton of movies but with the deals these days on 4k + dolby vision + atmos titles its worth it just to buy.
I only know about this practically in regards to 10-bit anime being more compressible than 8-bit, but I assume the same logic applies.
But the equivalent in spatial resolution is encoding near-horizontal lines... which isn't a common situation (unlike gentle color gradients) and isn't a problem in the first place.
Banding can refer to all sorts of things that form bands, like fixed pattern noise from a camera sensor.
I cannot imagine how this could ever be true. Have you experienced it actually happening for the same content at the same encoding settings?
"bitrate" = file size / duration. There is no reason why increasing the resolution should ever make a video more compressible. This is completely different from the 10-bit situation you described, which takes place because you're storing data in a different format.
You could choose to encode a higher resolution copy of a video at a lower bitrate, but it's nothing an encoder will do by default, and it should always result in the higher resolution version looking worse.
It's the point where I think we can pretty much say, okay, we are now finished with resolution increases. There's nowhere else to go. We've 100%'d that part of the tech tree. Weirdly satisfying.
Seeing the film grain detail on older movies is strangely satisfying to me. Even newer movies shot digitally are done so at 2K or 4K.
"640k ought to be enough for anybody."
Don't Bank So Close to Me ... Just a Thief Ya Know ... 50 Ways to Loot Another ...
I think it has something to do with full vision of the screen at any time so your its centerted and a particular percentage of your field of vision. I couldnt find the study which was for sports and movie watching but I bet such a standard is even more important for gaming.
Basically relatively speaking your screen should be the same size in your field of view but the fact that its bigger allows for better detailing.
4000px from 5 feet doesnt look at good as 8000px from 10 feet even thought they fill up the same area in your field of view
I use a UHD/4k 40" TV as a monitor, running at native resolution. It's essentially four 1080p 20" screens in one. The pixel density is pretty good, but not "retina"/high DPI level.
Things can be a bit too small as I sit further from the screen than a regular monitor.
With 8k, it'll be four 4k 20" screens in one.
I'd be able to go to a larger screen without sacrificing much on pixel density. I would no longer have to zoom in on certain websites and can just get a larger screen.
I'm running at native resolution, so not doing the "sitting far away" option.
"More screen space than you can actually see" - This is somewhat true. There's an upper limit to the size where it becomes impractical for most use cases. I'd say that limit is 40"~55", and mitigated somewhat by curved screens.
It's not too different from having a large quantity of screens you'd see at say, a trading desk, but all conveniently in one screen.
Why is all this high-resolution media walled behind decoder boxes and DRM? 4k blu-rays have some fairly intrusive DRM as far as I understand, and then there's 4k streaming like on Netflix, which requires hardware DRM to playback on PC.
Consumers aren’t told about DRM. The spec sheets don’t speak about what restrictions are in place. You only discover after the purchase that some capability you had assumed you could do no exists.
E.g. for me I’ve had this twice. Trying to view an HD movie over a component cable—nope not allowed. And second my daughter discovering she can’t use music from a streaming music service in a home video. Both occurred well after initial purchases.
Can you say more about where the DRM applied in this case? I would expect it to kick in, say, if you upload a home video to YouTube that incorporates copyrighted media, but I would be surprised if something prevented you from making a video (say on your phone) that included whatever music playing over speakers.
I need 4k for programming but I've yet to appreciate 4k for video. I see it, but I just don't really care.
It's hard to describe the difference, except that I have two monitors side by side -- one a 4k, and one a 1440p, and the difference is noticeable. Even when I scale the 4k monitor so that the window size is the same, the text quality is still noticeably easier to read.
4k monitor for two panes of code
4k monitor for a browser + debugger or maybe two browsers: one mobile one desktop.
1080p monitor for docs or maybe Netflix as I work depending on the kind of work.
When I'm doing robotics or server or more general programming work I basically just use my main monitor.
Reading code on a 1080p monitor is almost intolerable now that I'm used to 4k. I can size it way down and get two full panes of code side by side.
I’m still not convinced that 4K is a viable format. They’re still cranking out 480p DVDs, a technology that was supposed to be supplanted 12 years ago by Blu Ray.
Genuine question: does it display movies in a "natural light" the same way as a plasma TV does or does it display it under a "fake light", the same way as the LED screens do? (even the 4K ones). I'm asking because I'm a movie buff who's really attached to his plasma TV but looking at the available LED options up on the market I don't see anything that satisfies me. Yes, the image is "clearer" on the latest LED 4K screens but it also looks "fake-ish" when it comes to watching movies.
And sadly it’s been designed to break sometime around the 7-10 year mark.
E.g. using minimal amount of solder on connections that will eventually crack after repeated warm/cold cycles from use.