Oh I am so glad this is on HN front page so I get to ask the question. Why 8K?
I think people should try this at home. THX recommended Viewing Distance for cinema experience is 40 Degree. That is like sitting in the first few rows in a cinema that makes me sick watching movies. The maths that works for a 16:9 Display, 55" (inch) would be 5.5 Feet, 60" would be 6 Feet. Divide the display inches by 10 and change the unit to Feet.
And assuming you are even somehow comfortable with this distance in everyday usage, you wont notice the pixel on a 4K TV with a 20/20 vision, what Apple calls it "Retina". For a 50" 4K TV it becomes "Retina" at a viewing distances of 39 inches.
Some would argue 20/20 Visual Acuity is antique. With 20/15, Retina starts at 52 inches. That is still below the THX recommend viewing distance of 5.5 Feet, or 66" on 50" Display.
What about the 20/10! Well that would be 78inch, so yes they could see those pixel. But that is only viewing at 40 degree angle. If you view it in what most would consider normal of 30 degree, you would be sitting at 6.8 feet or ~82inch away from the TV. I.e You still wont see those pixel.
Also note we dont test moving object in visual acuity test. You are watching "Moving Pictures" on a TV. Not watching a single pixel.
And if, in any case we need to optimise for the 5% with 20/10 vision that prefer to sit in the front row of the cinema. We could have went with 5K! And despite only 1K increase which seems very small, that is going from ~8M pixel to ~15M pixel. We would have room to spare even for the remaining 5% use case. And if we are so paranoid and need even more headroom. Even 6K will do.
Instead we went for 8K; ~33M Pixels. The amount of bandwidth, computational power, compression, storage. We could have use those for higher frame rate when needed, 10bit colour accuracy by default or high quality colour profile. Instead we spent it on Pixel.
And before someone jump in saying they could tell the difference between 4K and 8K. Remember the current marketing term of 8K Requires mandatory support of HDR and brightness level. Something 4K could do without. What you think of 8K being better may have nothing to do with pixel density at all.
So apart from Sales and Marketing department pushing for it, ( along with Japanese government pushing it for Olympics ) like the days of Intel CPU pushing for Mhz. For someone who has no expert knowledge on the subject and merely an outsider, let me ask the question again. Why 8K?
Modern TV sales demo videos are chosen for the exact kind of thing you start to see when the resolution gets higher. Footage of the ocean waves, schools of fish, flocks of birds.
Moire effects and screengating are way larger than any one pixel. They make ripples and lumps in the image.
You get used to the way hair looks on any generation of TV but it's nothing like "imperceptible" distortion.
You can stand dozens of feet from an 8K, a 4K, and a 1080 TV and tell the difference when the image is of many tiny objects moving across each other.
Grass. Leaves. Sand. A tiled floor viewed from an angle. You know, rare stuff like that.
Blurring it absolutely reduces moire at the expense of distorting detail.
So now you just have another thing that's larger than a pixel (the blur area) that's reducing the effective resolution of your display, and you can't just say "the pixels are too small to be seen at this distance."
There will be a point where increased resolution isn't any use. When that happens they'll stop making higher-resolution TVs because human beings won't go into the store, look at the TV from far away, and say "holy shit, that has so much more detail from this distance than I've seen before."
Hasn't happened yet. We still have further to go before we hit the human visual sampling limit even for far away viewing, let alone close-up detail like reading a page of text in a scene or accurately representing a houndstooth jacket.
While I agree that 8k likely represents the limit to needed resolution for traditional TV screens, VR could use improvements beyond 8k. That is one usecase of 8k+ screens.
Also, to address your storage/bandwidth concerns, I expect 8k to be almost solely an "upscaling" technology (outside some niche applications). That almost nobody in practice will stream or store 8k, rather they will use local upscaling.
I think people should try this at home. THX recommended Viewing Distance for cinema experience is 40 Degree. That is like sitting in the first few rows in a cinema that makes me sick watching movies. The maths that works for a 16:9 Display, 55" (inch) would be 5.5 Feet, 60" would be 6 Feet. Divide the display inches by 10 and change the unit to Feet.
And assuming you are even somehow comfortable with this distance in everyday usage, you wont notice the pixel on a 4K TV with a 20/20 vision, what Apple calls it "Retina". For a 50" 4K TV it becomes "Retina" at a viewing distances of 39 inches.
Some would argue 20/20 Visual Acuity is antique. With 20/15, Retina starts at 52 inches. That is still below the THX recommend viewing distance of 5.5 Feet, or 66" on 50" Display.
What about the 20/10! Well that would be 78inch, so yes they could see those pixel. But that is only viewing at 40 degree angle. If you view it in what most would consider normal of 30 degree, you would be sitting at 6.8 feet or ~82inch away from the TV. I.e You still wont see those pixel.
Also note we dont test moving object in visual acuity test. You are watching "Moving Pictures" on a TV. Not watching a single pixel.
And if, in any case we need to optimise for the 5% with 20/10 vision that prefer to sit in the front row of the cinema. We could have went with 5K! And despite only 1K increase which seems very small, that is going from ~8M pixel to ~15M pixel. We would have room to spare even for the remaining 5% use case. And if we are so paranoid and need even more headroom. Even 6K will do.
Instead we went for 8K; ~33M Pixels. The amount of bandwidth, computational power, compression, storage. We could have use those for higher frame rate when needed, 10bit colour accuracy by default or high quality colour profile. Instead we spent it on Pixel.
And before someone jump in saying they could tell the difference between 4K and 8K. Remember the current marketing term of 8K Requires mandatory support of HDR and brightness level. Something 4K could do without. What you think of 8K being better may have nothing to do with pixel density at all.
So apart from Sales and Marketing department pushing for it, ( along with Japanese government pushing it for Olympics ) like the days of Intel CPU pushing for Mhz. For someone who has no expert knowledge on the subject and merely an outsider, let me ask the question again. Why 8K?