Hacker News new | comments | show | ask | jobs | submit login
8K is now being broadcast in Japan (newsshooter.com)
258 points by kungfudoi 13 days ago | hide | past | web | favorite | 349 comments





There are a lot of comments debating whether 8K is worthwhile vs 4K, and whether we can see a difference. I decided to do some math.

Let's assume someone has a 65in (165cm) TV and sits 6.5ft (~2m) away. This is a larger TV than most people have, and a closer viewing distance than most people sit. Let's also assume that the viewer has 20/12.5 (6/3.75) vision. 20/20 (6/6) is the low end of normal, and it isn't uncommon to have slightly better vision. My optometrist (my fiancée) can correct me to about 20/15, but 20/12.5 is unusually good.

The screen width is 56.65in. At this distance, that subtends an angle of about 40 degrees:

    >>> math.degrees(2 * math.atan(56.65475464133182 / (2 * 6.5 * 12)))
    39.9191582380095
20/12.5 vision corresponds to 48 cycles per degrees, or 96 pixels per degree, distinguishable (20/20 is generally accepted to be 30 cycles per degree). Therefore, the viewer can distinguish 96 * 40 = 3840 pixels horizontally. Which is UHD 4K resolution.

I chose the inputs to get to that result, but my point is that viewers would need to have unusually good vision, and unusually large TV, and be sitting unusually close to hit the limits of 4k. And that's only for very high contrast images, like black text on a white background. Our eyes are less sensitive to low contrast transitions, as found in most videos. Try reading an eye chart with the letters printed in gray on a lighter gray background. It's more difficult!

I think there are much better things we can use our limited bandwidth for than increasing resolution: we could use better compression codecs at higher bitrates, reduce or eliminate chroma subsampling, increase color depth and/or gamut for HDR content, or use higher framerates for content where that is appropriate.

EDIT: Here's a source for 20/20 = 30 cycles per degree, and 20/12.5 vision being unusually good: https://www.opt.uh.edu/onlinecoursematerials/stevenson-5320/...


Is being able to distinguish a single pixel the same thing as noticing that text is sharper? I've noticed that you can't stop doing subpixel font rendering on displays until they get far above a density that optometrists say shouldn't matter anymore.

This. People always keep forgetting that our brain is definitely capable of some amount of superres lotion calculations especially for very specific types of structures. One aspect I've always been fascinated about is how far away someone can be when I can still tell that they're looking at me and not the person next to me. Most of the time I can definitely not read any text (even at 16 pts) if it was held at that distance but I can subconscuously tell the subtle differences in the angle their eyeballs are at. Same thing with 60fps vs 120fps. People keep saying you can tell them apart but I sure as well can at least in some instances.

Eyes can definitely tell more than 60hz, especially the rods in your peripheral vision. From what I remember in cog sci classes it was about 72hz that was just noticeable difference (JND), but I think this may be a bit higher in rods, and have small difference by gender. One way to tell is look at a CRT off-angle and adjust refresh rates until its no longer perceptible.

Angle of eyes may be different and I'm more skeptical of whether we can tell if people really are looking at us off-angle or from considerable distance.

On the one hand, we're certainly hyperaware of faces both in central and peripheral vision, as part of our brain is dedicated to face detection, and eyes have high contrast (black on white). And looking dead-on at someone's eyes we can infer where they're looking even from pretty far away. All of this of course serves a social / evolutionary / survival purpose.

On the other, it may be that peripherally all we can tell that someone's is facing us and has open eyes, and this leads us to have a premonition that someone is looking at us. When we look up to confirm, this may or may not be true, but we'd be more likely to remember/encode that they were, thus making that sense err on the side of assuming people are looking at us. Add to that our own sudden shift in the angle of our face/eyes, which may cause them to react and look our way. We also don't get confirmation of whether people far away are looking at us, or if they're looking at someone next to us. Thus we don't really know if they are.

For me, both of these situations (thinking they are looking at me and finding they aren't, and seeing people look at me in response to my looking their way) happen frequently. I'm not sure our senses are that good at off-angle 'who's staring at me' though; instead I think we're just really well optimized to assume that just in case. Our brain is phenomenal at filling in the blanks visually, and making inferences based on important survival signals like where others are paying attention.


Re frame rate: I believe 10kHz is considered a stroboscopic-artifacts detection threshold. Michael Abrash mentioned 300-1000Hz is required for low resolution VR (1080p). Higher rates for higher resolution. So yeah, the current 240Hz displays don't quite cut it.

https://www.blurbusters.com/blur-busters-law-amazing-journey...


In fact, this is a great video demonstrating motion blur even at 1700Hz: https://youtu.be/PmAhJO6Vp3Y

I mean, the blur there is very slight, but still!


It's also annoying when they say you shouldn't be able to tell that PWM headlights/taillights are flickering but I sure as hell can, at least!

LED Christmas lights are the worst at this. I can definitely see the 60hz flicker, especially if there's movement.

Definitely noticeable when my eyes are moving quickly, glancing from one location to another.

Like when you're driving? :)

Yeah, and when they say Wi-Fi shouldn't make my brain tingle

you're being downvoted and I have no idea if it would actually be possible to be sensitive to wifi but I can hear my 2014 MacBook Pro process certain things and I don't mean the fan. No idea what triggers the processing "buzz" because it's not constant but certain cpu intensive operations make a buzz I can hear so I can certainly believe a similar buzz could come from a router's processor

That's called coil whine! It could happen in a router, but it seems unlikely in such a low power device.

I’ve used a 5-port ethernet switch (even lower power than a router) with very substantial coil whine. Like, “can’t tolerate it without headphones” substantial.

Tplink sg105e by any chance? I can hear mine from 3 meters away.

Just checked. That's exactly the one.

My graphics card had it terrible, well atleast until it died a two weeks ago. But it most often made the noise at an audible frequency during low power usage. When it got hit hard it stopped being a problem, but it could have just been pushing it outside of my audible range.

FPS matters a great deal in video games -- 120 is fantastically better than 30. FPS really doesn't matter that much in movies, much less TV shows, even action movies.

Please excuse my lack of technical biological vocabulary. Here is my take on the issue:

The error in reducing the fps to that of the human eye is that there is no guarantee the two are in sync. To guarantee I will actually see 72 fps (or x fps) as intended, the manufacturer needs to take into account the amount of time each frame is displayed, as well as the human response to changing frames on the screen during one "frame" of the eye. How many times should a pixel refresh during one observation to create a lifelike scene for the human eye?

Something analogous can be said for dpi, and the fact that we can't see a specific pixel doesn't mean that each of our eyes' "units of resolution" correspond perfectly to where a pixel is on the screen. If we are seeing the intersection of four pixels at once but only seeing subpixels of each one that don't correspond to the intended color when taken together, theoretically all the color is falsified. If we are instead seeing a number of complete pixels in each of our own visual pixels, there is less chance of this. Further, the spacing between pixels varies by manufacturer, and this can have effects on the required resolution.


That’s an interesting idea. The human eye (and the brain in general) isn’t a clocked system, different parts work at different speeds. You can see this yourself if you have an old fluorescent lamp that works on the mains frequency. You won’t perceive the flicker in the main part of your vision because the colour receptors aren’t that fast. However the edges of your vision where the rods dominate can perceive a 50/60hz flicker.

The same is true of visual resolution. You only have high resolution sight in a tiny portion of your vision, the rest is much lower resolution and your brain uses a combination of rapid eye movements and spacial memory to fill in the blanks.

Additionally one of the issues with perception of a pixel is your eyes have nothing like a pixel grid, it’s pretty random. You can see this if you use a digital camera to take a photo of a screen, you can see interference patterns as the grids align and unalign.

This makes the whole equation very complicated, certainly nothing I can answer. I guess I’ll have to find an 8k/120hz display and find out.


Eye response frequency depends on the light intensity. We can track higher frequencies when the lights are brighter. Dim theatres played a role in preserving the illusion of motion when movie frame rates were very low.

You can also perceive very high frequency lights by the strobe effect. For example if you wave your hands in front of a PWM LED or a bad fluorescent light you can quickly tell. By moving my eyes quickly from left to right, I can easily spot the order in which my office projector flashes the colour components of its image.


It's easier to just get the result empirically. I once literally did this experiment in an office where we had a bunch of similar-looking Dell displays from different eras, with a range different native resolutions:

It was pretty easy to set up a blind trial with 5 monitors by putting cardboard around the monitor bezel and stand, and asking people which one looks better.

Everyone, even nontechnical people such as HR and sales, can identify the 4k monitors, super reliably, at a distance waaaay beyond what optometrists claim should matter.

Of course you could argue that the experiment was confounded by backlights or color calibration or monitor age whatever. But IMHO the result is so stark that it would easily survive even if you repeated the experiment with a bunch of identical 4k monitors, and set some of them to crappy low resolutions.


I agree completely.The problem is obvious to anyone who cares to pay attention, but I am not sure if the interface between biology and technical implementation is being studied or at least popularized enough to create a market share for hardware that actually does what consumers think much of current hardware does (which is to have an imperceptible/unobtrusive implementation).

Florescent light flicker is especially obvious if you get your eyes dilated. I thought I was tripping out or something after an eye appointment because of it.

Who is saying you can't tell apart 60fps from 120fps? Ludicrous statement.

Agree and Yes but that is also assuming you want higher resolution for Stationary Text and Images, which in 99% of the cases isn't true on TV. And hence 4K Really is enough is most cases. The same could not be said about Computer Screen, i.e If you use the 4K Screen as a large Monitor you will notice how the quality might not be "perfect" from where you are usually sitting watching TV, but perfectly fine when you are using TV screen as play moving pictures. The same goes to Phone as well, despite Retina being good enough, you can tell how 400PPI LCD ( iPhone Plus ) still look better than 326 PPI ( iPhone "Number" )in stationary usage. ( OLED uses sub pixel so what ever advertised PPI are quite a lot bigger than it actually is, the blue and red colour dot in iPhone X / Xs OLED only has a Retina 326 PPI )

I do think there are some possible edge cases where 4K might not be good enough, even in the parent's reference example for possible worst case scenario. But we could have 5K or 6K. Going 8K takes ridiculous amount of bandwidth and processing power for... virtually little to no gain.

I am more interested in Colour Space, does it mean we get Rec 2020?


Similarly, I can get acceptable framerates gaming at 4K on a 27" monitor with an RX 480 for e-sport titles (Assetto Corsa, Project Cars 2 etc.) by disabling AA ... because at that resolution you don't Really need it any more.

Text? But we're talking about moving pictures. You're not gonna notice.

Also, frankly, the last thing I worry about every day is whether I can turn off anti-aliasing yet. Text has been comfortable on computer monitors for a long time now.


I don't have good eyesight (far and shortsighted), so reading can be sometimes difficult for me. I was surprised how much easier the iPhone OLED screen (458dpi) is for reading than the 326 dpi LCD. The screen resulution difference made the iPhone XS worth it for me alone. Also, while I cannot "see" pixels on a 1x screen, a retina screen (e.g. iMac 5k) is much easier for reading too.

The ppi don't tell the full story with OLED vs LED, since OLED has a different sub pixel arrangement (pentile) that isn't as sharp / legible. On the other hand, OLED has better contrast. It may not be pixels that matter in this case.

Is this universally true for all OLED screens? I thought it was just a cost-saving measure by some manufacturers.

No, that's not true. Eg https://www.rtings.com/tv/tools/compare/sony-a9f-vs-lg-c8/68...

Rtings has close-up pictures of the layout of OLED screens. Those are TVs but I would think monitors are the same.

I think only phones sometimes use pentile, probably because of the pixel density needed to fit 4k on a phone-sized screen.


Pretty much. It’s also got to do with the fact that blue OLEDs are less efficient and you need more of them.

Mine has 570 PPI and I can easily see that text is pixelated although I cannot see individual pixels.

Because if you discount the Green Dot, the Red and Blue only has 400PPI.

wow, which phone is that?

I'm not the person you replied to but the Samsung S8 has that PPI

https://www.samsung.com/global/galaxy/galaxy-s8/specs/

> Galaxy S8 5.8" Quad HD+ Super AMOLED (2960x1440) 570 ppi


why focus on text? Video is moving pictures for the most part. You probably cant see any difference at all when there is enough movement.

Your calculations are all based on the questionable assumption that details smaller than someone can reliably distinguish in a vision test do nothing to improve the perception of clarity to the viewer.

Specifically, here's a good blog post explaining hyperacuity, and why 8k is at least marginally better: http://filmicworlds.com/blog/visual-acuity-is-not-what-the-e...

Interesting, thanks.

"for complex objects stereo acuity is similar to visual acuity. But for vertical rods it can apparently be as low as 2 arc seconds (1/30th of an arc minute). In other words, 30x as strong as typical visual acuity."

Assuming that the author of the blog post picked 30x as representative of the most extreme cases, and if we take the root commenter's word that 4K is approximately 1x visual acuity, and if 8K is double the width and height of 4K (is that correct?), then that suggests that by the time we reach 128K resolution (32x visual acuity, >8 billion pixels) we'll have finally mastered screen resolution, which, alas, will be of little comfort to laptop users, whose screens will inevitably still be 1920x1080. :P


That's only for detecting depth. The worst case a flat surface is 'retina' multiplied by 5-10.

My current setup involves a projector, where the screen width is just over 118 inches, and I sit ~ 8.5 feet away.

Plugging that into your formula we get ~5780 horizontal pixels, or about 75% of 7680 provided by 8K.

In the future it's easy to imagine going even more extreme if you look back to the past where the first mass-produced TV looked like this:

https://en.wikipedia.org/wiki/Television#/media/File:RCA_630...


The sibling comments point out correctly that our brain likely can use the extra sharpness.

The other thing to point out is progress. I used to think 60 inch was massive but of course it’s not that big. Our field of view is incredible, and if we’re going to admit we like watching movies we may as well do the best we can to make the experience immersive.

So I’d hope the advent of 8k will also bring about 90, 120, and beyond. I think in a decade or so this will be the norm for most new TVs, and so this calculation also ignores technical progress which is good as we certainly aren’t near the maximum of an ideal cinematic experience.


> So I’d hope the advent of 8k will also bring about 90, 120, and beyond. I think in a decade or so this will be the norm for most new TVs, and so this calculation also ignores technical progress which is good as we certainly aren’t near the maximum of an ideal cinematic experience.

I agree, I think these screen sizes must be where the industry is going. As someone who also likes movies, and owns a 65" TV, I would love if it literally twice the size and mounted flat on/in the wall. This kind of mounting is definitely where the industry is heading, too: the highest-end TVs from LG and Samsung are ultra-thin and have the screen detached from the processing unit.


Past a certain size, I think I'd just go the projector route despite some of the disadvantages. I have a 78" TV and a projector shooting onto a wall roughly equivalent to a 110" television; so my threshold is somewhere between 78" and 110"... I'll also say that although I love TV and movies, there's something really nice about not having a television staring you in the face all the time.

> Our field of view is incredible, and if we’re going to admit we like watching movies we may as well do the best we can to make the experience immersive.

Except that the actual part of our field of vision our brain can focus on is relatively small and cinema favors pictorial representations where it's appropriate to grasp the "whole image". What you're describing is VR, whose tech isn't TV.


I'm sitting about 1.8m away from a 720p television showing a standard definition UK terrestrial signal (which is 576p) and I'm honestly baffled how anyone could want any higher res. In fact the only issue I notice is the poorer picture quality on some of the lower bitrate channels, the major channels are as clear and sharp as I'd ever want.

Because details matter. Here's an 8k image on one side and a 576 screen on the other side.

https://imgur.com/a/yqMu5VU

Don't forget to blow it up to 8k to see the full effect.

Bear in mind, if you're on a 65" screen you have to be at 2 feet to appreciate the image. Unrealistic for most purposes I understand, but at 1080p, you should be at 10 feet which is about normal. 4k should be about 8 feet which is the best viewing distance in my opinion for a theater type room. It is how my theater room is personally laid out.

8k in my opinion is probably as large as any home theater screen will ever need to be. I would not turn up my nose at being able to pause the move and step up to the screen to see some details. I already do that now on my screen.


I think the parent point was that there is currently insufficient bandwidth to even make 576p look good for many channels. Given even a high bandwidth slot, 4K with low compression might look better than 8K with high compression. Specifying just "8K", is no guarantee of a quality picture.

Be it with video or music, any level of resolution/quality is satisfying as long as you never got used to better.

This! There was an experiment where monkeys received 2 grapes every day. Then for a time they increased it to 3 grapes. Then one day they gave them only 2 grapes again, and the monkeys threw poop at them.

There is a point where resolutions this high 8K begin to look like you're staring right through a window into another reality. It is a mind boggling huge difference.

https://www.youtube.com/results?search_query=8k+ultrawide


Have you tried Freeview HD? It honestly looks a lot better, even on a 720p screen.

Maybe you don’t care about the difference in quality, and if so that’s totally fine, but it is definitely noticeable.


Although my tele's Freeview decoder is SD I do get to see in all its 'full' 720p glory when using the Chromecast or Blu-ray player, and okay, if pushed I can tell the difference, but I really do not miss it at all when it's not there.

> I'm honestly baffled how anyone could want any higher res.

The may be some learned preferences in here. I will choose 720p or lower given an option. 4k and high fps look just weird for movies. I'm sure the next generation won't understand me here.


4K TVs have some bad image processing details that might be why you don't like them. Motion smoothing in particular is jarring to me.

https://www.consumerreports.org/cro/lcdtvs/turn-off-these-3-...


I think it's more the issue of too much detail. Like I'm interested in what the people look like and the story, but once I can count the hairs in their nose, that's just distracting rather than adding to the experience :-)

Either you have a tiny screen or you should visit an optometrist.

55" 4k TV here and even 1080p is clearly blurry. Watch some movies or documentaries in 4k and it's obvious how much better they look. I recommend Planet Earth II on Bluray.


I mean, it won't add anything to the story...

...but the textures become astounding. Also, long shots of landscapes are so rich with detail as your eye can move around.

So for interior shots in an arthouse film, or exteriors in pretty much any film, it's a world apart. For talking heads on the news of course, who cares.

I'm so used to 1080p minimum for movies and TV shows that standard definition makes me think I have something wrong with me eyes, it's just blurry. Like I want to focus but it just won't.


I agree. 720p is good enough. And a 65" screen in a living room is obscene.

Good enough for what/who?

I have a 55" 4k OLED and now I find 720p looks very average. Maybe on a smaller screen it doesn't matter but for a larger screen, those pixels matter.

Obscene? That's a bit dramatic. I expect my next TV will be 77" or larger. Why not enjoy what you're watching?


I mean, depends where you live. In the midwest we have bigger homes and more room for stuff like that.

What about in a home theater setup?

its better to have 4k projectors because you can really see the difference but its pretty much only Sony that makes consumer ones and they are super pricey still.

Whenever I see these kinds of calculations I just roll my eyes, because video is compressed. There is a very good practical reason why you want 8K. It's the online streaming bitrate! Today if you watch 4K video on Netflix they use 20 Mbps, give or take. That in my opinion is NOT good enough for 4K (UHD). The reason why I am looking forward to 8K is that then they will finally have better video bitrate. So 8K bitrate would be probably be equivalent to reasonable 4K video bitrate.

You're not necessarily wrong per se, but it's really, really stupid that we need to buy higher res screens as an end-around to getting streaming content providers to give us higher bitrate videos.

You can increase bitrate without increasing the resolution.


Is there a difference between visual acuity for static and moving images? What about the square pixel grid, I've read that a hexagonal one would be better in principle for our visual system.

Basically, could higher resolution help in some edge cases?

> we could use better compression codecs at higher bitrates, reduce or eliminate chroma subsampling, increase color depth and/or gamut for HDR content

Those are likely to be rolled out in tandem with new resolution standards[0]

[0] https://i.imgur.com/phONmPZ.jpg


What really matters there is not the resolution but the refresh rate, which is already good with modern TVs so what really matters are directors upgrading to HFR (48 frame per sec) or 60FPS. It’s coming :)

That would be nice, but is it really coming? The first hobbit movie was a big introduction to hfr movies and the reviews were very negative. There hasn't been much interest in 60fps movies since then.

Same for color and sound when it came out. It’ll take some time for people to adjust but it is definitely coming.

I'm not an expert, but I think people in the comments missed a use case : you could have 1 image @ 8k, or 2 images @ 4k each, which could then be used to broadcast VR? Or 3D 4k video?

Lowest-common denominator will win again with yet another failed 3D format. :(

These are good calculations. I think many people would like to go to a significantly larger field of view than 40 degrees. 40 degrees horizontally is not that much: the human visual field is over 200 degrees.

OLED-on-glass-substrate is allowing TVs to get lighter making them more installable. OLED-on-plastic-substrate TVs could be shipped rolled up. They may not be that far away (<10 years).


Or, you could just sit closer. My day-to-day monitor is a curved 49” screen? I typically sit 2-3 feet with about an 80 degree FOV. No special technology required.

One thing, while "minor", is that with higher dpi you get a lot less aliasing issues. With sufficiently high dpi (I've read somewhere around 800), you don't get aliasing anymore. Is that worth the bandwidth? No, of course.

Yes, sufficiently high DPI corrects for bad mastering. If they low-pass (i.e. blur) the source before sampling then there are no aliasing artifacts.

For that calculation to work, the pixels of the screen must have 1:1 match with eye sensory elements, and must be positioned perfectly on (sensory element-lens center-pixel) line. Since it's not the case, you will see the difference.

It will become completely indistinguishable when error caused by that imperfection will be less than some threshold, for which you'd need higher density.

Of course, the positive effect of density diminishes as it goes up, and only makes sense if you want absolute realism of the picture, which would also require sensory-element perfect 3D mapping.


Now having 4k at 4:2:0 is equal to having 4:4:4 1080p. There is hope that having 8k at 4:2:0 will be like a proper 4:4:4 4k ;-)

It will make for some amazing interactive display screens. The latest video slot machines with vertical curved screens are already impressive. Imagine having an immersive touchscreen console for something more productive than gambling.

For media consumption at longer viewing distances is is clearly pointless.


What happens when you throw the motion of somebody's head and/or eyes into the mix? When you are not viewing from an absolutely fixed position?

This isn't a troll. I don't know, so I'm asking whether it can be a factor.


Reading small text is probably where you will see the biggest difference. Higher resolution will also make a bad (but capable) screen look better (eg 4K JPEG upsampled to 8K on a good display vs 8K on a worse display)

You are assuming that people keep their heads static whilst they are watching which is not true. If you move your head you'll see any subset of that 8k picture which makes it more "real" to the viewer.

How moving your head increases the maximum resolution of your fovea?

Some speculate that microsaccades actually increase resolution and perceivable detail in static scenes by basically offsetting the retina a tiny bit. The concept definitely works and is used in practice to boost the resolution of digital image sensors ("superresolution").

It is not but you'll see different image depending on your eyes position when looking at higher resolution image. Think of it as 8k image resampled to 4k but at different offsets. You'll effectively see more detail when moving your head than looking at 4k image.

Why downvotes? It is essentially the same thing when you can read plates from a low resolution movie where on a single frame you can see indecipherable number of pixels but if you combine the pixels from multiple frames and recover the car's offset you can compose higher resolution picture of the plates effectively being able to read them.

True.

However, one thing that 8K would be really good at is zooming content: even if zoomed at 10x, the image would still look good.

Admittedly, I have no idea why someone would want to zoom video content.


it will be nice for watching TV shows on YouTube that are a little postage stamp in the corner to fool content id

As a movie nerd I was interested by the bit about the broadcast being in 120hz. This is... unusual for most visual content.

Where video games strictly benefit from a higher framerate, adding more frames does weird things to movies and television. As you may know, most movies are shot at 24 frames per second, which is low considering humans can see far more fps than that.

This lower FPS means movies are actually impressionist paintings of reality. The framerate blurs what is being filmed so it looks less like a set and more like what we've been trained to think of as a movie.

Movies that experiment with high framerate like Billy Lynn's Long Halftime Walk (120 fps) find it makes selling the fiction harder. Viewers can see the cracks in everything a lot easier. This is one reason people disliked Peter Jackson's 48 FPS version of The Hobbit. Instead of seeing a group of dwarves, high frame rates made them look like a bunch of guys in funny costumes.

All this is to say it's interesting they are making high frame rate a part of their video future when its future within movies and TV is undecided.


Personally, I'm very disappointed that the higher frame rate of The Hobbit has been interpreted as bad. For 3D, in my opinion, it was a much better experience than using a lower frame rate for 3D. In my opinion, using a lower frame rate for 3D has been a bug ever since.

Luckily, 120/5 = 24 so movie content can be broadcast at 120hz without any weird effects.

Exactly right. 24fps has that cinematic feel we love. I spent some time when living in Europe tweaking Samsung TVs to make them stop adding an extra frame which made movies look like a cheap soap opera.

Weird how we are trained to dislike what is a better picture quality.


High framerate is very helpful for sports... and the 2020 Summer Olympics is in Tokyo.

Not all broadcast television is fictional. Allowing the spec to support 120Hz allows 120Hz programming for news and sports, which do benefit from the additional framerate. Fictional works can still be broadcast at their original framerate.

To be fair TV has always had a bit more of a 'real' look anyways, ever since it moved to largely being filmed on video instead of film. It's also always been a higher frame rate (30 frames/60 fields in that weird way where the fields still had motion from each other) and both crts and lcds have some more persistence of image than film projectors.

So I'm not sure that when we're talking about broadcast television it's that surprising to have them pushing for higher frame rates.

And it's entirely possible that like other things pushing modern broadcast TV advancements, the enhanced 'reality' and better motion fidelity is advantageous to television because of the one thing people still care a lot about watching live: Sports.


I'm so tired of the majority of content being produced at 24/30fps.

It's obvious that consumers want higher framerates. Just look at any high end panel from the last few years: they all do frame interpretation.

Something is not bad by virtue of being uncommon.


>Instead of seeing a group of dwarves, high frame rates made them look like a bunch of guys in funny costumes.

It wasn't for lack of trying either - they went to ridiculous lengths with film grain and motion blur.


I imagine similar arguments were made when movies went from b/w to color

I saw 8K demos at Yodobashi Camera a few years ago, and I couldn't tell the difference between 4K and 8K.

I can tell the difference between HD and 4K, so my eyes aren't completely shot. But I think we're getting to the point of resolution for resolution's sake. Like having a speedometer that goes to 200 MPH, when the legal limit in your state is 85, and the car simply isn't capable of 200.

I remember back when the Amiga came out. My friend who had one said it showed 4,096 colors because that was the limit of human perception. Now I know he was wrong, but it made sense at the time.


What you saw was the limitation of display technology, not the potential in 8K. 70 mm IMAX is equivalent to something like 12K (5-perf) or 18K (15-perf).

What is needed is better display technology to exploit the 8K.


Or giant screens. The problem is density not resolution.

But your eyes can’t see the whole screen. Even at 40 degreees you lose detail at the edges.

Give me 120fps and HDR before 4K, let alone 8k


Anecdata, I know, but I am much happier with my 165Hz, 2560x1440 monitor than my same-sized 60Hz 3840x2160 screen sitting next to it. The extra fluidity given to mouse motions, text/window scrolling, etc is a much bigger improvement to usability than I would have initially guessed it would be.

Do you mind sharing what model that 165hz monitor is?

It's the Acer Predator G-Sync monitor from a couple of years ago that I picked up during a Costco sale. At the time I paid ~$500, not sure what they're going for now. It's been great. Wish I'd gotten two of them.

[edit] picture of the Costco inventory card, http://i.imgur.com/hNWgFt7.jpg


What card(s) are you driving it with?

Plain-jane EVGA GTX 1080SC. So chosen because it was the only 1080 I could find at the time that only needed a single additional power header, I was building a mini-ITX based workstation, and space for wires was at a premium. No complaints, though, it's held up like a champ.

(This is the card: https://www.amazon.com/EVGA-GeForce-Support-Graphics-08G-P4-... )


When you're stretching a movie across an IMAX-sized screen, you really want the pixels. I can't fit an IMAX screen in my living room though, so I can't really see the benefit.

And unless the movie was designed for that IMAX field of view, it's going to look some combination of ridiculous and terrible.

I remember sitting a bit up front in a theatre and thinking “I can see the pixels”. And worse it was slightly out of focuse (maybe intentionally to hide them).

From that day on every time I see a 4K film in theatres the low resolution bugs me. 8K should get us much closer to the old 35mm experience.


I hate it so much. Not only is the resolution totally shot now that we have fully digital projection systems, but the compression artifacts in some movies I've been to are worse than if I had just pirated the damn thing at home.

It took me multiple movies watching over time to start seeing the difference between HD and 4K. I certainly couldn't care about 8K at a single demo.

Really? The day I got my 4k TV I could see the difference the moment I put up some 4k content - I though to myself - holy shit, this is too good! Now I can see the difference when going back to HD the moment it happens :(

Consider this: 4K DCP, which is just a little larger than consumer 4K (4096x2160 vs 3840x2160) is good enough to project on 40 foot movie theater screens. 8K is FOUR TIMES that consumer 4K resolution. The benefit of that resolution on a 65 inch screen is, uh, negligible.

Right, but the size of the screen is irrelevant if you don't also take into account the distance you sit from that screen.

An 8k 360 degree immersive environment might not be quite enough resulution to get fine detail if you were looking at it through a VR headset.


I think the short way to say this is viewing angle. If you're seated where the angle between one edge and the opposite edge is more than a typical situation, then you'll want more than typical DPI and thus resolution.

One rule if thumb I recall hearing is that the TV diagonal size should be about 1/3 to 1/2 the viewing distance. I'd need to brush up on my trig to get degrees from that.


But have you ever seen a film captured and projected in true 70mm IMAX? The difference between that and 4K DCP is immense.

Yes in Movie Theatre. But not at home.

Counter-intuitively perhaps, many of the benefits of 8K become readily apparent on tiny screens intended for use directly in front of one’s eyes, such as VR/AR headsets.

Discussions of resolution against screen size are meaningless without consideration of viewing distance.


For VR/AR the most important feature is be pixel density

I thought for VR it was refresh rate?

The refresh rate of the current generation of headsets is already enough for what we need. The next bottlenecks on the road to human vison like experience are peripheral field of view, pixel density and vergence-accomodation conflict.

I'm gonna need a bigger living room.

"car simply isn't capable of 200"

what if it's going down a steep hill?


If it's my car, the wheels will fall off before it reaches the bottom. Mine starts to shake around 90, then smooths out until about 120, then starts to shake again.

My mother-in-law in the back seat was not happy.


And you think it's because of the bumpiness of the ride? You're going to kill someone driving like that on public roads.

I was in the backseat of a car driving like that in high school. Driver lost control while changing lanes, we swerved across multiple lanes of traffic, exited the interstate and flipped five times. The two people in the car not wearing seatbelts were thrown from the vehicle as we rolled.

Nobody died, but by rights any or all of us should have.

Seriously, stop.


It is absolutely common to drive that fast on German roads. With less accidents/mile than America. It needs good driver education, good roads and good cars, and then it's fine.

It's 200 MPH, not KMH. Going ~320 KMH is not absolutely common on German roads

Even seeing cars capable of going 200mph is a rarity. Very few cars can go 200mph and they're all rather expensive.

The parent was talking about 120 mph, which is close to 200 kmh.

And you think it's because of the bumpiness of the ride?

No, because of the flimsiness of the car.

You're going to kill someone driving like that on public roads.

Ordinarily, I'm on your side with that. But this was a one-off passing a Winnebago on a downhill in a part of the country where you can see the road is clear for 20 miles ahead through the valley up into the next mountain pass.


Most cars are drag limited well before 200mph. Terminal falling velocity is only about 120mph for a human, cars are less aerodynamic and a straight vertical drop isn't usually considered a hill.

There are other acting acting forces on the car other than gravity

Most certainly no. Your best chance is putting the gearbox in neutral and even then an absolute free fall wouldn't even do it. You'd hit terminal velocity well before that.

aww :/

Good luck getting tires rated to not fail at 200 mph even if your engine and transmission can propel your car to that speed.

Tires have a maximum speed rating. For my tires this is only like 120 mph.

How far away were you standing and how big was the TV?

This is certainly not the case for gaming. I use a 4k 24" at a desktop and still need to turn AA on to prevent noticable alising.

It looks like 5k is the sweet spot for 27", and 8k for 32".

Not to mention VR needs 8k to provide 1080p-like resolution over a large field of view.


Extrapolating from current consumer situation, we can expect 8K huge ass TVs which display cringingly poor picture and users won't care. As usual contemporary digital TV setup is:

1. set top box is by default sending only 720p to the TV

2. HDMI input on the TV is by default in "video mode", where it rescales the picture even when fed native 1080p

It took me months to eventually figure this out, and now I have sharp pixel-perfect FullHD picture...that makes compression artifacts pop up :(


It won’t be that bad.

STBs from Comcast over the past few years support 1080p and the ones released over the past year support 4K. Also most newer Android TV based STBs support 4k. You’re probably saying “yeah, but the content doesn’t”, oh, but it does.

Companies like Comcast and OTT providers like DirectTV Now support 4K linear broadcast as most transcoders in the past five years support 4K streams and most broadcasters are supplying these providers with 4K streams. Even the recent move to HVEC is making these providers rencode their VOD libraries and titles are being rencoded at 4K.

The transition to 8k won’t be as bad when people had 1080p TVs with 480 content.


Comcast sends everything to the home as 720p, regardless of what the box can do.

This is false.

Some channels in some markets may be at 720p and some older STBs top out at 1080i, but recent boxes like the XG1, any boxes that are IP, like the Xi series, and all of their apps take top bitrate profiles, so channels like your local affiliates or ESPN are received at least 1080p.


Either that or 1080i, depending on the channel. Disney/ABC/ESPN networks all use 720p, as do Fox networks. CBS/NBC and all their networks produce in 1080i.

There may be some 4K or 1080p on-demand or special event content (i.e. Olympics), but it is very rare.


With the exception of local access, over seas channels, and some really small and locally produced content for local affiliates, it’s 1080p. A lot of companies transitioned from 720/1080i in 2012-13. If you have an older STB, you might be seeing content scales down. Fox’s animation I thought was produced in 720 but broadcasted in 1080i, you see that with Cartoon Network and Nickelodeon.

CBS/NBC get converted to 720p by comcast before you see it.

This is still not true. I’ve personally worked on some Of these transcoders, unless your affiliate has something running on a Pentium 4 in their back closet from 2003, they’re not broadcasting something that is down converted.

There are complaints all over their forums about this and CS agents confirming it. They started to do it about a year ago. Also, it’s very obvious when watching.

https://forums.xfinity.com/t5/Non-X1-Service/1080i-channels-...


The overcompression/poor compression of streaming video is scandalous. Netflix cannot even offer a decent DVD quality stream, it seems perverse for them to even offer 4K.

I have far, far worse compression artifacts in cable broadcast (apartment provided, can't opt out) than from Netflix, and I can only pick out artifacts in Netflix 4K if I'm actively trying.

Where streaming services do fall down for me is audio stream handling. If you stream high quality in most services on most dedicated streaming devices you get 5.1/7.1 by default, with awful, unconfigurable downmixing to 2.0/2.1 that makes you crank the volume to make dialogue audible but is still immediately drowned out by the slightest peripheral foley effect. Amazon video is the worst about this.


When I watch '4K' content on Amazon Prime Video, I can see huge compression artifacts. Now, I don't really expect to see great quality 4K streaming video on a 25 Mbps connection, but it'd be nice to have an option for higher bitrate 1080p instead of mediocre 4K.

I still haven't seen any streaming service look nearly as good as a Blu-ray and this is on a mid-range TV, from farther away than I should be with probably-should-wear glasses vision.


For years digital camera manufacturers scrabbled to have the camera with the higest number of megapixels, as that was what the majority of consumers looked for - it feels like that's where we're at with TVs these days.

I personally don't see the point in anything higher than 4k. I'd much prefer if manufacturers focused on better colour reproduction, better blacks, better audio and better network and media functionality.


I'd be happy with just proper mixing that doesn't make vocals a second or third-class citizen to orchestral swells or explosions.

I don't even had bad hearing and I had to get a 5.1 system, just so I finally had a proper center channel.


You didn't even mention the horror of the level of color reproduction on the cheap models that most people are buying. Or how some TVs don't even seem to have a way to fully disable motion interpolation.

We used to have a HiVision signal in Vladivistok in early nineties, the difference in between blurry coffee muck soviet tv, and Japanese tv sets was dazzling.

People back than were paying serious moneys for HiVision set, but I remember one case in particular. Some rich dude from Moscow flew all the way to Vladivistok, to buy a HiVision set from father.

First, he paid a ton of money, and happily flied away to Moscow, then father gets an angry call "the TV set is broken!" In a few months time he returned with the set, father asks him to turn in on to see what happened, and it "miraculously" starts working.

The rich dude wasn't able to hide his embarrassment and disbelief, and flew back again. As you can guess, the guy was not away for long, and that time father answered "if you that rich, maybe you will buy an own satellite," and pointed him to fine script in the contract.


I don't have cable, watch Netflix and steam from PC in 4k all the time - looks amazing. Netflix 4k is limited but they are adding more all the time.

The article says they're not just broadcasting in 8K, but 8K at 120 fps!

How many displays are actually capable of not just displaying 120hz, but actually accepting a 120 fps input (as opposed to just interpolating)?


120fps is the real news here. Most people will see no difference between 4K and 8K, but 120fps is an immediately obvious improvement. People will probably attribute the sharper image to the higher spatial resolution, but it will really be the result of reduced sample-and-hold blur. I'd take 1080p120 over 8Kp60 any day.

> is an immediately obvious improvement.

Technically, yes.

I'm one of those weirdos who thinks movies aesthetically look better at 24 frames for the "dreamlike" quality. Racing/Action Sports, yes, please more frames.


I hate 120Hz, especially TVs that try to add in the missing frames giving you the soap opera effect. Even The Hobbit films in HFR (and I think those were 60fps?) looked weird and unnatural. It was like being on the set where you could just tell how fake it was.

I always disable motion smoothing on 120Hz TVs. I think high frame rates are really only good for video games and live sports. For fictional cinema/tv, it just looks weird and unnatural.


24 fps is an art medium with certain features and incentives. Its moving shots are automatically blurry, and the audience is used to that, so you can use them to, e.g., give the audience's attention a rest between two important still shots.

Similarly, just because you can technically put the entire scene into focus all the time doesn't mean you necessarily want to!


>Similarly, just because you can technically put the entire scene into focus all the time doesn't mean you necessarily want to!

Sometimes you technically can't put the entire scene into focus. Even if you have unlimited lighting, there's only so far you can stop down the aperture before diffraction wipes out your depth-of-field improvement. Citizen Kane famously used composite shots to achieve extreme deep focus that would have been impossible as a single shot.

I'd love to see another attempt at a photorealistic 100% CG movie like Final Fantasy: The Spirits Within. We'd be free from the limitations of real world optics, and could produce a "hyperrealistic" style even better than what you can see with the naked eye. I find shallow depth of field almost as annoying as low frame rate.


> I'd love to see another attempt at a photorealistic 100% CG movie

Isn't that what Disney's Lion King remake is doing?


I watched the trailer, and it has some impressive CG, but I think it's cheating to use animals. It's easier to spot flaws in CG human characters. I still count FF:TSW as the most ambitious CG animated movie ever made.

I hadn't seen it since it was in the theater, so I just watched a trailer. The CG holds up surprisingly well (modulo the uncanny valley) considering it will be old enough to vote next year.

I'm with you, I remember watching the Hobbit at the theater, and being creeped out by how surreal all the movement were.

I'm okay with playing video games at 60fps, but my brain keeps rejecting photorealistic content above 30fps


As someone who has watched a lot of Let's Plays, recordings of games done above 30 FPS kind of weird me out. I don't usually have an issue playing them myself, but if I watch a recording of someone else's session (which is interesting to me from a game design perspective -- where did they get stuck, what did they breeze through with no challenge, etc.), I get that same surreal "something's not quite right here" feeling as higher frame rate television.

Huh, I get the opposite (for games I've seen running at 60) - footage recorded at 30fps looks a little off to me.

A movie at a frame rate higher than 24 looks strangely “cheap” to some of us. I believe it’s because on a subliminal level, the higher frame rate reminds us of a television broadcast.


This is definitely real.

This was quite noticeable to a trained eye in the finale episode of David Lynch’s Twin Peaks returns season.

As I was watching it, somewhere mid episode where the story moves between two realms of reality, I noticed the image suddenly had a much more real (or “cheap”) appearance.

I was in awe. That was a super nice touch done a technical level which really helped elevate the story itself.


I expect that if good high frame rate content was more common, this effect would go away. It's a conditioned response, not an innate one.

I also expect the effect to be lesser among younger people who are used to e.g. video games, although I have no actual evidence (anecdotal or otherwise) that this is the case.


That sounds plausible. I grew up without a TV, and I've probably watched <200 hours of live TV total. 60fps looked immediately and obviously superior to 24fps the first time I saw it, and I find it incredibly weird that anybody could prefer 24fps. 24fps isn't even close to realistic.

That's why some refer to it as the "soap opera effect".

And although I remember 60fps being very noticeable at first, I don't think I notice at all any more.


Same here, I was weirded out by it at first, but overtime got very used to it and it felt natural.

I never thought about that. Now it makes sense why they still use film even though the prices to shot some shows, even though it a lot of extra costs (media is more expensive, it is time consuming, better lightning, etc)

> I'm one of those weirdos who thinks movies aesthetically look better at 24 frames for the "dreamlike" quality.

That’s not weird, that’s what the majority of people prefer. Otherwise they wouldn’t keep doing it.

It would be more weird if you said you preferred films at 60fps


Add me to the list! I hate the way movies look at higher frame rates but the broad category of sports (traditional/e/motor/etc) all look much better.

>I'm one of those weirdos who thinks movies aesthetically look better at 24 frames for the "dreamlike" quality. Racing/Action Sports, yes, please more frames.

No you are not Weirdos. Perfectly normal with great taste in Movies frame rate and Sports Action Frame rates.


I notice they mentioned broadcast in 59,60,and 120 Hz.

I’m wondering if the 120Hz is just frame-multiplied 24Hz video. Watching a 24Hz movie would certainly be a better experience at 120 fps than at 60, since there won’t be the experience of judder as 120 and 24 divide evenly.

Regarding displays capable of accepting 120 Hz input, there are plenty of computer monitors that can do that at least.


> Regarding displays capable of accepting 120 Hz input, there are plenty of computer monitors that can do that at least.

Yep, just not televisions! That's a big difference, since most people don't want to put monitors in their living rooms.


How much bandwidth would you need to show that type of video? I feel like my broadband network couldn't even handle that.

I haven’t even bothered with a 4K tv yet because my internet couldn’t handle it. I think it will be a very long time before most Americans can get good 4K streaming, let alone 8k.

The nitrate of 8k is high enough for them to only broadcast it via satellite.

This is only going to be for sports bars.


It's fairly common to receive TV broadcast by satellite.

About 10% of American households have a satellite dish, and about 20% of western European households. I think it's much higher in parts of Africa and western Asia, where there isn't the infrastructure for terrestrial broadcasts, or where satellite receivers get around state censorship.

https://en.wikipedia.org/wiki/Satellite_television_by_region


To the question of how much resolution is enough, my answer is: can you see the pixels when you approach your eyeball to the screen? If yes, then resolution can be improved. If no, then there is no need for more pixels. On modern big television (e.g. > 45in) then the resolution required to pass this limit is well, well beyond 8K.

You can argue that because you look at the screen from a distance this criteria is irrelevant, but there are all sorts of discussions and argument about perception of clarity and whatnots which makes calculations based on distance hard to defend properly. The only way to be sure is to be unable to perceive pixels even if it's right in front of your eyeball. Like actual things in life.

I'm convinced resolution will improve until this limit is reached, as it should.


No, it should not. No one puts their eyeball to the screen under any normal circumstance. And each additional pixel has implications for bandwidth. Bandwidth is a limited commodity. I don't want my internet bandwidth reduced by your desire to watch TV from less than an inch away.

Do you think your internet would be faster right now if no one had ever managed to make video streaming work?

I think it's quite the opposite.


We need smaller pixels because the next gen tv will be a millimeter away from your eyeballs

Yes, we need smaller pixels for that, but that doesn't mean we need more than 4k.

Eyeballs close to the screen everyday: VR.

Now we're talking about the angle each pixel subtends, so the size of each pixel given different screen placement distances. This does not mean we need more than 4k, just that we need smaller pixels for screens that are meant to be held closer to the eyes.

Thats essentially what apple’s “retina display” definition is, isn’t it? Assuming whatever reasonable distance, a minimum ppi that doesn’t allow seeing individual pixels, gives you the apple line’s retina screens (I forget the actual numbers they use)

What GP is saying is that there is no minimal reasonable distance, just like in real life.

I would say the minimal reasonable distance is the minimal focus distance of a single eye, a few centimeters.


Wow, and I haven't even jumped up to 4K yet. It seems like technology might be advancing faster than many producers can keep up.

As I understand it, in the anime world most airing shows are still being animated at 720p or at slightly higher resolution, with a handful of them occasionally being animated at 1080p. Although the best studios typically clean things up and make a huge number of improvements for their 1080p BD release. Movies are generally animated at 1080p.

Older cell-drawn anime are being scanned at 4K / 5K to produce new 4K releases, and I'm told the level of detail in those is incredible, which makes sense given that they're hand-drawn with extreme care. Currently the best known release is probably Ghost in the Shell (1995).

Even the incredibly popular movie Your Name was animated at 1080p. The special edition is upscaled 4K with HDR.

The high cost of 4K releases also creates a barrier. A standard 4K Blu-ray movie seems to go for around ~$80. If you want both the English and Japanese versions of Your Name at 4K you're looking to pay around ~$195, admittedly that's for the 5-disk collector's edition. Compare that with a typical ~$20 1080p Blu-ray release.

Being unable to prepare a personal backup is also a problem. Ideally I'd dump the movie right away so the physical medium can be stored in as pristine conditions as possible. It would feel very frustrating to spend that much money, only to have it casually damaged.


Can't speak for anime but for the normal cinema world, iTunes has daily 4k sales for $5. Picked up a buhch on BF. You can also get disc + digital UHD deals on amazon for less than $10.

I used to pirate a ton of movies but with the deals these days on 4k + dolby vision + atmos titles its worth it just to buy.


Storing 8k media is going to be very expensive for a common man.

The common man will stream 8k. Discs are slowly on their way out and local media servers are for people on hn doing their own and rich people buying things like the Stratos s.

I saw nothing about bitrate? I can't help but thinking that the same bitrate would look better at 4k, or even 1080p.

Sometimes higher resolution can even reduce the bitrate, since jagged edges become smooth transitions. Basically, the codec has the freedom to reduce the resolution if that's what will maintain the most quality.

I only know about this practically in regards to 10-bit anime being more compressible than 8-bit, but I assume the same logic applies.


I don't think so. 10-bit improves compression because posterization leads to noisy patterns that must be encoded. By smoothing the posterization out, encoding a gradient, and having the decoder produce the noise, you can save bitrate.

But the equivalent in spatial resolution is encoding near-horizontal lines... which isn't a common situation (unlike gentle color gradients) and isn't a problem in the first place.


How does posterization differentiate from banding? I know when I encode videos from an 8-bit color space, I sometimes have to generate an additional layer of noise so that I don't end up with those horrible macro-blocking blotches you get with banding.

Posterization is the technical term for this sort of banding.

Banding can refer to all sorts of things that form bands, like fixed pattern noise from a camera sensor.


> Sometimes higher resolution can even reduce the bitrate

I cannot imagine how this could ever be true. Have you experienced it actually happening for the same content at the same encoding settings?

"bitrate" = file size / duration. There is no reason why increasing the resolution should ever make a video more compressible. This is completely different from the 10-bit situation you described, which takes place because you're storing data in a different format.

You could choose to encode a higher resolution copy of a video at a lower bitrate, but it's nothing an encoder will do by default, and it should always result in the higher resolution version looking worse.


This is what's most important. A lot can be said about internet streaming but some Sat services have 4x the bitrate using comparable format.

My thought too. The 4k streams still don't look as nice as most if not all of the Blu-ray I have.

I remember original starcraft, 256 colors and 640×480 it was a feast for my eyes. It will always be about content, not the medium.

HDTV was when content stopped being an eyesore to me. Those DVDs were just blurry on a good screen. But I still don't think I really need 4k.

Honestly my completionist instincts make me like 4K/8K.

It's the point where I think we can pretty much say, okay, we are now finished with resolution increases. There's nowhere else to go. We've 100%'d that part of the tech tree. Weirdly satisfying.


Agreed. An 8K monitor will be great. The equivalent of a 300DPI colour laser printer. Genuinely letter quality text. The quality is good at 4K, but there's still room for visible improvement. I can still easily distinguish individual pixels, so we're not yet down to the Rayleigh limit. To be fair, there will still be room for improvement above and beyond 8K, but the gains will likely be marginal, not to mention the image itself getting prohibitively expensive to render and send to the display.

That's also why I want HD audio. I'm not saying I can hear the difference above 16-bit 48khz, but I feel more comfortable with a 200-400% buffer. And maybe someday I'll get bionic ears.

Considering that with proper dithering, 12-14bits is the threshold, you already have at least a 400% noise-floor buffer.

The good news for you is that this is also going to be running with 22.2, so you can live in a room full of speakers and watch the Olympics as if you're in the pool yourself.

Apart from IMAX/70mm film sources...it is satisfying to watch a movie on a 4K HDR TV and know that you’re getting almost all (if not all) the visual information the director captured on the medium. Assuming the transfer was handled well (which most are).

Seeing the film grain detail on older movies is strangely satisfying to me. Even newer movies shot digitally are done so at 2K or 4K.


Except when movies shoot in 8K and digitaly downsample to 4K after editing.

https://www.red.com/news/gotg-vol-2-to-shoot-on-8k-weapon


I'm not sure 8k will suffice completely for 2030's VR experience.

>> "We've 100%'d that part of the tech tree."

"640k ought to be enough for anybody."


SC1, Ultima Online and Mario N64 are three truly beautiful games that stood out to me personally as not just games, but art, and not just visual medium. All in an era before the current billion-dollar industry and tech. The Ultimate Online and SC soundtracks are actually wonderful.

All these years later and the UO Parody songs still make me smile:

https://www.youtube.com/playlist?list=PL2D6D575572C1A7C6

Don't Bank So Close to Me ... Just a Thief Ya Know ... 50 Ways to Loot Another ...


Thanks for that, made me smile as well. I remember on ICQ receiving links to those. Sadly, I lost my old ICQ number and all those friends as well.

You're absolutely right about that. I always thought getting a bigger screen would dramatically increase my viewing pleasure but I don't think it hardly even counts for 5%. Content still is king.

there is actually a standard distance away from a screen that determines maximum viewing pleasure , I think it is 10ft but that is a rule of thumb it changes due to screen size.

https://www.crutchfield.com/S-2baTDHKQBkY/learn/learningcent...

I think it has something to do with full vision of the screen at any time so your its centerted and a particular percentage of your field of vision. I couldnt find the study which was for sports and movie watching but I bet such a standard is even more important for gaming.

Basically relatively speaking your screen should be the same size in your field of view but the fact that its bigger allows for better detailing.

4000px from 5 feet doesnt look at good as 8000px from 10 feet even thought they fill up the same area in your field of view


What really improved it for me is to install a backlight ala Philips Ambilight, just DIY.

Going to be more pedantic, its always about how the content transforms, how many attributes does it have and how can you interact with it.

Been waiting and looking forward to 8k. It would make a significant difference for me.

I use a UHD/4k 40" TV as a monitor, running at native resolution. It's essentially four 1080p 20" screens in one. The pixel density is pretty good, but not "retina"/high DPI level.

Things can be a bit too small as I sit further from the screen than a regular monitor.

With 8k, it'll be four 4k 20" screens in one.

I'd be able to go to a larger screen without sacrificing much on pixel density. I would no longer have to zoom in on certain websites and can just get a larger screen.


I've seen setups like this but have never really understood the point. You're either left sitting far away and scaling everything up (reducing the resolution) which defeats the purpose of having a larger display or you're exposing your eyes to the constant light of more screen space than you can actually see and putting up with the eye strain of looking at a 1:1 virtual-to-physical pixel ratio. It seems to embody the stereotype of American consumer logic, i.e. bigger and more is always better.

The setup is fantastic for me, and would be even better with 8K screens.

I'm running at native resolution, so not doing the "sitting far away" option.

"More screen space than you can actually see" - This is somewhat true. There's an upper limit to the size where it becomes impractical for most use cases. I'd say that limit is 40"~55", and mitigated somewhat by curved screens.

It's not too different from having a large quantity of screens you'd see at say, a trading desk, but all conveniently in one screen.


When will we finally have 4 8k 20" screens in one?

To be able to watch 8K you not only need an 8K capable television

Why is all this high-resolution media walled behind decoder boxes and DRM? 4k blu-rays have some fairly intrusive DRM as far as I understand, and then there's 4k streaming like on Netflix, which requires hardware DRM to playback on PC.


Content owners would love for people to have adopted DRM as standard way back when VCRs first came out. Higher-resolution is just the carrot they are using to get consumers to adopt their DRM standards.

It's because the rights holders in Japan (of whom many are broadcasters) have broadcasters by the balls and demand it and the Japanese Government goes along with it. In particular it's music labels, but various sporting events also make explicits of it. HD broadcasts in Japan (including free to air terrestrial) are encrypted with B-CAS, something that has been broken a few times over many years now. The 4k/8k transmissions over satellite have a new system called A-CAS that instead of being smart card based, is an ASIC in device that apparently receives keys and firmware out of band from the video mux. Now these kinds of systems have been the status quo in that market for over a decade, any new format coming along will most certainly have to implement _some_ form of conditional access. At least until the value of the content on it makes a minority stake on the balance sheets.

I'm honestly not sure what the DRM situation is on 4K blu-rays—I don't own an "official" 4K blu-ray player—but it should be noted their efforts amounted to basically nothing. With the right drive, you can rip 4K BluRays easily. I know because I have done it.

Consumers need to speak with their wallets if they want to impact change in this arena.

That isn’t quite fair.

Consumers aren’t told about DRM. The spec sheets don’t speak about what restrictions are in place. You only discover after the purchase that some capability you had assumed you could do no exists.

E.g. for me I’ve had this twice. Trying to view an HD movie over a component cable—nope not allowed. And second my daughter discovering she can’t use music from a streaming music service in a home video. Both occurred well after initial purchases.


"my daughter discovering she can’t use music from a streaming music service in a home video"

Can you say more about where the DRM applied in this case? I would expect it to kick in, say, if you upload a home video to YouTube that incorporates copyrighted media, but I would be surprised if something prevented you from making a video (say on your phone) that included whatever music playing over speakers.


I imagine the OP was more referring to trying to whip something up quickly in iMovie (or similar). If you own the track, this is trivial. If you're used to Spotify, not so much.

As if consumers were given any choice in the matter.

The environmental impact of all the TV churn... :/

Agreed, I'm still not convinced I really need to upgrade to 4k yet let alone 8k.

You'll probably notice a difference but there's really no promise you'll find any value in it.

I need 4k for programming but I've yet to appreciate 4k for video. I see it, but I just don't really care.


Out of curiosity, what do you mean when you say you need 4K for programming?

At 4k resolution, a 30-40 inch monitor gives noticeably improved text quality (even when I could only get it at 30hz). Part of the reason for this is that we're looking at the screen from 2.5-3 feet away, rather than the ~6ft viewing distance that is common for TVs.

It's hard to describe the difference, except that I have two monitors side by side -- one a 4k, and one a 1440p, and the difference is noticeable. Even when I scale the 4k monitor so that the window size is the same, the text quality is still noticeably easier to read.


For web app development my setup is as follows:

4k monitor for two panes of code

4k monitor for a browser + debugger or maybe two browsers: one mobile one desktop.

1080p monitor for docs or maybe Netflix as I work depending on the kind of work.

When I'm doing robotics or server or more general programming work I basically just use my main monitor.

Reading code on a 1080p monitor is almost intolerable now that I'm used to 4k. I can size it way down and get two full panes of code side by side.


Interesting. Thank you for sharing.

Same. We have a nice 1080p OLED and most things we watch look beautiful in 1080p. Plus these new 4K panels come with tons of weird processing and effects like motion blur, etc.

I’m still not convinced that 4K is a viable format. They’re still cranking out 480p DVDs, a technology that was supposed to be supplanted 12 years ago by Blu Ray.


If you're happy with your 1080p TV, you'll probably continue to be so, unless you get a 4K OLED TV, which may blow your mind a bit.

> 4K OLED TV, which may blow your mind a bit. reply

Genuine question: does it display movies in a "natural light" the same way as a plasma TV does or does it display it under a "fake light", the same way as the LED screens do? (even the 4K ones). I'm asking because I'm a movie buff who's really attached to his plasma TV but looking at the available LED options up on the market I don't see anything that satisfies me. Yes, the image is "clearer" on the latest LED 4K screens but it also looks "fake-ish" when it comes to watching movies.


OLED is not back lit, so it probably looks better than plasma once you account for the higher pixel density. I haven’t done a side by side comparison though.

It's as natural and balanced as you want it to be. I recently watched "The Revenant" in 4k HDR on an LG Oled Panel, the movie was apparently filmed only with natural light (not in the way you meant, just that there was no artificial lighting for scenes). It's absolutely stunning to see as everything feels spot on.

OLED is not the same thing as what they advertise as LED, which is really LED backlit LCD. OLED screens have no backlighting, and so when properly tuned, the picture is 100% perfect. I'm with you though, I love my plasma TV from 2009 and don't plan to give it up any time soon.

HDR alone will make your plasma look old. I went from a plasma to 4k then to hdr. Huge improvement.

It's the OLED part that blows the mind, not 4K part.

The difference in resolution from 1080p is very noticeable and a definite improvement if you prefer more detail in your movies.

I will replace my 1080 TV when it breaks. It has already failed once, the LED backlight replacement cost 1/5 of the a new TV. Next time it goes to waste (the bulk is mostly aluminum anyway).

> will replace my 1080 TV when it breaks.

And sadly it’s been designed to break sometime around the 7-10 year mark.

E.g. using minimal amount of solder on connections that will eventually crack after repeated warm/cold cycles from use.


Hah, solder cracks. LG panels backlight completely shuts off if one LED dies (for, uh, "safety" reasons, they couldn't design a proper fucking power supply). Which they do, a lot. Now, you can barely notice one LED gone. Replacing the strip costs 1/4-1/3 of a new TV. This fucking world...
More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: