Unfortunately this exhibits the usual confusion between pixel pitch and resolution, which makes it completely wrong.
> Studies show that someone with 20/20 vision (or 6/6 in Europe) can distinguish something 1/60 of a degree apart. This means 60 pixels per degree [...]
NOOOOO!
"Resolution" is defined as the closest distance two dots can be while still appearing as two distinct dots. If you have two dots shown on adjacent pixels then you just see one big dot. You need the two dots separated by a pixel in order to see that there are two dots, so the resolution is half the pixel pitch. Worse yet, the two dots might be diagonal, so you need an extra factor of sqrt(2) in your resolution calculations. This means that the pixel pitch of a TV screen is 2 * 1.41 ~= 3 times its resolution. You need 180 pixels per degree if you have 20/20 vision, not 60.
And it gets worse. Everyone knows that 20/20 vision is "normal". Well, not quite. 20/20 vision is the average without correction. Once you include glasses, 75% of people can actually see better than that, and some quite a lot better. So to be safe lets bump the pixel density up by another third, to 240 pixels per degree. That gives 7,200 pixels on a 30 degree screen.
So does this essentially mean that all those people who say things like "You don't need/can't distinguish 4K if your nose isn't touching the screen()" are wrong?
I have always thought it was a strange thing to say. I can definitely see the difference between a 4K stream vs a 1080p one, even when the screen is much more far away than the recommended distance...
To be fair the difference in bitrate between 1080/4k is probably (in my opinion) a bigger difference than the pixel count, uncompressed 1080 looks pretty damn good.
but yeah the "the eye can't see >20fps" or a certain dpi is definitely BS.
Glad to see this get some exposure. It always kills me when people buy expensive 4k, curved, HDR, etc 50" TVs and sit over 10 feet away from them. (127 cm diagonal and over 3m away for Europe) You just threw your money away.
Another thing is when you plan your sofa and TV furniture. Having over 10ft/3m (a common occurrence in homes) means you will need 100 inch screen for a cinematic experience, which means having lots of money or a projector, which brings a lot of problems itself.
My personal setup is 6.5feet/2.5m 65 inch diagonal 1080p plasma Panasonic Viera from 2007. Everybody who hears about this thinks it's way too big for that close. But when guests ask for the specs of what they find an impressive picture in person they are always shocked to find it's "obsolete" technology.
EDIT: as people dispute the real distance you can notice 4k vs 1080p, you don't need to trust any graph. Get close enough to your TV to notice individual pixels, then back away a little bit until you can't. That's the distance and I bet you will find it a lot close than you would think.
Also 3), bringing your couch and TV closer to each other will probably result in some wasted space around them that you can't use in any way (like 0.5m behind the couch and the TV stand). Not quite a worthwhile compromise.
Not everything is about optimizing for the numbers. You don't have to plan your TV watching experience to the mm in order to enjoy it. You can have the TV at a slightly non-optimal distance, use the integrated speakers, or view content that's not high bitrate 4K or FHD.
Reminds me how audio engineers have to test their masters on old Toyota Corolla stereos, because that's how many people will hear their production the first time.
1) You're (outside of some relatively small models) very right here, but TV manufacturers are starting to push 8K sets with negligible benefits for most people's scenarios and little to no native resolution content.
2) Absolutely. One of my eternal frustrations is many people's hyperfocus on a single metric when comparing products (resolution for TVs, megapixels for cameras, clockspeed for processors, etc.). Your average consumer would be much better off focusing on other image quality improvements like the things Rrtings includes in their reviews.
HDR you want regardless and good luck getting anything less than 4k for cheaper than the 4k one anyways... but curved was always a joke. The curve was always overdone to the point it was worse distortion than a flat screen would produce unless you sat at an unrealistically close and precise location. Sit anywhere else and every pro of curved turns into a con. Even when you did sit in the right place manufacturing limitations meant it was only half curved anyways, the vertical was still flat.
Thankfully the curve fad is gone in TVs and the cheapest TVs come with 4k HDR anyways.
Resolution isn’t necessarily about having the ability to stop and see individual pixels though.
If nothing else, there is a difference in 1080p versus 4K when it comes to aliasing. The absolute best antialiasing, is to increase resolution. The effect of resolution on all edges that aren’t perfectly vertical or horizontal is noticeable.
There are also particle effect differences where your particles have a 4x boost in definition.
I’m not saying it’s a big deal, but there are factors that go beyond sitting distances. I personally think these distances calculators are made to sell bigger TVs, but won’t immediately discount 4k as just pointless.
This reminds me of the misunderstandings around frame rate. It used to be that people would think "you can't make out individual frames of video at 24fps, so there's no reason to go higher than that".
And, yeah, I don't see a fast-moving slideshow at 24fps, but there is still a significant visible difference between 24, 30, 48, 60, 144, etc.
Just because you can't make out the building blocks of the thing, doesn't mean you can't see the effect of increasing the resolution (whether spatial or temporal).
Latencty went through a similar dark age until recently. People would argue that human reaction times are at best around 100ms, but the feedback loop formed by the eyes and ears, through the hands, into the keyboard/mouse/musical instrument, and back out to the ears/eyes, is much more sensitive than that.
Rhythm games are proof of that. Despite the 80ms or so "inherent latency" that we have, we can still be pretty exact in terms of synchronization with external stimuli.
Our perception abilities are not that simple; just because you can't discern individual pixels at a particular distance, doesn't mean you won't notice done details in an image.
Also, it is fine too be too far away to notice every detail in 4k, when the alternative is going down to 2k or 1080 where you then might be sitting way too close. Real world situations tend to involve compromise.
Another factor is that typical viewing is done on streaming services. That 4k streaming video is far more detailed than a 1080 stream, but the difference between 4k streaming vs 1080 uncompressed/Blu-Ray is not quite as significant. So if you are pixel peeping or using a calculator, it seems reasonable to factor this in and feel ok sitting a little farther back than some ideal.
Cinema means you want 30-40 degrees, but once you've fixed your screen size and your distance, you can then work out if 4k is worthwhile over 2k (or 8k over 4k)
Of course this all falls down because it them comes down to availability, other features (number of hdmi ports, "smart" options, power use), and "4k" tends to also mean things like HDR, which is worthwhile (with the right source) no matter the screen.
Recently upgraded from a 1080p 40" TV to a 4k 55". According to this, at ~12ft away, we should have bought an 85" TV?!
That's crazy and dumb. Or rather, a misrepesentation.
This article states that we _could_ get away with up to an 85" TV. It is only at that limit that we'd start to see "imperfections" as they call it.
I don't want to see imperfections. I also don't need a TV that will take up my whole field of view while bathing me in light.
The 55" we have already feels quite large, much sharper, and is brighter.
This article may be interesting from an academic point, but people shouldn't misconstrue this to think they should be getting a bigger TV. Get a TV that fits your space and feels comfortable. Bigger is not always equal to better.
I think it's useful outside of an academic thought experiment.
It hasn't encouraged me to get a bigger tv, quite the opposite. My takeaway isn't that I need a bigger tv, it's that I can get by with a smaller screen and lower resolution.
Except that at the very top is a slider that gives you a recommendation of screen distance to screen size. I think a lot of folks will encounter this and stop there.
> I also don't need a TV that will take up my whole field of view while bathing me in light.
To have an immersive experience, isn't taking up your whole field of view kind of the point? I used to have a 9' projector screen in my living room, and I definitely noticed cinematography and directing a lot more with the more immersive experience.
Except I don't always want that. I can always sit closer to my TV, but I can't get farther away than the opposite wall.
I would love to have a room designed for a cinema experience with comfy couches and a giant screen, (I am working on it), but my living room is not the place for it. It would just be an eye sore (figuratively and literally).
What is the point then? I feel as though lots of people will read this and say "My couch is 10+ feet from my tv. This states that I _need_ to get a giant tv".
That exact question is addressed almost word-for-word in TFA:
You are probably now thinking something along the lines of "My couch is 10' away from my TV, which according to the chart means I need a 75 inch TV. This is insane!". Yes, if you want to take advantage of the full capacity of higher resolutions, this is the ideal size. This brings us to the main limitation for most people: the budget.
The price of a TV is exponential to its size, as shown in the chart. The chart shows the price range of 2016 LED TVs by their size. As you can see, the jump to a 70 inch TV is quite a big one. For example, check out the price of our picks for the best 70"-75" TVs and the best 80-82-85" TVs.
Conclusion
We recommend an angle of vision of 30 degrees for a mixed usage. In general, we also recommend getting a 4k TV since 1080p choices have become quite limited and lack modern features such as HDR. To easily find out what size you should buy, you can divide your TV viewing distance (in inches) by 1.6 (or use our TV size calculator above) which roughly equals to a 30 degrees angle. If the best size is outside your budget, just get the biggest TV you can afford.
Except that, though I can afford a bigger tv than 55" tv, I still am not going to buy one. And buying the 85" they recommend is _enormous_. It might make for a cinematic experience, but it poses additional problems beyond price:
1) You can never sit closer to the TV. I always sit ~12' from my TV, but walking by the tv at close range (between two doors on either side) is going to be jarring for passers-by. If I have friends over for a movie night, we would have to huddle at the far end of the room.
2) It doesn't fit the space, both physically and design wise. I couldn't put that big a tv in the space if I wanted. Even if I did, it's going to be ugly as sin unless the space is very specifically designed for cinema experiences.
3) That's going to be an obnoxious way to watch non-cinema content.
Having a TV at the other end of a 12 foot room is not going to be an uncommon affair. Having a room where an 85" screen is appropriate is another matter all together.
I feel like the article should have a big disclaimer at the top "This is the maximum size you can enjoy, not the maximum size you should necessarily buy".
It really isn't. This quote is just saying that you probably can't afford the size of TV they're going to recommend, so you should "just get the biggest TV you can afford."
As for wasting your money on a 4K set vs. a full HD set, it almost isn't even a choice anymore. The "budget" 4K recommendation at RTINGS is about $500.
Also, although it is mentioned in the text, keep in mind that not all content is high-bitrate 4K:
Cable TV? Probably compressed 1080p or 720p, depending on the channel. If it's Comcast/Xfinity, they may even squeeze three channels into the bandwidth of two.
Streaming? Compressed 1080p, with the occasional 4K.
Old (but not that old) Blu-rays? 1080p.
If you're sitting in your leather club chair, Oppo UHD Blu-ray remote in one hand and a glass of '92 Screaming Eagle Cabernet in the other, then sure--slide your chair into the sweet spot at the optimal distance per these charts.
Don't get me wrong: when I replace my ten year-old Sony, it's going to be with the biggest LG CX or C1 OLED my wife lets me get. Just realize that trigonometry isn't the whole story.
About a week ago I heard about NextGen TV for the first time (ASTC 3.0). They're starting to roll out across the country will bring 4k content over the air. Not sure how compressed it will be, but I remember how revolutionary it was to switch to the digital OTA channels with a converter box many years ago.
Oh I am so glad this is on HN front page so I get to ask the question. Why 8K?
I think people should try this at home. THX recommended Viewing Distance for cinema experience is 40 Degree. That is like sitting in the first few rows in a cinema that makes me sick watching movies. The maths that works for a 16:9 Display, 55" (inch) would be 5.5 Feet, 60" would be 6 Feet. Divide the display inches by 10 and change the unit to Feet.
And assuming you are even somehow comfortable with this distance in everyday usage, you wont notice the pixel on a 4K TV with a 20/20 vision, what Apple calls it "Retina". For a 50" 4K TV it becomes "Retina" at a viewing distances of 39 inches.
Some would argue 20/20 Visual Acuity is antique. With 20/15, Retina starts at 52 inches. That is still below the THX recommend viewing distance of 5.5 Feet, or 66" on 50" Display.
What about the 20/10! Well that would be 78inch, so yes they could see those pixel. But that is only viewing at 40 degree angle. If you view it in what most would consider normal of 30 degree, you would be sitting at 6.8 feet or ~82inch away from the TV. I.e You still wont see those pixel.
Also note we dont test moving object in visual acuity test. You are watching "Moving Pictures" on a TV. Not watching a single pixel.
And if, in any case we need to optimise for the 5% with 20/10 vision that prefer to sit in the front row of the cinema. We could have went with 5K! And despite only 1K increase which seems very small, that is going from ~8M pixel to ~15M pixel. We would have room to spare even for the remaining 5% use case. And if we are so paranoid and need even more headroom. Even 6K will do.
Instead we went for 8K; ~33M Pixels. The amount of bandwidth, computational power, compression, storage. We could have use those for higher frame rate when needed, 10bit colour accuracy by default or high quality colour profile. Instead we spent it on Pixel.
And before someone jump in saying they could tell the difference between 4K and 8K. Remember the current marketing term of 8K Requires mandatory support of HDR and brightness level. Something 4K could do without. What you think of 8K being better may have nothing to do with pixel density at all.
So apart from Sales and Marketing department pushing for it, ( along with Japanese government pushing it for Olympics ) like the days of Intel CPU pushing for Mhz. For someone who has no expert knowledge on the subject and merely an outsider, let me ask the question again. Why 8K?
Modern TV sales demo videos are chosen for the exact kind of thing you start to see when the resolution gets higher. Footage of the ocean waves, schools of fish, flocks of birds.
Moire effects and screengating are way larger than any one pixel. They make ripples and lumps in the image.
You get used to the way hair looks on any generation of TV but it's nothing like "imperceptible" distortion.
You can stand dozens of feet from an 8K, a 4K, and a 1080 TV and tell the difference when the image is of many tiny objects moving across each other.
Grass. Leaves. Sand. A tiled floor viewed from an angle. You know, rare stuff like that.
Blurring it absolutely reduces moire at the expense of distorting detail.
So now you just have another thing that's larger than a pixel (the blur area) that's reducing the effective resolution of your display, and you can't just say "the pixels are too small to be seen at this distance."
There will be a point where increased resolution isn't any use. When that happens they'll stop making higher-resolution TVs because human beings won't go into the store, look at the TV from far away, and say "holy shit, that has so much more detail from this distance than I've seen before."
Hasn't happened yet. We still have further to go before we hit the human visual sampling limit even for far away viewing, let alone close-up detail like reading a page of text in a scene or accurately representing a houndstooth jacket.
While I agree that 8k likely represents the limit to needed resolution for traditional TV screens, VR could use improvements beyond 8k. That is one usecase of 8k+ screens.
Also, to address your storage/bandwidth concerns, I expect 8k to be almost solely an "upscaling" technology (outside some niche applications). That almost nobody in practice will stream or store 8k, rather they will use local upscaling.
I got a mount that swings down from the wall. If you are going for a 40-degree view angle making the TV a foot closer is the same as making it 10" larger. It also has the advantage of going above furniture when not in use, but down at eye-level to the couch when in use.
When it comes to visual acuity for TV watching, doing calculations while basing yourself on their suggested '20/20 vision'=1/60 of a degree, is probably wrong (on average) :
"At 0.8′ (arcminutes) across, the crater alone hovers at the naked-eye limit"
Which seems to back the the c. 1/60th idea.
Judging the resolution of an eye is a subjective process, not least beacuse of the relative sensititivty and density of rods and cones at different parts of the eye.
I recently tried to appropriately set the FOV of first person shooters to match the FOV that the screen actually took up. I found that most games would not let me set my FOV that low, even when I was 1-2 feet away from a 24” monitor.
It seems strange that it is given that people want a wider FOV than is natural. It gives a competitive advantage in the majority of scenarios, but I’m surprised more people don’t stop and think “this looks off”.
The wider FOV makes me nauseous. Setting the FOV as narrow as possible lets me play FPS games. It took me a while to figure out why some games made me sick and others didn't. Turned out to be FOV.
Interesting. I think I have the opposite issue. I try to get FOV in to 85 - 110 range. Otherwise i feel sick (or i need to sit far back from the monitor, which i can't really do on PC).
I usually play on PC. When I play games on my sofa (very narrow real life FOV) the issue is so pronounced that I feel like I’m on a treadmill. It pulls me out of the experience so much that I’d rather just play a different type of game.
I have and use VR. It is not a casual experience. I set aside a good portion of a day off when I use it. Most of the time when I game, I’m tired after work and just want to unwind with some exploration or creation.
> The chart also shows that a 4k upgrade is not worth it if you are sitting more than 6' away and have a 50" TV. Your eyes won't be able to tell the difference.
Doesn't the chart say exact opposite, the threshold for 50" 1080p is 6.3', so if your view distance is less than that (like 6.0') then 4k is worth it?
This has been on my mind a lot with the whole '4k gaming' push from the most recent consoles.
4k gaming just doesn't make sense for a lot of people. I have a 55" tv and sit 8.5' away from it. This TV is huge and dominates the room. I cannot reasonably have a larger tv or sit closer without ruining the usability of my livingroom.
Most people simply sit too far away from to small of a TV to be able to tell 4k from 1080p. When people protest they can absolutely see a difference, what they're usually seeing is better color and contrast of newer tvs, not resolution.
Even when it comes to PC gaming, 4k is often not all it's cracked up to be. According to OSHA, most people sit about 30" away from their monitor. This lines up nicely with the visual acuity distance for a 27" 1440p monitor. To reach the visual acuity limit for 4k, you'd have to sit 20" away. That's a lot closer than most people are comfortable with. Even a 32" model only expands it out to 25". That's within the realm of plausibility but another thing to keep in mind is viewing angles. The main theatre certification companies recommend anywhere from a 35-55 degree viewing angle. The visual acuity distance for 4k puts you at a whopping 60 degrees which is considered high for just consuming content, much less playing a game where you're expected to see everything happening on screen and react to it.
TLDR: The hype of 4k for gaming is mostly just that. Most people would be far better served going for lower resolutions and higher refresh rates. The only people who really benefit from 4k are using it for productivity, coding, video editing, etc.
> a whopping 60 degrees which is considered high for just consuming content, much less playing a game where you're expected to see everything happening on screen and react to it.
Consider that the game is probably rendering at least 90 degrees to squish down into that 60.
Personally I sit back for content but I love to sit close for games, with the FOV set to max. The really important stuff fits in the middle of the screen, and outside of the middle I get near-peripheral vision instead of bezel and wall.
32" 4k monitor at a normal viewing distance (~30" seems like a good estimate without bringing out a measuring tape) is pretty comfortable for work at 100% scaling. At that distance 1080p video looks significantly worse when compared to 4k, although a large chunk of that may be due to bitrate.
Right, va for 1080p is something like 3.5' for a 24" monitor. Most people will be able to see a diff between 4k and 1080p, but I'd imagine 4k vs 1440p would be negligible.
There’s no formula to spit out a handy chart for that. Center of the screen should be eye level for an adult sitting on your couch, ballpark 3’/1m for most setups.
I know that the value here is that it saves people effort to pre-digest this knowledge, but oh man this is the stuff that high school math definitely taught us to solve.
Heck, maybe this makes it a useful interview question scenario?
when reading about these considerations i always wonder if it really matters at which point i could discern single pixels from one another. i think there is more to it when we are talking about what we can see in the image. maybe we wont be able to benefit in the way that we see less of the pixel raster but i feel like having more details will benefit image clarity non the less. even if it is impossible to effectively discern the pixels the combination of them might still affect what i can see....
There are 2 things this page shows, one is the size of screen you need relative on the distance you sit from it based on how much of your view you want it to fill, the other is what resolution the screen needs to be before you have no improvement.
First decide what you want
30 degrees means you'll be able to focus on the picture, but it will fill your focus area, that's recomended by SMTPE
40 degrees is the "immersive" portfolio recommended by THX, you may lose some detail at the edge of the screen depending where you're looking.
Then it's just a bit of trig to work out the size you need for the distance you are sitting (or vice versa), to match your preferred "fullness"
Second is whether you'll benefit from more pixels. They base this on having 20/20 vision and your optical resolution being 1 arc-minute (1/60th of a degree). For comparison the moon is about 30 arc minutues wide.
Seems reasonable to me, if I take a picture of the moon and shrink it to 30px by 30px, I feel it's on par with what I can see in real life.
Then it's just more trig to work out how many pixels you need for a given size/distance combination. A 20" screen that's 10 feet away, you won't be able to tell the difference between SD and HD. A 40 inch monitor 2 feet away and you'll benefit from going higher than 4k. (This assumes a 16:9 aspect ratio).
Which states "At 6 metres or 20 feet, a human eye with that performance is able to separate contours that are approximately 1.75 mm apart"
Which I assume comes out at 1/60th of a degree.
You could argue that measurement is in fact wrong, or that there's something happening subconciously, but their point is that based on accepted viewpoints that someone with 20/20 can't determine the difference between two contours 1mm apart from 6 metres, you don't need to display them.
NHK by the way put a lot of money into developing 8k technologies, I saw it about 10 years ago. I was more impressed with the 120fps stuff than the higher resolution. YMMV.
>Do you think it just a coincidence that the best 35mm film also seems to top out around '12k' ?
Part of the reason (AFAIK; not an expert) for the very high quality of 35mm is that it leaves enough headroom to crop the filmed image significantly without a noticeable (or at least an obtrusive) drop in image quality. That provided the very useful ability to apply some "analogue digital zoom" in post-production.
When retina came out, Jobs said that you needed 300ppi at 12 inches. Soneira said it was 477ppi.
1 inch at 12 inches is 4.76 degrees, so Jobs is 63 pixels per degree. Soneira is 100 pixels per degree.
Assuming a 16:9 display:
If you are at 1.5m/60 inch from a 50 inch display, to meet Jobs's criteria you need 1455 lines -- so 4k (2160 lines). You need 2309 lines for Soneira's criteria. You get 40 degree coverage which is the upper bounds of THX.
If you are at 1.7m/66" from a 50" display, to meet Jobs's criteria you need 1326 lines -- so 4k (2160 lines). You need 2104 lines for Soneira's criteria. You get 36 degree coverage width wise which is the "perfect" bounds of THX.
If you are at 2m/80" from a 50" display, to meet Jobs's criteria you need 1098 lines -- so 1080p is fine. You need 1742 lines for Soneira. You get 30 degree coverage.
If you are at 3m/120" from a 50" display, to meet Jobs's criteria you need 735 lines -- so 1080p is fine. You need 1166 lines for Soneira, so you're only just out of that criteria. You get 20 degree coverage.
If you are 30cm/12" from a 7.4" display, you need 1083 lines (Jobs) and 1719 (Soneira), and get 30 degree coverage.
My own TV is c. 3m and 42", which I don't think is abnormally small from what people really have for living room TVs. That's 17 degrees, and needs 981 lines for Soneira's requirement, which 1080p does just fine.
If you have a large screen or sit very close to it (so 30 degrees), and are looking for a better-than-retina display, then sure, 4k is for you. For a home theatre where you're aiming for 36 degrees then certainly you'll want 4k
For the average person with an average tv though, I don't see the extra pixels of 4k being that important compared with a good viewing position, good lighting, good setup of the screen
Like the megapixel wars in cameras a decade ago, we're at the point where more pixels doesn't necessarily mean better. 4k, sure, it wouldn't be my prime consideration, but as almost all new TVs of living room size are 4k it's a given. I'd rather a 1080 TV with 6 HDMI inputs than a 4k TV with two inputs though.
I had the misfortune of watching the final season of game of thrones on some streaming platform from Sky (UK), at about 4 or 5mbit. It was awful, and that was nothing to do with the resolution.
I don't know who Soneira is and I trust NHK's research more than Jobs' marketing (especially since he seems to use the same flawed arcminute criteria), especially where TVs are concerned.
Quick point of note regarding TV resolution requirements:
I've got an old Samsung 45" screen that's only 1080p. Usually films and other media look great on it and I was pretty convinced I didn't need a higher resolution.
Last night I tried playing back some filmed-in 4k drone footage which ended up looking impressively awful. It was sharp, but lots of pixel level brightness noise changing from frame to frame, which looked grainy and was extremely distracting.
I can only assume it's the way VLC or my graphics card are downsampling to display it at 1080, but it was annoying enough I'm now thinking of a new 4k TV at some point in the future.
TLDR: Your source media might make a big difference.
> Studies show that someone with 20/20 vision (or 6/6 in Europe) can distinguish something 1/60 of a degree apart. This means 60 pixels per degree [...]
NOOOOO!
"Resolution" is defined as the closest distance two dots can be while still appearing as two distinct dots. If you have two dots shown on adjacent pixels then you just see one big dot. You need the two dots separated by a pixel in order to see that there are two dots, so the resolution is half the pixel pitch. Worse yet, the two dots might be diagonal, so you need an extra factor of sqrt(2) in your resolution calculations. This means that the pixel pitch of a TV screen is 2 * 1.41 ~= 3 times its resolution. You need 180 pixels per degree if you have 20/20 vision, not 60.
And it gets worse. Everyone knows that 20/20 vision is "normal". Well, not quite. 20/20 vision is the average without correction. Once you include glasses, 75% of people can actually see better than that, and some quite a lot better. So to be safe lets bump the pixel density up by another third, to 240 pixels per degree. That gives 7,200 pixels on a 30 degree screen.