
Netflix is now doing per-shot encoding for UHD content - mmcclure
https://netflixtechblog.com/optimized-shot-based-encodes-for-4k-now-streaming-47b516b10bbb
======
bscphil
It's really too bad that they haven't rolled this out for 1080p too. Their
1080p video quality is god-awful compared to Amazon. They may even have made
it worse: from the "ladder" comparisons, it looks like they jump to higher
resolutions at lower average bitrates now, so if you're stuck at 1080p you're
going to be getting worse quality after these changes than before.

Also, something that's kind of weird about all this is that encoders (x264,
x265, etc) already have their own rate control algorithms that decide on a
frame or scene basis how many bits to use. Netflix taking an approach like
this is equivalent to claiming that they're capable of doing a better job than
these codecs in an automated way - so why not just contribute the code needed
to achieve these improvements back to the projects they're using?

Last, it's very weird to select individual frames from an encode and not a
selection of frames. The comparisons purport to show that the lower bitrate
encode is better than the higher bitrate encode, but in fact (if I understand
how they're using rate control correctly) what they are showing is _a single
frame_ in the lower bitrate encode that uses _more bits_ than the same frame
in the higher bitrate encode. So it's arguably not a fair comparison. And even
with that, depending on how sensitive you are to artifacts, some of the
"optimized ladder" encodes still look worse.

Edit: here is an example of what I mean when I said that they may have made
the quality at 1080p worse. If what you end up watching is the highest bitrate
1080p available, you would get higher quality on the old latter than on the
newer one. [https://i.imgur.com/rzPR7Sh.jpg](https://i.imgur.com/rzPR7Sh.jpg)
Actually, on some of the examples, even 720p on the older ladder is better
than 1080p on the newer one!

~~~
crazygringo
> _It 's really too bad that they haven't rolled this out for 1080p too_

Maybe they will? But it matters most for 4K performance, so that's where they
started?

> _encoders (x264, x265, etc) already have their own rate control algorithms_

But they simply operate on a stream without taking future frames into account.
This does a first entire pass on the film holistically to determine where
keyframes should go and settings per-shot. It can't be backported to codecs
because they work linearly.

> _And even with that, depending on how sensitive you are to artifacts, some
> of the "optimized ladder" encodes still look worse._

The artifacts look worse only because it's zoomed in so crazy far. On a cinema
screen, the additional sharpness will be clear, the artifacts not so much.

~~~
bscphil
> This does a first entire pass on the film holistically to determine where
> keyframes should go and settings per-shot. It can't be backported to codecs
> because they work linearly.

Given that x264 already has a 2 pass mode, I don't see why that is necessarily
the case. Even CRF mode uses mbtree by default, which is a pretty complicated
rate control algorithm. x264 also has pretty intelligent keyframe
determination, I almost always see I frames on scene transitions.

> On a cinema screen, the additional sharpness will be clear, the artifacts
> not so much.

I think it's probably the other way around. The closer you are to a screen,
the more likely you are to notice increased resolution. But if there are large
patches of the image full of artifacts, you'll be likely to see that even far
away.

~~~
zbobet2012
> Given that x264 already has a 2 pass mode, I don't see why that is
> necessarily the case. Even CRF mode uses mbtree by default, which is a
> pretty complicated rate control algorithm. x264 also has pretty intelligent
> keyframe determination, I almost always see I frames on scene transitions.

x264 will not align keyframes across resolutions and encodes. In addition the
pershot encodes optimize more than just idr frame placement. They also
optimize other encoder parameters such as aq.

> I think it's probably the other way around. The closer you are to a screen,
> the more likely you are to notice increased resolution. But if there are
> large patches of the image full of artifacts, you'll be likely to see that
> even far away.

That isn't how this works. It's not a strict tradeoff between more artifacts
and less sharpness by changing resolution. Downscaling -> Upscaling is simply
another form of lossy compression. It may look better or worse than using
those bits in another spot.

~~~
bscphil
> x264 will not align keyframes across resolutions and encodes. In addition
> the pershot encodes optimize more than just idr frame placement. They also
> optimize other encoder parameters such as aq.

Sure, that's true. (Though I don't know why it matters that keyframes aren't
aligned.) But at the end of the day the point is that Netflix has a better
rate control algorithm, and this _could_ be built into x264, even if it might
require a significant amount of work. (Which I'm sure the x264 developers
would be willing to do for a substantial quality improvement.)

> That isn't how this works. It's not a strict tradeoff between more artifacts
> and less sharpness by changing resolution.

Of course it's not a _strict_ tradeoff. It's a loose one. And yes, it may look
better or worse. That's really my only point in that section of the comment:
that bumping up the resolution earlier in the ladder as they're doing is not a
pure win, and it may look worse to some people depending on their viewing
conditions.

~~~
zbobet2012
> Sure, that's true. (Though I don't know why it matters that keyframes aren't
> aligned.) But at the end of the day the point is that Netflix has a better
> rate control algorithm, and this could be built into x264, even if it might
> require a significant amount of work. (Which I'm sure the x264 developers
> would be willing to do for a substantial quality improvement.)

You can't ABR adapt without aligned GOP boundaries.

> Of course it's not a strict tradeoff. It's a loose one. And yes, it may look
> better or worse. That's really my only point in that section of the comment:
> that bumping up the resolution earlier in the ladder as they're doing is not
> a pure win, and it may look worse to some people depending on their viewing
> conditions.

Of course not, that's why the perform an analysis of both options and select
the better one. That's what the algorithm does...

~~~
bscphil
> You can't ABR adapt without aligned GOP boundaries.

Yes, that's true of course, but not really relevant to whether you could port
Netflix's work on scene-adaptive rate control to x264. Maybe you'd lose
aligned GOPs... but for a lot of purposes (offline?) that doesn't matter.

> Of course not, that's why the perform an analysis of both options and select
> the better one. That's what the algorithm does...

The _only_ point I've ever tried to make on this subject is that in some cases
this approach fails. "Perform an analysis" is such a high level description
that it misses the fact that this is being done according to some objective
metric that may disagree with an individual viewer's personal preferences or
viewing environment. In fact, just because an objective metric says 4k > 1080p
doesn't mean the difference will be noticeable at the viewing distance the
viewer is at, whereas the additional artifacts introduced by moving from 1080
-> 4k without a significant bitrate increase may very well be visible!

------
radley
> Sometimes we ingest a title that would need more bits at the highest end of
> the quality spectrum — even higher than the 16 Mbps limit of the fixed-
> bitrate ladder.

From just a cursory inspection using my Apple TV 4K, 12 Mbps is their target
bitrate but their ceiling was ~16 Mbps.

That's still only a third of what AppleTV+ offers (36/48) and only 2/3 of
Disney+ (18/24). Netflix is still higher than Prime Video (10/14). HBO Max was
the lowest quality, allowing only 8/10 Mbps for their HD streams (which
_generously_ doubled HBO Now's infamously low 4/5 Mbps bitrate).

~~~
ProfessorLayton
I've noticed that ATV+ looks the best, which is much appreciated.

However, what has been driving me insane with _all_ these services is how the
bitrate is completely inconsistent throughout, depending on network
congestion. Every service I subscribe to will "automagically" lower the
bitrate if the network can't handle it.

Which is fine, I get it. The thing is, it's the _only_ way to watch anything,
and depending on the content, it can absolutely ruin it. I'm on Comcast, and
Netflix will constantly ping pong between 480p and 1080p, and is very
unpleasant to watch. It's ok for some shows, but certain ones like Planet
Earth 2 become unwatchable (to me).

I really wish there was a setting to add a bit of wait time before playing to
avoid this. ATV+ buffers a bit more than the rest, but has the same issues.

Edit: To clarify, this issue happens both wired and wireless. The main
contributor to the inconsistency is the time of day I decide to watch.

~~~
snailmailman
On my phone it lets me download Apple TV+ content. The files are shockingly
large compared to what Amazon Video downloads are. A 1 hr clip was maybe 800mb
in prime, but a few gigabytes from apple. (I don’t have any on my phone atm,
but a quick search is telling me episodes of “For all mankind” are ~4GB and an
hour long) It looks noticeably better, a massive step up that is completely
worth the extra bytes.

On my desktop I always rent stuff through iTunes just cause I can download it
in advance, and don’t have to deal with buffering or reduced quality.

~~~
xzel
There is a setting in Prime Video to download the highest bitrate which is
often around 4GB for a 2 hour movie, iirc. Totally agree with you on the
download in advance rentals. With good enough internet it makes the experience
of rentals so much more enjoyable.

------
arendtio
Looks amazing and while I would like to enjoy such quality and am willing to
pay for the premium plan (in fact I do already), I still can't watch the 4K
content even in a world where I accept DRM modules in my browser, because some
DRM plugins seem to be more equal than others.

~~~
rblatz
I am shocked at how many people watch Netflix in their browser. I have
multiple different ways (AppleTV, Xbox, TV Smart App) to watch Netflix on my
tv, and at least 2 of them support 4k.

~~~
saxonww
I don't have a TV, so if I watch Netflix it's in a browser.

Most people don't realize this but you can only watch Netflix in 4K in a
browser if you're using Microsoft Edge on Windows or Safari in macOS 11. In
fact, you can only get _1080p_ if you're running Chrome in Chrome OS, or IE,
or Safari. All other browsers, including Chrome and FF on macOS, Linux, and
Windows, are stuck at 720p.

Their blog post[0] from 2017 sounded hopeful for higher resolution video on
Linux - they wrote "We... look forward to high-resolution video being
available on more platforms soon" almost immediately after announcing FF on
Linux was supported - but in hindsight it's clear that what they really meant
was what you point out - support for things like Apple TV, Xbox, and TV Smart
Apps.

[0]: [https://netflixtechblog.com/update-on-html5-video-for-
netfli...](https://netflixtechblog.com/update-on-html5-video-for-netflix-
fbb57e7d7ca0)

~~~
exikyut
What differentiates Edge from Chromium on Windows?

Edge hasn't been around as long as Netflix has provided in-browser service,
I'd have thought Chrome w/ Widevine would have been part of Netflix' success.

~~~
saxonww
Edge supports PlayReady in addition to Widevine. PlayReady is what gates the
higher quality stream. No idea what the technical reason is for this.

~~~
rblatz
Honestly the whole thing is stupid. 4k videos of everything on Netflix are
available on the high seas. What’s the point?

~~~
dimensi0nal
Right now stripping Netflix's 4K DRM is possible but it requires buying a new
Nvidia Shield TV for every release you do. If you could strip the DRM without
buying expensive hardware every time people might be slightly quicker to
release the entire 4K Netflix catalog.

~~~
rblatz
That is interesting how does that work where you have to sacrifice a shield
per release?

------
charrondev
That seems like a pretty impressive achievement!

Question for anyone here:

What do you use to play 4K/HDR? I have an Apple TV 4K, which can do 4K Dolby
Vision playback and looks ok, but the Apple TV tends to have some jittering
when streaming certain shows (very noticeable in panning shots of animation).
The sound quality is also noticeably worse on my set with it (it doesn’t seem
to be able to do direct pass through to my AV unit, always PCM/decoded on the
Apple TV).

On the other hand I have a a Shield TV that does direct pass through of audio
and sounds much better, and also seems to do video playback without that
occasional jittering. It does not seem to support Dolby Vision though.
Additionally the UI is very, very, very laggy after recent updates.

Does anyone use anything that doesn’t have either of these issues?

~~~
teilo
I use the native apps in my Samsung Q70 TV. They are really good, and I rarely
find a need to use my 4K Apple TV anymore. Even the Samsung Apple TV apps, and
the AirPlay 2 support is excellent.

As it is, Apple has screwed up HDR on the 4K so badly that it's very difficult
to make it work smoothly without also screwing up HDR on other devices.

~~~
charrondev
The difficulty for me is that my TV doesn’t have enough HDMI inputs (LG B7). I
use 7 through my yahamah receiver, but have struggled to get ARC (audio from
the TV through the sound system) to do-exist with also also having everything
else go through the receiver.

~~~
teilo
I have a Yamaha receiver also. Occasionally I need to power-cycle the receiver
or the TV to get ARC to work again, but other than that, I've had no problems
with it. I run everything through the Yamaha.

------
mappu
What exactly is shot-based encoding doing here?

Even old XviD would reliably always insert a new I-frame / start a new GOP on
a scenecut, and perform global rate optimization based on scene complexity
within the target ABR parameters.

~~~
zerocrates
Their older articles talk more about shot-based encoding specifically (and
before that, per-title encoding) as they applied it to their non-4K content:

[https://netflixtechblog.com/dynamic-optimizer-a-
perceptual-v...](https://netflixtechblog.com/dynamic-optimizer-a-perceptual-
video-encoding-optimization-framework-e19f1e3a277f)

[https://netflixtechblog.com/per-title-encode-
optimization-7e...](https://netflixtechblog.com/per-title-encode-
optimization-7e99442b62a2)

A few relevant points:

\- Their previous systems used fixed keyframes, so wouldn't be using scene-
change detection at all (I presume this was to allow predictable chunking
across different codecs)

\- Since reliable streaming performance is a pretty big deal for Netflix, they
probably have quite tight restrictions on VBR modes, which make them not work
as effectively since they have less "room" to work with

------
hinkley
The best (worst?) example I know of this going horribly wrong is the title
sequence for the Big Bang Theory, which speeds up as it goes along and then
cuts to the cast.

Math, science, history, unraveling the mystery <as the pixels start to
unravel>, it all started with a Big Bang. BANG (Bang, your Screen is a riot of
random pixels. Oh wait, here’s the cast instead!)

Then I caught an episode on another service, maybe Netflix? The transition
wasn’t awesome, but it wasn’t awful either. It only glitched for a frame or
two instead of a few seconds. Clearly better heuristics were in play.

------
FullyFunctional
That’s great. I’d love to see how they fare on this glitter which seems to
destroy all compressors:
[https://m.youtube.com/watch?feature=youtu.be&v=MG_Lyg74UlU&t...](https://m.youtube.com/watch?feature=youtu.be&v=MG_Lyg74UlU&t=130)

(I saw a similar problem with a famous monarch butterfly footage that I don’t
have time to chase down)

~~~
joshschreuder
Wow, that really was awful. Does this sort of thing require kind of like a
specific optimisation to fix, or is there a general technique that would fix
this and other stuff?

~~~
bscphil
Codecs with better understanding of what a complex scene looks like (and
allocates more bitrate accordingly) will do better, and codecs that are better
are blurring away details instead of blocking will do better as well.

Keep in mind this is on Youtube and uploaded by some shitty clip site, so it's
a very biased example. In this case there are four lossy encode steps
involved: master -> bluray -> clipsite's master -> Youtube encode. Youtube in
particular has absolutely horrific, just inexcusably bad bitrate even at the
highest quality they'll give you. I'd have to see what the original Bluray
looks like in this case to see if there's really a problem worth worrying
about here.

~~~
FullyFunctional
Actually I first saw it on a plane and wondered how we have gotten to this
stage where obviously garbled video is ok to give to paying customers. Yes,
more bits will solve this, bit it’s still a great example that breaks encoding
assumptions.

------
pedrocr
And if they would start providing 4K or at least 1080p to Linux users I might
consider subscribing again. Streaming services are popping up left and right
while at the same time turning into their own specific guettos of supported
platforms and features. Makes you almost value the role of the traditional
network aggregator. Maybe the market will eventually consolidate or organize
itself so you can buy a full service from a single provider that knows what
it's doing technically.

------
tpmx
Still getting crappy 2.5 MBit/s visuals on an "HD" plan on a gigabit
connection in Sweden. Not impressed with their real-world image quality. I
don't think you should need to buy a 4k plan to get proper 1080p with a decent
bitrate/image quality.

------
TazeTSchnitzel
It sounds like Netflix might not be doing the horrible thing YouTube and
Twitter do where they treat low-resolution (e.g. standard definition) content
as being less deserving of bitrate with only higher resolutions allowed to
climb the bitrate ladder? For those sites I have to upscale SD content to HD
resolutions when uploading just for it to be allowed a less paltry bitrate.

I fear people who grow up in the age of streaming might not realise that DVDs
had good video quality because streaming services seem to hate SD content.

> As a side note, we do have some additional points, not shown in the plots,
> that are used in resolution limited scenarios — such as a streaming session
> limited to 720p or 1080p highest encoding resolution. Such points lie under
> (or to the right of) the convex hull main ladder curve but allow quality to
> ramp up in resolution limited scenarios.

------
WalterBright
Since the credits can run for 5 minutes or more these days, those can be
optimized by converting to a font rather than a bitmap of the screen. Transmit
the font, then transmit the text. Should be able to get gigantic compression
this way!

------
munificent
If only they would spend this much effort optimizing the quality of their
screenwriting.

~~~
riyadparvez
Are snarky comments like this really necessary or relevant to what's being
discussed here?

~~~
munificent
No, they aren't. I usually resist the urge, but it's been a long week.

------
httpsterio
Without understanding all of the technical details, am I the only one who
finds Netflix's quality on Android insufferable? Technically, I guess it's
Full HD but the packing is so aggressive that any scenes with just a bit
darker sections will instantly be blocky and pixellated. Is there any way
around it? Honestly it's not worth paying for as it's almost unwatchable.

------
lvs
Not supported in Linux, presumably, since you can't even get anything higher
than 540p or 720p in Linux due to Widevine DRM restrictions.

~~~
Filligree
Well, you _can_. Just not from Netflix.

------
jjj123
Do other people have the same issues I have with streaming dark content? I
watch a lot of horror, and the compression on Netflix and Shudder (and
probably HBO + Hulu) causes really horrible banding in dark scenes.

Literally if they could just fix this I would have no problems with streaming
quality.

------
ponker
Any idea why none of these services supports queueing some movies you’d like
to watch, downloading them in the middle of the night in astounding bitrates,
and allowing you to watch on your TV? As far as I can tell only the mobile
apps support this, not the “streaming boxes.”

~~~
wmf
The streaming boxes probably don't have enough storage to hold much. There's
Kaleidescape but it's very expensive.

~~~
ponker
Why not give them enough storage? 32 GB of flash is more than enough and now
you get to sell a whole new generation of boxes.

------
eisa01
Now they only need to offer the 4K streams on an affordable plan for single
household customers. It’s unfair to pay for the 4-stream family plan to get 4K
for double the price of the single stream plan!

~~~
Mindwipe
That's not going to happen, Netflix need to push their ARPU up quite
significantly in the next few years given how much cash they're burning (and
they explicitly promised shareholders they would stop doing last year).

------
superkuh
Uhh, so we're supposed to be happy about a variable bitrate and quality
because it saves them some bandwidth? This isn't a feature for users it's a
spin on a feature for the corporation.

~~~
chii
> so we're supposed to be happy about a variable bitrate and quality because
> it saves them some bandwidth?

yes, because it saves you money - netflix could be increasing its subscription
cost, or it could keep it down as the subscriber base grows, by using
technique like this.

As long as you don't notice the difference, what's wrong with them saving some
bandwidth?

~~~
Lammy
Why is raising their subscription price the alternative? Wouldn't Netflix's
capability to negotiate peering/transit deals grow along with their subscriber
base?

~~~
chii
as bandwidth grows, people's choice is to keep upping their fidelity. If the
subscriber base was OK with a 480p video, they would've seen it. And
negotiating peering/transit deals are fickle and isn't guaranteed to work as
cable companies may be playing politics/business games to squeeze netflix.

Therefore, a technical solution is the next best option.

------
sbahr001
Wow! That is crazy. Love how they use "The Dirt" to show it lol. Guess there
are few rockers @ netflix lol.

------
heroprotagonist
Yet they still can't properly render 21:9 content on a 21:9 display. You get a
shrunk picture with black bars on all sides. Even their "Originals" or
N-Series or whatever they call it now have this problem.

Here's hoping someone from Netflix is in the comments and can act on it,
because their support system hasn't done anything in the 2 years since I
brought it to their attention.

edit: Actually, I tried to track down the film used in the new encoding (I
think it's The Dirt from the signs and dates seen in the frame captures) so I
could screenshot for comparison. It actually worked in 21:9 fullscreen.

Was it the new encoding? I saw this problem up until this past week, most
recently on Maniac but I checked that show and it's no longer an issue.

From the post I suspect it was this adding in extra blacks at top and bottom
on 21:9 content in the old method:

> with fixed 4K resolution bitrates — 8, 10, 12 and 16 Mbps — regardless of
> content characteristics

But I truly don't know what it was. I just know that video no longer ignores
40% of my screen real estate by watching on a 21:9 monitor since sometime last
week.

THANK YOU! That has been annoying me for years.

But Please add 3440x1440 to your testing. It's not shown in those charts.

------
Wistar
Maybe I missed something but, scene-by-scene encoding--a model similar to
scene-by-scene color correction--for feature films destined for digital
release has been a thing since DVDs first arrived in the late 90s.

------
matthewhartmans
This is incredible!

also, what is the movie or show sampled in that blog?

------
gen3
Wow! Those optimized encodings look really good.

------
uladzislau
How many titles on Netflix are available in 4K? I did a quick search and it’s
either incorrect or there’s not that many, mostly original Netflix content.

------
ComodoHacker
TL;DR: More 4K adoption lead to higher traffic costs for us, so we have to
optimize that.

------
HenryKissinger
ELI5?

Edit: Why the downvotes?

~~~
2bitencryption
Instead of one optimization profile for a whole movie, Netflix is detecting
when the shot of a movie changes (the camera "cuts" from one shot to another)
and beginning a new optimization profile specifically for the content of that
shot.

At least, that is my understanding.

~~~
purerandomness
That sounds so intuitive that I want to follow up with the question: Why isn't
this the default? Is detecting a cut particularly difficult?

~~~
Groxx
Coarsely detecting cuts is relatively simple - look for large frame-to-frame
differences (e.g. encode however -> find large frames surrounded by smaller
ones -> done, it's as accurate as your perceptual compression is). There are a
number of ffmpeg-using tools out there doing this and other "cut to / from
black" detection and it's _pretty_ good. Not good enough for a human to say
"yeah, these are all scenes", but probably good enough for picking things to
re-encode like this.

The harder part is the significantly increased compute use due to re-encoding
things multiple times, to detect these cuts and to try to find the best
encoding. Heuristics there can be arbitrarily complex and re-calculate any
number of times. I imagine it hasn't been done earlier just due to cost,
though maybe they've recently achieved a better heuristic.

edit: ah, great, they link to a "dynamic optimizer" post that goes into this
in some detail: [https://netflixtechblog.com/dynamic-optimizer-a-
perceptual-v...](https://netflixtechblog.com/dynamic-optimizer-a-perceptual-
video-encoding-optimization-framework-e19f1e3a277f)

~~~
myself248
Seems to me that if the frame-to-frame difference isn't big enough to detect
that way, it's not likely to benefit from a new I-frame, yeah?

~~~
Groxx
That's the basic idea, yeah. It falls apart in a couple places, e.g. when the
cut or fast-fade goes to a _very_ cheap frame like a mostly solid color, and
it may not detect stuff like whip-cuts (since a whole chunk of frames are
expensive), but so many scenes in so much of media has single-frame cuts that
it's well within that "good enough" range.

And for dynamic encoding like this: when it's wrong, it's not visually worse
in that scene than choosing that sub-par encoding _for the entire movie_ ,
which has the same "choose the best encoding" problem as individual chunks
have. I assume it'd be relatively rare for it to result in anything worse than
a one-shot strategy.

\---

ffmpeg will let you easily do frame-to-frame-diff logic that lets you chop
videos into scenes, for example:
[https://video.stackexchange.com/a/30701](https://video.stackexchange.com/a/30701)
I'm not sure how much it handles compressed-frame differences, but it
shouldn't be too hard to build around it. Just might be a bit beyond bash-
friendly.

------
paulmendoza
Will The Office look better is all that matters

~~~
hombre_fatal
Never really understood this meme. Imagine liking a show so much that it's one
of the only things you care to watch, yet depending on streaming services to
watch it.

------
paulmendoza
I wish I knew what any of this meant but it sounds awesome

------
wtallis
A lot of those graphs seem to show that video quality at 1920x1080 is going
down significantly. Does that mean you now need to be on their most expensive
plan and using one of their approved devices/platforms to continue receiving
the same video quality, now only obtainable with 2560x1440 and higher
resolutions?

~~~
charrondev
How’d you get that out of the article? I read it as scores the board, you’re
getting better quality at a lower bit-rate. They talked about being able to
serve 1080p at the same bitrate they were serving 720p (where they talked
about mobile devices).

~~~
jamie_ca
All 4 of the Bitrate vs Video Quality graphs have two 1920x1080 data points on
each line. All four graphs show the data points lower on the Video Quality
axis under the optimized ladder.

Sure, the first graph shows 1080p at higher quality for lower bitrate to the
720p on the optimized ladder, but if I'm resolution-limited (on a phone) and
Netflix isn't streaming higher quality and downsampling on the device, the
changes here show that my perception of quality will suffer. (I feel for folks
stuck with a 720p screen, if they're just streaming native resolution it looks
like they'll be getting worse than previous 480p quality levels.)

