Hacker News new | past | comments | ask | show | jobs | submit login
Netflix is now doing per-shot encoding for UHD content (netflixtechblog.com)
348 points by mmcclure on Aug 28, 2020 | hide | past | favorite | 196 comments



It's really too bad that they haven't rolled this out for 1080p too. Their 1080p video quality is god-awful compared to Amazon. They may even have made it worse: from the "ladder" comparisons, it looks like they jump to higher resolutions at lower average bitrates now, so if you're stuck at 1080p you're going to be getting worse quality after these changes than before.

Also, something that's kind of weird about all this is that encoders (x264, x265, etc) already have their own rate control algorithms that decide on a frame or scene basis how many bits to use. Netflix taking an approach like this is equivalent to claiming that they're capable of doing a better job than these codecs in an automated way - so why not just contribute the code needed to achieve these improvements back to the projects they're using?

Last, it's very weird to select individual frames from an encode and not a selection of frames. The comparisons purport to show that the lower bitrate encode is better than the higher bitrate encode, but in fact (if I understand how they're using rate control correctly) what they are showing is a single frame in the lower bitrate encode that uses more bits than the same frame in the higher bitrate encode. So it's arguably not a fair comparison. And even with that, depending on how sensitive you are to artifacts, some of the "optimized ladder" encodes still look worse.

Edit: here is an example of what I mean when I said that they may have made the quality at 1080p worse. If what you end up watching is the highest bitrate 1080p available, you would get higher quality on the old latter than on the newer one. https://i.imgur.com/rzPR7Sh.jpg Actually, on some of the examples, even 720p on the older ladder is better than 1080p on the newer one!


> They may even have made it worse: from the "ladder" comparisons, it looks like they jump to higher resolutions at lower average bitrates now, so if you're stuck at 1080p you're going to be getting worse quality after these changes than before.

No they have not made it worse. 1080p encoded at lower bitrate can look better than 720p at a higher bitrate. Or vice versa. Even to the trained eye. It depends on the scene dynamics.* In the ladders they posted (called a RD chart) we can clearly see they have made a better selection of the resolution to encode at a given target bitrate increase the video quality. They have posted the BD rates for this stuff, the gains are substantial (30% or more visual quality at the same bitrate).

> Also, something that's kind of weird about all this is that encoders (x264, x265, etc) already have their own rate control algorithms that decide on a frame or scene basis how many bits to use. Netflix taking an approach like this is equivalent to claiming that they're capable of doing a better job than these codecs in an automated way - so why not just contribute the code needed to achieve these improvements back to the projects they're using?

Because they've built a split and stitch encoder on top of the codecs. That means the code is built around x264, not into it.

> Last, it's very weird to select individual frames from an encode and not a selection of frames. The comparisons purport to show that the lower bitrate encode is better than the higher bitrate encode, but in fact (if I understand how they're using rate control correctly) what they are showing is a single frame in the lower bitrate encode that uses more bits than the same frame in the higher bitrate encode. So it's arguably not a fair comparison. And even with that, depending on how sensitive you are to artifacts, some of the "optimized ladder" encodes still look worse.

This is extremely true.

* High action scenes tend to benefit from lower resolution with more bits spent on inter compression while slower scenes benefit from tighter intra compression.


> No they have not made it worse. 1080p encoded at lower bitrate can look better than 720p at a higher bitrate. Or vice versa. Even to the trained eye. It depends on the scene dynamics.*

Well, of course that's correct. Take 1080p encoded at 100 Mbps and 720 encoded at 300 Mbps, I guarantee you that 1080p wins every time. :-)

> In the ladders they posted (called a RD chart) we can clearly see they have made a better selection of the resolution to encode at a given target bitrate increase the video quality.

I understand how an RD chart works, but I disagree with your conclusions. Reread my original post carefully - my point is that if you're stuck with 1080p as the maximum resolution, than the quality you can get with the revised ladder is strictly worse, because Netflix will want to jump up to UHD resolutions, but your device won't support it. (And even assuming that the UHD will be better based on this chart is kind of a leap, because it depends on what artifacts you're most sensitive to. Objective metrics ≠ subjective experience. In several of their comparisons, I prefer the lower res version because the new one is simply too artifacted.)

> Because they've built a split and stitch encoder on top of the codecs. That means the code is built around x264, not into it.

I understand the technical difference, but I'm still not convinced. Fundamentally what you get at the end of the day is a single continuous h.264 stream. The quantizer is chosen based on some rate control algorithm, and IPB frame choices are made by another algorithm. Netflix's improvement amounts to having a better rate control, that understands scene differences and uses lower quantizers on more complex scenes. There's fundamentally no reason why you couldn't add such a rate control algorithm to x264. It might require using a 2-pass mode, but it could be done.


if you're stuck with 1080p as the maximum resolution, than the quality you can get with the revised ladder is strictly worse, because Netflix will want to jump up to UHD resolutions, but your device won't support it.

"As a side note, we do have some additional points, not shown in the plots, that are used in resolution limited scenarios — such as a streaming session limited to 720p or 1080p highest encoding resolution. Such points lie under (or to the right of) the convex hull main ladder curve but allow quality to ramp up in resolution limited scenarios."


> Reread my original post carefully - my point is that if you're stuck with 1080p as the maximum resolution, than the quality you can get with the revised ladder is strictly worse, because Netflix will want to jump up to UHD resolutions, but your device won't support it.

These are for their HEVC encodes. Apple (and most others) require H264 to have a separate ladder which will be 1080p limited and will be generated on different encode set and maximum video quality.* Almost any HEVC capable device can go to 4k, even if the screen is only 1080p. It will be used than down scaled.

> (And even assuming that the UHD will be better based on this chart is kind of a leap, because it depends on what artifacts you're most sensitive to. Objective metrics ≠ subjective experience. In several of their comparisons, I prefer the lower res version because the new one is simply too artifacted.)

Certainly objective and subjective metrics have differences, but VMAF does correlate pretty tightly with subjective metrics. This is a very detailed subject, I summed up many of my thoughts on the differences in my demuxed talk last year here: https://www.youtube.com/watch?v=nCUsXhSPyyw

* https://developer.apple.com/documentation/http_live_streamin... 1.23 iirc


> These are for their HEVC encodes.

Really? I wasn't aware Netflix even offered anything other than H.264 at <= 1080p. I know Youtube doesn't. If this is all HEVC maybe you're right and the resolution issue isn't such a big deal. If this is actually the case I wish they would have said so explicitly in the blog post.

> VMAF does correlate pretty tightly with subjective metrics. This is a very detailed subject, I summed up many of my thoughts on the differences in my demuxed talk last year here: https://www.youtube.com/watch?v=nCUsXhSPyyw

Thanks, I'll check that out when I have the time. I'd still emphasize that it's quite possible for objective metrics to be fooled (that's why codecs have psy tunings!), and I do prefer several "old-ladder" images over the newer versions in Netflix comparisons here. I suppose I've become rather cynical about claims made on the basis of metrics ("now 50% less bitrate for the same quality!"), when to my eye, web video has always been very bad and is getting worse as companies keep reducing the bitrate.


Youtube offers various codecs, also including VP8 at all resolution/framerate settings H.264 is offered on (not all formats will always be available). For example, here's the youtube-dl format list for a well-known video:

  [youtube] dQw4w9WgXcQ: Downloading webpage
  [info] Available formats for dQw4w9WgXcQ:
  format code  extension  resolution note
  249          webm       audio only tiny   49k , opus @ 50k (48000Hz), 1.18MiB
  250          webm       audio only tiny   65k , opus @ 70k (48000Hz), 1.55MiB
  140          m4a        audio only tiny  130k , m4a_dash container, mp4a.40.2@128k (44100Hz), 3.27MiB
  251          webm       audio only tiny  136k , opus @160k (48000Hz), 3.28MiB
  394          mp4        256x144    144p   73k , av01.0.00M.08, 25fps, video only, 1.72MiB
  278          webm       256x144    144p   97k , webm container, vp9, 25fps, video only, 2.25MiB
  160          mp4        256x144    144p  107k , avc1.4d400c, 25fps, video only, 2.05MiB
  395          mp4        426x240    240p  159k , av01.0.00M.08, 25fps, video only, 3.42MiB
  242          webm       426x240    240p  217k , vp9, 25fps, video only, 4.04MiB
  133          mp4        426x240    240p  290k , avc1.4d4015, 25fps, video only, 4.48MiB
  396          mp4        640x360    360p  340k , av01.0.01M.08, 25fps, video only, 6.68MiB
  243          webm       640x360    360p  396k , vp9, 25fps, video only, 6.96MiB
  134          mp4        640x360    360p  484k , avc1.4d401e, 25fps, video only, 8.27MiB
  244          webm       854x480    480p  586k , vp9, 25fps, video only, 10.03MiB
  397          mp4        854x480    480p  603k , av01.0.04M.08, 25fps, video only, 11.37MiB
  135          mp4        854x480    480p  741k , avc1.4d401e, 25fps, video only, 11.56MiB
  247          webm       1280x720   720p 1035k , vp9, 25fps, video only, 17.67MiB
  136          mp4        1280x720   720p 1077k , avc1.4d401f, 25fps, video only, 16.72MiB
  398          mp4        1280x720   720p 1133k , av01.0.05M.08, 25fps, video only, 22.07MiB
  399          mp4        1920x1080  1080p 2106k , av01.0.08M.08, 25fps, video only, 40.74MiB
  248          webm       1920x1080  1080p 2666k , vp9, 25fps, video only, 58.46MiB
  137          mp4        1920x1080  1080p 4640k , avc1.640028, 25fps, video only, 78.96MiB
  18           mp4        640x360    360p  601k , avc1.42001E, 25fps, mp4a.40.2@ 96k (44100Hz), 15.19MiB (best)


Netflix has a debug screen that will show you the codec info. It varies depending on platform but you can Google it to find the one for your platform. For the Windows app, it's Ctrl-Alt-Shift-D. That shows H264 on my device since I don't have hardware that can decode H265. On Android, I've seen VP9 streams.


Youtube mostly uses vp9 these days (I think it might fallback to h264 depending on the device?)

You can confirm this by loading a video, right clicking somewhere in the player and opening 'stats for nerds'



Safari on Mac OS Catalina is always h264


Because Apple will invent(as in start supporting) vp9 with Big Sur.


> Well, of course that's correct. Take 1080p encoded at 100 Mbps and 720 encoded at 300 Mbps, I guarantee you that 1080p wins every time. :-)

No it doesn't. Depending on the distance from the screen and indeed the screen resolution, you won't benefit from the extra 1080p resolution, but you could benefit from the extra bits. 1080p60 is 3.7GBit uncompressed with 10 bit 444, so that's 37:1 compression. 720p60 is 1.7gbit, or 5.5:1 compression, so far fewer compression artifacts.


This post is more or less all about them extending things they already do for lower resolutions to 4K.

And the graph you posted isn't quite showing what you're saying it is: these show the "4K" curves. When you're on a 1080-max device but have a strong connection, there are higher-bitrate 1080 streams available; they're just not on this chart because that's not what it's showing.

In other words, what the curves are showing is just that they're switching to higher resolutions at lower bitrates than before, when your device supports them. Really the way to read it would be drawing lines vertically: given a fixed bandwidth, the "optimized" encode is always giving higher quality and often higher resolution.


> It's really too bad that they haven't rolled this out for 1080p too

Maybe they will? But it matters most for 4K performance, so that's where they started?

> encoders (x264, x265, etc) already have their own rate control algorithms

But they simply operate on a stream without taking future frames into account. This does a first entire pass on the film holistically to determine where keyframes should go and settings per-shot. It can't be backported to codecs because they work linearly.

> And even with that, depending on how sensitive you are to artifacts, some of the "optimized ladder" encodes still look worse.

The artifacts look worse only because it's zoomed in so crazy far. On a cinema screen, the additional sharpness will be clear, the artifacts not so much.


> This does a first entire pass on the film holistically to determine where keyframes should go and settings per-shot. It can't be backported to codecs because they work linearly.

Given that x264 already has a 2 pass mode, I don't see why that is necessarily the case. Even CRF mode uses mbtree by default, which is a pretty complicated rate control algorithm. x264 also has pretty intelligent keyframe determination, I almost always see I frames on scene transitions.

> On a cinema screen, the additional sharpness will be clear, the artifacts not so much.

I think it's probably the other way around. The closer you are to a screen, the more likely you are to notice increased resolution. But if there are large patches of the image full of artifacts, you'll be likely to see that even far away.


> Given that x264 already has a 2 pass mode, I don't see why that is necessarily the case. Even CRF mode uses mbtree by default, which is a pretty complicated rate control algorithm. x264 also has pretty intelligent keyframe determination, I almost always see I frames on scene transitions.

x264 will not align keyframes across resolutions and encodes. In addition the pershot encodes optimize more than just idr frame placement. They also optimize other encoder parameters such as aq.

> I think it's probably the other way around. The closer you are to a screen, the more likely you are to notice increased resolution. But if there are large patches of the image full of artifacts, you'll be likely to see that even far away.

That isn't how this works. It's not a strict tradeoff between more artifacts and less sharpness by changing resolution. Downscaling -> Upscaling is simply another form of lossy compression. It may look better or worse than using those bits in another spot.


> x264 will not align keyframes across resolutions and encodes. In addition the pershot encodes optimize more than just idr frame placement. They also optimize other encoder parameters such as aq.

Sure, that's true. (Though I don't know why it matters that keyframes aren't aligned.) But at the end of the day the point is that Netflix has a better rate control algorithm, and this could be built into x264, even if it might require a significant amount of work. (Which I'm sure the x264 developers would be willing to do for a substantial quality improvement.)

> That isn't how this works. It's not a strict tradeoff between more artifacts and less sharpness by changing resolution.

Of course it's not a strict tradeoff. It's a loose one. And yes, it may look better or worse. That's really my only point in that section of the comment: that bumping up the resolution earlier in the ladder as they're doing is not a pure win, and it may look worse to some people depending on their viewing conditions.


> Sure, that's true. (Though I don't know why it matters that keyframes aren't aligned.) But at the end of the day the point is that Netflix has a better rate control algorithm, and this could be built into x264, even if it might require a significant amount of work. (Which I'm sure the x264 developers would be willing to do for a substantial quality improvement.)

You can't ABR adapt without aligned GOP boundaries.

> Of course it's not a strict tradeoff. It's a loose one. And yes, it may look better or worse. That's really my only point in that section of the comment: that bumping up the resolution earlier in the ladder as they're doing is not a pure win, and it may look worse to some people depending on their viewing conditions.

Of course not, that's why the perform an analysis of both options and select the better one. That's what the algorithm does...


> You can't ABR adapt without aligned GOP boundaries.

Yes, that's true of course, but not really relevant to whether you could port Netflix's work on scene-adaptive rate control to x264. Maybe you'd lose aligned GOPs... but for a lot of purposes (offline?) that doesn't matter.

> Of course not, that's why the perform an analysis of both options and select the better one. That's what the algorithm does...

The only point I've ever tried to make on this subject is that in some cases this approach fails. "Perform an analysis" is such a high level description that it misses the fact that this is being done according to some objective metric that may disagree with an individual viewer's personal preferences or viewing environment. In fact, just because an objective metric says 4k > 1080p doesn't mean the difference will be noticeable at the viewing distance the viewer is at, whereas the additional artifacts introduced by moving from 1080 -> 4k without a significant bitrate increase may very well be visible!


Existing encoders have suported this for ages using multipass. In the first pass of multipass encoding, the input data from the source clip is analyzed and stored in a log file. In the second pass, the collected data from the first pass is used to achieve the best encoding quality. In video encoding, two-pass encoding is usually controlled by the average bitrate setting or by the bitrate range setting (minimal and maximal allowed bitrate) or by the target video file size setting.

The best way to understand why this is used is to think of a movie — when there are shots that are totally, absolutely black, like scene changes, normal 1-pass CBR encoding uses the exact same amount of data to that part as it uses for complex action scene. But by using VBR and multipass, encoder “knows” that this piece is OK with lower bitrate and that bitrate can be then used for more complex scenes, thus creating better quality for those scenes that require more bitrate.


My point is that if Netflix thinks they can do better, it would be great to see their code contributed to the public, since they're using an open source program to do all their encodes.

Also, you're making a slight mistake. About a decade ago (?), x264 was changed so that even 2-pass mode is really just using CRF under the hood. It's not "achieving the best quality", it's just figuring out what quantizers you need to hit your average bitrate target exactly. Some more information about that here: https://trac.ffmpeg.org/wiki/Encode/H.264

> normal 1-pass CBR encoding

You may be confusing constant bitrate (which no one should ever use) with CRF (constant rate factor), which varies the bitrate from frame to frame and scene to scene, without having to do a 2 pass encode.


> My point is that if Netflix thinks they can do better, it would be great to see their code contributed to the public, since they're using an open source program to do all their encodes.

This really is their core business and somewhere they can have a competitive edge, though. Even if you think it's unethical or that it might make business sense to contribute back to the upstream projects, you can see why they might want to keep some of that secret sauce in-house.


Fair enough, I'm not arguing that they're obligated to or anything, I just think it would be nice. I do suspect that catalogue matters much more at this point than any tiny competitive advantage they get through better encoding. And they've certainly been willing to share advances (like VMAF) in the past - that's why this tech blog exists, after all!


Bandwidth is a huge cost for these services and if they share encoding techniques then their competitors can cut prices at the margins.


>since they're using an open source program to do all their encodes.

Is that still the case? They switched from x265 to beamr for HEVC. Not sure if they are still on x264 on recent encodes.


It's gotten so I use my Netflix subscription as a catalog more than a streaming service. If I find I really like something I pirate a higher-quality version of it.


> Sometimes we ingest a title that would need more bits at the highest end of the quality spectrum — even higher than the 16 Mbps limit of the fixed-bitrate ladder.

From just a cursory inspection using my Apple TV 4K, 12 Mbps is their target bitrate but their ceiling was ~16 Mbps.

That's still only a third of what AppleTV+ offers (36/48) and only 2/3 of Disney+ (18/24). Netflix is still higher than Prime Video (10/14). HBO Max was the lowest quality, allowing only 8/10 Mbps for their HD streams (which generously doubled HBO Now's infamously low 4/5 Mbps bitrate).


I've noticed that ATV+ looks the best, which is much appreciated.

However, what has been driving me insane with all these services is how the bitrate is completely inconsistent throughout, depending on network congestion. Every service I subscribe to will "automagically" lower the bitrate if the network can't handle it.

Which is fine, I get it. The thing is, it's the only way to watch anything, and depending on the content, it can absolutely ruin it. I'm on Comcast, and Netflix will constantly ping pong between 480p and 1080p, and is very unpleasant to watch. It's ok for some shows, but certain ones like Planet Earth 2 become unwatchable (to me).

I really wish there was a setting to add a bit of wait time before playing to avoid this. ATV+ buffers a bit more than the rest, but has the same issues.

Edit: To clarify, this issue happens both wired and wireless. The main contributor to the inconsistency is the time of day I decide to watch.


Under your profile settings you should be able to force HD. It will cause it to buffer instead of downgrade if your internet gets too slow, but it will also do a bigger buffer (depending on the device and how much memory the device makes available to Netflix for the video buffer).


In case you are using Google/Cloudflare DNS try switching back to Comcast's DNS servers so you get directed to a Netflix Open Connect box within their network.


Google DNS supports EDNS-subnet info, and should work just as well as Comcast's DNS for Netflix. Cloudflare won't, though.


Do you know if this used to be an issue that is now resolved? The poor streaming quality I experienced was watching Netflix over Comcast cable internet with Google DNS several years ago. It was night and day different after changing DNS resolvers back to Comcast, though I guess it could have been a coincidence.


It should not have been an issue in the last few years. I do not know if this was a problem early-on, but Google themselves use it for their CDN-boxes they offer ISPs to offload what's likely mostly YT videos and popular android APKs.


> switching back to Comcast's DNS servers

A scary thought (they highjack NXDOMAIN, MiTM http to "bring you important messages", etc)


That's when you set up dnsmasq or something similar and use your favorite dns server for most of the sites but a specific one for sites that will work better with it.


On my phone it lets me download Apple TV+ content. The files are shockingly large compared to what Amazon Video downloads are. A 1 hr clip was maybe 800mb in prime, but a few gigabytes from apple. (I don’t have any on my phone atm, but a quick search is telling me episodes of “For all mankind” are ~4GB and an hour long) It looks noticeably better, a massive step up that is completely worth the extra bytes.

On my desktop I always rent stuff through iTunes just cause I can download it in advance, and don’t have to deal with buffering or reduced quality.


There is a setting in Prime Video to download the highest bitrate which is often around 4GB for a 2 hour movie, iirc. Totally agree with you on the download in advance rentals. With good enough internet it makes the experience of rentals so much more enjoyable.


> A 1 hr clip was maybe 800mb in prime, but a few gigabytes from apple.

Assuming x264 encoding, that sounds about right for 720p vs 1080p.


>It looks noticeably better, a massive step up that is completely worth the extra bytes.

I keep thinking if their reason was to push for higher capacity iPhone sold.


And this is why I still get discs by mail. No dropouts, no unexplained drops in resolution, and it has a deeper catalog.


Scratched discs.


You'd be surprised just how resilient optical media can be. Damage to the metallic substrate is usually a bigger problem than most scratches

Plus, with optical media you can easily make bitperfect copies!


Actually, hasn't happened all that often. I've had more of a problem with discs arriving broken.


That almost always play nicely.

I still use optical media myself.


Does a speed test report that you have severe Bufferbloat issues? I don’t experience this issue with Netflix on 50 down 5 up Comcast at 50/5 but I do at 150/15, unless I install a shaper (eg. IQrouter, Eero SQM). I ask because if you have too high latency, such as the 1200ms I’ve seen on Comcast with bufferbloat due to upstream overcommit, Netflix might detect that as a loss of bandwidth.


Try doing "offline" downloads (with "highest" quality of course) in the app then casting to your TV.

For true videophiles there's Kaleidescape but it's less valuable now that there are so many exclusive shows.


Airplay is limited to 1080p, and I don't think it includes HDR. Definitely not what I would call the pinnacle of quality.

I'm surprised that Apple TV apps don't allow you to download content, considering it is based on essentially iOS, but it is clearly a conscious choice.


Are you streaming over wifi?


I just stopped caring, bought a Blu-Ray drive, which works fine under Linux nowadays and can enjoy 30Mbps FHD-encodings where I can just hit pause and have a photo-like still (everything sharp, no blocky shadows). Oh and yes, I also can buy 2 discs/per month of actual good movies I really want to watch (who has time for more than that anyway?) for what Netflix would like to charge me...


I personally have way more than ~4 hours of movie watching time per month. I usually spend about an hour (often more) per day watching movies and TV shows on Netflix, HBO, Hulu, etc.

I've never cared too much about the quality though. The old school dvd rips that were compressed to 700mb to fit on a CD were usually fine by me.


yes, if you like to do that and find stuff, it's great. Especially if you're a fan of their comedy stuff (which is about the same level as public TV here... - just a little more international). If you'd like to watch a nice movie you get about 1 good one per 2 months.


You can also get Blu-rays from Netflix if you watch more. The real costly thing for this approach is tv though, isn't it?


I'm in Europe, so I can get blu-rays from some alternative source, but not from Netflix.


As far as I know, with netflix the quality depends on things like the platform you're watching. Even with the highest subscription plan, if you watch it in Firefox or Chrome you're limited to 720p and you now get about 500 kbit/s, which looks awful. The older content seems to be at 1 or 2 mbit/s, and the 2 mbit/s looks good. But as you can see in the figures, they no longer seem to offer good quality 720p.


Seems to be the case on Linux for both Netflix and Amazon Prime video for me, I’d cancel both but my mum uses the hell out of them and I watch an iPad in bed occasionally.

It’s kinda annoying though I get the business reasons and all that.

It’s just that it did used to be better and it’s gotten worse over time that really annoys me.


I'm assuming that when you're at as large of a scale as Netflix, if you were to provide bitrates like Apple does, it would probably collapse most of the internet backbone. I think Netflix and Youtube already use something like half of the internet's bandwidth. Now imagine if they just tripled that overnight.

Though I guess there's a difference between allowing the user to switch to it vs making it default. Youtube lowered the default to 480 to help with congestion during the pandemic, but you can still manually still. I assume 99% of users use the default which is why they can get away with it.

But it would be kinda silly for Netflix to provide 48mbps but only to those who know to enable it.


Why not provide it for people who will pay for it? I know I would.


> From just a cursory inspection using my Apple TV 4K

I'm confused, how are you getting this information? Does Apple TV have a way to display the bitrate ceiling?

Also, do you find the bitrate differences noticeable? On my 4k tv, I find Netflix looks pretty good, but Amazon looks pretty bad and HBO Now is awful of course (no Max yet, since I use a Roku.) This might be enough to get me to check out AppleTV+.


> how are you getting this information?

Does your router report its current throughput?


Not Apple TV specific, but Netflix (and I think Prime) have test videos that also show the bitrate as it’s streaming. I used them a while back when setting up my network and access point location.


Apple TV has a Developer overlay with video metadata (requires XCode & dev account).

Yeah, there's a clear difference, especially on Disney & Apple. Apple is very close to 4K blu-ray (there's some YouTube vids comparing them).

I'm using a low end 4K projector, so it's more apparent.


Most routers can log enough information to calculate the bit rate. You’ll need enterprise equipment to show a real time feed per MAC but getting a “total bytes transferred” now and an hour from now is very easy.


> only a third of what AppleTV+ offers (36/48) and only 2/3 of Disney+ (18/24)

I'm very suspicious of these numbers, I think they might be the maximum transient bitrate and not the average bitrate. Do you have an example of an AppleTV movie or show that actually has an average bitrate that high?


I didn't do anything fancy. I simply turned on the Developer overlay in my Apple TV 4K and played a few 4K movies (if available) in each service. The bitrates were consistent within each platform.

I was very surprised by Apple's numbers (and HBO), so I did some Googling to confirm.


I know this is very unscientific, but I regularly watch a part of a 4K movie on the Apple TV app on my 4K Samsung, and then in 4K on Netflix as well.

Netflix is superior in every regard (easy to browse, quick to buffer, if I rewind it’s instant) but even to a naked eye Apple’s quality is way higher. Less smudgy, more definition.


I watch a lot of anime, and Netflix consistently looks the worst. Recently, Devilman Crybaby in particular looked so bad I thought I was experiencing a technical issue but nope, I was watching the anime at Netflix’s top quality. The banding artifacts in many scenes were intolerable. It’s hard to get excited about Netflix saving bandwidth when their top bitrates consistently have huge artifacting problems.

It’s so bad that I will pay Apple $3 per episode than watch whatever garbage bitrate Netflix has decided is suitable for anime.

To my eyes, Apple TV+ consistently offers the highest quality images, with a 23 minute anime episode using 700MB+ total instead of 70-170MB I see on Netflix. Even Amazon is a huge drop in quality compared to Apple TV+, although it is better than Netflix.


Is crunchyroll even higher quality than Apple TV? It seems like every episode is 1.3GB at 1080p.


Wow, you may be right. I haven’t run any tests personally, but subjectively I’ve always been satisfied with Crunchyroll quality. Funimation, too.


I often wonder why. Both Apple and Netflix have their own Edge Cache. Which is just an Appliance sitting inside the ISP network. Bandwidth is essentially cheap / free.


Looks amazing and while I would like to enjoy such quality and am willing to pay for the premium plan (in fact I do already), I still can't watch the 4K content even in a world where I accept DRM modules in my browser, because some DRM plugins seem to be more equal than others.


Meanwhile, pirates get the content DRM free and they don't pay a cent for it. Yet another instance where DRM hurts the consumer more than the pirates.


Do we still need an Intel Kaby Lake or later CPU for 4K DRM decoding? (And Windows... forgot about that.)

If so, Netflix is on the wrong side of the CPU war with AMD ascending.


The arguments Netflix made for this highly model-specific DRM never made sense to me. There are 4K rips of pretty much every Netflix show available for download on various bays related to piracy.

So why bother locking your paying customers out of 4K content? What's the upside?

Also: I'm reasonably convinced that NetFlix is violating the law, at least in Australia, by advertising that their content is 4K and then arbitrarily blocking access to the 4K streams. This is called "bait & switch", and the fines are eyewatering, even for large corporations.


>Also: I'm reasonably convinced that NetFlix is violating the law, at least in Australia, by advertising that their content is 4K and then arbitrarily blocking access to the 4K streams. This is called "bait & switch", and the fines are eyewatering, even for large corporations.

Nah, they have their asses covered. I went to the website and tried to sign up, and at the first mention of 4k (when you're choosing a plan), there's a disclaimer of:

>HD and Ultra HD availability subject to your Internet service and device capabilities. Not all content available in HD or Ultra HD. See Terms of Use for more details.

If you go to the terms of use, it says:

>4.7. The quality of the display of the Netflix content may vary from device to device, and may be affected by a variety of factors, such as your location, the bandwidth available through and/or speed of your Internet connection. HD, Ultra HD and HDR availability is subject to your Internet service and device capabilities


One could make the argument that this is purposefully deceptive.

The phrasing makes it sound like external conditions, outside of Netflix's control are the reason 4K may not be available.

The reality is they sell two subscription levels: 4K and a HD-only one, but then after you've given them your money, they outright block access to 4K streams on a device that is physically capable of playing the stream.

I can play 3D games in 4K 60fps. I can stream YouTube at 8K. I can watch blu-ray at 4K with 2% CPU utilisation. For crying out loud, I have an NVIDIA RTX 2080 Ti and gigabit fibre!

But no. Sorry sir, your CPU is the wrong model, and Netflix cannot trust it. No 4K for you.


>The phrasing makes it sound like external conditions, outside of Netflix's control are the reason 4K may not be available.

I agree that the phrasing isn't the clearest, but it's not like the actual requirements aren't available. A search for "netflix 4k" turns up this page as the first result: https://help.netflix.com/en/node/13444, which clearly lists what the requirements are.


I could rephrase that entire page with "The compatible devices are the ones that are compatible" and it would provide about as much useful information.

Nowhere on that page does it mention HDCP. There is no such thing as a firmware update for 99.9% of monitors out there.

I meet 100% of the stated requirements on this page, yet NetFlix refuses to play 4K on my high-end computer. It does this silently, without explanation. It simply shows everything as HD, basically gaslighting me.

This kind of thing infuriates me.

I paid for this. I feel like a sucker.


I suspect the arguments are made to make sense to media executives not to the consumer. After all, DRM is an anti-feature for the consumer.


In a lot of cases the decode is happening on the GPU anyway, so requiring a specific CPU is doubly pointless.

I figured it was some backdoor Intel scheme where they paid Netflix to be exclusive.


No. At the time that cpu family was only one that had a gpu with support for necessary drm schemes and 10bit HEVC decode.


They also block access if you have old drivers or an older display panel that doesn't support HDCP 2.2.

Which is nuts, because you can buy HDCP stripper boxes for a few tens of dollars from several Chinese suppliers.


They needed Kaby Lake for the GPU not CPU.

And according to https://help.netflix.com/en/node/23931 you can use any cpu if you have an NVIDIA gpu starting with 3gb version of 1050.


On windows the DRM needed for 4k Netflix is PlayReady 3.0, which is supported on some cards after driver release 19.8.1: https://www.amd.com/en/support/kb/release-notes/rn-rad-win-1...


I am shocked at how many people watch Netflix in their browser. I have multiple different ways (AppleTV, Xbox, TV Smart App) to watch Netflix on my tv, and at least 2 of them support 4k.


I don't have a TV, so if I watch Netflix it's in a browser.

Most people don't realize this but you can only watch Netflix in 4K in a browser if you're using Microsoft Edge on Windows or Safari in macOS 11. In fact, you can only get 1080p if you're running Chrome in Chrome OS, or IE, or Safari. All other browsers, including Chrome and FF on macOS, Linux, and Windows, are stuck at 720p.

Their blog post[0] from 2017 sounded hopeful for higher resolution video on Linux - they wrote "We... look forward to high-resolution video being available on more platforms soon" almost immediately after announcing FF on Linux was supported - but in hindsight it's clear that what they really meant was what you point out - support for things like Apple TV, Xbox, and TV Smart Apps.

[0]: https://netflixtechblog.com/update-on-html5-video-for-netfli...


If you are watching on a windows computer the Netflix windows store app can play 4k and 4k hdr titles.


What differentiates Edge from Chromium on Windows?

Edge hasn't been around as long as Netflix has provided in-browser service, I'd have thought Chrome w/ Widevine would have been part of Netflix' success.


Edge supports PlayReady in addition to Widevine. PlayReady is what gates the higher quality stream. No idea what the technical reason is for this.


Honestly the whole thing is stupid. 4k videos of everything on Netflix are available on the high seas. What’s the point?


Right now stripping Netflix's 4K DRM is possible but it requires buying a new Nvidia Shield TV for every release you do. If you could strip the DRM without buying expensive hardware every time people might be slightly quicker to release the entire 4K Netflix catalog.


That is interesting how does that work where you have to sacrifice a shield per release?


I've tried to do a bit of searching around to learn more about how this works, but failed. Got any citations?


It's not too surprising. Media players (chromecast, appletv, etc.) seem pointless if you already have a laptop and you're not too into movies/tv. Not everyone owns a game console, either because they don't game or they play on PC. Finally, some people (especially hn users) think that smart tvs are a privacy/security nightmare, so use them as dumb tvs.


I agree on the smart tv thing, I don’t use it, but it’s available. But even with numerous laptops and desktops I’m just not very likely to watch on one. It’s harder to watch with my wife or friends on a laptop, or desktop. And even though I have a nice setup I don’t want to sit in my desk chair all night after sitting in it all day. Im more likely to watch on a tablet while laying in bed or on the couch than an actual computer. But still use the Apple TV 90%+ of the time.


I have an appletv connected to a projector but sometimes I just want to throw on something on the computer when I’m eating/waiting for something else.

And of course because I’m on Linux I can’t get even 1080p encoded at a shitty bitrate unless I force it with a Firefox extension that they can gimp at any time (in a way, many thanks to the engineers who made the extension possible by relaxing checks on the browser).

I don’t understand why most platforms hate their paying users like this - I can and will pirate a Blu-ray quality rip if I can’t pay for it easily.


> I am shocked at how many people watch Netflix in their browser.

They don't. They're just very noisy.


That seems like a pretty impressive achievement!

Question for anyone here:

What do you use to play 4K/HDR? I have an Apple TV 4K, which can do 4K Dolby Vision playback and looks ok, but the Apple TV tends to have some jittering when streaming certain shows (very noticeable in panning shots of animation). The sound quality is also noticeably worse on my set with it (it doesn’t seem to be able to do direct pass through to my AV unit, always PCM/decoded on the Apple TV).

On the other hand I have a a Shield TV that does direct pass through of audio and sounds much better, and also seems to do video playback without that occasional jittering. It does not seem to support Dolby Vision though. Additionally the UI is very, very, very laggy after recent updates.

Does anyone use anything that doesn’t have either of these issues?


I use the native apps in my Samsung Q70 TV. They are really good, and I rarely find a need to use my 4K Apple TV anymore. Even the Samsung Apple TV apps, and the AirPlay 2 support is excellent.

As it is, Apple has screwed up HDR on the 4K so badly that it's very difficult to make it work smoothly without also screwing up HDR on other devices.


Agreed, i have Sony Android TV and with Amazon Prime 4K HDR the content has really good quality the problem is that after some time it starts to buffer/jitter and i think its due to Android and bad processors Sony used in the TVs.

My Apple TV 4k obviously runs much snappier but i find the HDR quality really dark and lower than native Android TV.


The difficulty for me is that my TV doesn’t have enough HDMI inputs (LG B7). I use 7 through my yahamah receiver, but have struggled to get ARC (audio from the TV through the sound system) to do-exist with also also having everything else go through the receiver.


I have a Yamaha receiver also. Occasionally I need to power-cycle the receiver or the TV to get ARC to work again, but other than that, I've had no problems with it. I run everything through the Yamaha.


The jittering is caused by the Apple TV defaulting to 60p frame rate for all content. But if you turn on “match frame rate” the jittering goes away.


I can‘t understand why it wouldn’t default to match the content’s frame rate. How can anyone watch content with a frame rate that doesn‘t match (or is a multiple of) the source material? Maybe modern TVs can hide those issues with interpolation, but I wonder how this isn’t a bigger and more widely discussed topic. It‘s my understanding that the Apple TV even handles this better than other devices, which sometimes simply don‘t seem to bother to offer this option at all.


Before I turned that on, watching BBC documentaries in 25 FPS the judder was very noticeable on panning shots. Definitely worth enabling if your device supports it.


At least Apple TV supports this feature, even if it's off by default.

Google simply "decided" for Chromecast users that 60 Hz is fine and that nobody needs 24 fps or 50 fps content.


I used to use Linux, but the DRM makes that really hard. I'm restricted to HD from Netflix, so some media I access 'differently' for the media format of my choice.


I use an Apple TV 4k into a Denon AVR-X4500 which then outputs in to my 2018 LG 65” TV. Match frame rate is on. All supports 4K Dolby Vision and Dolby Atmos (I have ceiling height speakers as well as normal surround). Looks and sounds great.

I also have a second setup which is similar to the above in a dedicated cinema room with projector as well, but with even more speakers.

If you have a receiver, it should be where your video playback devices attach. Only the receiver attaches to the TV.


HDR10 is fine right now. I don’t know of any tv that has a 12 bit capable panel or can create over 1000 nits brightness. HDR10 supports 4,000 and Dolby vision supports 10,000. TVs need to get a lot better before Dolby vision makes a difference.


The per-scene dynamic metadata of Dolby Vision is what makes the most difference in my opinion, not the theoretical max nits / bits. I can see a pretty dramatic difference on my LG C9 OLED when watching HDR10 vs DV.


I wasn’t aware! I have an LG B7 and it’s maximum brightness is like 800nits, and even that seems too bright a lot of the time, so I turn on “power save” mode which dims it by about half. No wonder I never noticed any difference.


Yes unless you have a new tv that supports eARC it won’t pass Dolby Atmos to your receiver from the tv. There is a solution to this which is currently in preorder. https://www.hdfury.com/product/4k-arcana-18gbps/


So if I understand correctly, using this device you can e.g. hook up a HDMI 2.1 device directly to a HDMI 2.1 TV, then pass the audio through an eARC port to the device, which then outputs a normal HDMI signal with full audio support (including Dolby Atmos) you can plug into a HDMI 2.0 receiver that has no eARC or HDMI 2.1 support?

If that’s how it works that would be awesome as I bought an expensive 11.2 receiver not long ago, but when the new consoles are out I want to have VRR and ALLM, which means HDMI 2.1, which means it has to go straight into the TV, which means I would lose Dolby Atmos (assuming PS5 will support it of course, but that seems pretty likely)


Of course it's an HDFury product <3


I have a Vizio Quantum X and a PS4 Pro both work great. YouTube videos show the occasional dropped frame on the Vizio app, and generally 0 dropped frames on the PS4.

I don't have the best speakers or room acoustics, but I can't complain about the sound.


I have a PS4 which won’t do the trick, but was planning on a PS5 when it comes out so maybe that’ll be the answer.


I was using FireTV Cube and recently switched to latest Shield. Both of them were playing content fine till I went playing with receiver settings and discovered that I didn't have HDR enabled on this HDMI port. After enabling I started to have a lot of jitter and blank outs (though while it worked - picture was better then before).

Solution was to buy "premium" HDMI cables that are rated for 4K HDR


I'm using a 2018 model-year Samsung 4k smart TV. I'm pretty sure it's running Tizen OS and it's surprisingly good. I'm doing sound out via ARC to a Sonos Beam in a 5.0 with the IKEA/Sonos speakers as rears. So far my only complaint is that the Tizen Plex app doesn't play nice with DTS audio even though the TV has a native DTS decoder.


"I have an Apple TV 4K, which can do 4K Dolby Vision playback and looks ok, but the Apple TV tends to have some jittering when streaming certain shows (very noticeable in panning shots of animation)."

How don't more people complain about this? I avoid streaming on the Apple TV because it does some sort of bizarre framerate thunking that is just brutal for panning. Do so few people use the product that it just goes unnoticed?

My LG 4K TV has fantastic Netflix, Prime, and Disney+ clients. HDR, 4K, etc.


I believe most people just don't notice the judder, and many of the rest don't care. E.g. it took a lot of convincing to get Google to add "use 50p output" to Chromecast settings years ago (and that didn't fix 24p of course, just European 25p/50p which is more severely affected).

Also, most high-end TVs are able to recover the original 24p (or other) frame rate and thus remove the judder by using specific settings: https://www.rtings.com/tv/reviews/lg/c9-oled/settings#judder

I tested with a high-speed camera that with correct settings my old Samsung UE75H6475 was able to recover the original frame rate perfectly from quite a few framerate mangling combinations. Haven't done similar testing with my current C9, though.


"I believe most people just don't notice the judder, and many of the rest don't care"

This is probably the case. DAZN took over NFL streaming in Canada and for the first two years seemed to use their existing European soccer processing chain (they might still --- I gave it a try to years straight and then gave up). So the 60/30 NFL stream was re-encoded to 25/50, and then on playback on my set would be displayed at 30/60. It was brutal, and even if displayed at 25 or 50 FPS was still brutal because they were seriously corrupting the NFL stream.

I tried it across a number of devices -- AppleTV, Chromecast, different TVs, pads, laptops -- and it was just unbelievably intolerable to me. Every panning pass was the horrendous juddering mess. Yet somehow no one seemed to have a problem with this! In discussions it seemed to be a non-issue.


I also have a 4K LG TV and “frame rate thunking” is the perfect term to describe it.

I wish it would just give the direct output to the TV and receiver (video audio). No idea why they the need to process it all on device.


Have you tried turning “match frame rate” on?


I had tried it a while back stopped because anytime you go into or out of video streaming it would black out the whole screen for a second or two before starting to buffer the content. It’s particularly annoying if trying to search for a particular episode of something.

That’s probably the ticket for it though. I find the lack of audio pass through to be the more annoying piece though. I have a machine that costs significantly more than the Apple TV and has knowledge of all of the attached speakers to do that decoding.


> I had tried it a while back stopped because anytime you go into or out of video streaming it would black out the whole screen for a second or two before starting to buffer the content. It’s particularly annoying if trying to search for a particular episode of something.

That‘s the correct behavior though and not specific to the Apple TV. Any device that actually does switch and match the content’s frame rate will cause the output display to resync/adjust with a short black screen. The alternative of not adjusting frame rate is much worse and it’s such an underrated problem in video playback in my opinion. The Apple TV handles this better than most devices.


What exactly is shot-based encoding doing here?

Even old XviD would reliably always insert a new I-frame / start a new GOP on a scenecut, and perform global rate optimization based on scene complexity within the target ABR parameters.


Their older articles talk more about shot-based encoding specifically (and before that, per-title encoding) as they applied it to their non-4K content:

https://netflixtechblog.com/dynamic-optimizer-a-perceptual-v...

https://netflixtechblog.com/per-title-encode-optimization-7e...

A few relevant points:

- Their previous systems used fixed keyframes, so wouldn't be using scene-change detection at all (I presume this was to allow predictable chunking across different codecs)

- Since reliable streaming performance is a pretty big deal for Netflix, they probably have quite tight restrictions on VBR modes, which make them not work as effectively since they have less "room" to work with


It sounds like the link between GOP and "shot" (single camera shot, presumably) is being done explicitly. I would've expected this to happen automatically through their production pipeline but I guess it wasn't.


The best (worst?) example I know of this going horribly wrong is the title sequence for the Big Bang Theory, which speeds up as it goes along and then cuts to the cast.

Math, science, history, unraveling the mystery <as the pixels start to unravel>, it all started with a Big Bang. BANG (Bang, your Screen is a riot of random pixels. Oh wait, here’s the cast instead!)

Then I caught an episode on another service, maybe Netflix? The transition wasn’t awesome, but it wasn’t awful either. It only glitched for a frame or two instead of a few seconds. Clearly better heuristics were in play.


That’s great. I’d love to see how they fare on this glitter which seems to destroy all compressors: https://m.youtube.com/watch?feature=youtu.be&v=MG_Lyg74UlU&t...

(I saw a similar problem with a famous monarch butterfly footage that I don’t have time to chase down)


The HBO intro’s white noise background is pretty bad too. Quite blocky and degraded.


Wow, that really was awful. Does this sort of thing require kind of like a specific optimisation to fix, or is there a general technique that would fix this and other stuff?


Codecs with better understanding of what a complex scene looks like (and allocates more bitrate accordingly) will do better, and codecs that are better are blurring away details instead of blocking will do better as well.

Keep in mind this is on Youtube and uploaded by some shitty clip site, so it's a very biased example. In this case there are four lossy encode steps involved: master -> bluray -> clipsite's master -> Youtube encode. Youtube in particular has absolutely horrific, just inexcusably bad bitrate even at the highest quality they'll give you. I'd have to see what the original Bluray looks like in this case to see if there's really a problem worth worrying about here.


Actually I first saw it on a plane and wondered how we have gotten to this stage where obviously garbled video is ok to give to paying customers. Yes, more bits will solve this, bit it’s still a great example that breaks encoding assumptions.


there isn't necessarily a fix. all compression algorithms expand the size of the data on most inputs in order to make certain inputs with sufficiently regular patterns a lot smaller. it just so happens that most of the images/video/audio/text/etc people are interested in is fairly regular. with a lossy compression algorithm that is required to hit a certain compression ratio to keep the bitrate under some threshold, the only way to be able to guarantee that it is hit on uncompressible input is to introduce regularity/remove randomness, and in the case of video the easiest and least visually distorting way to do so is to pixelate it. this particular video is basically random noise which is incompressible by definition, which is why the result is so bad. if it was 1080p going in and the output size is required to be reduced by 16x, reducing the resolution to 480x270 is about as good you can do.


And if they would start providing 4K or at least 1080p to Linux users I might consider subscribing again. Streaming services are popping up left and right while at the same time turning into their own specific guettos of supported platforms and features. Makes you almost value the role of the traditional network aggregator. Maybe the market will eventually consolidate or organize itself so you can buy a full service from a single provider that knows what it's doing technically.


Still getting crappy 2.5 MBit/s visuals on an "HD" plan on a gigabit connection in Sweden. Not impressed with their real-world image quality. I don't think you should need to buy a 4k plan to get proper 1080p with a decent bitrate/image quality.


It sounds like Netflix might not be doing the horrible thing YouTube and Twitter do where they treat low-resolution (e.g. standard definition) content as being less deserving of bitrate with only higher resolutions allowed to climb the bitrate ladder? For those sites I have to upscale SD content to HD resolutions when uploading just for it to be allowed a less paltry bitrate.

I fear people who grow up in the age of streaming might not realise that DVDs had good video quality because streaming services seem to hate SD content.

> As a side note, we do have some additional points, not shown in the plots, that are used in resolution limited scenarios — such as a streaming session limited to 720p or 1080p highest encoding resolution. Such points lie under (or to the right of) the convex hull main ladder curve but allow quality to ramp up in resolution limited scenarios.


Since the credits can run for 5 minutes or more these days, those can be optimized by converting to a font rather than a bitmap of the screen. Transmit the font, then transmit the text. Should be able to get gigantic compression this way!


If only they would spend this much effort optimizing the quality of their screenwriting.


Are snarky comments like this really necessary or relevant to what's being discussed here?


No, they aren't. I usually resist the urge, but it's been a long week.


Without understanding all of the technical details, am I the only one who finds Netflix's quality on Android insufferable? Technically, I guess it's Full HD but the packing is so aggressive that any scenes with just a bit darker sections will instantly be blocky and pixellated. Is there any way around it? Honestly it's not worth paying for as it's almost unwatchable.


Not supported in Linux, presumably, since you can't even get anything higher than 540p or 720p in Linux due to Widevine DRM restrictions.


Well, you can. Just not from Netflix.


Do other people have the same issues I have with streaming dark content? I watch a lot of horror, and the compression on Netflix and Shudder (and probably HBO + Hulu) causes really horrible banding in dark scenes.

Literally if they could just fix this I would have no problems with streaming quality.


Any idea why none of these services supports queueing some movies you’d like to watch, downloading them in the middle of the night in astounding bitrates, and allowing you to watch on your TV? As far as I can tell only the mobile apps support this, not the “streaming boxes.”


Take a wild guess.

(it's digital rights management and "Woo, PCs are scary!")


The streaming boxes probably don't have enough storage to hold much. There's Kaleidescape but it's very expensive.


Why not give them enough storage? 32 GB of flash is more than enough and now you get to sell a whole new generation of boxes.


Now they only need to offer the 4K streams on an affordable plan for single household customers. It’s unfair to pay for the 4-stream family plan to get 4K for double the price of the single stream plan!


That's not going to happen, Netflix need to push their ARPU up quite significantly in the next few years given how much cash they're burning (and they explicitly promised shareholders they would stop doing last year).


Uhh, so we're supposed to be happy about a variable bitrate and quality because it saves them some bandwidth? This isn't a feature for users it's a spin on a feature for the corporation.


I don't know where you live or what your internet setup is like, but I can't stream 4k at ridiculously high bitrates consistently, and apparently many of Netflix's customers can't either: "The number of rebuffers per hour go down by over 65%; members also experience fewer quality drops while streaming." Additionally, "members who were limited by their network to 720p can now be served 1080p or higher resolution instead."


I'd guess it means that the average user gets to watch a higher quality video with less buffering. If you also watch netflix on your phone or any data plan, you save money.


Also helps the consumer. My ISP doesn't offer unlimited bandwidth. Covid-19 has me hitting caps from being home so much.


The article shows even at lower bitrates higher quality/sharpness. This also helps people on metered connections or slower connections which would include mobile consumption.


> so we're supposed to be happy about a variable bitrate and quality because it saves them some bandwidth?

yes, because it saves you money - netflix could be increasing its subscription cost, or it could keep it down as the subscriber base grows, by using technique like this.

As long as you don't notice the difference, what's wrong with them saving some bandwidth?


Why is raising their subscription price the alternative? Wouldn't Netflix's capability to negotiate peering/transit deals grow along with their subscriber base?


as bandwidth grows, people's choice is to keep upping their fidelity. If the subscriber base was OK with a 480p video, they would've seen it. And negotiating peering/transit deals are fickle and isn't guaranteed to work as cable companies may be playing politics/business games to squeeze netflix.

Therefore, a technical solution is the next best option.


Wow! That is crazy. Love how they use "The Dirt" to show it lol. Guess there are few rockers @ netflix lol.


Yet they still can't properly render 21:9 content on a 21:9 display. You get a shrunk picture with black bars on all sides. Even their "Originals" or N-Series or whatever they call it now have this problem.

Here's hoping someone from Netflix is in the comments and can act on it, because their support system hasn't done anything in the 2 years since I brought it to their attention.

edit: Actually, I tried to track down the film used in the new encoding (I think it's The Dirt from the signs and dates seen in the frame captures) so I could screenshot for comparison. It actually worked in 21:9 fullscreen.

Was it the new encoding? I saw this problem up until this past week, most recently on Maniac but I checked that show and it's no longer an issue.

From the post I suspect it was this adding in extra blacks at top and bottom on 21:9 content in the old method:

> with fixed 4K resolution bitrates — 8, 10, 12 and 16 Mbps — regardless of content characteristics

But I truly don't know what it was. I just know that video no longer ignores 40% of my screen real estate by watching on a 21:9 monitor since sometime last week.

THANK YOU! That has been annoying me for years.

But Please add 3440x1440 to your testing. It's not shown in those charts.


Maybe I missed something but, scene-by-scene encoding--a model similar to scene-by-scene color correction--for feature films destined for digital release has been a thing since DVDs first arrived in the late 90s.


This is incredible!

also, what is the movie or show sampled in that blog?


Wow! Those optimized encodings look really good.


How many titles on Netflix are available in 4K? I did a quick search and it’s either incorrect or there’s not that many, mostly original Netflix content.


TL;DR: More 4K adoption lead to higher traffic costs for us, so we have to optimize that.


ELI5?

Edit: Why the downvotes?


Instead of one optimization profile for a whole movie, Netflix is detecting when the shot of a movie changes (the camera "cuts" from one shot to another) and beginning a new optimization profile specifically for the content of that shot.

At least, that is my understanding.


That sounds so intuitive that I want to follow up with the question: Why isn't this the default? Is detecting a cut particularly difficult?


Coarsely detecting cuts is relatively simple - look for large frame-to-frame differences (e.g. encode however -> find large frames surrounded by smaller ones -> done, it's as accurate as your perceptual compression is). There are a number of ffmpeg-using tools out there doing this and other "cut to / from black" detection and it's pretty good. Not good enough for a human to say "yeah, these are all scenes", but probably good enough for picking things to re-encode like this.

The harder part is the significantly increased compute use due to re-encoding things multiple times, to detect these cuts and to try to find the best encoding. Heuristics there can be arbitrarily complex and re-calculate any number of times. I imagine it hasn't been done earlier just due to cost, though maybe they've recently achieved a better heuristic.

edit: ah, great, they link to a "dynamic optimizer" post that goes into this in some detail: https://netflixtechblog.com/dynamic-optimizer-a-perceptual-v...


Interesting that it makes sense now because the economics of it have shifted, as the shift away from broadcast and towards individual stream has happened.


Seems to me that if the frame-to-frame difference isn't big enough to detect that way, it's not likely to benefit from a new I-frame, yeah?


That's the basic idea, yeah. It falls apart in a couple places, e.g. when the cut or fast-fade goes to a very cheap frame like a mostly solid color, and it may not detect stuff like whip-cuts (since a whole chunk of frames are expensive), but so many scenes in so much of media has single-frame cuts that it's well within that "good enough" range.

And for dynamic encoding like this: when it's wrong, it's not visually worse in that scene than choosing that sub-par encoding for the entire movie, which has the same "choose the best encoding" problem as individual chunks have. I assume it'd be relatively rare for it to result in anything worse than a one-shot strategy.

---

ffmpeg will let you easily do frame-to-frame-diff logic that lets you chop videos into scenes, for example: https://video.stackexchange.com/a/30701 I'm not sure how much it handles compressed-frame differences, but it shouldn't be too hard to build around it. Just might be a bit beyond bash-friendly.


Imagine if the compression for the movie “Hero” couldn’t exploit the color themes in each act of the movie. That would be a might bigger.


"Variable bitrate" encoding is a thing, which allows chunks of a file to be encoded at a higher or lower bitrate depending on how much is happening. I don't know exactly how it happens, but I assume that each chunk is determined by duration or size, while Netflix's new method determines each chunk by the content.

In order for this to be the default, you'd need either humans, or pattern recognition algorithms to identify the chunks. You also need to quantify how by how much each chunk needs its encoding parameters tweaked.

Monetary costs aside, that's increasing complexity of your pipeline with relatively small gains. I'd bet that more companies will start looking at similar approaches now that 4K HDR (and 8K) are becoming more common. Probably not worth the R&D for 1080p, but we'll see tricks like this start to trickle down, I'm sure.

It's highly unlikely we'll see anything similar in FOSS tools in the near future.


Right, these gains are mostly relevant when you have a relatively small library compared to how many views each item gets. For example this would probably not be worth it for Youtube, expect maybe on individual hyper popular videos. Every kb that Netflix can shave off of a popular movie means terabytes of bandwidth saved.


Even if you have to pay somebody to tag the scene changes manually that cost would be pretty insignificant to Netflix.


Netflix should be able to get the EDL, which would probably take care of most of the work.


Traditionally, you just encode the video with a new frame every X frames, regardless of shots.

It might sound trivial, but it is still extra computational work to figure out when the shot changes.


You expect a video to play at exactly 1 second per second. But compression or network bandwidth can screw that up. What do you do?

Ages ago they solved this by making some frames more important than others. Key frames being the top priority. If you are falling behind you can drop everything you are working on and skip to the next key frame. Then they added other priority levels so you can dump part of the work instead of all of it when things are only a little choppy.

With a zip file if you don’t have the middle of the file you can’t figure out the end. So clearly video is doing something different if they can remove chunks and still work.

Many of our compression techniques for video “hang” off the key frames. All the pictures around them are described as a set of changes to the key frame. Much smaller, but also why your screen occasionally looks like a horror show. Something went wrong and the changes were shown using the wrong starting point.

If the key frame is right before the scene changes completely, then you will struggle to compress the transition. There’s not enough space to describe all the changes. But if you have a program look for the transitions, you can delay or advance the key frame a bit so it lands exactly on the camera change. Bang. New picture. Then everything looks right.

Today it’s gone a lot farther. There are hints and patterns that span a whole movie or a chunk of it that for instance clues in that the whole movie is shot in sepia tone or a dark cave so you can guess the next scene will be sepia too. They’re talking about lining those attributes up with big shifts in the camera work, like someone coming out of a cave into bright light.


isnt this.... Vbr?


maybe VBR just works on constant sized chunks, whereas this works on chunks that are computed to be optimal.


I mean --- not really - adaptive scene detection for Vbr is a long standing thing.


Will The Office look better is all that matters


Never really understood this meme. Imagine liking a show so much that it's one of the only things you care to watch, yet depending on streaming services to watch it.


I wish I knew what any of this meant but it sounds awesome


A lot of those graphs seem to show that video quality at 1920x1080 is going down significantly. Does that mean you now need to be on their most expensive plan and using one of their approved devices/platforms to continue receiving the same video quality, now only obtainable with 2560x1440 and higher resolutions?


How’d you get that out of the article? I read it as scores the board, you’re getting better quality at a lower bit-rate. They talked about being able to serve 1080p at the same bitrate they were serving 720p (where they talked about mobile devices).


All 4 of the Bitrate vs Video Quality graphs have two 1920x1080 data points on each line. All four graphs show the data points lower on the Video Quality axis under the optimized ladder.

Sure, the first graph shows 1080p at higher quality for lower bitrate to the 720p on the optimized ladder, but if I'm resolution-limited (on a phone) and Netflix isn't streaming higher quality and downsampling on the device, the changes here show that my perception of quality will suffer. (I feel for folks stuck with a 720p screen, if they're just streaming native resolution it looks like they'll be getting worse than previous 480p quality levels.)


There are two dimensions to these graphs. Yes, they've drastically decreased bitrates, but if the cheaper plans still cut off those curves at 720p or 1080p, then you're not necessarily gaining video quality.


The chart does not see to show every bitrate for every target resolution, but rather is show where the 4K stream would have to drop to a lower resolution given insufficient bandwidth.


https://miro.medium.com/max/2000/1*c28F7YXjNo-GpmB9bBmbrA.pn...

This chart clearly shows multiple data points per resolution under the old fixed-ladder scheme, and under the new scheme the new data point for 1080p is lower on the video quality scale than any of the old 720p encodes. That's a pretty significant difference, and my question about where these new encodes fit into their existing price structure remains valid.


> As a side note, we do have some additional points, not shown in the plots, that are used in resolution limited scenarios — such as a streaming session limited to 720p or 1080p highest encoding resolution. Such points lie under (or to the right of) the convex hull main ladder curve but allow quality to ramp up in resolution limited scenarios.

They're showing the curve that is used for connections that want to be streaming 4k. So if you're on the 1080 stream in the charted scenario, it's because you're bandwidth-limited. They have higher-quality, higher-bitrate encoding curves for lower-resolution connections. However, it doesn't make sense to use that 1080 encoding for a 4k connection. If you could afford the bitrate, you'd rather be streaming the higher resolution from the displayed curve.


The article addresses this point:

> As a side note, we do have some additional points, not shown in the plots, that are used in resolution limited scenarios — such as a streaming session limited to 720p or 1080p highest encoding resolution. Such points lie under (or to the right of) the convex hull main ladder curve but allow quality to ramp up in resolution limited scenarios.


Thanks for pointing that bit out. It would still be useful to see where those other points lie. It seems counter-intuitive that they would need a higher bitrate for 1080p than 1440p when both are encoded to the same quality. Maybe their quality metric is slightly biased in favor of having more pixels and is more concerned with accurate reproduction of high-frequency components of the image than having an artifact-free low noise reproduction at a lower resolution?

Edit: Looking at their animated GIF comparisons, the newer encodes definitely give a much sharper image, but also have a lot more in the way of ugly artifacts:

On the left, under the tree branch: https://miro.medium.com/max/2000/1*SQQkYltbVC-HT-l8GpN-7Q.gi...

The right side of the frame: https://miro.medium.com/max/2000/1*A37BBzK4Ap8JPhmiUiFKcw.gi...

Many of the areas that should be fairly flat looking: https://miro.medium.com/max/2000/1*qI3EY7HDp3p0gwK9YGqchQ.gi...

There's definitely a lot more ringing and similar artifacts in the newer, lower-bitrate samples despite their overall increase in effective resolution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: