Hacker News new | comments | show | ask | jobs | submit login
What the new video compression strategy from Netflix means for Apple and Amazon (donmelton.com)
158 points by ca98am79 549 days ago | hide | past | web | 97 comments | favorite



I wonder at his premise that consumers are choosing things based on wanting larger file sizes and higher bit rates. Most of my friends literally could not tell you anything about their mp3 or movie collection in terms of bit rate. Half of them you'd need to explain what "bit rate" means, before even asking the question. I only think about it for my DJ music collection (and VBR is fine in that context, I'm just ruling out CBR stuff below ~192kpbs, because sometimes it sounds a little harsh on the high end over the big speakers); never worry about it for video or music streaming. If it's HD and looks/sounds OK, I don't think about it at all. Netflix and Amazon both have acceptable quality, so I don't think about it, I just consume it.

I think the success of Spotify and Pandora and Rhapsody are proof that consumers don't care about quality. I don't know exactly what bit rate they're streaming at, but, it sounds pretty bad on mobile, so I assume it's something quite low. But, even though I recognize the crappiness of it, sometimes I listen to them in the car (my truck has a crap stereo, anyway, so no big deal there).

In short: Cool article, but the suggestion that consumers will stop it because they want bigger files seems weird.


I think you're missing the point. Consumers don't care, until some listicle website or advertising firm points out that Amazon/Apple short you on your mp3s by not even offering 128kbps all the way through.

Or imagine Amazon goes with pure VBR, then Apple makes an ad claiming their sound quality is "better" because their bitrate never dips below 128kbps. It's bullshit, but how is an average consumer suppose to figure this out? They'll probably err on the side of caution and buy the CBR version since, "it can't be any worse than the VBR one, but I don't lose bits and it's the same price!"

The whole article was talking about streaming vs. downloading. Streaming is _fine_ and Netflix will probably get away with their compression, but will Amazon/Apple be able to do that with downloads? He doesn't think so. People are fine with Spotify/Pandora because there is no perceived ownership of the songs they're streaming. People who actually buy and download audio or video, they have money in the game so they want "the best" and any loss of that is viewed as Amazon/Apple screwing them over.


Has that ever happened? I mean, have there been consumer revolts over bit rate that have cost Apple or Amazon customers? Pono and Tidal don't seem to be killing the existing players, but perhaps I'm just not up to speed on the state of the industry.


The opposite has happened. Steve Jobs introduced H.264 as providing significantly higher quality at the same or lower bitrate. He shows an still from an existing MPEG-4 video and then a much higher resolution still from an H.264 encoded MPEG-4 video and claims that they require the same bitrate. He brings out Frank Casanova who goes on for 5 minutes or so about quality at lower bitrates. [0]

IIRC, Apple also sold AAC as requiring lower bandwidth for the same quality, but I don't recall when that was.

0 - https://youtu.be/dPCNUExWR6I?t=3271


It is a bit disingenuous to compare video codec quality using stills. One can simply choose a key frame (= no temporal compression) to compare, and they'd likely be identical regardless of how well the codecs compress motion in non-key frames.

(Not sure if the video clarifies this point since I cannot watch it right now.)


There is no actual "revolt" because both Apple and Amazon make sure it doesn't happen by using CBR. You're right in that consumers don't care about it all that much.

But let's say Amazon decides to go with VBR to save space and download speed, now there's an easy way for Apple to attack Amazon. "We never dip below a certain quality. Amazon does. We care. Amazon doesn't"

Maybe that ad/slogan works, or maybe it doesn't. But if you're Amazon would you be willing to risk some weird consumer backlash over it? And if you're Apple it is an easy point to attack, and if it doesn't work, no harm no foul (and then secretly also switch to VBR and announce it at the next Apple conference!)


And the VBR company can claim that their codec gets higher that the others when it needs to giving better definition to the complex parts of the sound. It's not a hard sell and most people won't care.


Except I'm reasonably intelligent and still get CBRs because irrationally I don't trust VBR files (and VBRs with DJ software)


Have you noticed a difference in handling of VBRs in your DJ software? I use Mixxx, and that's not been something I've noticed. Someone I regularly have tag-team DJed with uses VirtualDJ and he never complained about me sharing VBR files with him.

I have noticed that older pre-2.0 versions of Mixxx would sometimes get glitchy (warbly) when using the pitch-correction feature (key lock, I think is the name in the UI), but, I don't think it's related to VBR vs CBR, but I never thought to check.

The only time I've noticed an audible sound quality problem when DJing was with low bit rate (usually 128kbps) files.


Mixxx developer here -- can confirm our VBR decoding code is slightly less reliable than CBR! It's easier when you know exactly how many samples you're going to get out of a frame. We rewrote a bunch of this in 2.1 (coming in Q1 2016) so hopefully a few odd VBR quirks we had got fixed.

The warbling definitely sounds like our pre-2.0 time stretching algorithm to me. The new one in 2.0 is much better.


I love Mixxx! 2.0 beta is really fantastic. And, it's enough better and more reliable than 1.11 that I'm confused why y'all haven't done a stable release from that branch (but, it's good to hear it's coming). I haven't noticed the warbling since upgrading to the beta, I don't think, and a number of other big annoyances and crashers have also gone away. The additional capabilities (I upgraded specifically to be able to use the mic/mixer inputs on my controller, which 1.11 can't do, at all) and the nicer scratching and such is just great. In short: I see you've got a call out for testers of the RC; but, honestly, I already believe 2.0 in versions from months ago are more reliable and capable than 1.11. Anything that delays a release of 2.0 is just leaving lower quality software in the wild for longer than necessary.

Anyway, I really appreciate your work on Mixxx. I don't get to do audio or music work much these days, but I do occasional DJ gigs (mostly for free for non-profit orgs and such, but sometimes they're paying gigs), and it's been a whole lot of fun.


Glad you like it :). We're working on the last mile stuff right now (updating the manual, website, etc.) -- it will definitely be out before new years!


Not specifically. I play around with Mixxx but use Traktor in the club. It's more about software stability than anything inherent in the sound. I've had a handful of sets go south (which ultimately is enough) due to freezes etc that I'm wary (unnecessarily so. I'm sure they are fine, but It's more about "simplifying variables in arguable a super complex system.)


So...um...Apple and Amazon have been shipping VBR files for years.


I am not sure if there's confusion or what but this is literally discussed in the article. So I am not sure what you're arguing exactly. The article discusses how the VBR files have been encoded with minimum bitrate constraints in the fear that someone will make a big deal out of it if it dips "too low"


VBR with a mininum bit rate != CBR.

And, to be clear, the article is making a guess that Amazon or Apple are imposing a minimum bit rate to insure some lower bound on file size. I don't think there is really any solid evidence that Amazon or Apple are making decisions based on trying to make file sizes bigger to convince consumers they're getting "more value". I took exception to the premise, which is why I commented above. I don't believe Apple and Amazon are making decisions based on trying to increase file sizes, and I find it weird that the article suggest they are. I believe they are trying to maximize audio quality at smaller file sizes. Evidence seems to indicate that is what is happening.

And, we've come full circle to the point of my initial comment. I don't think the argument he's making about maintaining large file sizes is backed by evidence or a particularly good understanding of consumer behavior/preference. I do think his guesses about the Netflix algorithm are interesting, but his digression into consumer behavior is less so, IMHO.


Amazon has been selling what appear to be insanely high v0 mp3s for years now. Of course, there's no way to know for sure. This is what they claim "Where possible, we encode our MP3 files using variable bit rates for optimal audio quality and file sizes, aiming at an average of 256 kilobits per second (kbps)" but the bit rates suggest insane v0 to me.


Quality of recorded music today has much more to do with mastering than bit rates and codecs. I certainly would prefer a 128 kbps version of a twenty year old mastering than a 192 kbps version of the modern remaster.

That has led to a consumer revolt of sorts, for people who care about audio. That's not a large enough group to make a difference to the big record companies, but it is large enough that catering to them is a good business model in itself.


The two are unrelated issues.

128kbps mp3 files do sound pretty bad, especially on the high end and during very dynamic parts. It is audible to me, regardless of the quality of the original recording and its mastering.

And, while we're talking, I'll point out that the loudness wars began roughly 20 years ago, so you may need to go back to an even earlier mastering to get the really good stuff. Certainly it was more pronounced through the late 90s and 00s, but I remember the first few records I bought that suffered badly from over-compression, and they were released around 1994-1996. The multi-band compression that has killed many great recordings began to see usage around that time as well, though it peaked much later (it was expensive to start with).


> Has that ever happened?

I'm not sure if it has ever happened with bitrates in audio files, but the megapixel war comes to mind. 36 MP in a point-n-shoot? Must be better than the 12 MP in a Canon 5D DSLR since 36 is way more than 12.


It depends on the reason why you're buying the track. You can find all the great electronic music on iTunes, but the sound quality isn't as good as on Juno or Beatport. DJs shop at stores other than iTunes because the sound quality is a consistent 320 kbps (or WAV) and a track won't sound "bad" just because it's a slightly lower quality played in between two higher quality tracks.

You can tell if you play them next to each other, on an expensive club sound system. Other than that, you probably can't.


A club system is hardly something that I would call hi fidelity. Most PA systems are designed for injecting high spl music/audio in the frequency range of ~40-10khz.


Club systems vary wildly. I usually provide my own gear, and it's quite high end (JBL SRX700 series subs, and PRX 625 mains), and is more accurate than most home stereos, while also being dangerously loud. But, I can't hear the difference between a ~200kbps mp3 and a FLAC through that system. I probably couldn't consistently hear the difference on my headphones (also high end) or my studio monitors (again, quite high end), either.

That said, a 128kbps mp3 through a really loud system tends to sound harsh. It is definitely audible, even to untrained ears. But, anything above 192kbps, or so, is fine and any flaws that might be audible in an ideal listening environment are lost in the general noise of the club environment.


I don't think consumers care, full stop. Amazon, Apple and Netflix compete on content, not quality. The quality has been "good enough" for a long time now.


> Consumers don't care, until some listicle website or advertising firm points out that Amazon/Apple short you on your mp3s by not even offering 128kbps all the way through.

A 128kbps threshold is a holdover from the days of MP3s, WinAMP, ripping your CDs, and technically-savvy early adopter consumers. The average consumer is content to stream and doesn't know or care about bitrate, and the phrase "not even 128kbps" would be meaningless most readers.


> I don't know exactly what bit rate they're streaming at, but, it sounds pretty bad on mobile

Free Pandora maybe but Spotify doesn't. Spotify's bitrates are pretty good for most content and very good if you have a paid account with high bitrates turned on.

Some content is poorly recorded, mixed, and/or mastered which will sound bad at any bitrate.


I checked their site, out of curiosity, because Spotify sounds, um, spotty, to me on mobile. And, it is at 96kpbs on mobile when on "normal" mode, which even for Ogg Vorbis (which does sound better at lower bit rates than mp3) is a bit low. But, I learned there is a higher bit rate mode (160kbps) available, I just need to turn it on, which is cool. It's too bad Spotify isn't as good at predicting my preferences as Pandora.


Indeed - Spotify premium goes up to 320k (Ogg Vorbis). I don't claim to be an expert, but it certainly sounds great to me on my Yamaha DAC and Sony 7506 monitor headphones.


This is exactly what I thought while reading this. I don't think todays consumers care about that stuff. I think maybe they did somewhat more when we all had the original iPods or napster or whatever, but those days are long since past. I think these days most consumers are back to focusing on the picture and audio quality without regard to "geeky" numbers. I'm an engineer who has spent quite a bit of time dealing with video encoding in my career, and to be honest, I've never really bothered looking much into what Netflix, Google Play, iTunes, etc send my way. Thats because the quality is always good enough.


I run a plex server and share with 12 people. A few months ago I asked everyone to watch the same shows in SD and 720P and pick if the 720P version was worth the extra bandwidth and the size of my library being drastically reduced. Three people picked the 720p option. But they were the three that watch very little.

Movies were different. I did the same thing but with 720p and 1080p versions of movies. The majority wanted 1080p even if it meant a smaller library.


> I think the success of Spotify and Pandora and Rhapsody are proof that consumers don't care about [absolute] quality

That is, after a certain threshold, quality is no longer a concern. As ears are typically less fidelity than eyes, that threshold bar is higher for video than audio.

As an aside, this http://test.tidalhifi.com/intro is an interesting little test.


Author seems to think that they'll choose one file as the winner - that's not the case. Each video needs to be available not just in a few bitrates but also in different resolutions, codecs and profiles for broad compatibility with the various player apps many of which aren't updated any more.

While I haven't keep that abreast of video encoding tech I'm surprised people are saying animated shows will take less bandwidth. Typically video codecs use DCTs which are great at roughly approximating complex patterns but do a bad job at solid blocks of color and contrast. The result is ugly artifact especially on the edges. Perhaps things have greatly improved with h.264


"Author seems to think that they'll choose one file as the winner"

I suggest reading it again.


"To do that, Netflix will transcode every one of their videos a bazillion times at different resolutions and at different bitrates, finally selecting the smallest one for a particular title that doesn’t suck visually."


Aren't they also saying that 5800kb/s is not enough for some bits of some movies?

I've definitely noticed this - on 1080 full bluray files the bitrate swings enormously - from 2mbit/sec up to 50mbit/sec (so much so my Chromecast wifi can't keep up in high action scenes).

So I definitely think they will be able to make improvements to their 1080p content (and definitely 4k content - I don't think 25mbit/sec CBR is enough IMO).


I'm waiting for something like Daala + Opus starting being used by huge services like Netflix. Youtube already uses Opus.

Apple? It will take them another 50 years to start using free codecs.

UPDATE: A post on IETF blog about standardized free video codec effort: https://www.ietf.org/blog/2015/09/aiming-for-a-standardized-...


I'm somewhat confused, is this a different effort than the royalty free codec which is being developed by Google, Amazon, Netflix, Microsoft, Mozilla, Cisco, Intel under the name 'Alliance for Open Media' ?


Regarding the video codec, it's the same thing, except one in IETF is the actual engineering group, and that AOM is more of an administrative one that synchronizes all the bureaucratic stuff probably (legal as well I guess).

UPDATE: See here: http://xiphmont.livejournal.com/67752.html


Ah, thanks, makes sense.


"But I suspect that was a problem.

You see, it would probably be difficult to sell those VBR files — some of which were quite a bit lower than 256 Kbps and a few even lower than 128 Kbps — because customers might perceive a loss of value."

The vast majority of customers do not care about the actual Kbps, as long as the sound quality remains above certain standards. Just market the different quality levels at different prices and most people would never think twice about it (and most would choose the cheapest version).


Not only do customers not care, in blind testing young people prefer the sound of MP3 artifacts.

It's this generation's "warm sound".


I've never heard this before. Source?

(I'm very skeptical given that the higher base level of MP3/AAC encoding that the big music stores use these days has eliminated most of the obvious artifacts in lossy music. In fact, I don't think I've heard any artifacts in retail music since the Napster days!)


Jonathan Berger's research is typically cited for this[1] but I haven't found the actual publications.

[1] http://www.audioholics.com/news/kids-prefer-poor-quality-mp3


While I haven't heard of "warm sound" before, I do know that we unconsciously associate 60fps+ in videos to "home videos" and 30fps to movies and films.


Mp3 artifacts?


That stuff that sounds like tinny tamborines.


I've always thought of it as a "swirling" sound. It's really really bad on SiriusXM, especially with percussion.

(Why the hell am I still paying for XM?)


If enough "audiophiles" repeat the claim that Apple or Amazon has worse audio quality, it could seriously hurt their brand's reputation.


Audiophiles have and will say that regardless, since all of the discussed options are lossy. (I'm not ignoring Apple's lossless codec, just don't think it's relevant for this conversation)


Would it? It might harm their brand's reputation with audiophiles but I don't see the average customer caring.


Isn't Apple + Beats enough to convince people that Apple is not about audio quality?


Isn't the popularity of Beats (not to mention default iPhone headphones) proof that Good Enough is fine for most people


Major question about this comment.

> They all have the same server farms. Owned by Amazon, no doubt. And there aren’t any technical hurdles. It’s just more computation.

Does Apple use Amazon's servers? I thought Apple ran its own hardware/data centers. I've definitely heard war stories of Apple towing trucks full of racks into the desert so they could bump their capacity for cheap.


http://www.apple.com/business/docs/iOS_Security_Guide.pdf

Page 41, Second paragraph: "The keys, and the file’s metadata, are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information, using third-party storage services, such as Amazon S3 and Windows Azure."


They did about two years ago. http://images.apple.com/iphone/business/docs/iOS_Security_Fe... (emphasis added):

"Each file is broken into chunks and encrypted by iCloud using AES-128 and a key derived from each chunk’s contents that utilizes SHA-256. The keys, and the file’s metadata, are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information, using third-party storage services, such as Amazon S3 and Windows Azure."


Apple have some of their own data centres, and they also use lots of other people's. They're rather diversified.


Apple's contemplating building a server farm not terribly far from where I live, so no, I don't think they're using Amazon. I think that comment is supposed to mean that they all have access to big clusters, not that it's the same clusters.


All they need is a half dozen powermac supercomputers.

But seriously, could they not be using Amazon until they finish building out their own farm?


There used to be a group of people who cared about bitrate. They wanted 64kbps Audio that sounded better then MP3 128Kbps, which till today still isn't possible. Be it AAC, HE-AAC, Vobris, or the new Opus. Despite the hype every time a new codec arrived.

There used to be a group of people who wanted codec that is the same quality as Lossless at 256Kbps to 320Kbps. Personally I think MPC ( Musepack ) accomplished it. And it is patentless as well since it is based on MP2. But the codec never caught on in Hardware world. Meanwhile AAC does about just as well @256kbps despite being more complex.

That was in the Naspter -> iTunes download era, Then time flies, both group of people lost interest. Mainly for the same reason. Both group wanted to store as many music downloads as possible. 1st group dont mind a little quality loss, 2nd group wanted near perfect quality @256Kbps, however HDD prices dropped to a point where 1st group dont mind storing them in ~256kbps and 2nd group will simply store them as lossless FLAC.

Then we come to the age of Streaming, that doesn't necessarily means only Apple Music or Spotify and the like, the largest music streaming is properly Youtube. People dont download anymore, They just click and play on Youtube.

It tuns out, I think we have reached the stage of "good enough". Whether it is audio, or video. With Video, we can get huge improvement if we smooth out the noise / grain details. Our broadband speeds continues to improve, we will have G.Fast & VDSL2, the next generation of DSL broadband tech. Most kids or youngster of this generation dont care about Audio / Video quality as much, they would rather want instant and ease of access.


Ok everyone is stuck on the bitrate argument and consumer choice. However I'm more focused on if this plan is even possible.

Could Netflix really want to throw so much money at transcoding so many ways? Are there various tricks to do this at a reasonable cost? Like grab random 10 seconds across 15 points in a movie and try just that? Work with top n most popular first? Sort by existing biggest movies?


Doing the transcodes is pretty much a drop in the bucket for NetFlix. Even transcoding the same movie 1000x at full length is pretty meaningless to them since it is a fixed cost against their library size.

What they really care about is things that are multiplied by the number of users they have. Saving 5% of bandwidth on 100k users is very meaningful to them. So doing extra transcodes to figure out what to send is very valuable.

They've come and given talks at Cloud dev meetups and just an unexpected bump in bandwidth or delay in server response time causing traffic to back up is enough to knock over their servers and they do things no one else would even think of, like having their clients upload code to their servers to batch requests together in the optimal format for the clients, just to reduce bandwidth and load on their network.


I was running small project that encoded videos with SaaS in multiple formats/sizes (~10) and then distributed it over CDN (no major traffic). CDN price was about 99% of project costs. Transcoding and storage on S3 was fraction of a cost. I was also trying every trick in the book to make files smaller, just to cut on traffic.


the transcodes is not where the money is spent. Its primarily on streaming costs. Every company wants to maximize quality and reduce cost to stream. Right now everyone wants 4k streams. The size and cost are becoming ridiculous. Its worth it in the end to use better encoders as well as do what Netflix is doing.


Netflix's internal instance spot market means they can pretty easily throw unused EC2 instances at transcoding tasks.


  I would bet money that Amazon ran into this same
  conundrum with the unconstrained VBR mode of the
  LAME MP3 encoder which they use.
Lame has always had a way to set minimum and maximum bounds on the VBR bit rates. I would bet that Amazon has at least one employee who knows this. I used to use this a lot with a hardware player that couldn't handle VBR above 224 kbps.


Why would apple et al follow?

Unless there is any noticeable affect on quality/streaming ease then consumers won't care.

You have to remember that most people can't/won't tell the difference between blueray and DVD, let alone bitrate change. More importantly they have lots of silly TV effects that actively fuck with the picture (Sharpening, Aspect ratio stretching, over saturation, active motion, and other horrid "enhancements")

The only reason netflix et al is a thing is because of the content, not the platform. (just look at how shit iTunes is to use) You can make the most wonderful interface in the world, but if there is no content, there is no point.

Streaming does cost money, but that's not the main cost of business. Most cost comes from licensing the content in the first place. (Then paying all your staff to do fancy things)

Seriously bandwidth is pretty cheap, compared the the cost of buy the license to broadcast a top rated movie. (a high ranking movie is easily a few $million. custom TV series is anywhere between 1 and 30+ million for a season.)


The article seems to focus too much on the consumer side of the bandwidth equation. I think the real win for Netflix is the aggregate egress bandwidth savings from their DCs. If Netflix can halve their bandwidth (as the article seems to claim) without any appreciable loss in quality, they've just saved substantially in the infrastructure and peering contracts needed to deliver their content. I have no idea how much money Netflix currently spends on bandwidth/CDN, but I'd guess it's certainly in the 100s of millions. I can imagine that Amazon and Apple would be very interested in emulating those savings.


One of the big things that drew me to amazon's platform was their 'ASAP' streaming. It wasted my bandwidth to begin streaming all the stuff I might be about to open and as a result it seemed seamless.

If Netflix can make my film stream at a high resolution faster and buffer less, over time I'll notice the buffering more on other platforms and just use Netflix instead.


I think Apple already does that for the content delivered by iTunes


It's completely normal for Netflix to work on that and end-consumers wont see a difference.

That's exactly what's been done by illegal release groups (pirated content) which are very picky when it comes to time to release (encoding/sharing) and do their best to encode fast enough while having a good viewable quality.

When it comes to encoding, even for a large library like Netflix's, time to encode is always lesser than time/bandwith saved while sharing/streaming.

As of now, Netflix 1080p raw content (not transcoded) is delivered at a bitrate of 5200kbps/5900kbps with no differences between the content (animated or live action - here I compared BoJack Horseman to The Ridiculous 6). While many (or even all) high quality release groups encode animated bluray at around 3000Kbps (1080p) (around 5500kbps for very high quality) while live action is encoded at around 11.0Mbps. The same difference is applied when the content is capped from TV and then encoded.


This also varies by the target audience.

Stuff that is released via publically-tracked torrent (i.e. for mass-market audiences) often targets around 1.5-2 GB for a feature-length movie in 1080p. In contrast, releases aimed at Usenet (the technical crowd) are often 6-7 GB and sometimes as large as 11-13 GB for the same movie.

On the other hand the situation is much more equitable for audio. Lossless audio torrents are pretty common even in the torrent world, and due to the typically greater number of torrents available the overall selection of lossless files is probably at least as large as on Usenet.

I would assume that private trackers tend towards higher-quality releases.


As more video moves to higher resolutions (such as 4K) the amount of bandwidth wasted by inefficient encoding increases exponentially.

While consumers only care if the quality is "acceptable", it's pretty easy to tell the difference between a crisp 4K picture and a 1080p picture, and also easier to see encoding or bitrate artifacts.

So I think this is probably an attempt to improve the margins a bit on content delivery costs without sacrificing quality.

Netflix has also embraced 4K content with its original series, so it is in a unique position to leverage the shift to higher resolution content for maximum profitability.


I see it more as an attempt to offer better quality to people with crappy ISPs as they expand to more countries. Previously, they would have seen a low-res Bojack, but now that Netflix can decide that a full HD encode of Bojack can fit in a smaller bitrate, those people will see a high-res Bojack. It might even be related to the T-Mobile announcement that allows people to watch Netflix without eating into their data caps as long as the bitrate is capped.


1080p bluray looks better than Netflix/Amazon Prime at 4K, when all things are even bitrate matters the 1080p quality of Netflix is close to 720p HDTV rips at this point.


...it's pretty easy to tell the difference between a crisp 4K picture and a 1080p picture...

Or not: http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupi...


People get really excited about this revolutionary "quality based" encoding and to me it just sounds like -crf in x264. For trying to hit a constant bitrate, I've heard of people performing a -crf pass, then taking the average bitrate of the result and using that for a constant quality encode.

With this method, you let the encoder figure out the bitrate for the quality target you want to hit, then you can use that bitrate in a constant quality encode if you like.


author's point is weak. users of netflix is not concerned with knowing what bitrate the video is. It just needs to have an acceptable quality.


> users of netflix is not concerned with knowing what bitrate the video is

That's part of the author's point: if netflix can halve their bandwidth without users noticing, they will earn massive savings. Apple & Amazon will also want in on the savings (being competitors and all)


that must have been a nuanced point. He had much more emphasize on the strategy of differentiation via bitrate.


Consumers don't understand bitrate for videos. They only care if they can watch 1080p on their 1080p TV. It doesn't occur to them that resolution is only a minor player in digital quality.


Let's assume that consumers even know about or care about bit rate. Apple and Amazon could offer two downloads, VBR and CBR.


Really neat stuff. Can't wait to see it in FOSS software so we can save some space in our video collections. :)


OSS encoders (x264, Theora, VP9, Daala, Vorbis) already tend to have a constant quality rate control mode that they'd very much prefer you use unless you have an actual reason to need a specific bitrate. I'm always surprised at how many people try to reinvent it via abr...

Netflix (and streaming services in general) on the other hand needs known bitrates with known constraints so their bandwidth estimation can work correctly without hiccups. Your personal video collection does not.


So, it only affects bandwidth and not storage space?


No - the point is that true constant quality has (almost) no constraints on local bitrate, so one section of a movie might be twenty times the bitrate of another section. Online streaming services continually estimate the current available bandwidth, then choose from a selection of pre-encoded streams to download. For this to work well, the selection must know the maximum local bitrate of each stream to match the estimation. If this local maximum isn't known or constrained, you get buffering because you're attempting to download a stream that's actually currently twenty times more than the available bandwidth.

Whereas your personal video collection probably isn't being streamed over any link slower than several hundred MBit/s, which is more than enough for anything short of intermediate codec bitrates, plus significant buffering doesn't count against any data caps.


Makes sense. Thanks for the detailed explanation.


You can already achieve the same result using custom scripts and settings in x264. Most serious encoding groups have been doing this for years.


Didn't know that. You got a link that makes it easy for people to learn how to do that in VLC player?


Not VLC player, but equally good: https://handbrake.fr/


But VLC can play the more clever encoding it generates, right?


VLC can play anything!

More seriously: the bitrates do not affect a player's compatibility with the container or video/audio format. For instance, if VLC can play H263, it doesn't matter if you use Handbrake to output 5100kbps or 100kbps.


Higher bitrates need higher CPU resources to decode, but it doesn't really matter. You'd run out of wifi bandwidth in your house streaming bluray images, though.


Sweet! I'll try it out at some point then.


I'm not sure about that. You'd need to research or test to determine that.


Aight. Thanks for the link anyway.


Love the closing sentence from this article:

I don’t know. It’s hard to predict because consumers… well… we’re fucking stupid.


Are they just using H265 HEVC?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: