Hacker News new | past | comments | ask | show | jobs | submit login

Most likely you're watching in 720p.

In Chrome, Firefox and Opera you're getting 720p max.

To get 1080p you need to be watching in Internet Explorer, Safari or on a Chromebook.

To get 4k you need to be watching in Edge and have a 7xxx or 8xxx Intel CPU and HDCP 2.2 enabled display.

Source: https://help.netflix.com/en/node/23742

If that’s true, that’s pretty unfortunate. YouTube does 1440p@60fps now.

Yep, but YouTube doesn't require full DRM support.

Requiring protocol level DRM to be included in video/music streaming technologies has always baffled me.

What's the point? Even with theoretically perfect protocol level DRM, the consumer eventually has to be able to see/hear the protected content. If the frames of the video are displayed on screen, and the audio played through the speakers, the output can be recorded and preserved, period.

Do the people in charge of making these decisions not realize that whatever convoluted DRM technologies they pay to be developed and implemented will always be defeated by a $30 camcorder from the 90s?

After working in the broadcasting industry for a few years... the answer is that the management of companies in the chain simply doesn't care. DRM absolves them of any kind of piracy blame and they happily pile blame around on people who push back on DRM. That's about it to the story - actual results never come into the reasoning.

I had always considered this the most likely explanation.

The tendency for humans to implement elaborate but ineffective security theaters in an attempt to convince people they're protected from fundamentally unprotectable threats is as old as society itself.

As a consequence, chrome can't watch netflix videos at full quality, anybody who flys in the US has to remove their shoes, and my child has to wear a clear backpack to school. I wish we'd stop playing these silly games of pretend which degrade the quality of life of everyone.

> Requiring protocol level DRM to be included in video/music streaming technologies has always baffled me.

> What's the point?

It's the same reason you put a lock on the door to your house. You know the lock is easily bypassed by tools. The windows are easily broken. The door can probably be easily forced open. But you still lock the door when you leave in the morning.

Locks are about keeping casual theives as honest people. DRM is about keeping casual pirates as honest customers. It's about making it just difficult enough to copy that most people will consider it not worth the bother. It's about saying, "You must be this determined to break the law."

> making it just difficult enough to copy that most people will consider it not worth the bother

I don't think that the analogy holds up. The deterrence only applies to uploaders and not consumers, because the processes for removing DRM and distributing videos are independent. It just takes one person to make a video available to the entire world. It does seem like torrents are available for most of the Netflix catalogue, so I'm skeptical that DRM is useful for popular shows.

The analogy is good, but the reasoning is a bit off. Let me fix it - You still lock your door in the morning because you know that your insurance company will not pay out your theft insurance if anything gets stolen. DRM is mandated all the way up the chain (and noone really cares if it works).

They realize, of course. It's not the true reason for DRM, which is squashing competition[1]

[1] https://boingboing.net/2017/09/18/antifeatures-for-all.html

Here's the real rub: the videos have already been pirated at full quality, and are available online for free. They're doing a poor job of defending something that has already been breached.

That all said, camcorders don't get you anywhere: the goal isn't ripping the video, it's ripping the high quality video. DRM can theoretically defend against that, but you'd need to control the whole stack, incl. hardware, incl. the monitor and speakers.

>but you'd need to control the whole stack, incl. hardware, incl. the monitor

Even then, you could tear monitor apart and grab LVDS signals to panel.

DRM works on multiple fronts, causal piracy is one. The other is the control over player production, what features it will have, who can and who cannot make it, etc. Those who can make good deals, will get an advantage.

Someone above linked a help page that says, that for 4K Netflix you need Edge and Intel Kaby Lake or newer. Do you think that it was free for Microsoft or Intel, or some good deal sweetened that?

> causal piracy

Not sure if you mean causal or casual. Casual is going to piratebay and downloading.

I would agree that DRM and other anti-consumer things (unskippable things on dvds, adverts accusing you of pirating the dvd you've actually bought, etc) does cause piracy though

Recording the analog output has a lower quality than the original stream for 3 main reasons:

1. Marketing and psychology: Viewers want to believe they are viewing the original, not a degraded copy.

2. Unfaithful copy: Analog output and analog input introduce errors. LCDs use a variety of tricks to improve resolution such as spacial and temporal dithering. Also you can't use a normal camera to record a monitor because of aliasing (of pixels and non-genlocked frame rates).

3. Encoding noise. The encoding of the original is based on the higher quality original, and carefully optimised for the least visual artifacts. Any re-encoding also has to deal with noise introduced by the copying process, and with the noise introduced by the original encoding. This noise noticeably reduces the quality of a copy.

If anyone is wondering, this whole concept referred to as the analog hole. [1]

[1] https://en.wikipedia.org/wiki/Analog_hole

There are many answers to your question but for starters:

- There is no perfect security. There is a notion of raising the expense of piracy to a level that it effectively does not matter.

- IIRC, for instance, rooted Android loses support for... Widevine? So you can't really use Netflix on a rooted device where you could easily steal frames from the video buffer. Yeah, you can rig up a nice camera system and record analog off the display. Nothing they can do about that. They also may insert watermarks to let them know who recorded it.

I bought an HDMI cloner box the same day Netflix announced they were adding this DRM.

I actually haven't even taken it out of the box yet. But it just feels good to know their DRM is pointless.

They don't care because once the signal hits analog the quality will be much lower.

But why should I, as a viewer, care about that?

Because content providers are making you care about that. They _demand_ that you're prevented from seeing high quality TV if your platform doesn't fully lock you out.

Do netflix limit their own shows?

Yes, they apply full DRM to their own content as well.

The answer is probably "because your content is locked on that provider", but that's a less and less valuable point.

Obviously, BitTorrent was the big reason in the past, but now the reality is that there is a lot of competition in the video space - you aren't just competing with films and tv shows, but youtube videos, twitch streams, etc...

It most likely does for paid offerings. I doubt you'll be able to watch a paid 4K movie on youtube/google movies unless you satisfy their highest widevine requirements. I think youtube's 1080p requirements may be more relaxed than netflix though. Regular yt has no DRM of course.

You can actually watch 8k video off of youtube:


Wow! Even at my monitor's 1024x768, the 8k video looks amazing! So much better than 1080p!

Something something nyquist something

I'm not 100% sure if you tried to be ironic or if you really reported that the video was better in 8k than FHD.

Because actually, it can be.

Although 8k is overkill, 4k will be enough, and 1440p nearly ok on your old 1024x768 monitor. Typically video encoding does some subsampling on some color components. If you play 4k content on a FHD screen, the quality can be better because you will have no subsampling on your FHD screen, compared to mere FHD encoding (in most cases).

It was a stab at irony.

True, but the video is already subsampled. That's how it was able to be uploaded at 1080p at all, since the source video is 8k. So 8k vs 1080p shouldn't make any difference on monitors less than M-by-1080 resolution.

The video is typically subsampled at encoding at capture resolution, but it is also subsampled at other encoding resolutions. Because the whole point of subsampling is to be taken into account during encoding, and encoding itself needs not to vary depending on whether the source was downscaled or not.

So video codecs most of the time work with some subsampled chroma components. So your encoded 1080p might be able to render after decoding only e.g. 540 lines of those components, while with the 4k stream it might be: 2160/2 => back to 1080.

Edit: but to be clear, I'm not advocating for people to choose 2x stream and start watching 4k on FHD screens in general, that would be insane. Chroma subsampling is used because the eye is less sensitive to those colors.

I would be interested in a double blind experiment confirming that their specific implementation of chroma subsampling is even detectable. The eye is much less sensitive to colors than intensity, as you point out. If it were perceptible, I think the codec designers wouldn't feel it was an acceptable tradeoff.

So your encoded 1080p might be able to render after decoding only e.g. 540 lines of those components, while with the 4k stream it might be: 2160/2 => back to 1080.

I'm not sure that's accurate -- whatever downscaling process was used to convert from 8k to 1080p on Google's servers is probably the same process to convert from 8k to 1080p in the youtube player, isn't it? At least perceptually.

I would agree that if they convert from 8k (compressed) to 4k (compressed), then 4k to 1080p (compressed), then that would introduce perceptible differences. But in general reencoding video multiple times is fail, so that would be a bug in the encoding process server side. They should be going from the source material directly to 1080p, which would give the encoder a chance to employ precisely the situation you mention.

Either way, you should totes email me or shoot me a keybase message. It's not every day that I find someone to debate human perceptual differences caused by esoteric encoding minutiae.

It's not just that the eye is less sensitive to chroma.

Although your 4:2:0 subsampled 1080p video only has 540x960 pixels with chroma information, the decoder should be doing chroma upsampling, and unless its a super simple algorithm it should be doing basic edge detecting and fixing the blurry edges chroma subsampling is known to cause. I posit that even with training, without getting very very close to your screen you wouldn't be able to tell if the source material was subsampled 4:2:0, 4:2:2, or 4:4:4.

The truth is that generally people DO subjectively prefer high resolution source material that has been downscaled. Downscaling can clean up aliasing and soften over-sharp edges.

People who watch anime sometimes upscale video beyond their screen size with a neuron-based algorithm, and then downscale to their screen size, in order to achieve subjectively better image quality. This is even considering that almost all 1080p anime is produced in 720p and then upscaled in post-processing!

It will make different if the encoding compression is different. Not all 1080p streams are equal. A 1080p FHD blueray is around 30mbps. I've read 20mbps h264 being almost indistinguishable from 30mbps blueray. In my own personal test using some Starwards bluerays, a 10mbps looks pretty good compared to the blueray. On YouTube I've seen anywhere from 2-4mbps being used for 1080p and 7+ used for 4k.

A 4k or 8k stream is coming into your computer at 10+mbps and being downsampled to 1080p can very contain more information than a lower quality 1080p stream coming into your computer at 4mbps even after downsampling.

Even ignoring chroma, video compressed at streaming bitrates is nowhere near the nyquist limit for a given resolution.

In addition, YouTube generally encodes 4k at like 5-6x the bitrate of their 1080p encodes (codec for codec), rather than merely 3-4x higher which would be closer to the same quality per pixel.

So yeah, YouTube's 4k is better on a 1080p screen than their 1080p stream.

There is already an 8k display on the market.

SMPTE recommend a viewing angle of 30 degrees, which matches THX (26-36).

Assuming you have 20/20 vision, You won't be able to tell the distance between 4K and higher unless your screen fills more than about 40 degrees, in which case you are losing detail at the edges.

An 8K monitor on your desk may make sense -- if you're say 3' away from it and it's say 60", you'll start noticing a difference between 4k and 8k, however you will be focused on one area of the screen, rather than the entire screen.

Even with 4K, for most people watching television/films the main (only) benefits are HDR and increased temporal resolution (60p vs 60i/30p)

All the 8K stuff I've seen comes with 22.2 sound to help direct your vision to the area of the screen wanted. It certainly has applications in room sized screens where there are multiple events going on, and you can choose what to focus on (sport for example).

If you were to buy a 32" 8K screen - say the UP3218K, about 28" wide, to get the benefit of going above 4K you would need to be sat within about 30 inches. At 30" you would have the screen filling about 50 degrees of vision. Even an immersive film should only be 40 degrees.

It's even more complicated in my experience. You get up to 1080p in Edge compared to only up to 720p in Chrome or Firefox, but you still don't get 1080p guaranteed. For me it varied depending on the specific movie or TV series, and I often still didn't get 1080p in Edge, and even only 480p sometimes (as confirmed with the debug overlay). Only switching to the Netflix Windows 10 App and later to a Smart TV actually fixed this and gave me consistent 1080p content.

Do they offer any reasons to these limitations?

In particular, is it due to DRM requirements, or pure performance? I suspect it's the former.

It's 100% a DRM thing. Desktops are pretty much the most powerful platform to watch Netflix on yet the most restricted in terms of available video quality due to their open nature.

For example, my desktop with an i7-2600k (that's a Sandy Bridge CPU from 2011) has zero issues playing 4K60 VP9 footage on YouTube in Chrome with CPU decoding, yet on Netflix with the same Chrome I'm arbitrarily restricted to 720p H.264 video.

It’s DRM. The widevine conf they are using means they are decrypting and decoding in software when you use Chrome or Firefox. When you use Edge you use a different DRM scheme that allows allows decrypting, decoding and rendering in hardware so Netflix offered content upto 4K in Edge with Recent Intel CPU’s. (Last time I checked Ryzen has only just come out with no onboard GPU. But support for recent Nvidia GPU’s was promised, it’s been a while so the landscape may of well changed) If you didn’t have the latest Intel CPU it called back to an older version of PlayReady (sure that’s the brand name of MS’s DRM - on phone and a bit lazy to look it up) that still surported 1080.

See in Widevine there are a number of “levels”, the highest being when it can decode, decrypt and push to the frame buffer all in a secure zone. This can not be achieved (atm, well atleast the time of my research into the matter) with widevine on Desktop, so in such a setup widevine will only decrypt upto 720p content.

When running on Android and ARM this is possible and you can get 1080p, which is why you can get cheap android based tv sticks (even the old Amazon Fire TV sticks) supported 1080p but your gaming rig and Chrome could not.

Don’t work for Widevine, Google, NetFlix or anyone else for that matter. Just a nerd with too much time on my hands so I looking into this stuff. Any corrections welcomed :-D

No longer able to edit so replying to myself: Taken from another post of mine about widevine 7 months ago where we was discussing why the RaspberryPi couldn't support 1080p Netflix (https://news.ycombinator.com/item?id=15594460 and a link to the comment chain to make it easier for anyone reading - https://news.ycombinator.com/item?id=15586844)

> As far as I understand it there are 3 security levels to widevine Level1 being the highest and 3 being the lowest.

> Level 1 is where the decrypt and decode are all done within a trusted execution environment (As far as I understand it Google work with chipset vendors such as broadcom, qualcomm, etc to implement this) and then sent directly to the screen.

> Level 2 is where widevine decrypts the content within the TEE and passes the content back to the application for decoding which could then be decoded with hardware or software.

> Level 3 (I believe) is where widevine decrypts and decodes the content within the lib itself (it can use a hardware cryptographic engine but the rpi doesn't have one).

> Android/ChromeOS support either Level1 or Level3 depending on the hardware and Chrome on desktops only seems to support Level 3. Kodi is using the browser implementation (at least when kodi is not running on Android) of widevine which seems to only support Level 3 (So decrypt & decode in software) and therefore can not support hardware decoding. But that doesn't mean that hardware decoding of widevine protected content can not be supported on any mobile SoC. Sorry if I gave that impression.

> When a license for media is requested the security level it will be decrypted/decoded with is also sent and the returned license will restrict the widevine CDM to that security level.

> I believe NetFlix only support Level 1 and Level 3, which is why for a while the max resolution you could get watching NetFlix on chrome in a desktop browser was 720p as I believe that was the max resolution NetFlix offered at Level 3 and we had to use Edge/IE(iirc) to watch at 1080p as it used a different DRM system (PlayReady) and why atm Desktop 4k Netflix is only currently supported on Edge using (iirc) Intel gen7+ processors and NVidia Pascal GPUs (I don't know if AMD support PlayReady 3.0 on their GPUs as I don't have one so not really had the desire to investigate, I'm guessing that current Ryzen CPUs do not as they currently don't have integrated GPUs).

I'd wager the 720/1080 split is due to the former being their limit for browsers doing software decode. 4k being restricted to edge sounds like Microsoft is the only one supporting HDCP (drm).

It could also be a lot worse than 720p. At least for me, most bigger (not netflix-original) titles are limited to 480p with maximum bitrates even below 1000 kbps when using a "partially" supported browser like Chrome.

But even using Edge is not a silver bullet for all content, as some seems to be limited to that low bitrate 480p on all browsers, even if higher quality is available on a TV app.

Control+Alt+Shift+S and override bitrate used.

Yes, I'm using that to see what bitrates are available. For the mentioned content, they range from very low to low (up to around 1000 kbps).

>In Chrome, Firefox and Opera you're getting 720p max.

You can actually get some content in 1080p in Chrome and Firefox with a browser extension. It is somewhat unreliable however and some videos still get capped at 720p.

Chrome extension (with explanation of how it works): https://github.com/truedread/netflix-1080p

Firefox extension (unfortunately doesn't seem to work at the moment): https://github.com/vladikoff/netflix-1080p-firefox

Do note that it's possible to watch in 1080p with this addon[1]. 4K it's probably not possible to spoof.

1: https://addons.mozilla.org/en-US/firefox/addon/force-1080p-n...

Wow that is ridiculous. I thought Netflix owned most of their content and wouldn't need to kowtow to ridiculous DRM demands like this? Or is this their own doing?

Even then, 720p from a torrent looks much better that what I get from Netflix. And I've tried Chrome, edge and even the win10 UWP app.

What source is your torrent from?

A WEB-DL is different from a (1080i) HDTV capture is different from a Blu Ray rip.

Netflix are optimising for bandwidth over quality - hell, the audio still seems to be 96 kbit AAC.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact