In Chrome, Firefox and Opera you're getting 720p max.
To get 1080p you need to be watching in Internet Explorer, Safari or on a Chromebook.
To get 4k you need to be watching in Edge and have a 7xxx or 8xxx Intel CPU and HDCP 2.2 enabled display.
What's the point? Even with theoretically perfect protocol level DRM, the consumer eventually has to be able to see/hear the protected content. If the frames of the video are displayed on screen, and the audio played through the speakers, the output can be recorded and preserved, period.
Do the people in charge of making these decisions not realize that whatever convoluted DRM technologies they pay to be developed and implemented will always be defeated by a $30 camcorder from the 90s?
The tendency for humans to implement elaborate but ineffective security theaters in an attempt to convince people they're protected from fundamentally unprotectable threats is as old as society itself.
As a consequence, chrome can't watch netflix videos at full quality, anybody who flys in the US has to remove their shoes, and my child has to wear a clear backpack to school. I wish we'd stop playing these silly games of pretend which degrade the quality of life of everyone.
> What's the point?
It's the same reason you put a lock on the door to your house. You know the lock is easily bypassed by tools. The windows are easily broken. The door can probably be easily forced open. But you still lock the door when you leave in the morning.
Locks are about keeping casual theives as honest people. DRM is about keeping casual pirates as honest customers. It's about making it just difficult enough to copy that most people will consider it not worth the bother. It's about saying, "You must be this determined to break the law."
I don't think that the analogy holds up. The deterrence only applies to uploaders and not consumers, because the processes for removing DRM and distributing videos are independent. It just takes one person to make a video available to the entire world. It does seem like torrents are available for most of the Netflix catalogue, so I'm skeptical that DRM is useful for popular shows.
That all said, camcorders don't get you anywhere: the goal isn't ripping the video, it's ripping the high quality video. DRM can theoretically defend against that, but you'd need to control the whole stack, incl. hardware, incl. the monitor and speakers.
Even then, you could tear monitor apart and grab LVDS signals to panel.
Someone above linked a help page that says, that for 4K Netflix you need Edge and Intel Kaby Lake or newer. Do you think that it was free for Microsoft or Intel, or some good deal sweetened that?
Not sure if you mean causal or casual. Casual is going to piratebay and downloading.
I would agree that DRM and other anti-consumer things (unskippable things on dvds, adverts accusing you of pirating the dvd you've actually bought, etc) does cause piracy though
1. Marketing and psychology: Viewers want to believe they are viewing the original, not a degraded copy.
2. Unfaithful copy: Analog output and analog input introduce errors. LCDs use a variety of tricks to improve resolution such as spacial and temporal dithering. Also you can't use a normal camera to record a monitor because of aliasing (of pixels and non-genlocked frame rates).
3. Encoding noise. The encoding of the original is based on the higher quality original, and carefully optimised for the least visual artifacts. Any re-encoding also has to deal with noise introduced by the copying process, and with the noise introduced by the original encoding. This noise noticeably reduces the quality of a copy.
- There is no perfect security. There is a notion of raising the expense of piracy to a level that it effectively does not matter.
- IIRC, for instance, rooted Android loses support for... Widevine? So you can't really use Netflix on a rooted device where you could easily steal frames from the video buffer. Yeah, you can rig up a nice camera system and record analog off the display. Nothing they can do about that. They also may insert watermarks to let them know who recorded it.
I actually haven't even taken it out of the box yet. But it just feels good to know their DRM is pointless.
Obviously, BitTorrent was the big reason in the past, but now the reality is that there is a lot of competition in the video space - you aren't just competing with films and tv shows, but youtube videos, twitch streams, etc...
Something something nyquist something
Because actually, it can be.
Although 8k is overkill, 4k will be enough, and 1440p nearly ok on your old 1024x768 monitor. Typically video encoding does some subsampling on some color components. If you play 4k content on a FHD screen, the quality can be better because you will have no subsampling on your FHD screen, compared to mere FHD encoding (in most cases).
True, but the video is already subsampled. That's how it was able to be uploaded at 1080p at all, since the source video is 8k. So 8k vs 1080p shouldn't make any difference on monitors less than M-by-1080 resolution.
So video codecs most of the time work with some subsampled chroma components. So your encoded 1080p might be able to render after decoding only e.g. 540 lines of those components, while with the 4k stream it might be: 2160/2 => back to 1080.
Edit: but to be clear, I'm not advocating for people to choose 2x stream and start watching 4k on FHD screens in general, that would be insane. Chroma subsampling is used because the eye is less sensitive to those colors.
So your encoded 1080p might be able to render after decoding only e.g. 540 lines of those components, while with the 4k stream it might be: 2160/2 => back to 1080.
I'm not sure that's accurate -- whatever downscaling process was used to convert from 8k to 1080p on Google's servers is probably the same process to convert from 8k to 1080p in the youtube player, isn't it? At least perceptually.
I would agree that if they convert from 8k (compressed) to 4k (compressed), then 4k to 1080p (compressed), then that would introduce perceptible differences. But in general reencoding video multiple times is fail, so that would be a bug in the encoding process server side. They should be going from the source material directly to 1080p, which would give the encoder a chance to employ precisely the situation you mention.
Either way, you should totes email me or shoot me a keybase message. It's not every day that I find someone to debate human perceptual differences caused by esoteric encoding minutiae.
Although your 4:2:0 subsampled 1080p video only has 540x960 pixels with chroma information, the decoder should be doing chroma upsampling, and unless its a super simple algorithm it should be doing basic edge detecting and fixing the blurry edges chroma subsampling is known to cause. I posit that even with training, without getting very very close to your screen you wouldn't be able to tell if the source material was subsampled 4:2:0, 4:2:2, or 4:4:4.
The truth is that generally people DO subjectively prefer high resolution source material that has been downscaled. Downscaling can clean up aliasing and soften over-sharp edges.
People who watch anime sometimes upscale video beyond their screen size with a neuron-based algorithm, and then downscale to their screen size, in order to achieve subjectively better image quality. This is even considering that almost all 1080p anime is produced in 720p and then upscaled in post-processing!
A 4k or 8k stream is coming into your computer at 10+mbps and being downsampled to 1080p can very contain more information than a lower quality 1080p stream coming into your computer at 4mbps even after downsampling.
In addition, YouTube generally encodes 4k at like 5-6x the bitrate of their 1080p encodes (codec for codec), rather than merely 3-4x higher which would be closer to the same quality per pixel.
So yeah, YouTube's 4k is better on a 1080p screen than their 1080p stream.
Assuming you have 20/20 vision, You won't be able to tell the distance between 4K and higher unless your screen fills more than about 40 degrees, in which case you are losing detail at the edges.
An 8K monitor on your desk may make sense -- if you're say 3' away from it and it's say 60", you'll start noticing a difference between 4k and 8k, however you will be focused on one area of the screen, rather than the entire screen.
Even with 4K, for most people watching television/films the main (only) benefits are HDR and increased temporal resolution (60p vs 60i/30p)
All the 8K stuff I've seen comes with 22.2 sound to help direct your vision to the area of the screen wanted. It certainly has applications in room sized screens where there are multiple events going on, and you can choose what to focus on (sport for example).
If you were to buy a 32" 8K screen - say the UP3218K, about 28" wide, to get the benefit of going above 4K you would need to be sat within about 30 inches. At 30" you would have the screen filling about 50 degrees of vision. Even an immersive film should only be 40 degrees.
In particular, is it due to DRM requirements, or pure performance? I suspect it's the former.
For example, my desktop with an i7-2600k (that's a Sandy Bridge CPU from 2011) has zero issues playing 4K60 VP9 footage on YouTube in Chrome with CPU decoding, yet on Netflix with the same Chrome I'm arbitrarily restricted to 720p H.264 video.
See in Widevine there are a number of “levels”, the highest being when it can decode, decrypt and push to the frame buffer all in a secure zone. This can not be achieved (atm, well atleast the time of my research into the matter) with widevine on Desktop, so in such a setup widevine will only decrypt upto 720p content.
When running on Android and ARM this is possible and you can get 1080p, which is why you can get cheap android based tv sticks (even the old Amazon Fire TV sticks) supported 1080p but your gaming rig and Chrome could not.
Don’t work for Widevine, Google, NetFlix or anyone else for that matter. Just a nerd with too much time on my hands so I looking into this stuff. Any corrections welcomed :-D
> As far as I understand it there are 3 security levels to widevine Level1 being the highest and 3 being the lowest.
> Level 1 is where the decrypt and decode are all done within a trusted execution environment (As far as I understand it Google work with chipset vendors such as broadcom, qualcomm, etc to implement this) and then sent directly to the screen.
> Level 2 is where widevine decrypts the content within the TEE and passes the content back to the application for decoding which could then be decoded with hardware or software.
> Level 3 (I believe) is where widevine decrypts and decodes the content within the lib itself (it can use a hardware cryptographic engine but the rpi doesn't have one).
> Android/ChromeOS support either Level1 or Level3 depending on the hardware and Chrome on desktops only seems to support Level 3. Kodi is using the browser implementation (at least when kodi is not running on Android) of widevine which seems to only support Level 3 (So decrypt & decode in software) and therefore can not support hardware decoding. But that doesn't mean that hardware decoding of widevine protected content can not be supported on any mobile SoC. Sorry if I gave that impression.
> When a license for media is requested the security level it will be decrypted/decoded with is also sent and the returned license will restrict the widevine CDM to that security level.
> I believe NetFlix only support Level 1 and Level 3, which is why for a while the max resolution you could get watching NetFlix on chrome in a desktop browser was 720p as I believe that was the max resolution NetFlix offered at Level 3 and we had to use Edge/IE(iirc) to watch at 1080p as it used a different DRM system (PlayReady) and why atm Desktop 4k Netflix is only currently supported on Edge using (iirc) Intel gen7+ processors and NVidia Pascal GPUs (I don't know if AMD support PlayReady 3.0 on their GPUs as I don't have one so not really had the desire to investigate, I'm guessing that current Ryzen CPUs do not as they currently don't have integrated GPUs).
But even using Edge is not a silver bullet for all content, as some seems to be limited to that low bitrate 480p on all browsers, even if higher quality is available on a TV app.
You can actually get some content in 1080p in Chrome and Firefox with a browser extension. It is somewhat unreliable however and some videos still get capped at 720p.
Chrome extension (with explanation of how it works): https://github.com/truedread/netflix-1080p
Firefox extension (unfortunately doesn't seem to work at the moment): https://github.com/vladikoff/netflix-1080p-firefox
A WEB-DL is different from a (1080i) HDTV capture is different from a Blu Ray rip.
Netflix are optimising for bandwidth over quality - hell, the audio still seems to be 96 kbit AAC.
On youtube, you can right click > stats for nerds for some similar info.
I tend to find the quality of HD Netflix streams pretty great over my PS4. Certainly never a cause to download a torrent instead.
I believe fast.com is supposed to test against actual Netflix video delivery servers, just to detect this kind of ISP fuckery.
EDIT: Looking at some stuff, it seems like Netflix might "trust" a first-party browser to select the highest-quality stream that it has hardware video decode support for. In comparison, it sounds like there are extensions that enable 1080p in Chrome by pushing it into the list of playlist options, but it can cause a serious performance hit by decoding on the CPU.