
YouTube no longer supports 4K video playback in Safari - clouddrover
https://9to5mac.com/2017/01/12/youtube%E2%80%A4com-no-longer-supports-4k-video-playback-in-safari/
======
AaronFriel
Asking here because this might have some visibility:

What's the status of hardware acceleration of next-generation video standards
(h.265, VP9, something else)?

It's my understanding that after VP8 was pushed and then superseded rapidly,
hardware manufacturers are now leery of implementing anything other than
h.264.

And as a follow-on: what makes generic GPU hardware and software not
sufficient for hardware acceleration of video decoding? What is it that makes
my GPU stupendously good at pushing pixels and training neural nets and
physics calculations and so many other things, but not better than my CPU at
video decoding? Is there a reason h.265 and such aren't implemented in CUDA?

I don't know much about video encoding, not even enough to be dangerous, so to
speak.

~~~
dogma1138
CUDA supports VP9 decoding since Kepler iirc.

~~~
jsheard
VP9 was added in VDPAU Feature Set F, which was introduced with the later
Maxwell chips (950/960 cards have it but 970/980/980ti cards don't).

~~~
dogma1138
VP9 decoding was supported since Feature Set E, it was running a combination
of a native ASIC decoder and a set of shaders running on the GPU. So while it
wasn't as power efficient as the dedicated PureVideo hardware block it was
still GPU accelerated.

~~~
jsheard
I'm curious, do you have a source for that? I know that Feature Set E cards
used a hybrid ASIC/GPU approach for H.265 but I can't find a mention of hybrid
VP9.

~~~
dogma1138
It's done via DXVA (2.0) if you query DX on a Maxwell card (or even Kepler)
you should get something like VP9_VLD_Profile0: DXVA2/D3D11, SD / HD / FHD /
4K (You can either manually do it, or there is a tool called DXVA checker or
tester or something along those lines).

LAV Filters (an Open Source implementation of DirectShow/DXVA filters) VP9
support works with Maxwell cards for sure, I have a 780ti somewhere I can
check Kepler also, but IIRC it should work.

That said this would work for many video players that use DXVA or have support
for external DS filters e.g. VLC/MPC-HC or proprietary players like Splash,
but it won't work in a browser. Chrome for example doesn't allow for external
filters iirc so you'll see a pretty big CPU jump. For some reason it seems
fine on VP9 1080p videos on a Maxwell card CPU is 2-3% but at 4K it jumps to
70-80% I have a feeling that since Maxwell has partial support in the PV block
for HEVC and VP9 they might be able to decode 1080p with it and then do full
CPU decoding for 4K, I wouldn't expect CPU decoding for 1080p to be so
resource efficient otherwise.

~~~
dogma1138
Just some more info this is the DXVA GPU accelerated decoder
[http://imgur.com/a/9MPsp](http://imgur.com/a/9MPsp)
[http://imgur.com/a/q8SfN](http://imgur.com/a/q8SfN) It basically decodes a
VP9 stream with a shader and outputs it into an image format inluding NV12
which means you can use PureVideo for direct ouput.

This one can work with either a dedicated media player or with Edge, Chrome
does it's own thing and I'm not entirely sure what they do and I have no clue
about Safari/Opera/Firefox so if some one wants to fill in about them that
would be nifty.

Whilst technically DXVA is a software decoder
[https://en.wikipedia.org/wiki/DirectX_Video_Acceleration](https://en.wikipedia.org/wiki/DirectX_Video_Acceleration)
It does offload the heavy lifting to the GPU via shaders, so can still some
what classify it as "hardware decoding" since softare decoder/encoders are
usually defined as CPU only.

------
TheAceOfHearts
Personally, I've stopped watching video directly on YouTube.

Instead, I installed mpv [0], and since it embeds youtube-dl [1] you can play
any video with the native UI by running:

    
    
        mpv "youtube_url"
    
    
    

[0] [https://mpv.io/](https://mpv.io/)

[1] [https://rg3.github.io/youtube-dl/](https://rg3.github.io/youtube-dl/)

~~~
qqii
What is your process for browsing for content then, do you still browse the
YouTube website?

~~~
rasz_pl
small tempermonkey script replacing all YT embeds with custom URI + custom URI
handler calling mpv "youtube_url" (or in my case extracting direct 720p stream
[https://xxxx.googlevideo.com/videoplayback..](https://xxxx.googlevideo.com/videoplayback..).
link and passing it to SMPlayer)

This lets you watch every single YT clip using player of your choosing. Result
is smooth video on 10 year old laptops(1.8GHz Core2) when Flash/browser
buildin codecs are barely able to play 480p.

~~~
vinay427
Would you consider publishing the script somewhere? I can't speak for others,
but I would personally be inclined to use it as I'm looking for a YouTube
client alternative.

~~~
rasz_pl
[https://github.com/raszpl/smplayer4YT](https://github.com/raszpl/smplayer4YT)

its a mess, but ~works :)

------
std_throwaway
Mac users have long known that using Safari extends their battery life. This
change by YouTube probably improves it further.

~~~
BoorishBears
I always hear about Safari being described as "the new IE", but after
switching back to a Macbook I've tried to force myself to stay on Safari as a
default browser and it's pretty great from a user perspective.

It saves an incredible amount of battery life over Chrome (which makes me that
much more weary of Electron apps eating up battery and memory at idle). Is
there any specific architecture decision by Safari that enables those battery
savings?

~~~
greggman
AFAIK the difference is Safari is not as secure as Chrome. Chrome's multi-
process security comes at a cost in that pretty much everything a webpage
wants/needs to do needs to be shuttled between processes. All network request,
all disk io, all graphics happen in other processes. The communication
overhead between processes is the difference in CPU usage. It's also why
Chrome has 10x less code execution bugs than Safari. Note: Chrome doesn't have
10x less bugs. It has the same number of bugs. It's just by category of bugs
it has 10x less code execution bugs.

[http://www.cvedetails.com/product/2935/Apple-
Safari.html?ven...](http://www.cvedetails.com/product/2935/Apple-
Safari.html?vendor_id=49)

[http://www.cvedetails.com/product/15031/Google-
Chrome.html?v...](http://www.cvedetails.com/product/15031/Google-
Chrome.html?vendor_id=1224)

I'm only guessing that as Firefox goes multi-process for security reasons the
same thing will happen. Their CPU usage will go up because of the overhead of
cross process communication but their code execution bug percentage will go
down

~~~
LeoNatan25
WebKit2 behaves in the same way with regards to the points mentioned above.
The sandboxed content process(es) are responsible for network, IO, layout,
etc. The UI process (normally, the browser process) serves only as a broker
for developer decision making and final drawing of the laid out content.

See
[https://trac.webkit.org/wiki/WebKit2#ProcessArchitecture](https://trac.webkit.org/wiki/WebKit2#ProcessArchitecture)

~~~
greggman
AFAIK WebKit2 does not use a separate process for graphics. Graphics calls go
directly into whatever api is appropriate (CoreGraphics, OpenGL).

Where as Chrome that's not the case, all OS/Driver level graphics happen in
the GPU process. That means all data has to be shuttled from the process that
wants to display the data to the GPU process. Even video for example gets
decoded in a secure process (because there might be exploitable bugs in the
codecs) then that data has to be shuttled to the GPU process so it can be
composited with the page. The directives of compositing happen in the render
process (the webpage) but eventually have to translated to graphics commands
that happen in the GPU process.

None of this is true in WebKit2 AFAIK and is one of the many reasons it has so
many move code exploit bugs (15x actually for 2016).

------
nrjdhsbsid
Good. Screw MPEG LA for hampering the development of AV codecs and making free
software using audio/video a pain in the ass for a decade.

------
kartickv
As someone who has to live with a 4mpbs connection, if VP9 produces better
quality video, I'd want Apple to use it.

I use an iMac, so battery life is not a concern. Even otherwise, I'd take
video quality over battery life.

------
celsoazevedo
Maybe it has something to do with costs? VP8 and VP9 are free while
distributing h.264 and h.265 content will cost millions in fees per year. It
makes sense to use free codecs instead of h.26x

~~~
kuschku
> VP8 and VP9 are free

Unless Google ever uses one of your patents, and you have to sue Google over
that.

VP8 and VP9 come with a no-litigation clause that basically requires you to
share all your patents with Google if you use VP8 and VP9 – which is a shame.

~~~
lern_too_spel
No. Here's the license.
[https://www.webmproject.org/license/additional/](https://www.webmproject.org/license/additional/)

You only lose the right to use WebM parents if you file litigation against any
user of WebM (not just Google) over any implementation of WebM. You can sue
Google about anything else and not suffer WebM license consequences.

You're probably thinking of Facebook's open source patent rider.
[https://github.com/facebook/react/blob/master/PATENTS](https://github.com/facebook/react/blob/master/PATENTS)

~~~
kuschku
Indeed, turns out, I was wrong on that. Now I wonder, if i own MPEG patents,
could I sue Google for misuse of them, while licensing WebM? There’s quite
some overlap with them.

And yes, Facebook and Tesla’s patent licenses are completely ridiculous.

~~~
lern_too_spel
You can sue Google about misuse of MPEG patents and keep your protections
under the WebM license, so long as you do not sue Google or anybody else about
WebM's patents.

------
shmerl
Good. No one stops Apple from supporting free codecs. Google should have done
something of the sort a long time ago. But they didn't, and proliferated H.264
usage.

~~~
akjainaj
VP9 uses a retarded amount of CPU compared to h264, so I can't blame Apple for
not wanting to implement it.

~~~
phkahler
>> VP9 uses a retarded amount of CPU compared to h264, so I can't blame Apple
for not wanting to implement it.

h264 uses a retarded amount of bandwidth compared to VP9, so I can't blame
Google for not wanting to implement it.

Google is still providing a choice here, except for 4K where it costs them
twice as much. Apple on the other hand has no excuse for not implementing
both. There is also a free low-power hardware implementation of VP9 which
Apple (who makes their own SoC) could choose to use but hasn't.

~~~
ebbv
> which Apple (who makes their own SoC) could choose to use but hasn't.

They don't make their own SoC for their laptop and desktop machines, which is
who this primarily affects. 4K on iPhones doesn't matter since they only have
2560x1440 resolution.

~~~
LeoNatan25
iPhone 7 Plus, the latest and largest phone, has a 1080x1920 display.

~~~
ebbv
Yeah you're right I got it mixed up which of my displays my phone had the same
resolution as. :P

------
callesgg
What a waste of battery power.

Most modern computers have hardware ASIC's capable of decoding h.264. It is a
total waste to not use it.

~~~
johnnydoe9
For most people net speeds is a bigger problem than battery drain.

~~~
callesgg
I see your point. In this specific circumstance it would probably not be an
issue.

As one could say that 4K video and network speed issues don't go hand in hand.

------
drivingmenuts
No big loss for me, no matter how you slice it. I don't watch that much video
on YouTube and I prefer to d/l it with youtube-dl if I'm going to be involved
with it longer than 5 minutes.

