> I discovered that putting a man-in-the-middle proxy between my Apple TV and the world lets me decrypt HTTPS traffic
This surprised me quite a bit because normally that shouldn't work, but then that surprise was exchanged for a different one, when I learned later down that you can add CAs to the certificate store of an Apple TV.
Nice and thorough writeup, thanks for sharing. A good carousel through the entire stack involved.
If I had to guess why Apple supports adding certificates, it’s probably to allow Apple TVs to work as AirPlay boxes in corporate/educational environments while playing nice with the IT/device management stuff that entails.
For instance, when I was in college, getting something on the college WiFi either required allow-listing it’s MAC address or installing a certificate.
Unfortunately Google can trivially block this by checking which CA signed their SSL certificate in the YouTube app. I don’t know if they will - doing so might break YouTube within a lot of corporate environments. But it would be unfortunately easy.
Of course Google can do this. And more. They could, if they wanted, to embed ads into the video stream itself with no way to distinguish them from the actual video content.
But they do not do it. They had so much time and opportunities to do that over the years. And yet, they did not do it.
I am not going to speculate why. But I suppose it is safe to assume that it is their intention to not do it.
yet. They are moving forward with measures. YT webpage player.js no longer fetches individual video/audio stream URLs. It fetches single bundle pre-packaged on the server. Its a POST request now with only one URL parameter changing &rn=x, where x increments with every request, and ~2000 byte binary encoded body.
Does anyone do that? The average developer likely would not think to do this because it is too computationally intensive to splice things into A/V streams on the fly.
A more clever developer could splice the ad into the video at an I frame, but then the ad needs to be a multiple of the number of frames that are both the I frame and follow the I frame. This also would mess with metadata on the length of the video that would need to be adjusted in advance. It is doable, but you give up flexibility and your HTTP sessions cease to be stateless. Then there is the need to handle splicing into audio and I do not know offhand if there is a cheap way of doing that at the server like you can do with video through I frame splicing.
It seems to me that they have lower server costs by doing things the current way.
SSAI (server side ad insertion) is not uncommon for premium streaming video; Twitch and Hulu have had the technology in use for years. It's also practically just a checkbox option to enable the feature for all major ad serving tech platforms, including Google's DoubleClick.
They're not using it simply because it increases server and bandwidth costs. YouTube is still positioned as part of Google's "moat" by driving down video ad price so no one else can build an ad empire off video instead of being a profit generating division on its own.
Youtube do their own re-encoding on upload to different quality levels, so they could theoretically hook that and make sure to provide suitable splice points and record them in the metadata.
> your HTTP sessions cease to be stateless
There's already pretty heavy magic around preventing people from simply grabbing all the HLS blocks, I think? All the work that yt-dlp does.
YouTube Videos are a stream, not a file you download. I’m not sure what the major technical nurdle is injecting ads directly into the stream. Also H.264 has key frames typically a few sounds apart anyway
I think that it should be sufficient to create content identifiers of all unitary parts of the video, e.g. parts between keyframes, and skip over the ones which are not supposed to be there.
These identifiers could be collected automatically by plugins like SponsorBlock in a community effort and then combined together to identify parts which are common for every viewer, i.e. the ones representing the original video content.
In other words, it seems to me that even putting ads directly into a video stream would not prevent people from being able to block these ads.
Most devices allow you to add CAs, but almost all apps nowadays use certificate pinning which means the system certificate store is ignored. I find it extremely surprising that YouTube doesn’t do that.
That sounds like you've just made it so your app doesn't work behind a corporate SSL proxy. I really need people to stop rolling there own SSL stores (looking at you python, java and nodejs). I spend way to much of my time getting things running on my work laptop that should just use the CA store IT pre-installed.
Is that a problem? What segment of Google's Apple TV revenue comes from people behind shitty middleboxes?
YouTube won't work on Chromecast if you're trying to MitM it, so clearly Google doesn't think this situation is worth making an exception for in their logic.
I can't help but wonder if any apps have tried doing TLS-in-TLS, with the outer TLS not caring about MITM, and the inner TLS doing certificate pinning?
> but almost all apps nowadays use certificate pinning which means the system certificate store is ignored
Certificate pinning (or rather, public key pinning) is technically obsolete and browsers themselves removed support for it in 2018. [1] Are there many apps still really using this?
HPKP, yes. Certificate pinning in apps is the norm.
The difference between HPKP and certificate pinning is that HPKP can pin certificates on the fly, whereas certificate pinning in apps is done by configuring the HTTPS client in the native application.
Apps like Facebook won't work on TLS MitM setups without using tools like Frida to kill he validation logic.
It's gotten less popular over the years as people keep asking "wait, what are we doing this for again?"; but it's still very popular in certain kinds of apps (anything banking related will almost certainly have it, along with easily broken and bypassed jailbreak detections, etc).
Most personal banking apps I’ve used still do this. The bank is liable for your lost funds if your corporate IT department doesn’t secure the MITM solution properly otherwise.
(The end customer isn’t liable for the bank’s inability to properly secure their app from MITM attacks…)
I don't have any numbers, but I think this is still pretty common. On iOS for example Alamofire which is a popular network stack, still offers this as a feature. I think the use case is a bit different for apps and web sites, especially for closed ecosystems like Apple's where reverse engineering is not as easy/straightforward.
> I find it extremely surprising that YouTube doesn’t do that.
Not surprising for me - it used to be only banks where it was required (sometimes by law) that any and all communication be intercepted and logged, but this crap (that by definition breaks certificate pinning) is now getting rolled out to even small businesses as part of some cyber-insurance-mandated endpoint/whatever security solution.
And Youtube is obviously of the opinion that while bankers aren't enough of a target market to annoy with certificate pinning breaking their background music, ordinary F500 employees are a significant enough target market.
A regime can now force you to install their "root certificate" (and forcing organizations under their rule, e.g. national banks) to use a certificate issued by them, and these certificates would also be able to MITM your connection to e.g. Google. (1)
Looking forward to Americans being forced to install the DOGE-CA, X-CA or Truth-CA or whatever...
Ironically enough Android TV (at least version 7.X) does not let you do that, which I found out the hard way when trying to work around untrusted Let's Encrypt certificates.
Starting with Android 7, apps have to opt into user-installed certificates. Browsers often do (Firefox is an annoying exception, you need to turn it on in the dev settings and it doesn't work in the official release version of the browser), but apps usually don't even know that the setting exists.
Aside from that, Android has a very easy certificate pinning API where you can just assign a fingerprint to a domain name in the XML config files and it'll pin a certificate to that domain. Easy to bypass if you modify the APK file, but then you miss out on updates and other mechanisms could check if the signature has been tampered with.
With root access (shouldn't be too hard to gain on an Android device still running 7) you can add your certificate to the root certificate folder on the system partition. This will make Let's Encrypt work on all apps. It doesn't bypass certificate pinning, of course, but you don't need there for Let's Encrypt.
I remember people promising a rogue CA would not work anymore due to certificate transparency requiring certificates to be published in order to be valid, but it is quite obvious here that certificate transparency was not even needed. A private CA is different from a rogue CA, but if the private CA was not forced to do certificate transparency, I wonder what is supposedly forcing the public CAs to do it for their certificates to be “valid”.
> I wonder what is supposedly forcing the public CAs to do it for their certificates to be “valid”.
The power of browsers and operating systems including the cert in the default store distributed to everyone. Participating in cert transparency is a requirement.
How is that enforced as it was billed as certificates are not valid unless published, yet it is obvious certificates for a private CA are not being published and are treated as valid?
As I understand it there are various interested parties who monitor to make sure that the default CAs in the root store publish to the certificate transparency before they sign anything. A violation would be grounds for immediate removal. None of this applies to a private CA you add to your own store.
This surprised me quite a bit because normally that shouldn't work, but then that surprise was exchanged for a different one, when I learned later down that you can add CAs to the certificate store of an Apple TV.
Nice and thorough writeup, thanks for sharing. A good carousel through the entire stack involved.