Frankly--as someone at the front line of all of this stuff--none of these DMCA related arguments have much "practical" importance in the medium-to-long term :/. (This is not at all to say it is a waste of time or anything: the right to examine products and discuss what you learn is really really important and is the basis of security research, so these lawsuits are extremely important and this DMCA law--which likely will only ever die due to first amendment issues because it is locked in via international treaty--is extremely harmful.)
The reality is that tech companies are winning this war, technologically, because asymmetric cryptography works--so there isn't some secret key embedded in the devices that you can merely extract and use--and the industry is getting better about not causing regressions. The big battle over unlocking cellphones from 5-10 years ago? That was nothing more than a gambit (which worked, btw) to change the process rules, because no one is actually using technological techniques to supply unlock codes anyway (it is now all via, shall we say... "light corporate espionage" ;P).
In a world where I am allowed to make a jailbreak that doesn't mean I can make a jailbreak, as we can see in the war with Apple: the reason there aren't jailbreaks for modern iPhones right now isn't because of DMCA fears. We live in an awkward time where bugs can be features :(. The real "practical" changes are thereby going to have to come from proactive regulation that prohibits certain forms of protection or corporate behavior, not attempts to defend the rights of users to publish what they can find and build. I would more watch Right to Repair, or the various antitrust lawsuits (including my related Apple one).
(Note: I know I didn't directly tackle whether anything anyone is doing might help you backup your streaming licensed content, but I also appreciate you didn't directly ask me anyway ;P. I have not heard of such, but I don't pay as much attention to that kind of content. The issue in Green and Huang's cases, though, is about being able to publish information and tools that might help you rip such content, and the reality is you can likely already do that and to the extent to which people start only letting locked down devices with encrypted cables access content, I think you will see companies win that in the next 5-10 years and the legality of publishing a workaround won't matter as there won't be any workarounds to publish. And whether it is legal to do that backup for content no one even pretended you can "buy" seems orthogonal to the efforts I have seen or been involved with.)
The real "practical" changes are thereby going to have to come from proactive regulation that prohibits certain forms of protection or corporate behavior
Indeed. Of course, trying to regulate crypto will bring everyone against (individual) privacy and freedom out in opposition, and forcing companies to not tie functionality to their private keys / force them to make public those keys would also make you an enemy of the "security" industry.
We live in an awkward time where bugs can be features
As the saying goes, "insecurity is freedom". We only got into this position because of corporate interests that oppose those of the user.
> Of course, trying to regulate crypto will bring everyone against (individual) privacy and freedom out in opposition, and forcing companies to not tie functionality to their private keys / force them to make public those keys would also make you an enemy of the "security" industry.
This seems easy to solve. Just mandate that the end-user/purchaser of the hardware should have the same level of control over the code execution as the manufacturer does after the sale. Whatever the manufacturer can do to your own unit after it's in your posession, you should be able to do as well.
No need to get into restricting crypto. After the aforementioned regulations are passed, it will be up to the companies to figure out how to implement them.
>Just mandate that the end-user/purchaser of the hardware should have the same level of control over the code execution as the manufacturer does after the sale. Whatever the manufacturer can do to your own unit after it's in your posession, you should be able to do as well.
I don't see how this changes much except to mandate Apple or any other hardware manufacturer to leave privilege escalation bugs in their software for the lifetime of the device.
Also, updates are generally user-initiated. If the unadvertised "functionality" of unauthorized code execution is patched, the user has only himself to blame, not the manufacturer.
The point of the regulation would be to force manufacturers to officially give users the same level of code execution that they have over the sold devices as themselves.
If Apple, via software updates, can control what software runs on the iPhone of the user, then the user should also have that control, to install whatever OS/bootloader they desire, the same way as Apple can.
You're shifting goalposts. You were arguing for the same level of code execution after sale of the device, not at point of compilation. By the time every iPhone leaves the factories in China and India, they're already locked down. The consumer only has as much ability to execute code as any bugs allow in the initial OS release.
Also, Apple doesn't have the ability to install any OS/bootloader as the hardware is specifically tailored to one OS. Even if you had an Apple-sanctioned root mode, it's likely the hardware won't run AOSP images or android compatible bootloader as it's only guaranteed to work with iOS. The same can be said of game consoles.
The code execution is locked down with a private key that only Apple has.
Apple can therefore sign any executable for any iDevice that exists, and it will run without issues. They could make a completely new bootloader/OS combo from scratch while mantaining compatibility with the hardware.
The bar is then: "Is it technically and officially possible for Apple to install any OS/bootloader that's compatible with the iPhone hardware?" The answer to that is yes, it is.
So, if it is possible for Apple to do so (by them having the private key used to sign the OS images) even after they sell it to me, then it should be legally mandated for me to have the same level of official posibility to do the same via the same means.
> The code execution is locked down with a private key that only Apple has.
And digital signatures for the various drivers that only work in iOS.
> Apple can therefore sign any executable for any iDevice that exists, and it will run without issues. They could make a completely new bootloader/OS combo from scratch while maintaining compatibility with the hardware.
That Apple can does not mean it should be compelled to. In addition, Building an entirely new OS is separate from the issue of being regulated to provide equivalent access after sale.
> The bar is then: "Is it technically and officially possible for Apple to install any OS/bootloader that's compatible with the iPhone hardware?" The answer to that is yes, it is.
Then this is no longer about consumers having the right to do as they wish with a device they have purchased. Instead, this is about compelling Apple into providing protected information or forcing the company to design an open OS. This runs afoul of many constitutional protections not the least of which is compelled speech.
> So, if it is possible for Apple to do so (by them having the private key used to sign the OS images) even after they sell it to me, then it should be legally mandated for me to have the same level of official posibility to do the same via the same means.
That they have a private key, does not mean they're obligated to share it nor should they. I fret for the precedent this would set. Arguing for control of your own device that you can root yourself is not the same as forcing a manufacturer to allow arbitrary access to the OS at first swipe.
Apple shouldn't be the one doing the signing for every user or giving out any private keys of theirs. What should happen is that they should be forced to design their devices in such a way as to allow an authorized user to change the public key used for signature verification.
All this is a disgression however. The point is that the law should simply mandate that manufacturers design their devices in a way in which what the OEM can do to an already-sold device in terms of code execution/control also be possible for the new owner to do.
I am not entirely pessimistic towards the landscape of device level exploits in particular when low-level hardware glitching attacks seem to be the next research frontier as reliable persistent software exploits get harder and harder.
Worth noting that certain people probably have means to dump streaming HD Netflix content but keep these methods private because presumably Netflix can change things around to break them. So the cat and mouse game continues.
>the right to examine products and discuss what you learn is really really important and is the basis of security research
I hate to push back on you (of all people) for this, but no it's actually not. You are talking too much with your hacker hat on and not enough with your legal hat on. There is quite a difference between accredited security firms doing responsible security research, and random unaffiliated parties with shifting and conflicting motivations doing "security research". That angle is a losing angle and it's not because these companies did anything, it's because it is actually a fallacious and bad meme that has propagated around forever in hacker circles, seemingly for no other reason than that it is fun to think about it.
In my opinion, if you follow that "right" to its legal and technical conclusion, you will end up with the "right" of corporate security firms to do research. That's it. I don't mean to be all doom and gloom though. You are right that the idea of "right to repair" is a much broader thing that makes a much more compelling case for any kind of consumer protection angle.
I'm not sure what you are arguing :(... is your premise that it is sufficient for only "accredited" (is that even a thing? I didn't know that was a thing) security research firms that are, I guess, hired by the company that is selling the product for the world to be safe? As that definitely doesn't seem to be true in practice, and puts a LOT of power in some extremely biased hands :(. It also doesn't, from my understanding, match the intention of the laws either... the weakest part of the Green case (which is what was referenced and which is almost annoyingly-narrowly about security research being published in book form, and so sidesteps any confusion we might be having here with respect to my personal agendas that involve "jailbreaking")--as far as I can tell, as a non-lawyer who spends way too much time talking to the lawyers--is that the DOJ actually came out during the hearing to say they don't see anything wrong with the activity in the first place ;P. I'm thereby really confused that you seem to think this is somehow, I guess, illegal currently? Cause like, AFAIK, it isn't: the issue at hand is whether there is a chilling effect being caused by Section 1201's anti-trafficking provisions on someone's first amendment right to explain not only that something is insecure but in exactly what way it is insecure (as I, for example, often do in my post-mortems: see my articles on Optimism or Master Key, etc.) when those exploits happen to affect an "effective" (lol: I hate that wording) technological measure protecting someone's copyright, as, in the US, we tend to be pretty adamant about reserving the right to publish information.
The reality is that tech companies are winning this war, technologically, because asymmetric cryptography works--so there isn't some secret key embedded in the devices that you can merely extract and use--and the industry is getting better about not causing regressions. The big battle over unlocking cellphones from 5-10 years ago? That was nothing more than a gambit (which worked, btw) to change the process rules, because no one is actually using technological techniques to supply unlock codes anyway (it is now all via, shall we say... "light corporate espionage" ;P).
In a world where I am allowed to make a jailbreak that doesn't mean I can make a jailbreak, as we can see in the war with Apple: the reason there aren't jailbreaks for modern iPhones right now isn't because of DMCA fears. We live in an awkward time where bugs can be features :(. The real "practical" changes are thereby going to have to come from proactive regulation that prohibits certain forms of protection or corporate behavior, not attempts to defend the rights of users to publish what they can find and build. I would more watch Right to Repair, or the various antitrust lawsuits (including my related Apple one).
(Note: I know I didn't directly tackle whether anything anyone is doing might help you backup your streaming licensed content, but I also appreciate you didn't directly ask me anyway ;P. I have not heard of such, but I don't pay as much attention to that kind of content. The issue in Green and Huang's cases, though, is about being able to publish information and tools that might help you rip such content, and the reality is you can likely already do that and to the extent to which people start only letting locked down devices with encrypted cables access content, I think you will see companies win that in the next 5-10 years and the legality of publishing a workaround won't matter as there won't be any workarounds to publish. And whether it is legal to do that backup for content no one even pretended you can "buy" seems orthogonal to the efforts I have seen or been involved with.)