Hacker News new | past | comments | ask | show | jobs | submit login

>As we have stated in the past, there is no effective way to weaken encryption for some use cases such as law enforcement while keeping it strong for others.

I've never been fully satisfied by this assertion.

Apple has the ability to push whatever code it wants to whatever device it wants. They can make a version of iOS that bypasses encryption and restrict it to only run on devices identified by a warrant. They could post it on GitHub and it wouldn't matter because iPhones would refuse to run the code if it had been modified by somebody other than Apple to run on any device. You would need Apple's internal code signing keys to actually do any damage.

Currently, the security of your iPhone is dependent on Apple's internal security. If that gets compromised then your phone can be compromised. In a world where law enforcement can get a warrant to force Apple to unlock a specific iPhone, the security of your phone is still dependent on Apple's internal security. Nothing changes for anyone not targeted by a warrant.

You don't need to give law enforcement a universal backdoor key to enable them to execute warrants on devices. What am I missing?




> They can make a version of iOS that bypasses encryption and restrict it to only run on devices identified by a warrant.

Of course. Apple could absolutely do this and undermine any trust people have in them in seconds if they want to.

Didn't they pass a law in Australia requiring corporations to do exactly what you describe? I remember reading news here about several corporations just moving off of Australia as a result of the inherent untrustworthiness of any system where the government can compel any party you're doing business with to ship you malware.


In the current world Apple is not actually able to compromise your iPhone, warrant or not, by design (remember the whole FBI debacle?). Encryption is done through separate hardware on the phone that they can’t remotely bypass. In the world you’re proposing they would have to be legally compelled to engineer a weaker system. That’s the problem.


I could be wrong but I don't remember Apple claiming it was literally impossible for them to open an iPhone. If that were true then Apple wouldn't have had to fight them at all. The government can't force you to do impossible things.

Apple objected to being compelled to create the tooling to bypass an iPhone's security. That implies they can do it whenever they want, they just choose not to.


They could create a build with unlock attempts removed, making it possible to brute-force weaker unlock schemes. That's what they didn't want to do because if they created something like that they would lose a lot of customers.


Are unlock attempt limits not built into the security chip?


I wasn't sure about that so I erred on the side of caution, but I do think that is how it works yes.


You already can't sideload Signal on an iPhone. The Apple store is a single point of failure that's being ignored by too many.


great idea. while we are at it, we should also require that the police have a key to your house and vehicle, as well as your debit card pin number and, email password, and bank password.

its necessary security, so you agree with this correct?


They effectively have all of those things already. You don't need a key to enter a house or a car, and they can get anything they want from your bank and email provider with a subpoena.

I agree that giving law enforcement a universal key that can defeat all encryption would be a monumentally stupid idea, but it's not actually necessary to enable them to bypass encryption on specific devices. As far as I can tell, privacy activists have just made up the fact that law enforcement wants a universal key because it's easier to argue against that than to argue in favor of their actual position.


They already have the capability to remotely access most modern vehicles, this is built in to cars manufactured in the past few years, along with telemetry that tracks everywhere you drive.

Access to bank account details and emails is also routine in police investigations. And they can simply smash your door in if they need access to your house.


To the degree that they can do it, they can only do that because they own both the hardware and the software, and they are tightly integrated.

You need hardware that can securely hold secrets; you need software to detect tampering and tell the hardware about it; you need software that the hardware trusts to communicate with apple servers. Without the whole integrated pipeline from local secrets cache to apple servers, protected because it's all owned and managed by the same secret-keeper, what apple does can't be done.


> What am I missing?

The fact that this is not how security on iPhones works at all.

There are unencrypted partitions of the storage yes, which means the device can boot into iOS without any user password, but the ones containing user data are encrypted and an OS update can at best remove the limit on how fast or how many times you can attempt to unlock the user data (or rather the Secure Enclave which stores the actual key for the user data).

Sure, if you have a 4-6 digit PIN code this is then quickly unlocked in that scenario you're talking about, but if you have an alphanumeric password you can make that attack completely infeasible.

Bottom line is there is no way for Apple to make a version of iOS that completely bypasses encryption, encryption doesn't work like that.

They could, I guess, make a version which silently checks if it's host devices serial number is in a certain list provided by e.g. law enforcement, and then silently removes the SEP encryption, IF the user unlocks the device AFTER that software is installed. But they would have to secretly add this code to the normal iOS releases which would severely compromise their customers data en masse if they did it this way, or create a way to push a specific build over the air to a specific device (which actually doesn't sound that far-fetched honestly, now that I'm thinking about it), without alerting the user that they are installing a backdoor for law enforcement.

I don't think that's feasible either way, because it would quickly come out that they've done that and once the cat's out of the bag they are losing big bucks, and would be criminals will simply stop installing updates on their iPhones.


an OS update can at best remove the limit on how fast or how many times you can attempt to unlock the user data

Wouldn't an OS update be able to store the user password in a plain text file on the non-encrypted partitions? I don't think those partitions are hardwired to be readonly until the rest of the system is unlocked?


Absolutely, but only if that code is running when the user supplies the password. I mentioned this possibility as well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: