Hacker News new | past | comments | ask | show | jobs | submit login

That's the closest non-digital analogy I have seen yet. To make it closer, the safe designer has a code to disable the autodestruct mechanism that nobody else has.



> ...the safe designer has a code to disable the autodestruct mechanism that nobody else has.

No, not quite. The safe designer can more easily figure out how to bypass his autodestruct mechanisms than anyone else, but that doesn't mean that he "has a code" that bypasses them. In this analogy, it's still nontrivial for the safe designer to perform the work, but it's (probably) easier for him than for anyone else.


It takes a day of work to use this code to disable the autodestruct, but that doesn't really change anything. That they have this code (that nobody else has) at all is the main reason a judge can force the safe maker to assist with the investigation.


> That they have this code...

They don't have any bypass code. They have detailed plans for the autodestruct mechanism. That's it. They are probably the entity on the planet with the best understanding of the autodestruct system, but the autodestruct system does not have an "off" switch.

To reverse the analogy: The safe designer's bypass "code" is analogous to a back door already built into iOS that opens when presented with some information held in secret by Apple. (Think CALEA wiretapping infrastructure built into US-destined telecom infrastructure equipment.)

Because Apple has to write a special version of iOS to fulfil FedGov's request, we can reasonably assume that there is no pre-built back door in iOS that sits around, awaiting the secret information.


The code in Apple's case is the release build key. Only Apple has it, and it takes a day to use it to disable the autodestruct. Ergo, the analogy fits perfectly.


> The code in Apple's case is the release build key.

No, the "code" [0] in Apple's case is the yet-to-be-written version of iOS that can both be loaded at boot time to bypass the PIN retry count and retry delay functions and -to a sufficiently high degree of certainty- only be run on the intended iPhone.

Ergo, your analogy is -as I said- not quite right.

[0] And it's not a "code", but a bypass procedure.


I won't keep up this conversation if you're not going to put in the effort to understand what I'm writing. The code we've been talking about is the safe manufacturer's code that only the safe manufacturer has and that takes a day of effort for the safe manufacturer to use. The equivalent for Apple is the release build key, which has exactly the same properties.

You then made further mistakes. The signed build does not need to be limited to a single device, so it has no place in the analogy. The FBI allows Apple to keep the build themselves and delete it after use.

Please try to keep up, or you'll continue to tilt at windmills, and you won't learn anything.


> Please try to keep up... The signed build does not need to be limited to a single device... The FBI allows Apple to keep the build themselves and delete it after use.

You misunderstand what business this safe manufacturer is in. He's in the business of high-security safes. Any bypass technique he comes up with must be as specific as possible, and defended in every way possible, lest he fail his customers [0] and destroy his reputation (and -thus- his entire business) as a high-security safe manufacturer.

So, yes, the technique developed must be limited -as far as is possible- to a single safe (or phone). Anyone who reads HN knows that leaks happen. The more valuable a piece of information, the more likely someone with the ability to snatch it will be targeting it. And -as we should all know by now- it's tremendously difficult to prevent a state-level actor from acquiring data that's not very securely locked away.

Again, if you're in the business of providing high-security services to your clients, you are going to ensure that any bypass mechanism you're forced to make is as difficult to create and one-time-use as possible.

[0] And -in some jurisdictions- aid in the "legal" torture, imprisonment, or death of his customers.


The way to make it difficult to reuse is to keep it at Apple (where nobody can extract it from the device) and dispose of it after use, not try to limit it to a single device and allow the build to leak. The signing key is much more valuable target than this build, and the build needs to be secured for a very limited time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: