
A Technical Perspective on the Apple iPhone Case - blubg
https://www.eff.org/deeplinks/2016/02/technical-perspective-apple-iphone-case
======
natch
Snowden has been tweeting about skepticism being warranted. Especially
skepticism of the FBI statements, since law enforcement is allowed under law
to lie (at least out of court) and mislead in order to further cases.

Some questions to ask:

\- Has the FBI stated that they do not already have the pin for the device?
They did state that the device is locked but I didn't see it stated that they
don't already have the pin. So do they? It's not a crazy question. They have a
good reason to pretend they don't have it.

\- Has the FBI stated that the device was ever in the physical possession of
the suspect during the time since the last backup? Or are we all just assuming
this? This information is not going to be given voluntarily.

\- If one FBI agent has provided testimony that he and some Apple engineers
could find no feasible alternative, that should be taken as the findings of
just that one agent, not the entire government. What about the rest of the
FBI? What about the NSA? If these questions aren't asked, the FBI has no
reason to answer.

------
ysv2
Am I missing something here, or is there no reason the FBI couldn't desolder
the 5C's Toshiba NAND flash chip, read its encrypted contents, and perform the
desired offline brute-force attack themselves?

The key derivation function is known, right?

~~~
paulannesley
That's been covered by most of the articles on the topic, but not very clearly
in this article.

Removing the storage chips from the device would mean breaking a very strong
key, perhaps 128-bit AES, which is not a desirable offline brute-force attack.

That strong key is derived from the PIN combined with a unique device ID which
cannot feasibly be extracted from the processor. So an offline attack needs to
crack full AES, but an online attack by running modified OS code on the device
itself means only the weak PIN needs to be attacked (just 10,000 distinct
combinations, roughly equivalent to a 13 or 14 bit key).

~~~
abalone
Perhaps the key could be extracted by physically analyzing the chip, e.g.
grinding it down and using microscopic tools to detect state?

~~~
akiselev
Secure chips that store private keys generally keep them on a part of the
silicon die that can't be analyzed like the rest of the chip. Any attempt to
open the chip package (take off the black plastic/epoxy covering the die)
results in the destruction of the secure region and methods of reading state
in semiconductors (using electron microscopy) require you to somehow expose
the silicon holding the private key.

~~~
tedd4u
Apparently this iPhone 5C pre-dates the "Secure Enclave." So the key is
somewhere else. Possibly a place vulnerable to a physical readout, possibly
not.

------
tzs
> This key is generated by combining the user's passcode with a key baked in
> to the hardware in a way that is designed to be difficult to extract.

Difficult doesn't necessarily mean impossible:
[https://www.technologyreview.com/s/519201/tamper-proof-
chips...](https://www.technologyreview.com/s/519201/tamper-proof-chips-with-
some-work-might-give-up-their-secrets/)

Is it publicly known how the key is physically stored in the chip and if there
is active tamper resistance?

It might be a lot of work to break out the tiny probes and the tunneling
microscopes or whatever and get the key that way, but at the current level of
terrorist attacks in the US the FBI should be afford the resources for that.

~~~
Zigurd
AFAIK having to grind off the chip package and probe the chip would apply to
the iPhone 6, but the iPhone 5C does not have hardware protection for keys.

~~~
Crito
The 5C does not have a secure enclave, _however_ the 5C still has the device
UID burned into hardware.

As I understand it, for an offline attack on the encrypted contents of the
flash, you would need that UID.

~~~
Zigurd
I should have been more specific: The iPhone 5C does require signed system
software, and the system software enforces safeguards on key access. In
contrast, on the iPhone 6, it may not be the case that an "evil update" of
system software could be sufficient to subvert the ways keys that are already
stored in the hardware secure enclave are protected, and you would have to try
to probe the hardware.

------
abalone
Would it be possible to physically "scan" the secure enclave chip to determine
the secrets contained therein such as the burned-in unique device key and PIN?

Like, is there some kind of chip microscope or reverse engineering process
that can not just look at circuitry, but also detect flash memory state?

UPDATE: answered earlier by tzs:
[https://www.technologyreview.com/s/519201/tamper-proof-
chips...](https://www.technologyreview.com/s/519201/tamper-proof-chips-with-
some-work-might-give-up-their-secrets/) It's expensive, but seems well within
reach of governments for targeted investigations.

------
d4rkph1b3r
Does anyone know, would other agencies have different capabilities when it
came to recovering data on the iPhone in question?

------
empressplay
This article is silly. We're talking about making some changes to an existing
codebase and dumping it on to a device. The remote code-entry thing yes is a
little bit complicated I agree but disabling the "wipe on 10 bad tries"
function is probably nothing more than commenting out a few lines of code. It
would take them a few days to have someone type in 9999 passwords but they'd
still get there.

~~~
rblatz
Have you ever shipped large complicated software? Even simple changes take a
lot of time.

For example the recent error 53 bug, in theory all they had to do was roll
back a commit. But it still took weeks to get it out.

~~~
empressplay
For publicly-shipped software sure, but we're not talking about that. We're
talking about what is effectively a test build on a device that would be in
Apple's possession. Different kettle of fish, isn't it?

Edit: Another developer here thinks there's probably a developer soft-switch
to disable the 10-strikes feature anyway; if that's true then it's even more
trivial?

~~~
zaroth
Because if Apple somehow is compelled to create the functionality, but then
fucks it up and the device gets wiped, that'll go down just wonderfully. So
yeah, they'll just load up a quick build off a development machine and start
entering PINs and hope it works!

The creation of this backdoor would likely rank as one of the highest pressure
coding events in the security engineering groups lives. Yeah, they'll just
whip that up in no time.

Anyone who codes these secure systems would reasonably estimate the perfectly
safe removal of all these security features combined with adding an automated
PIN entry system, and then lets not forget the creation of a secure
environment to isolate this code from any possible misuse... somewhere on the
order of _man-years_ to develop, test, document, and deploy. Much can be done
in parallel, so it's on the order of a team of 6 - 10 engineers working for at
least two to three months straight.

~~~
empressplay
1) Obviously they'd test it on other devices first. Also I would expect the
government to waive Apple's liability should things "fuck up". But somehow, I
think you underestimate Apple's competency.

2) I agree the automated PIN entry system makes things more difficult and if
Apple pointed out that it would be far faster in terms of combined "hacking"
\+ development time to just have people do the PIN entry the FBI would
probably back down on that request.

3) Oh, you're absolutely right, it would take a great deal of additional
effort to secure that code in the leaky sieve that's Apple's internal server
infrastructure. I wouldn't want it to end up in the wild like the iOS, OSX and
all the pro-app source code did (what, what?)

~~~
zaroth
You may recall the PIN counter code was buggy the first time they wrote it. It
took a patch release to ensure the counter would be incremented even if power
was cut immediately after the invalid result was returned. This is part of
what leads me to believe the security system which ensures the counter is
incremented is non-trivial and therefore its removal may also be non-trivial
due to cross-dependencies.

You really think Tim and Dan are just going to let an engineer whip up a build
with these features and sign it willy nilly with their production code-signing
key, deploy it to the subject phone, and just start punching out PINs? If
Apple hasn't _already_ spent several man-years of engineering time on just
investigation and tech support for this particular case I would be shocked.

You claimed creating this build should be easier than a public release. From
my perspective it's an order-of-magnitude harder, because it's a new process
which must be developed and scrutinized from top to bottom and not a well-
oiled machine. I've worked inside Apple, and this is not a negative reflection
on the competency of their developers in any way. For Pete's sake, this
backdoor isn't getting developed as a late-night hackathon!

Of course we're just throwing speculative arguments past each other so it's
not so productive. I think I know what I'm talking about, and so do you, but
it doesn't really matter much either way :-/

~~~
empressplay
I suspect that like most things the reality is somewhere in the middle.

------
lucio
The article is heavily biased. It is not balanced in any way. They start with
a desired conclusion, and then work backwards to construct the answer to the
questions.

~~~
clort
I also agree with this, though that is to be expected. This whole case is
massively problematic because the guy was guilty beyond doubt and in a nasty
way. The law should not be punitive though, we want justice not _revenge_. We
should not compromise our own standards to pursue potential guilty parties.

    
    
      Summary
    
      EFF supports Apple's stand against creating special software to crack
      their own devices. As the FBI's motion concedes, the All Writs Act requires
      that the technical assistance requested not be "unduly burdensome," but
      as outlined above creating this software would indeed be burdensome,
      risky, and go against modern security engineering practices.
    

So the FBI says the law suggests it not be "unduly burdensome". The EFF
response is that they agree it is burdensome but they did not mention _unduly_
so that won't pass. Then they suggest it is _risky_ and _goes against modern
security engineering practices_ both of which are irrelevant to the FBI and
the law.

------
jordigh
What times these are. The EFF is supporting a security-by-partial-obscurity
company who loves to control what its customers can do with their own devices
just so that the FBI can't set a terrible legal precedent, possibly worldwide.
Enemy of our enemy is our friend this time, I suppose.

~~~
harryh
I dunno why you're getting down voted here. There is definitely some major
nerd cognitive dissonance going on. The ability to have full control over
one's own hardware is supposed to be a core tenant of hacker-dom. Amazing how
quickly it's been thrown out in this case.

~~~
empressplay
I don't get it either. A black box that prevents you from running your own
code on it = bad. A black box that encrypts your data and keeps it away from
the evil gubbermint (let's be clear here, that _is_ what the community is
saying) = good ? It's confusing.

~~~
MCRed
The iPhone has never prevented you from running your own code on it. It
shipped with a javascript SDK and then within the year they opened it up to
objective-c programs. They charge you $99 a year for the certificate signing
service, but it's really no big deal... if you can afford an iPhone you can
afford $99 a year. (Hell, I think these days you don't even need to be a paid
developer, the free developers can run their code on their phones.)

~~~
rimantas
You no longer need to pay $99 to run code on your phone. You pay only if you
want to distribute it through AppStore.

