
Apple can comply with the FBI court order - admiralpumpkin
http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/
======
zaroth
Crucially, this is software which doesn't currently exist in the world and
which Apple has no intention of voluntarily writing. There is no specific law
or regulation (like CALEA) which requires Apple to provide this functionality.

What the FBI is attempting is to use 'All Writs Act' from 1789 which
authorizes Federal courts to issue "all writs necessary or appropriate in aid
of their respective jurisdictions and agreeable to the usages and principles
of law." Are there limits to what a judge can order a person, or a company, to
provide?

A warrant describes "the place to be searched, and the persons or things to be
seized." I would not expect a judge can draft a warrant for something which
doesn't actually exist, and then force someone to create it.

This is not about providing physical access, or about producing documents
which are in your possession. This is whether the government can usurp your
workforce to make you create something that only you are capable of creating,
against your will, not because there's actually a law which says you have to
provide that capability, but simply because some investigator has probable
cause that given such a tool they could use it to find evidence of a crime!

If 'All Writs' somehow does give the government the ability to enslave
software developers to creating this particular backdoor, what is there to
legally differentiate this request from, for example, one that would function
over WiFi or LTE remotely?

There's been a lot of discussion about the 'secure enclave' and how this
particular attack isn't possible on the iPhone 6. I think that's missing the
point.... If 'All Writs' can force Apple to open a black-hat lab responsible
for developing backdoor firmware for the 5C, then it can do the same for the
6. For example, why not force Apple to provide remote access to a suspect's
device over LTE while the device is unlocked / in use? While we're at it, the
iPhone has perfectly good cameras and microphones, let's force Apple to
provide real-time feeds.

Think about the sheer quantity of networked devices which exist (or will
exist) in an average home which could be used in the course of an
investigation. If they can force Apple to create a 5C backdoor, I can't see
any reason they can't apply the same logic to WiFi cameras, Xbox Kinects, or
even your cars OnStar. Heck, even TV remotes come with microphones and
bluetooth now... And don't get me started on Amazon Echo!

Fundamentally, the question is can you force a device manufacturer to
implement backdoors into their products to be used against their own
customers? Notably, _service providers_ have already lost that battle, they
are required to architect their systems to be able to spy on their users and
provide that data to law enforcement, often through specially design real-time
dashboards. At least in that case it is based on duly enacted legislation with
that specific intent.

But this is something really quite shocking -- can investigators, simply
through obtaining a warrant, force companies to re-design the personal devices
that we own and keep with us almost every moment of the day to spy on us? I
truly hope not.

~~~
tptacek
Under extraordinary circumstances, compelled expert witness testimony has lots
of precedents, including situations where experts are required to expend
resources to develop that testimony.

People on the Internet tend to believe that technology poses confounding
problems for the law, but the law has been dealing with technical challenges
for centuries. See, for instance, any case involving a complicated medical
issue.

~~~
legalbeagle
Yes, you can compel expert witness testimony, but this is different. Testimony
is answering questions. Here, they are seeking to compel engineering/coding
work. I've never seen that compelled by a subpoena.

~~~
tptacek
If you read old-ish legal journal articles about expert witness compensation,
you find that the requirement to do up-front work in order to generate the
knowledge required to handle questions is a dividing line for whether (or, at
least, whether in the 1960s) expert testimony must be compensated. From that,
I gather that this kind of request isn't unprecedented.

------
pilif
I know my opinion is probably not popular, but if there was a way for Apple to
physically install a firmware on a single device to allow for brute-forcing
and if Apple did that in response to a direct court order, then this is
probably the best compromise we can get.

I really believe that there should be a way for law-enforcement to get access
to specific devices in response to a court order as long as the solution
doesn't involve weakening the encryption for everybody else.

I'm absolutely against backdoors, secret* keys or similar crap. But physically
access a single device in order to make brute-forcing it possible, that seems
acceptable to me as that won't affect any other device.

That would be similar to a court order allowing law enforcement to enter your
premises and take out the safe in order to pry it open at some other location
where specialised equipment is available.

If this is all law enforcement wants, then maybe it's time to hand this over
before law enforcement wants even more which will doubtless pave the way for
mass surveillance of devices.

* until they leak. Then everybody has access.

~~~
bjacobel
> maybe it's time to hand this over before law enforcement wants even more
> which will doubtless pave the way for mass surveillance of devices.

The FBI is already paving that way with this case. They don't overly care
about access to this particular iPhone. They're taking this case through the
courts so that they can establish a precedent that allows them to force
manufacturer cooperation to unlock _any_ phone.

Edit: If they really cared about access to this individual phone, they
wouldn't be going through the courts to get it; they'd be talking to the NSA
TAO or other LEO with advanced forensic capability. As several people have
pointed out, this iPhone 5C does not have a Secure Enclave and probably does
not present a significant challenge to forensically analyze, to people that
know what they're doing. They're going through the courts on this so they can
get carte blanche to access iPhones 5S and above, which no LEO currently has
capabilities to inspect.

Further edit: This is Farook's work phone. His main, personal phone was found
destroyed in a dumpster near the site of the attacks. I find it incredibly
unlikely the FBI really cares much about the contents of this individual
phone, they just want a high-profile test case to expand their surveillance
capabilities.

~~~
Amezarak
> They don't overly care about access to this particular iPhone. They're
> taking this case through the courts so that they can establish a precedent
> that allows them to force manufacturer cooperation to unlock any phone.

This is an analysis, not an objective and demonstrable fact.

I could just as well argue that yes, the FBI really does care a lot about this
particular iPhone, and that's why the asked-for update is to be keyed to this
iPhone and only this iPhone.

At the same time, _even assuming that is true_ , we're talking about the FBI
going through a legal process, reviewed by a judge, to get the data off one
phone at a time. If that's how it works every time, I don't see a problem;
that is how the system is _supposed_ to work. I am kind of baffled as to why
we're cheerleading the fact that Apple is refusing to perform what appears to
be a perfectly reasonable request that is being made in accordance with the
law. If you are operating under the presumption that the government is always
a bad-faith actor, then we have much, much bigger problems.

Also, apparently this 'precedent' has already been set; according to a link in
the article, Apple had previously offered custom firmware images to law
enforcement after a court order that bypassed the lock screen on earlier
iPhones.

[http://www.cnet.com/news/how-apple-and-google-help-police-
by...](http://www.cnet.com/news/how-apple-and-google-help-police-bypass-
iphone-android-lock-screens/)

~~~
rconti
Every comment I've read so far has said that Apple should help in this
instance, so I don't see the cheerleading-- yet. Except now I may provide it.
I just read Apple's letter to customers, and now I agree with _them_ that the
very creation of backdoor software -- even if it's only meant to help in
specific instances -- is a dangerous thing. Applying specialized knowledge
that Apple has about iOS and iPhones, plus Apple's engineers, to creating an
innovative backdoor that does not exist today, means that it can never be un-
designed. It will never have fewer people aware of it, unless you kill them
after they create the software. The knowledge will only spread. The software
can only leak. The engineers can only get conveniently hired by a competitor
or foreign government or our own government. I agree, it is troubling.

~~~
sliverstorm
_The knowledge will only spread. The software can only leak._

Then why don't we have Apple's private keys yet?

Plenty of companies keep a lot of things very secret, including things like
powerful debug modes, for a long time. At least long enough that everybody
forgets the details and the software has long since rotted away.

~~~
dottedmag
Because it's Apple who keeps them, not FBI.

Nobody in FBI would give a damn about leaking the patched OS image: it's
Apple's reputation on stake, not FBI's.

~~~
pilif
But. The FBI doesn't want the keys in this case. They not even want a build
that works for on any phone but the one in question.

There is nothing of value for the FBI to leak.

This is the huge difference between this order (which I can live with) and
blanket encryption backdoors using key escrow or other crap (which I'm
absolutely vehemently against and willing to fight to the teeth)

~~~
st3v3r
"They not even want a build that works for on any phone but the one in
question."

That is completely not true. There is no way to make such a thing that can
only work on one particular phone. There will be some point at which the
compromised firmware image checks to see if it's that device, at which point
it would be possible to change that to whatever device you want.

"This is the huge difference between this order (which I can live with) and
blanket encryption backdoors using key escrow or other crap (which I'm
absolutely vehemently against and willing to fight to the teeth)"

No, there is absolutely no difference between those two.

~~~
otterley
> That is completely not true. There is no way to make such a thing that can
> only work on one particular phone

The technique that makes this possible is described in Apple's iOS Security
White paper, page 6 ("System Software Authorization"):
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

This mechanism explains why you can't take an old release of iOS off a
different phone and copy it to yours.

~~~
st3v3r
You've missed the point: By doing this, they've shown that it's possible, and
that they already have the tools. Meaning that next time, it's going to be
almost impossible to say no.

------
nindalf
Indeed, this is precisely why Apple is writing an open letter. If this was an
iPhone 6, they would have simply told the judge "no" and that would have been
the end of the story. But since its a 5C, it is possible and Apple doesn't
want to do it to avoid setting a precedent. If they cooperate with law
enforcement to backdoor this phone, then they would face much more pressure to
comply with any future laws that require backdoors be built-in.

~~~
cubano
In the end, they will have no choice but to comply.

Do people think this a game? Apple doesn't run things, the federal government
does, and will, in the end, use it's full power to get what it desires.

~~~
atirip
Why are you saying this? There is always a choice. TC may choose to go to
jail. His successor may also choose to go to jail. To the point that Apple
shareholders may choose to let Apple go bust and burn the servers with the
source code of iOS. You can argue that this is silly, idiotic and what not.
But there is always a choice. Some brave people have choosen to go to jail
previously.

~~~
curun1r
Apple also has a form of resistance not available to you or I. They don't have
to fall on their sword. A sword as big as theirs can be used it to decapitate
(or, more aptly, recapitate) the FBI. With a 12-figure bank account, you can
get a lot of people elected...people who will be more than willing to replace
those in charge at the FBI.

What's more, they don't even have to actually do it, they just need to make
the FBI believe that they actually would do it if the FBI presses the issue.

~~~
0942v8653
Right, but the FBI knows they wouldn't. This whole situation is generating
good PR for Apple. Getting people elected that can change things would be a
waste of their money because if known, it is very bad PR (the public is
generally against lobbying like this) and the good PR created by this "fight"
ends immediately.

~~~
curun1r
Right, I was just suggesting that Apple has better "nuclear" options than Tim
Cook going to jail out of principle and Apple going out of business.

------
citizensixteen
This is so far the clearest and easiest to understand explanation of the
Apple/FBI case I have come across. Great read.

------
spdustin
I wonder about the relevance of the case styled "United States v. Hubbell"
(530 U.S. 27)†

Specifically ( _emphasis mine_ ):

> ...the Self-Incrimination Clause ... may be asserted only to resist
> compelled explicit or implicit disclosures of incriminating information.
> Historically, the privilege was intended to _prevent the use of legal
> compulsion_ to extract from the accused a _sworn communication of facts_
> which would incriminate him.

-and-

> ...the act of producing documents in response to a subpoena may have a
> compelled testimonial aspect. We have held that “the act of production”
> itself may implicitly communicate “statements of fact.” By “producing
> documents in compliance with a subpoena, the witness would admit that the
> papers existed, were in his possession or control, and were authentic.”

-and-

> Compelled testimony that _communicates information that may “lead to
> incriminating evidence”_ is privileged even if the information itself is not
> inculpatory.

†
[https://www.law.cornell.edu/supct/html/99-166.ZO.html](https://www.law.cornell.edu/supct/html/99-166.ZO.html)

EDIT: Wikipedia summary of the case here:
[https://en.wikipedia.org/wiki/United_States_v._Hubbell](https://en.wikipedia.org/wiki/United_States_v._Hubbell)

~~~
riskable
This is all well and good but the 5th amendment doesn't apply to corporations.
So United States v. Hubbell is irrelevant.

------
caf
Makes you wonder why the FBI bothered with the bit about submitting PIN
guesses electronically, and didn't just ask for a firmware that looped over
all 10,000 PINs until it found the right one.

~~~
bryanlarsen
Perhaps the phone in question has a 6 digit PIN, so it's useful to try it non-
sequentially.

------
jeffehobbs
Great piece. Get thee to a "Secure Enclave" supported device, everyone.

~~~
venomsnake
Or any rooted android. Good luck in defeating LUKS. No custom firmwares will
help them.

~~~
feld
a rooted android is probably easiest to own over the air with a push
notification, so yeah, that's a great idea! _NOT_

~~~
venomsnake
A powered down device rarely has that vulnerability.

~~~
paraxisi
A powered down device isn't exactly terribly useful.

~~~
16bytes
You can't send a powered down phone a push notification for post-hoc analysis.

You would have had to know the target and push a vulnerability beforehand,
which wouldn't have helped in this case.

------
eridius
Despite the assertion of this article, it doesn't actually give any evidence
to support the claim that Apple is capable of writing this backdoor. The
important question is whether it's possible for Apple to update the OS on the
phone (or to load a program into memory that runs on the phone) via DFU mode
or something similar without triggering a wipe of the phone. And this article
doesn't even acknowledge that question, it makes the blind assumption that
this is possible. But is it? As far as I know, nobody has ever updated an
iPhone over DFU mode without erasing the phone. It's plausible that Apple has
the know-how to do that, but it's also plausible that the device firmware may
have been written to trigger a wipe the moment any modification is made via
DFU mode.

As a side note, the author mentions that Apple has updated the Secure Enclave
with increased delays in the past without wiping data, though they state that
only Apple knows how it really works. I just want to put forth the theory that
maybe the Secure Enclave allows its firmware to be updated if and only if the
user's passcode is provided at the time the OS tells the Secure Enclave to
prepare for a firmware update. That would be a reasonable way to ensure the
Secure Enclave can't be subverted.

------
lowbloodsugar
One would hope that the Secure Enclave only allows itself to be updated after
the PIN has been entered successfully.

------
growlix
The security architecture described here seems pretty clever. Is this degree
of security unique across mobile devices? If the phone in question was from a
different manufacturer or ran a different OS, would the FBI have to ask its
creator for help?

~~~
pas
Chrome OS was already doing chain of trust booting when it was first
announced/revealed/open-sourced, but with maybe a hardware switch to enable
flashing other loaders. (Since then that has been probably removed.)

There are Android full disk encryption schemes, and of course phones with
signed bootloaders.

[https://nerdland.net/unstumping-the-internet/pattern-
unlock-...](https://nerdland.net/unstumping-the-internet/pattern-unlock-after-
android-full-device-encryption/)

[http://www.extremetech.com/mobile/216560-android-6-0-marshma...](http://www.extremetech.com/mobile/216560-android-6-0-marshmallow-
makes-full-disk-encryption-mandatory-for-most-new-devices)

What Apple did that was so valuable is providing a very clear, almost abstract
implementation, from scratch, hitting every point along the way (randomized
device private keys, read and execute only Secure Enclave, signed loaders,
proper AES(-XTS?) full disk encryption, probably also requiring strong a
password too, full lock after ~48 hours - sure, it'd be good if this could be
customized to something lower).

------
jkxyz
Would setting this precedent to enable brute-forcing the PIN on a less secure
model really be that dangerous, then? This isn't a request to circumvent
encryption on all models. It would be worrying if the government were asking
for a backdoor into the Secure Enclave feature to retrieve the encryption
keys, but that's nowhere near what this request is actually detailing.

I still remain opposed to any kind of circumvention that reduces security,
which this definitely does. Just questioning whether it's something to be so
shocked about since it's not exactly far removed from the kind of requests
that they have conformed to in the past.

------
geographomics
If Apple can update the firmware of the Secure Enclave, could they not also
write one that leaks out all the data contained within it?

Presumably this could then be used for offline attacks against the image
dumped from the phone's flash memory.

------
sandycheeks
The part that I find most interesting as a former enterprise systems
administrator from the 90's is that the employer owns the device but does not
have a pass code for it. Is this the normal IT policy for these kinds of
devices?

~~~
crystalmeph
It's bad security to tell your manager your passcode/passwords, even for
accounts they can get into through admin channels. The risk isn't the manager,
the risk is that someone overhears you telling them the password, finds the
piece of paper it's written on, etc.

------
Fando
Obviously government should not be allowed such power. Simply because it
cannot be trusted. Everyone knows it will use this backdoor against everyone
without limit. And everyone knows the lengths to which the US will go to spy
on their people and everyone else. The fact that this request is being made is
further evidence of the government's intentional malevolence. This behavior
should not come as a surprise anymore. I mean any reasonable person by now
knows what to expect from US government.

------
StreamBright
I am wondering if there is no other way to get in. There are lots of security
researchers out there who can hack in to iOS or any other mobile os. Would not
be it simpler to hire somebody who could do it? There are probably many ways
to get access to this device other than guessing the pin.

Update: In the meantime I was talking to my security engineer peers and it is
not feasible to carry out an attack this way. The user partitions remains un-
mounted until the PIN is provided after the boot.

~~~
jonknee
> There are probably many ways to get access to this device other than
> guessing the pin.

It's encrypted data using the PIN (and a key embedded in the phone). There's
not another way in.

~~~
j_jochem
So I've been wondering, since the PIN is obviously not a strong cryptographic
secret, the way the encryption works is basically security by obscurity.

All an attacker would have to to was clone the contents of the the device's
SSD and somehow read the secret key that is embedded somewhere else. I'm not
sure how feasible the latter part is, but surely this shouldn't be beyond the
capabilities of US three-letter-agencies?

~~~
jonknee
It's much more complicated than that (which is why the FBI needs help). The
encryption uses the PIN _and_ a key that is in the phone. If you take the
image and try all the PIN combinations you will fail because you don't have
the embedded key.

[http://www.darthnull.org/2014/10/06/ios-
encryption](http://www.darthnull.org/2014/10/06/ios-encryption)

> The UID key is used to create a key called “key0x89b.” Key0x89b is used in
> encrypting the device’s flash disk. Because this key is unique to the
> device, and cannot be extracted from the device, it is impossible to remove
> the flash memory from one iPhone and transfer it to another, or to read it
> offline. (And when I say “Impossible,” what I really mean is “Really damned
> hard because you’d have to brute force a 256-bit AES key.”)

Newer phones also include a secure enclave that introduces another key and
hardware restrictions on timing. The FBI's request wouldn't make sense for a
modern iPhone.

~~~
j_jochem
How does "cannot be extracted" work? There must be some physical
representation of the key inside the phone, so surely it should be posible to
retrieve it somehow (e.g. using a scanning tunnel microscope or whatever)?

~~~
RandomBK
I don't know if this is implemented in the iPhone's Security Enclave, but many
modern HSMs are designed so that physical tampering (such as extracting the
chip for analysis) damages/destroys the data.

------
PascLeRasc
What I don't understand is why Apple can't just unlock the phone the same way
that they would for any typical customer that forgot their passcode. I've
never owned an iphone, what is the recovery process like for a typical
customer that can't get into their phone?

------
tkinom
What would happen when other governments, their court, their agencies make
similar demand?

The Pandora's box is opened?

------
markyc
FBiOS :)

------
ericmuyser
Technically feasible, sure. But with a higher risk of introducing security
vulnerabilities than most features.

------
jlebrech
why can't they just use a pin code robot
[https://www.youtube.com/watch?v=k_n5W69OdKM](https://www.youtube.com/watch?v=k_n5W69OdKM)

~~~
LordKano
Because they don't want to spend the time.

After too many incorrect pins, there's a time delay before another attempt can
be made.

The FBI is also thinking about the next time. They want to be able to take a
phone, plug it in and brute force the PIN to gain access.

~~~
centizen
I have a feeling it's more about setting the precedent, as others have said.
There are strategies that can bypass the timeout periods and automatic wiping
while bruteforcing, and I'd be surprised if the FBI didn't have at least one
rig set up to do it.

------
exabrial
We need to create devices that RESIST (warrantless,unconstitutional) mass
surveillance and hide all of our data and metadata from everyone. Obama set a
terrible precedent by warrantlessly scanning 'metadata' about who we contact.
We're innocent by default, period.

I agree with the first post. We need to be creative and find a way to resist
government surveillance, and the piece where the engineering seems impossible
is allowing an occasional breach of security for extreme circumstances.

What's extreme? Well first, physical possession of the device should be
required. Second, it should take resources only a nation-state would be able
to afford. Want to decrypt an iPhone? It's going to cost > $5million in
processing power. Any criminal would move on.

------
retube
The FBI's position seems entirely defensible. The phone data may yield
important information - accomplices, contacts etc.

It also seems pretty disingenuous/hypocritical for Apple to plead "customer
privacy" when the ENITRE BUSINESS MODEL of much of the smart phone and app
industry (from which Apple directly benefit with a 30% commission) is
predicated on abusing customer privacy.

~~~
feld
Apple does not use customer data as part of their business model. This is a
unique Google/Android feature.

~~~
el_duderino
Sources to back up your claim?

~~~
amdavidson
How about this?

"...we never sell your data."

[http://www.apple.com/privacy/approach-to-
privacy/#personaliz...](http://www.apple.com/privacy/approach-to-
privacy/#personalization)

