
Apple ordered to bypass auto-erase on San Bernadino shooter's iPhone - bgentry
https://www.techdirt.com/articles/20160216/17393733617/no-judge-did-not-just-order-apple-to-break-encryption-san-bernardino-shooters-iphone-to-create-new-backdoor.shtml
======
philip1209
A thought experiment: Let's say the government makes hardware encryption
standards in the style of FedRAMP that sets standards for preventing tampering
by foreign governments. Then, imagine that a consumer electronics company
voluntarily makes all devices comply with this standard. Could a court attempt
to compel the company to defeat the standards which the government set as
tamper-proof against governments?

A second: What happens if Apple states that it will take a 50-person team with
an average annual labor cost of $200K/person approximately 5 weeks to fix the
problem with a 50% chance of success. Can Apple bill the court a million
dollars to try to fix the issue?

A third: Apple open-sources their encryption modules and firmware. They no
longer have proprietary information for how to unlock the phone. Are they
legally required to be the ones who defeat a system to which they hold no
proprietary information?

A fourth: The small team that built the system no longer works for Apple.
Perhaps their visa was revoked and they left the country, perhaps they were
poached by a competitor, or perhaps they retired in the years since this
module was published. Who is responsible for complying with the order?

A fifth: The data is actually corrupted. Apple presents this conclusion under
penalty of perjury after a thousand hours spent on the project, which it
requests are compensated.

A sixth: Apple requests that trading of its stock is frozen for one month
while it expends considerable resources on complying with an unexpected court
order relevant to national security.

~~~
ars
1: Yes.

2: Yes.

3: No, but they will probably be the ones asked anyway, and then yes, they
would be legally required.

4: Apple.

5: What's the question? Is the question will they be compensated? Then yes.

6: They can't. They don't own their stock. Bad PR is not a good enough reason.

You are treating the court like a mathematical proof and finding edge cases. I
used to as well. But courts don't work that way at all - they don't care in
the slightest about your proof. They analyze things on a human level, not a
mathematical level.

~~~
Lawtonfogle
>They analyze things on a human level

This (and the job prospects) are the reason I choose computer science/software
engineering over being a lawyer even though most people who commented on what
I should be say I should be a lawyer. The law requires not just thinking like
a human, but accepting such thought as valid. To me the logic is exactly the
same behind any racist who makes special exceptions for all minority people
they know while still holding their view in general. In effect the system is
built on a mix of emotion and logic, and in doing such it is insanely unjust
to those who fall on the unpopular side. A President who admits to smoking pot
leading a government that ruins the lives of people who use pot... it is just
as bad as some older men I know who admit to having tried pot but who think
anyone currently trying it deserves to be put in prison.

Analyzing something on the human level is always a horrible standard when
viewed outside the context of our own emotional fallacies.

~~~
DickingAround
This is an amazing meta-point here; the people on the other side of this
debate seem to have no need for logical cohesion.

I might extend that a bit to ask 'what is the consequence of such lack of
cohesion'? Perhaps first that getting agreement on anything needs to be done
at a close to case-by-case basis since there is a weaker adherence to
standards. This can introduce favoritism (e.g. some people get hit for pot and
others don't). Another consequence might be up-front costs. Not needing to
think too deep about edge cases and whatnot is clearly less time-cost to run
first-time-seen cases. This in turn means people won't hesitate as much in
setting new precedence because they won't notice when they do. Another
consequence would be inability to get consistency across and entire system
since not everything can be debated by all and there's not clear rules.

------
tptacek
Remember, this is an iPhone 5C, which doesn't have Touch ID or the Secure
Enclave; the security model for this phone is significantly different from
that of more recent iPhones.

On phones with a Secure Enclave, the wipe-on-failures state is managed in the
coprocessor (which runs L4), and is not straightforwardly backdoor-able.

If you're worried about the police brute-forcing your phone, enable Touch ID
and set a passcode that is approximately as complex as the one on your
computer.

~~~
randyrand
Even if touch id, it would be of no use. TouchID requires a password after 48
hours. or after the device resets.

Which is interesting. If you happen to use TouchID, is your best bet to hope a
court will not be able to compel you to unlock it within 48 hours of arrest?
That sounds very probable.

~~~
ggreer
After five failed fingerprint attempts, your password is required to unlock
the phone. That seems pretty safe to me. If you're ever ordered to unlock the
phone, just touch an unregistered finger to it. Fingerprint sensors aren't
foolproof. It'd be hard to prove you deliberately sabotaged the effort.

Though, one feature I'd like would be to register a distress fingerprint. Then
I could touch say... my left index finger to require a password unlock.

~~~
gst
If you do this on purpose after asked to unlock your phone you will probably
be charged with destruction of evidence or something like that.

However, while a court is (afaik) able to ask you to put your finger on the
fingerprint reader, you do not need to tell them which of the fingers the
correct one is. So instead of purposely using a wrong finger, I'd ask the
court to explicitly tell me which of my fingers I should use to unlock the
phone.

~~~
azernik
I think the court would similarly consider that obstruction and contempt. If
they tell you to unlock your phone and you try to play some "first you have to
guess which finger's the right one!" game, the judge will slap you with either
contempt of court or refusal to comply with a subpoena.

~~~
gst
IANAL, but I don't think there's much of a difference between asking someone
to reveal the correct password and asking someone to reveal the correct
finger. In both cases you would be asked to incriminate yourself.

If it would be lawful for a court to ask you to "unlock the phone with the
correct finger" then they might as well also ask you to "unlock this harddisk
with the correct keyboard keys pushed in the correct order (as a password)".

~~~
rhizome
_I don 't think there's much of a difference between asking someone to reveal
the correct password and asking someone to reveal the correct finger._

There's a huge difference. Authorities can force you to give up your
fingerprint, but not your password[1].

1\.
[http://jolt.law.harvard.edu/digest/telecommunications/court-...](http://jolt.law.harvard.edu/digest/telecommunications/court-
rules-police-may-compel-suspects-to-unlock-fingerprint-protected-smartphones)

~~~
schrodinger
It's the "which finger" that becomes similar to a password, not the
fingerprint (ianal)

~~~
rhizome
All the court cares about is for the person to supply the one that unlocks the
phone, they're not going to play guessing games.

~~~
schrodinger
Courts care about precise distinctions of law (that's their purpose!). Seems
clear that fingerprints aren't protected, basically the same thing as your
face in terms of privacy given a good enough camera.

But they would effectively be asking you the question "which finger did you
use to lock this phone" to which you may plead the 5th.

~~~
rhizome
It'll be contempt and possibly more if you don't unlock the device with your
fingerprint.

It's not hard, the "precise distinction of law," is "unlock this with your
finger, whichever one does it." I don't know what complicated back and forth
you're imagining, but it's never occurred in any case that I've heard of.

 _they would effectively be asking you the question "which finger did you use
to lock this phone" to which you may plead the 5th._

We already covered this in the link above: the 5th Amendment covers passcodes,
not fingerprints.

~~~
Someone
But, as _Schrodinger_ says, this is not about your fingerprint; it is about a
bit of information you have that the government does not have: which of your
finger(s) this device knows about.

 _" A bit of information you have that the government does not have"_ is a
password.

~~~
rhizome
I really don't understand this line of thinking. What is the scenario you
imagine where the question of "which finger?" is relevant? I'll lay out the
beginning:

\- Police want to get into your phone for some reason \- You refuse to help
them based on 5th Amendment or admiralty law or whatever \- They go to court
for a order compelling you to operate the touch lock to open the phone \- You
receive the order \- ?

Please lay out the "?" part, if you don't mind. I'm highly curious.

------
matt_wulfeck
If you read the iOS security guide you'll know Apple built the phone in such a
way as to wash its hands with these types of request. They'll say it's
impossible and they won't be lying. Nothing is ever impossible, but it will be
very impractical. The hardware and software is built to ensure this.

I think the real game here is to compel Apple to build a backdoor into future
models. I expect to see a lot of rhetoric around this fact, until something
forces Apple hand.

~~~
rkangel
That is possibly true for current models of the iPhone. It is significantly
less true for the 5c in question, which has less robust security features. See
other answers referring to the Secure Enclave.

------
tzs
The article at Errata Security [1] is better. There is an HN submission for it
[2], but it hasn't drawn any attention.

In particular, it addresses technical issues not covered in the Techdirt
article that are relevant to many of the existing comments here on HN.

[1] [http://blog.erratasec.com/2016/02/some-notes-on-apple-
decryp...](http://blog.erratasec.com/2016/02/some-notes-on-apple-decryption-
san.html#.VsPXQ_krIdU)

[2]
[https://news.ycombinator.com/item?id=11115251](https://news.ycombinator.com/item?id=11115251)

------
whatwhatwhat999
Unfortunately, there's no good outcome here.

If Apple can unencrypt the phone, it will prove to everyone that backdoors
exist. If they can't, and they tell the FBI as much, it will just give
politicians more reasons sound off about how we have to have backdoors,
because this shooter was a "terrorist" after all, and we just have to suck it
up and do whatever is necessary to go after people like that.

Either way, we end up with backdoors.

~~~
tannhauser23
Did you read the article? The court didn't order Apple to decrypt the phone.
Instead, Apple has to disable the phone's feature that automatically wipes the
hard drive after 10 failed password attempts. This is so that the FBI can
brute-force its way into the data.

~~~
anonydsfsfs
This could be example of parallel construction[1]. They may already have
unencrypted it via a backdoor, but they wouldn't be able to use anything they
find as evidence in court because they'd have to reveal the backdoor. If they
can plausibly show they brute-forced it instead, they keep the backdoor
hidden.

[1]
[https://en.wikipedia.org/wiki/Parallel_construction](https://en.wikipedia.org/wiki/Parallel_construction)

~~~
Godel_unicode
"it could be parallel construction" is true in literally every instance since
it's impossible to prove the negative case.

This is becoming my cue to stop reading the comments; when parallel
construction is the most obvious argument, you've read the interesting ideas
up thread.

~~~
kevin_b_er
However, the US Government is known to have gathered hidden evidence in drug
cases, then used parallel construction to hide the violation. So, now the
government's presentation of evidence should always be considered in question.
As it has shown dishonesty at the presentation and gathering of evidence and
there's nothing that says they haven't changed their unethical ways, how can
they be trusted to present evidence legally gathered?

[https://www.washingtonpost.com/news/the-
switch/wp/2013/08/05...](https://www.washingtonpost.com/news/the-
switch/wp/2013/08/05/the-nsa-is-giving-your-phone-records-to-the-dea-and-the-
dea-is-covering-it-up/)

------
headmelted
For me, the most interesting question I would have is absent from the article.

The court is basically ordering Apple to produce new firmware that doesn't
block brute forcing. If Apple were to comply, who keeps this firmware after
the fact?

There's no mention of this at all, but if the firmware image stays with the
FBI then the implications are much more profound with regard to privacy.

~~~
chopin
>The SIF will be coded by Apple with a unique identifier of the phone so that
the SIF would only load and execute on the SUBJECT DEVICE.

If I understand the cited order correctly the firmware is ordered to be
constructed in a way that it runs only on the target phone.

~~~
headmelted
I do wonder though that had Apple not predicted this exact scenario ahead of
time (likely), how would they control this?

It's unlikely they can rely on hardware protections to provide this device
locking, so is it the case that they would build the unique identifier into
the image.

Optimistically some obfuscation could help but are the FBI/CIA/NSA really more
than a few hops away from opening the binary image in a hex editor and
changing it by hand?

If Apple firmware images for the iPhone are signed per-device then fine, but
is that the case?

I don't know this but it seems unlikely to me that a custom device-signed
build of iOS happens for every iDevice, and if that's not the case, I can't
see how Apple can reliably restrict this with confidence.

~~~
chopin
I agree with you that this will be difficult or maybe impossible to implement.
However the court has foreseen the upthread argument as the order shows.

As many here I believe that once this backdoor exists it will be somehow
exploited (at the very least by further orders).

------
ars
> Apple ... will probably have little time to debug or test it overall,
> meaning that this feature it is being ordered to build will almost certainly
> put more users at risk.

Eh? They are not being asked to install it to the public at large, just one
phone.

Of all reasons to object, this reason makes little sense.

~~~
tptacek
That's true. In fact, if it's possible for Apple to accomplish what DOJ is
demanding of it, the best outcome would be for DOJ to succeed, and do so
publicly:

* There is an authentic need to get at the data on that phone

* There's no likelihood at all that other users will be impacted by the backdoor

* We'll all be on the same page about how secure these phones are versus the USG.

It's possible that they can prevail against the 5C but not against the 5S or
later, since the security architecture of the 5S is very different from that
of the 5C.

~~~
jbapple
> There is an authentic need to get at the data on that phone

What is the authentic need? The shooters are dead. Do we have reason to
believe that there is evidence of any pending crimes or any old unsolved
crimes on the phone?

~~~
BWStearns
If you shoot a bunch of people while declaring allegiance to an organized
group known for shooting bunches of people then I think that pretty clearly
demonstrates that reading your communications has a pretty high likelihood of
turning up something useful in preventing future incidents. If this doesn't
clear your hurdle for reasonable search then what would?

To be clear, I don't think the order to Apple is necessarily altogether a good
idea or is even going to produce the desired results, but your complaint seems
to be with the fact that this data is being pursued at all.

Edit to reply:

> Evidence of a conspiracy would help. You said they declared allegiance to an
> organized group. When they did that, did they say or hint that they had been
> in contact with that group, other than, say, watching public YouTube videos?

The woman in the couple declared it right before the shooting[0]. Do you want
a notarized letter from the deceased?

> Would you agree that "high likelihood" is too low a bar for justifying
> searching the phones of people who live in high-crime neighborhoods?

I'm pretty sure neither "high likelihood" nor "authentic need" were being used
as a term of art here, but I would bet that any judge would view the
commission of murder declaredly for an organized militant group to be probable
cause that there is information pertaining to more criminal activity by that
group on these two's phones and in their communications.

Do you really view this as a government overreach or are you just trolling?
Under what circumstances, if any, would you see as justified a search of
someone's email? phone? house? So far you've equivocated between living in a
bad neighborhood and committing murder-suicide.

[0] [http://www.nytimes.com/2015/12/05/us/tashfeen-malik-
islamic-...](http://www.nytimes.com/2015/12/05/us/tashfeen-malik-islamic-
state.html?_r=0)

~~~
jbapple
In response to your (first?) edit:

> The woman in the couple declared it right before the shooting[0].

I'm not questioning that she declared allegiance. I'm asking if she was in
private contact with anyone. If you were responding to that, can you show me
where that is in the NYT article you linked? I don't see it.

> Do you want a notarized letter from the deceased?

Let's try to keep this civil, please.

> Do you really view this as a government overreach or are you just trolling?

I actually believe the things I am saying. I am not saying them to anger or
upset you or anyone else. Please do not let the fact that we disagree about
the scope of the 4th Amendment cause you emotional suffering.

I am not ready to declare it overreach, because I do not know all of the
evidence yet. This is why I have been saying things like "Do we have reason to
believe that there is evidence of any pending crimes or any old unsolved
crimes on the phone?" and "did they say or hint that they had been in contact
with that group" and "I have not followed the news on this shooting, so I
would not be shocked if the answer were 'yes, there is some evidence of a
conspiracy'."

If there is no such evidence, I do think it is overreach, but my opinions on
policy are not fixed in stone, and I sometimes change my mind about them when
presented with new arguments, ideas, or philosophies.

> Under what circumstances, if any, would you see as justified a search of
> someone's email? phone? house?

I doubt anyone has a complete enumeration of all circumstances under which
they feel a search is justified. I would feel torn if there was lousy
circumstantial evidence that the phone would solve or prevent crimes, I would
be in support of a warrant if there was strong evidence, and I am opposed to a
warrant with no evidence. One thing I would call strong evidence is a shooter
having announced that he or she was part of a terrorist cell in the US.

I will no longer reading or responding to your edits that are "edited to
reply". If you want to discuss with me further, please reply to reply by using
the "reply" button. I will not be editing any of my posts to "edit to reply".

~~~
BWStearns
You keep switching between legal and normative requirements. We disagree on
the 4th amendment in the same way that scientists and climate change deniers
disagree about global warming. You have a fringe understanding of it with no
support from the relevant literature and your arguments about it are poorly
structured, deny evidence, and rely on intentionally misunderstanding context
and terms of art.

The legality of searching for evidence is pretty open and shut because you
need probable cause. The point of a search is to gather evidence, requiring
the evidence that would be the result of a search is obviously a non-starter
as a system.

Shooting a bunch of people and saying you're with ISIS is plenty of probable
cause for a search. I don't see how you're waiting for "all the evidence" here
since all the relevant facts are in and they're sufficient. Whether or not she
was conversing privately with ISIS counterparts would be the resulting
information of the search.

> One thing I would call strong evidence is a shooter having announced that he
> or she was part of a terrorist cell in the US.

The only way to read this in light of our previous discussion is that saying
"I'm in ISIS!" and then shooting up a bunch of civilians is insufficient to
prompt a post-mortem search of the attackers' affairs, instead they need to
say "I'm in ISIS and there are a bunch of us!" and then shoot a bunch of
civilians.

Bravo sir, I have been well and properly trolled.

~~~
jbapple
> You keep switching between legal and normative requirements.

If I did so, it was a mistake. My reference to the 4th Amendment, for
instance, should have said "how the 4th Amendment ought to protect us". I did
not mean to imply that I am trying to predict what warrants the justice system
will or will not grant.

> You have a fringe understanding of it

I think I mentioned the 4th amendment just the once. I have been trying to
stick to normative arguments.

> The point of a search is to gather evidence, requiring the evidence that
> would be the result of a search is obviously a non-starter as a system.

I think this is a point where we truly disagree. I think a system can function
in which some evidence that a search will yield results is required before the
search is conducted. I do not think that the evidence must be airtight. Note
that I am speaking about what I think is possible and just and right, not what
the law says now or the justice system does now.

> The only way to read this in light of our previous discussion is that saying
> "I'm in ISIS!" and then shooting up a bunch of civilians is insufficient to
> prompt a post-mortem search of the attackers' affairs

Did the shooter say she was "in ISIS", or that she pledged allegiance to the
leader? There might be a difference in this case. I have read that there is
religious significance to a pledge of allegiance in ISIS's theology that might
make a pledge indicative of ideological alignment and a membership "in ISIS"
indicative of being in actual conversations with ISIS.

> Bravo sir, I have been well and properly trolled.

Please, let's try to be civil.

~~~
thinkpad20
> Did the shooter say she was "in ISIS", or that she pledged allegiance to the
> leader?

Either one would seem to constitute probable cause for an association. Of
course we don't know if she was actually in ISIS, or just agreed with their
beliefs. But how would we know without conducting further investigation? You
seem to be demanding a somewhat unreasonably large burden of proof, when all
that is needed in this case is probable cause. Frankly, even if she hadn't
verbally declared allegiance to ISIS, I don't think it's a stretch to say
there's probable cause for connection to other terrorist groups. The fact that
she did say that makes it a slam dunk.

> I think a system can function in which some evidence that a search will
> yield results is required before the search is conducted. I do not think
> that the evidence must be airtight.

We do have such a system. The evidence you're describing is called probable
cause, and that's the whole point. I'm not sure of any reasonable definition
of probable cause that this situation wouldn't satisfy. Moreover, your
objections seem to be in the form of vague misgivings rather than concrete
arguments. You haven't precisely described what would constitute sufficient
evidence for an investigation, but instead seem to just be saying "there's not
enough right now." I think this is what's behind GPs frustrations responding
to your posts.

------
rburhum
So if I get this right, they want to (1) disable the delete feature after x
retries (therefore enabling unlimited retries) and (2) enable to submit tries
via a connector/wifi, bluetooth (therefore enabling a bruteforce approach).
What good is an encrypted filesystem in that scenario?

~~~
danielweber
Plenty of good if you have a reasonable passphrase and the vendor hasn't been
compelled to assist.

"Can only try 10 times" isn't anything guaranteed by encryption. My laptop has
an encrypted partition, but an attacker can brute-force it at will. Even if I
had software to say "only let it happen 10 times, then erase the partition"
the whole drive could just be cloned. That's why I have a 20+ character
passphrase.

~~~
tptacek
Apple goes _way_ out of their way to avoid scenarios where they can be
compelled to subvert iOS security. For instance, see pg44+ of the iOS security
white paper:

[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

... the HSMs that manage the escrow scheme for credentials stored in iCloud
are themselves rigged to blow up on 10 failed tries, and, not only that, but
the code that implements that process is burned into the HSMs and the keys
Apple would need to change that logic have been destroyed.

~~~
niij
Thank you for the link. I was unaware how seriously Apple takes their
security. Self-destructing HSMs to avoid brute-forcing is extremely
impressive. THIS is a model of how to implement proper key escrow.

------
cant_kant
The key parts of the Federal order:

"Apple's reasonable technical assistance shall accomplish the following three
important functions:

(1) it will bypass or disable the auto-erase function whether or not it has
been enabled;

(2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for
testing electronically via the physical device port, Bluetooth, Wi-Fi, or
other protocol available on the SUBJECT DEVICE and

(3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE,
software running on the device will not purposefully introduce any additional
delay between passcode attempts beyond what is incurred by Apple hardware.

Apple's reasonable technical assistance may include, but is not limited to:
providing the FBI with a signed iPhone Software file, recovery bundle, or
other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE.

The SIF will load and run from Random Access Memory and will not modify the
iOS on the actual phone, the user data partition or system partition on the
device's flash memory. The SIF will be coded by Apple with a unique identifier
of the phone so that the SIF would only load and execute on the SUBJECT
DEVICE.

The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery
mode, or other applicable mode available to the FBI. Once active on the
SUBJECT DEVICE, the SIF will accomplish the three functions specified in
paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a
government facility, or alternatively, at an Apple facility; if the latter,
Apple shall provide the government with remote access to the SUBJECT DEVICE
through a computer allowing the government to conduct passcode recovery
analysis.

If Apple determines that it can achieve the three functions stated above in
paragraph 2, as well as the functionality set forth in paragraph 3, using an
alternate technological means from that recommended by the government, and the
government concurs, Apple may comply with this Order in that way."

------
hardmath123
Robert Graham (Errata Security)'s notes on this:
[http://blog.erratasec.com/2016/02/some-notes-on-apple-
decryp...](http://blog.erratasec.com/2016/02/some-notes-on-apple-decryption-
san.html#.VsPuiFIq1Ec)

------
kirykl
The implications are quite important for future technologies. Neural implants
for example. Neural implants are currently used for prosthetics and paralysis.
A forced backdoor would kill all research to develop a co-processor directly
linked to the brain. Who would want a government backdoor directly to the
brain

------
zekevermillion
If Apple is capable of compromising security on its devices (by using its root
key to sign a custom version of iOS, or through some other method), then I see
no way that they will avoid eventually being subject to a court order in some
jurisdiction that compels this action. If that's true, then device security is
_already_ compromised and Apple knows this. Let's say the facts of the case
were slightly different, that the FBI "knows" a terrorist attack is about to
occur, and Jack Bauer-style demands that Apple assist in compromising a
specific device that has the top seekrit plans on it. In that instance, do you
think Apple would comply with a warrantless request for cooperation? Hm...

Reading Tim Cook's announcement in light of this thought experiment, methinks
he doth protest too much! Apple does not have any objection to compromising
user security at the root level, and in fact has already done so by creating a
device that has some limited vulnerability to malicious action by the
manufacturer signed with its root key. (By the way, no doubt every other
manufacturer has done worse, so this is not to deprecate Apple vs. any other
big company.)

I would speculate that Tim Cook's goals with this announcement are largely PR-
based, and that the goal of Apple's legal strategy is not to avoid cooperation
but rather to retain the ability to decide _whether_ to cooperate, and/or to
impose a higher perceived cost on the government for such requests. No doubt
Apple is correct to say that once a precedent is established, then it will be
widely used by law enforcement even in routine cases.

At the end of the day, I am not optimistic that we can avoid a world in which
large device manufacturers are compelled (legally and practically) to build
security flaws into their devices. Perhaps not the flaw of a back-doored
crypto implementation, but other flaws such as those that have been identified
in current iOS devices that allow the government (with commitment of
sufficient resources) to chip away at some of the more superficial
protections.

------
Myrmornis
Why the worry about auto-wiping? Is it not possible to make a copy of the
encrypted data and then play around with it as much as you want?

~~~
scottcanoni
Can someone answer this? Raw read the memory to an external device and then
brute force that shit using super computers until it cries.

~~~
X-Istence
If you can break AES... then the NSA would love to have a word with you :P

The FBI is going after the lowest hanging fruit, the users password that was
used to create the crypto key.

~~~
superuser2
The user's password is _not_ used to create the crypto key; it is randomly
generated and burned in at the factory.

~~~
X-Istence
It is used to create the crypto key, using a password based key derivation
function, using the user's password fed into the PBKDF the output is the key
used for encryption/decryption.

The users device key is mixed into that PBKDF. Without both parts of the
equation, you have nothing.

For your reading enjoyment:
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

Specifically page 11 the diagram at the bottom.

------
blakecaldwell
Does Apple get to bill the FBI for the time that their engineers and legal
department will be busy on this request?

~~~
savoytruffle
Yes. In the court order section 5 (on line 8 of page 3) "Apple shall advise
the government of the reasonable cost of providing this service."

Perhaps only engineering resources. (it's always gonna be a big legal hassle)

~~~
bb88
A reasonable cost may still be $150 to $200 per hour. Engineering time charged
to a third party is never cheap.

------
elgenie
"""To the extent that Apple believes that compliance with this Order would be
unreasonably burdensome, it may make an application to this Court for relief
within five business days of receipt of the Order."""

If what Apple's security guides claim is true, "unreasonably burdensome"
should be an easy standard to meet on practical technical feasibility grounds.
The issue is whether they'll want to challenge this on non-technical grounds.

------
SideburnsOfDoom
So, Apple says that "the FBI wants us to make a new version of the iPhone
operating system, circumventing several important security features, and
install it on an iPhone recovered during the investigation."

[https://www.apple.com/customer-letter/](https://www.apple.com/customer-
letter/)

If it's possible to make such a "backdoored" build of iOS, then there are
state actors who will be throwing $Millions at doing it already, with or
without any willing help from Apple.

~~~
ascorbic
I'm guessing this would need to involve stealing Apple's private code signing
key.

~~~
SideburnsOfDoom
This is mentioned here:
[https://news.ycombinator.com/item?id=11116390](https://news.ycombinator.com/item?id=11116390)

------
kevinastone
Why wouldn't the FBI just clone the phone disk contents and crack the
encryption on more dedicated systems?

~~~
Mandatum
Are there tools to dump and resume iPhone/Android states? Could easily dump
state, null "tryCounter++", and resume cracking?

~~~
superuser2
The PIN is an input to the HSM, where the key actually lives. HSMs are
designed from the silicon up to resist inspection or modification of internal
state. There are certainly no baked-in debugging/JTAG interfaces, and the
hardware is designed to blow away the key if the chip is under physical or
logical attack.

------
nanocyber
I thought this was an excellent write-up regarding how the iOS security
platform (recent iPhone models) works from someone obviously in the know, as
posted in the forums of Apple Insider. (Source:
[http://forums.appleinsider.com/discussion/191851](http://forums.appleinsider.com/discussion/191851))

" Apple uses a dedicated chip to store and process the encryption. They call
this the Secure Enclave. The secure enclave stores a full 256-bit AES
encryption key.

Within the secure enclave itself, you have the device's Unique ID (UID) . The
only place this information is stored is within the secure enclave. It can't
be queried or accessed from any other part of the device or OS. Within the
phone's processor you also have the device's Group ID (GID). Both of these
numbers combine to create 1/2 of the encryption key. These are numbers that
are burned into the silicon, aren't accessible outside of the chips
themselves, and aren't recorded anywhere once they are burned into the
silicon. Apple doesn't keep records of these numbers. Since these two
different pieces of hardware combine together to make 1/2 of the encryption
key, you can't separate the secure enclave from it's paired processor.

The second half of the encryption key is generated using a random number
generator chip. It creates entropy using the various sensors on the iPhone
itself during boot (microphone, accelerometer, camera, etc.) This part of the
key is stored within the Secure Enclave as well, where it resides and doesn't
leave. This storage is tamper resistant and can't be accessed outside of the
encryption system. Even if the UID and GID components of the encryption key
are compromised on Apple's end, it still wouldn't be possible to decrypt an
iPhone since that's only 1/2 of the key.

The secure enclave is part of an overall hardware based encryption system that
completely encrypts all of the user storage. It will only decrypt content if
provided with the unlock code. The unlock code itself is entangled with the
device's UDID so that all attempts to decrypt the storage must be done on the
device itself. You must have all 3 pieces present: The specific secure
enclave, the specific processor of the iphone, and the flash memory that you
are trying to decrypt. Basically, you can't pull the device apart to attack an
individual piece of the encryption or get around parts of the encryption
storage process. You can't run the decryption or brute forcing of the unlock
code in an emulator. It requires that the actual hardware components are
present and can only be done on the specific device itself.

The secure enclave also has hardware enforced time-delays and key-destruction.
You can set the phone to wipe the encryption key (and all the data contained
on the phone) after 10 failed attempts. If you have the data-wipe turned on,
then the secure enclave will nuke the key that it stores after 10 failed
attempts, effectively erasing all the data on the device. Whether the device-
wipe feature is turned on or not, the secure enclave still has a hardware-
enforced delay between attempts at entering the code: Attempts 1-4 have no
delay, Attempt 5 has a delay of 1 minute. Attempt 6 has a delay of 5 minutes.
Attempts 7 and 8 have a delay of 15 minutes. And attempts 9 or more have a
delay of 1 hour. This delay is enforced by the secure enclave and can not be
bypassed, even if you completely replace the operating system of the phone
itself. If you have a 6-digit pin code, it will take, on average, nearly 6
years to brute-force the code. 4-digit pin will take almost a year. if you
have an alpha-numeric password the amount of time required could extend beyond
the heat-death of the universe. Key destruction is turned on by default.

Even if you pull the flash storage out of the device, image it, and attempt to
get around key destruction that way it won't be successful. The key isn't
stored in the flash itself, it's only stored within the secure enclave itself
which you can't remove the storage from or image it.

Each boot, the secure enclave creates it's own temporary encryption key, based
on it's own UID and random number generator with proper entropy, that it uses
to store the full device encryption key in ram. Since the encryption key is
also stored in ram encrypted, it can't simply be read out of the system memory
by reading the RAM bus.

The only way I can possibly see to potentially unlock the phone without the
unlock code is to use an electron microscope to read the encryption key from
the secure enclave's own storage. This would take considerable time and
expense (likely millions of dollars and several months) to accomplish. This
also assumes that the secure enclave chip itself isn't built to be resistant
to this kind of attack. The chip could be physically designed such that the
very act of exposing the silicon to read it with an electron microscope could
itself be destructive."

~~~
rogeryu
Thanks for this explanation. Does Android have anything like this?

~~~
interpol_p
There are probably Android devices which make use of ARM's TrustZone [1].
Apple's Secure Enclave is a bit more thorough, though, because it actually
uses a physically separate co-processor running a custom L4-based microkernel
with a secure boot process. It is hardware isolated from the rest of the
system, and uses a secure mailbox and hardware interrupts to communicate.
Whereas ARM TrustZone appears to be implementable entirely on a single CPU.

[1]
[http://www.arm.com/products/processors/technologies/trustzon...](http://www.arm.com/products/processors/technologies/trustzone/)

------
mchahn
Why can't they just pull the flash memory and work on it directly?

~~~
X-Istence
AES crypto does not lend to easy cracking...

------
dschiptsov
This is wrong. Engineers who make encrypted devices should do their best to
make them undecipherable. This is an universal standard - to do your best.

If encryption cannot be broken it means it has been done right, and engineers
should have the highest respect.

Govts, on the other hand, should use appropriate policies, not orders or force
or backdoors.

------
DyslexicAtheist
>> _A magistrate judge, an Apple employee, and an FBI agent agree to meet at a
local bar. Only the Apple employee makes it. Why? Because the bar didn 't have
a back door._

------
tomohawk
The phone in question was not owned by the shooter, but by his employer, who
has consented to the search. This seems like a poor basis to contest
protecting someone's privacy.

------
gizi
Big problem. People will generally stop using phones of which they suspect
that they are back-doored. At the same time, it would be a hopeless endeavour
for law enforcement to get a swarm of (Chinese or other Asian) companies to
help with unlocking their phones. They would literally not even answer the
phone. Therefore, this may very well spell the end of highly centralized,
Apple-style companies that can be effectively pressured and browbeaten into
"compliance".

------
mariuolo
Do I sound conspiratorial if I suspect this is all PR and NSA has already ways
to get around the encryption?

It would entice $EVILDOERS to use a compromised platform.

------
fab13n
Fortunately for the future, this kind of attack can be thwarted through key
stretching (making each attempt intrinsically long to perform, by making it
computationally expensive).

I expect to see an optional, configurable key stretching setup in future
phones, for those whose privacy is worth a couple of seconds' delay when
unlocking their phones.

------
frogpelt
Who goes to jail if Apple flat out refuses?

~~~
bgentry
Nobody would have to go to jail. They'd hold Apple in contempt and charge them
a non-trivial sum of money for every day they refuse. If they continue
refusing, the amount increases exponentially until the company is threatened
with bankruptcy.

In short: they can't refuse without a justified explanation of why they are
unable to comply.

~~~
gambiting
In which case the best way to proceed would be to say that they will try to do
it and after few months of investigation say the data became corrupted before
they even got the phone. Who's going to verify it? No one.

------
briankwest
The headlines keep changing, is it crypto, is it auto-erase, is it a purple
unicorn? sheesh lets muddy up the actual issue with incomplete, or
inconsistent data. Its the American way!

------
curryhowardiso
Literally the one time the west coast would have needed Scalia...

------
venomsnake
Why no one is attacking on hardware level? Cut the processor to get the GID
and UID, dump the flash, pregenerate rainbow tables with pin, power flash chip
externally and give the codes ...

Yeah it is expensive, but I would not be surprised if there aren't such labs
that could provide such service. Why does FBI goes trough such pains?

~~~
BinaryIdiot
While I'm also curious why this isn't possible (as per everyone else's
comments), the phone in question doesn't even have the same level of security
so does it even have a dedicated chip with the GID and UID?

------
tosseraccount
It's not his phone. He worked for the government It belongs to the government.

------
j1vms
It's at times like these they're surely knocking on the door of every company
whose R&D in quantum computing, information theory and algorithms they've been
funding for at least the past 2 or so odd decades. "So, is it ready yet?"

~~~
dmak
I'm assuming you are referring to quantum computing for it's speed
computations? That wouldn't make a different here. They have only X amount of
tries before the phone locks them out. It is the number of tries that is the
issue here.

~~~
mmel
From my understanding, if you had for example a 128-qbit quantum computer, it
would be able to crack any 16 character password in a single operation.

~~~
BinaryIdiot
It's not really a single operation though. How would that quantum computer
test against an iPhone when each test is a mark against the 10 max test count?
An iPhone is still based in standard silicon using bits; you can't exactly pop
a process that uses qubits into it and expect it to work.

------
empressplay
It bothers me that Tim Cook lied: he stated in his open letter that if they
provided the modified OS it could be used on other phones, but the court order
specifically says Apple should make the software only work on the specific
phone in question.

~~~
cp9
the second software like this exists that was signed with Apple's release key,
the second they can no longer control it

------
dplgk
The 5th amendment protects evidence inside the brain of the accused. As
devices becomes more and more an extension of the brain, the more I think
we'll need to adjust the rule of the 5th amendment to cover things outside of
the brain.

~~~
BWStearns
Think about what we want as a society though. If you did commit a crime, we
want to convict you. The reason the 5th protects you from compelled self-
incrimination is to prevent unjust interrogation and investigation techniques
from the cops[0], not to make the overall likelihood of conviction lower,
although it incidentally has this effect.

From a policy standpoint the ideal world would be one where all criminals are
convicted and no innocents are convicted. In our system we _try_ [see 0 again]
to err on the side of letting criminals go rather than convicting innocents.

One of the interesting things about the new normal of everything being
recorded always and everywhere is how this might change our laws. Laws are
supposed to accomplish something, speeding law is supposed to improve road
safety for example. If automatic enforcement instead imposes massive taxes on
everyone while not improving road safety, society (should) modify the law
since it didn't get them to where they wanted to go.

Sorry, long winded way of saying, our electronic artifacts aren't really all
that different than letters. If I wrote an email to my friend demonstrating
premeditation in murdering your puppy, you and everyone else have an interest
in that email being valid evidence. That's a wholly separate discussion from
the encryption debate. I definitely you're right that we need to think about
it but I just wanted to play devils advocate here for why we might want to
think twice.

[0] Yes, shit still happens, but the system is SUPPOSED to limit these abuses.

~~~
Lawtonfogle
>If you did commit a crime, we want to convict you.

There are too many bad laws out there. If I meet a bad guy, I can defend
myself. If I cross the government, I'm doomed. I much rather support things
that increase my likelihood of meeting a bad guy that I can defend myself
against while reducing my ability to cross the government and being doomed
without recourse. This is the whole reason I justify that it is better to let
99 guilty go free than to jail 1 innocent (assuming the government doesn't
also take away my ability to defend myself, which it does seem intent on
doing).

~~~
BWStearns
It is unjust to jail Rick for jay walking and ignore that Morty jay walks. A
system of partial enforcement is not intrinsically more just because we
haven't jailed Morty. It's actually unjust because if Morty is free to jay
walk then how can we justify locking up Rick if jay walking is the crime? We
probably just like Rick less.

Partial enforcement is a practical reality because we're willing to accept the
injustice of partial enforcement rather than live in an Orwellian police state
(a decision I'm thrilled with!).

> There are too many bad laws out there.

That's the point! If bad laws stay on the books and aren't generally enforced
then they can be enforced capriciously as punitive weapons by government
officials. If bad laws are fully enforced, they won't stay on the books for
very long.

This is why it is in the best interest of a society to have as high an
enforcement rate as possible before increasing false positives. It discourages
bad laws from existing in the first place and surviving in the second.

~~~
Lawtonfogle
>Partial enforcement is a practical reality because we're willing to accept
the injustice of partial enforcement rather than live in an Orwellian police
state (a decision I'm thrilled with!).

False dichotomy. If all laws are fully enforced, bad laws will soon be done
away with by popular demand. If the government won't do so willingly, then
they will be forced to do so by the people.

~~~
BWStearns
Can you clarify? I just said that bad laws would go away with full
enforcement, which is the core of why it is good for a society to enforce its
laws evenly and thoroughly.

I don't think there's any false dichotomy produced by simultaneously noting
that on a practical level you will never achieve literally 100% enforcement of
laws without some serious damage to civil liberties which is why we err on the
side of partial enforcement as the lesser of two evils.

Edit: Is the issue is the use of the phrase "accept the injustice of partial
enforcement"? I was trying to communicate that there is some injustice in not
fully applying the law (the fact that some murderers go untried is unjust for
instance). However we're willing to accept that because getting literally 100%
enforcement would require things we don't want. When I say full enforcement as
something to aspire to I'm not suggesting literally 100%, I'm referring to the
way we roughly fully enforce murder laws and do not fully enforce drug laws.
One of those crimes (more or less) gets treated as a crime regardless of who
commits it, the other does not. I don't think we'd tolerate drug laws as they
are if they were enforced with the same degree of universality as murder laws.

~~~
Lawtonfogle
>I don't think there's any false dichotomy produced by simultaneously noting
that on a practical level you will never achieve literally 100% enforcement of
laws without some serious damage to civil liberties which is why we err on the
side of partial enforcement as the lesser of two evils.

My point being there is the third option that the laws will be removed. And it
seems like in part of your post you agree with this, but in part of your post
you act like this isn't an option. I'm kinda confused in that regard.

~~~
BWStearns
Sorry, I specifically pointing to Orwellian nightmare a false option. It's
like Cake or Death? Cake's really the only thing people are going to pick.

------
EGreg
I always wondered why more people don't go around bricking iPhones by entering
the wrong pin several times. Same goes for any other lockout. Why not do this
to someone famous by constantly logging in as them from a botnet?

~~~
X-Istence
Because you need physical access to the device to put in a pin?

------
gcb0
this is smoke screen. Purely.

they can already desolder the flash memory chips and brute force the data,
programatically no less, all they want.

~~~
tlrobinson
Are you suggesting they can brute force AES-256?

iOS's security is quite sophisticated:
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

~~~
gcb0
well, now that you mention...

i was commenting only on the article that says they wanted apple to remove the
"wipe data after X tries"

~~~
gcb0
actually, no. most people use 5 digits pins. why break anything other than
that?

------
samfisher83
Can't they use the dead guys finger print?

~~~
savoytruffle
The phone in question is an iPhone 5c that doesn't have a fingerprint sensor.

Also even if it did, simply rebooting the device will require a complex
passcode instead of a stored fingerprint.

------
exabrial
Ok someone murders a bunch of defenseless people... Why is Apple dragging
their feet? This is tasteless. I'm NOT for backdoors, but this is ridiculous.

~~~
bbatsell
> I'm NOT for backdoors

I think you've just learned that you are, in fact, for backdoors. A backdoor
doesn't become any less of a backdoor when it's only used against bad people.

~~~
exabrial
No, don't put words into my mouth. I think Apple should go to some
extraordinary means to assist here. Any system is hackable if have physical
possession and control over the input.

~~~
Natsu
It's true that we haven't seen perfection yet, but there are tamper-resistant
devices where the above is not trivially true. If each device were protecting
the same keys or we just needed to break the platform, I might be more
inclined to agree, but given that they have individual keys and a failed break
could leave the data unobtainable, I'm not so confident.

That said, I think this particular phone doesn't have the secure enclave, so
it may be breakable here.

------
doggydogs94
If Apple can jailbreak the phone in its current state, Apple (or the NSA) may
be able to help.

------
ghettoimp
This order says Apple must... \- bypass the auto-erase feature \- enable the
FBI to "submit" passcodes \- not purposefully introduce additional delays

I don't see that this requires Apple to do anything in particular with
whatever passcodes the FBI submits.

bool tryPasscode (string passcode) { return false; }

Reasonable cost of service: $5?

~~~
ncallaway
Judges are not strict compilers. They will interpret "submit passcodes" to
mean what a lay person would read it to mean: if the submitted passcode is
correct, the phone will be unlocked.

Games like this will get you contempt of court. I don't know if an obstruction
of justice charge could come out of this, but I wouldn't test it myself...

------
yarou
They're effectively asking for a backdoor, plain and simple. I'd be highly
surprised if Apple complied with this court order.

Even if they removed said feature, the only way to decrypt the FS would be if
and only if the owner had a weak strength passcode.

Can somebody explain to me how this warrant is not a direct violation of this
individual's 4th amendment rights?

This seems like yet another case where the rights guaranteed by the
Constitution are selectively applied based on your skin color.

~~~
jack9
> Can somebody explain to me how this warrant is not a direct violation of
> this individual's 4th amendment rights?

The person is dead and it's reasonable and the 4th isn't even applicable
(depending on the interpretation of "make a backdoor in a product" as opposed
to "we're looking at someone's data they own on a device they own"). Even the
very liberal interpretation of the 4th doesn't apply here, when the right
violation would be the act of accessing the data, not just asking to have a
way to look at it.

Techdirt is full of fiction by design, so I'm not surprised by the confusion.

