
Answers to your questions about Apple and security - mrexroad
http://www.apple.com/customer-letter/answers/
======
panarky
This FAQ has more than 1000 words, but these are the words that matter:

    
    
      The only way to guarantee that such a powerful tool isn’t abused
      and doesn’t fall into the wrong hands is to never create it.
    

Can a court force a company or an individual to create something that does not
exist?

China required Google to actively censor search results about sensitive
topics, and Google quit China. (They may now be heading back [1].)

Bing stayed in China and silently replaced their organic results with
government-approved propaganda [2].

The best way to prevent governments from oppressing their citizens is to
refuse to create tools that enable oppression.

[1] [http://www.theatlantic.com/technology/archive/2016/01/why-
go...](http://www.theatlantic.com/technology/archive/2016/01/why-google-quit-
china-and-why-its-heading-back/424482/)

[2] [http://www.theguardian.com/technology/2014/feb/11/bing-
censo...](http://www.theguardian.com/technology/2014/feb/11/bing-censors-
chinese-language-search-results)

~~~
zeveb
> This FAQ has more than 1000 words, but these are the words that matter:

> > The only way to guarantee that such a powerful tool isn’t abused and
> doesn’t fall into the wrong hands is to never create it.

And of course, that's also wrong. The only way to guarantee that such a
powerful tool isn't abused it and doesn't fall into the wrong hands is to make
it impossible for such a tool to exist, not to refuse to create it.

Right now, today, Apple has the ability to create such a tool. Some finite
number of human beings at Apple have the ability to create such a tool on
their own initiative, e.g. were they disgruntled.

It should be _impossible_ for Apple, or disgruntled Apple employees, or any
nation state, to create such a tool.

Otherwise, it will eventually be created, because if something is possible
then is eventually probable.

~~~
Spooky23
The real answer is to do a Putin -- buy a typewriter.

If you're doing something that will attract negative political attention, mail
a letter.

~~~
cmurf
USPS logs all mail, i.e. metadata.
[http://www.nytimes.com/2013/07/04/us/monitoring-of-snail-
mai...](http://www.nytimes.com/2013/07/04/us/monitoring-of-snail-mail.html)

~~~
Spooky23
So what?

It's pretty trivial to obscure that metadata, and frankly the content of your
correspondence is more important than the metadata.

Your mail has fairly robust legal protection, and you can spend more money
(i.e. Registered mail) to provide a higher level of tamper evidence and
accountability in transit.

------
af16090
One thing that I don't think has been covered enough in this whole debate
about forcing Apple to unlock the iPhone is that "Farook and his wife
destroyed their personal iPhones, and the hard drive from their computer was
removed and has never been found"[1]. The iPhone the FBI is after is one that
was issued to him by his employer. It seems to me to be very unlikely that
Farook and his wife would go to the trouble to destroy all their other
electronics but somehow forget to destroy his work phone (assuming his work
phone had incriminating information on it in the first place).

[1]: [http://www.usatoday.com/story/opinion/2016/02/18/apple-
court...](http://www.usatoday.com/story/opinion/2016/02/18/apple-court-order-
iphone-fbi-syed-farook-editorials-debates/80572492/)

~~~
peter303
I am guessing they have the call records from the phone company and found
something interesting enough to go through all this effort.

We dont know what they found in the lake.

~~~
jonlucc
What would they get? Best case, there is a contact name associated with a
phone number they think is suspicious? Wouldn't that have been backed up on
iCloud?

Also, what's the endgame? They want to convict the dead guy? They want to
convict the neighbor who sold them guns, maybe?

------
dkopi
"Law enforcement agents around the country have already said they have
hundreds of iPhones they want Apple to unlock if the FBI wins this case. In
the physical world, it would be the equivalent of a master key, capable of
opening hundreds of millions of locks. "

Actually - the master key, the backdoor, already exists. The master key is
Apple's ability to sign a new version of IOS, and update the software on a
locked phone.

The Federal government isn't asking Apple to create a backdoor. Their asking
apple to use the backdoor that already exists.

~~~
WireWrap
> The Federal government isn't asking Apple to create a backdoor. Their asking
> apple to use the backdoor that already exists.

Basically. Unfortunately, most of the reporting is focused on the payload
Apple is being asked to create and doesn't draw enough attention to that
"existing backdoor" that will allow such a payload to be successfully
installed.

Eliminating that "existing backdoor" should be a priority. I see some, here,
expressing the thought that Apple might be working on that. I think Apple
needs to be pressed, hard, on that very subject.

~~~
icebraining
Supposedly, the backdoor has already been fixed by the Secure Enclave, which
was included in newer models of the iPhone.

~~~
mikeash
Supposedly, and also supposedly the Secure Enclave doesn't defend against
this, depending on who you ask and how they're speculating. I've yet to see
anything authoritative on this from someone in a position to know for sure.

The good news is that if you have a good password, not just a simple numeric
passcode, you should be safe against this sort of thing regardless, unless the
authorities can coerce or trick you into revealing your password.

------
mikeash
The question I'm most interested in is whether newer iPhones are still subject
to this attack. Unless I missed it, no answer to that question is presented
here.

It sure feels to me like Apple is dancing around that issue. I'm betting that
newer iPhones are still vulnerable, and Apple is a bit embarrassed at dropping
the ball there. (The Secure Enclave stuff doesn't necessarily protect against
this attack, it depends on how it's implemented and the official documentation
doesn't quite say.)

If nothing sooner, it's going to be interesting to see what happens in the
fall when iOS 10 and the iPhone 7 presumably will ship, along with a new
version of Apple's iOS security guide. Diffing that with the 2015 edition
could prove quite educational.

~~~
ilyanep
I am not a security expert, but my understanding is that the main problems for
the FBI are: a) There is a timed delay for trying new passwords after enough
unsuccessful attempts and b) The phone will wipe itself after 10 unsuccessful
attempts.

It's pretty much only sufficient to have (a) since the delays will make it
take years to guess the password by brute force.

I just ctrl-f'd for "delay" in the security guidelines[0], which claim that
the secure enclave is the one that enforces the timed delay[1], so I guess the
only attack vector would be if you could somehow backdoor code onto the chip.
I can't find anything in the guide from a quick skim, but I'd suspect the code
is on a ROM chip or is somehow prevented from an upgrade without an unlock?

[0]
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)
[1] Page 12

~~~
mikeash
> I can't find anything in the guide from a quick skim, but I'd suspect the
> code is on a ROM chip or is somehow prevented from an upgrade without an
> unlock?

This is the big unanswered question. I suspect the same, but so far have not
been able to find anything that actually says so. The code is not in ROM, as
it can be updated, but it could wipe data if updated without an unlock. But I
can't find anything saying whether it actually _is_. The mere fact that it's
possible isn't enough, and their silence is a bit odd.

~~~
ilyanep
Seems like it'd be a pretty huge oversight if that vector were open, but I
agree with you that it's not very confidence-inspiring if nothing about this
is out there.

~~~
mikeash
Depends on your threat model. It's possible that the Secure Enclave was just
intended to be a defense in depth against malware and common criminals, not
something that could keep Apple themselves out. If so, I'm sure they're re-
evaluating that now.

------
jonpaine
Can someone ELI5 a typical opinion in support of the Government's case? I've
read through various comments and I haven't seen a concise opinion in favor
and am genuinely curious.

Does it boil down to (1) trust that the Government won't abuse the existence
of the tool and (2) trust that the tool will never be leaked?

Or is it more fundamental - that the target data is so valuable that the ends
justify the means?

I know it's more nuanced than that, but I think - in particular -someone's
view on the All Writs component just follows their view on the above in most
cases.

[edit]: I'm considering this a research sub-thread, not a debate sub-thread.
Trying to understand, not convince. So forgive me for not responding one way
or the other.

~~~
grecy
I'm staying with a friend who is is retired LEO, and he strongly believes
Apple should/must comply. He wants big jail time if they don't.

He thinks a it's a matter of safety because terrorists. 14 people were killed,
therefore the ends justify the means as you say.

We got into it the other night, and I think the case boils down to if we
should be allowed to have secrets from the - secrets the government can never
unlock, no matter what. he feels strongly that we should not be permitted to
do so, because when a legal warrant is written, the government must have
access to everything.

He also doesn't believe the Snowden leaks, and thinks that when it comes to
the pursuit of justice, the government should be trusted with everything.

~~~
vlod
A case I heard recently is that if a undercover CIA operative was captured in
e.g. China (with an iPhone), should Apple be compelled to break the security
if demanded by the Chinese government?

~~~
dopamean
I think people would argue that since Apple is not a Chinese company they'd be
ok with Apple ignoring the request.

~~~
jameshart
I'm sure Apple could just set up its iPhone production lines somewhere else in
no time at all, right?

------
kingnothing
In all likelihood, the NSA has already created this hacked version of iOS that
does exactly what the FBI is requesting. The government can probably already
get the data off of the phone if that's all they're after. But it probably
isn't. I imagine all of this is so the government can try to set precedent in
order to publicly use that technology in courtrooms and investigations instead
of having to keep it hidden from the world.

~~~
philip1209
What would the NSA's motivation be for creating this? They are mainly SIGINT.
This software is only useful when you are in physical possession of a phone.

~~~
sesutton
Some of the Snowden leaks showed that the NSA was intercepting servers in
transit and installing firmware with a backdoor.

------
pilif
_> >The government says your objection appears to be based on concern for your
business model and marketing strategy. Is that true?

> Absolutely not. Nothing could be further from the truth. This is and always
> has been about our customers._

so it _is_ about marketing :-)

All joking aside though, I agree strongly with this document and I'm both a
bit surprised and very happy about their detailed arguments and about the
passion they put into this issue.

As a customer I'm happy to see that they are really fighting for me and not
giving in, even to the point of refusing a comparatively reasonable request
out of fear of producing a precedent.

~~~
cthulhujr
>so it is about marketing

I read that line the same way. Under everything, their business isn't privacy;
it's making money. They just happen to take a strong public stance on customer
privacy. That seems really difficult to explain to readers without
patronizing.

------
ericfrederich
> Is it technically possible to do what the government has ordered?

>>Yes, it is certainly possible to create an entirely new operating system to
undermine our security features as the government wants.

So... something to fix in the next release. Apple could be doing this all
along. Maybe they've already done it in the past via a FISA warrant.

My point is, if you rely on software for security, and that software can be
"upgraded" at any time by the manufacturer, it's a problem. This is the
definition of a back door. They could design their OS so that it has to be
unlocked to "upgrade", but they didn't...

~~~
endergen
What you said. They do not address this in this letter.

------
interpol_p
This letter comes off very strongly. It's as if they treat the government as
just another customer. The way they describe their process for assisting law
enforcement almost reads like their process for providing developer support:

> _We also provide guidelines on our website for law enforcement agencies so
> they know exactly what we are able to access and what legal authority we
> need to see before we can help them._

The concerning portion of this letter is:

> _Yes, it is certainly possible to create an entirely new operating system to
> undermine our security features as the government wants._

I have a feeling Apple is currently working on preventing updates to the
Secure Enclave ROM from happening while a phone is locked (or at least
ensuring the keys are wiped if it does happen while a phone is locked).

~~~
saryant
> We also provide guidelines on our website for law enforcement agencies so
> they know exactly what we are able to access and what legal authority we
> need to see before we can help them.

Why is that sentence objectionable? Every tech company has guidelines like
that on their website.

~~~
kogus
He didn't say it was objectionable. He just said it treats the government like
a customer. I, for one, would like to live in a country where the government
is a servant, not a master.

------
atourgates
One thing I don't understand is, what prevents the FBI (or Apple or anyone
else) from duplicating the contents of the iPhone in question to a virtual
machine, then trying the 10,000 possible 4-digit unlock combinations on
virtual machines until they find the correct one?

I assume that since this seems like a fairly easy solution that it's not
possible, but what makes it not possible?

~~~
vlod
Not sure if 100% applies to the iPhone in question, but the secure enclave was
designed to prevent this sort of thing. Here's an intro to it:

[https://www.mikeash.com/pyblog/friday-qa-2016-02-19-what-
is-...](https://www.mikeash.com/pyblog/friday-qa-2016-02-19-what-is-the-
secure-enclave.html)

~~~
jml7c5
This applies to later models of iPhone; the 5c doesn't have a secure enclave.
All password attempt limiting and erasure of data is implemented by the
operating system.

~~~
mikeash
The 5c still has the secret UID baked into the chip.

The escalating artificial delays are implemented by the OS and can be
circumvented, but the secret UID is designed to make it impossible to extract.

Without that UID, then you're brute-forcing a 256-bit AES key, not a 4-6 digit
passcode. Practically, the brute forcing can only be done on the actual
iPhone.

------
madez
When Apple says that they’ve never unlocked a device for the authorities one
should keep in mind that the authorities have and use the power to force
entities to respond this way.

~~~
sillysaurus3
They have the power to force people to keep quiet, but not to lie.

~~~
shostack
But lies of omission are fairly easy to construct, and the government seems
particularly adept at coming up with them.

In this case, I'd feel much better if there were some sort of ToS canary or
something. Does Apple have anything like that?

------
frogpelt
One question for the FBI: why so much investigation over a crime committed by
someone who is dead?

Based on the nature of the crime, I'm guessing there weren't many accomplices.
The one guy who helped him get the guns is probably where they need to be
focusing their investigation and interrogation. If ISIS helped him, how? He
took a gun to work and killed people. That's not an elaborate scheme.

It does not benefit the FBI to lose goodwill with the American people over
this case.

~~~
niels_olson
Something about law enforcement causes practitioners to stop seeing themselves
as peers of the citizenry. It is probably similar to the paternalism
physicians exhibit toward patients despite the fact that they recognize the
problem and train themselves against it.

~~~
user8341116
Also see: Stanford prison experiment.

------
johnrob
Could Apple alter the icloud account in question such that a login succeeds
with any password or session key? I wonder if there is some variant of that
which would allow the remote backup strategy to work.

~~~
dkokelley
I imagine the issue is that the iPhone has non-backed up information on it.
From what I've read Apple has already turned over the latest backup from ~6
weeks before the shooting. Getting a current backup would require the phone's
hardware to cooperate.

------
specialist
I was surprised to learn that iOS 4 - 8 permits updating the firmware without
first entering the PIN.

Device Firmware Update mode
[https://www.theiphonewiki.com/wiki/DFU_Mode](https://www.theiphonewiki.com/wiki/DFU_Mode)

This technical factoid is relevant to the current discussion. I did not
understand how creating a custom firmware was useful.

To further ground the discussion, I found this informative:

Legal Process Guidelines U.S. Law Enforcement

[https://ssl.apple.com/privacy/docs/legal-process-
guidelines-...](https://ssl.apple.com/privacy/docs/legal-process-guidelines-
us.pdf)

Whereas before I was firmly against Apple helping to crack the San Bernadino
iPhone, I'm now merely mostly against.

I don't understand how this action can help FBI. What additional, unique
information could possibly be on this phone that they couldn't discover by
alternate means?

I don't understand how an error by FBI obligates Apple to clean up their mess.

My understanding is that iOS 9 changed things so that this kind of forensics
backdoor is no longer possible, mooting this discussion.

FBI should take their lumps and learn from their mistakes.

~~~
ascagnel_
> My understanding is that iOS 9 changed things so that this kind of forensics
> backdoor is no longer possible, mooting this discussion.

The question Apple would like to avoid is if, in the future, creating such a
backdoor-resistant OS will be illegal. iOS 9.2.1 (the latest pre-ruling
release) is legal; however, any future releases of the OS with this capability
may be considered a circumvention of or non-compliance with the ruling.

~~~
specialist
Aha.

I just reread the letter. I would like them to use your direct language.

The use of encryption is a human right (to privacy). I utterly oppose any
measure to curtail my right to privacy.

Arguing about the technical whatnots, and who did what when, as I've done
here, is a trap, to better obscure the fundamentals. And I fell for it.

So thank you. I know it feels like restating obvious. But I needed to be
reminded what's at stake here.

------
cballard
> Yes, it is certainly possible to create an entirely new operating system to
> undermine our security features as the government wants. But it’s something
> we believe is too dangerous to do. The only way to guarantee that such a
> powerful tool isn’t abused and doesn’t fall into the wrong hands is to never
> create it.

Is this just security through obscurity, then?

~~~
patch_cable
No? Not if it requires a key to sign the software.

~~~
mattb314
I think the argument here is that stealing a private key isn't fundamentally
more difficult than stealing this iOS backdoor (were Apple to create it).
Under this model, Apple's refusal to create a backdoor is in some sense an
appeal to security through obscurity because it implies that engineering
knowledge is the main obstacle to adversaries, whereas the true obstacle is
the ability to steal protected information from Apple (the key or the
backdoor). If the key and backdoor were equivalently useful, the existence of
a backdoor wouldn't impact the safety of customers because the key already
exists.

Of course, whether stealing a backdoor is actually as hard as stealing a key
is a legitimate question, but (I thought) Apple had the option to unlock the
phone in-house, which would at least keep the backdoor out of FBI hands (the
legal precedent, however, could still pose a real threat to user security).

------
zeveb
The contradict themselves:

> Is it technically possible to do what the government has ordered? … Yes, it
> is certainly possible to create an entirely new operating system to
> undermine our security features as the government wants.

And yet:

> We have done everything that’s both within our power and within the law to
> help in this case.

If it's possible for them to do it, then it's within their power, and it's
perfectly within the law for Apple to write a custom OS and deploy it onto a
device with the device's owner's permission (in this case, the owner is the
County of San Bernardino).

They don't want to do it. Heck, _I_ don't want them to be able to do it. But
they can, because they designed a system which they can backdoor.

~~~
hahainternet
> Heck, I don't want them to be able to do it.

Why? Having talked to a lot of people who hold this position, the only
underlying thought I've been able to discern is that you can't conceive of any
time someone else holding encrypted data could hurt you.

If this phone instead belonged to a living rapist or paedophile instead of a
terrorist, and was the only evidence proving their guilt, would you feel the
same?

~~~
snowwrestler
Let's not pretend that this is a new issue for American society. We have
almost 250 years of making these sorts of decisions.

For example, we know that bad guys use guns to commit crimes, yet we're very
very reluctant to ban guns. Bad guys use cars, but we don't require cars to
come with remote kill switches.

Bad guys use encryption. Shall we therefore ban it? Or destroy its
effectiveness? Do any good guys use encryption? And should we take those uses
in to consideration as we think about public policy?

The President of the U.S. uses an iPad. Millions of federal employees,
including FBI agents, use iPhones. Let's think about the implications of
punching a hole in the security of those devices.

It's very easy to make decisions when one ignores the broader context and
consequences. That doesn't mean it's the right way to make decisions.

~~~
grecy
(To play devils advocate, coming from the perspective of my retired LEO
friend)

legal warrants can we written to seize guns which can then be tested and
information extracted from them.

Legal warrants can be written to search and seize cars, so they can be
searched and extensively examined to extract information/evidence from them.

The argument goes that right now, warrants can be written to gain access to
basically everything a criminal has/owns/has been in contact with so that it
can all be gobbled up and analyzed.

People supporting the government in this believe the same is true for digital
data - they don't care that it's on a phone or laptop or "online", they just
think a warrant should let the government access it. If Apple can do it, then
they must.

(Note I don't personally agree with that, but I understand it)

~~~
zeveb
> If Apple can do it, then they must.

Which is why they ought to ensure that they _cannot_ do it. And it's why we
should resist any law mandating that they be forced to include a pre-emptive
backdoor.

------
nebulous1
They could have/should have circumvented all this by not allowing firmware to
be forced onto a locked phone without it wiping its own key store.

Admittedly users could also solve the issue for themselves by using much
longer passwords instead of short passcodes.

~~~
ajross
That would have eliminated their ability to fix bugs in the firmware, though.
They literally did this last week with the "Error 53" patch.

There are tricks to allow this, but broadly no: what we're talking about here
is Apple engineering a snoop-proof architecture that remains resistant when
the _attacker is Apple itself_.

And that's just not going to happen in any practical way. Eventually, if the
government wants to compel an backdoor in iOS encryption there will be a
backdoor to iOS encryption. Arguing otherwise is just fooling ourselves.

And it's a silly issue anyway. If you want snoop-proof encryption on your
personal device, install Linux, select "encrypt my drive", and memorize a
secure pass phrase. Done. Relying on a third party hardware vendor to do it
for you won't ever work.

~~~
nebulous1
Well, it would have to wipe the keys to update the firmware. Alternatively the
update process could ask for the password/passcode.

I agree with you, but even if you install something open source you're still
trusting the hardware, so at the moment there's basically no practical method
of not trusting any hardware vendor at all. Obviously when you get all your
hardware and software from the same vendor then it makes a move on the
government's behalf much more practical for them.

~~~
ajross
> even if you install something open source you're still trusting the hardware

Not for the encryption. That's done in software. A seized linux laptop with an
encrypted partition using a strong key is effectively snoop proof by the
definition we're using here.

It's true that hardware could have other attack vectors: a key logger to
intercept the pass phrase would be an obvious one. But again, that's just my
point: Apple is in no privileged position here. If they get compelled to
backdoor the iPhone then _no amount of security architecture_ along the lines
you posit is going to help us, because they can just be compelled to defeat
it.

~~~
nebulous1
> Not for the encryption. That's done in software. A seized linux laptop with
> an encrypted partition using a strong key is effectively snoop proof by the
> definition we're using here.

You can't run the software without hardware, so it has to be trusted. Don't
misunderstand me, this is obviously considerably more far fetched than the
Apple attacking their own software/hardware combo. However, assuming (a big
assumption) that we trust what Apple are telling us at the moment, a seized
iphone with a strong password would currently be just as snoop proof. In fact,
this includes the phone that has spawned this conversion.

They can be compelled to defeat their own security if you're accepting
continued updates to your phone. Under the security architecture I've
(loosely) described they can't attack it without the user accepting an update.
Of course, you're totally correct in practice because you're most likely just
going to have to trust Apple updates as they come out.

------
ikeboy
>The digital world is very different from the physical world. In the physical
world you can destroy something and it’s gone. But in the digital world, the
technique, once created, could be used over and over again, on any number of
devices.

They leave out the fact that Apple would need to sign (literally, using their
private key) every time it is used.

>Unfortunately, we learned that while the attacker’s iPhone was in FBI custody
the Apple ID password associated with the phone was changed. Changing this
password meant the phone could no longer access iCloud services.

Why can't Apple change the password back, reset the flag that says the
password was changed, and have them turn it on again?

~~~
stormbrew
They don't really leave it out of the equation, though they don't explicitly
state how it factors in. The issue is that a high profile case that people
want to be prosecuted to the fullest extent is being used as a wedge to force
a large undertaking (producing the unsecure firmware) to be made. Future
requests from law enforcement on much smaller cases would then _only_ require
the smaller undertaking (signing the existing firmware), and resisting that
court order would be significantly more difficult.

~~~
ikeboy
If the court decides that they're obligated to comply, then that should apply
in all similar cases. Resisting that order would be more difficult because
they already lost the battle, not because they already created the tool.

~~~
stormbrew
My point is that it would also apply in a _smaller_ cases for a different
reason than simple precedent: it would require no significant work not already
undertaken.

This is the crux.

~~~
vectorjohn
But why would that be bad? It _shouldn 't_ require significant additional
work, even in smaller cases. If the FBI is getting warrants to access things
without actual probable cause, then we need to be outraged about _that_ issue,
not how they get access.

------
cpt1138
My primary concern here is that there are 14 people dead and the powers that
are asking for this obviously failed to protect them. To me the arguments
people are making for Apple to comply are like abstinence education. Despite
all facts pointing to the effect that with all that they ALREADY have on us
they are completely useless at "protecting the American public" why are they
using that as an argument here? Obviously because people are still buying it,
but at some point shouldn't we point back to fact that they were unable to
stop 14 people being massacred?

------
exodust
> _" hundreds of iPhones they want Apple to unlock if the FBI wins this
> case."_

Only hundreds?

It's not "unlocking" anyway, it's exposing the phones to an opportunity of
cracking them open. A non-trivial time-consuming task, hardly the domain of
opportunistic hackers with stolen iPhones.

> _" government-ordered backdoor._"

"Backdoor" is not what is being asked for, so they shouldn't use the word
"backdoor".

Encryption is not under threat by this request. If raw computing power can
break the encryption, then Apple should improve their encryption. Use more
bits, more salt or whatever. Make it so a computer needs 50 years to crack a
password, even with electronic brute force. Then it wouldn't matter whether
the self-destruct kicked in or not.

And no mention of the compromise offer for Apple to keep the alternative OS on
their premises and destroy it after.

Apple are trying just a bit too hard to "not put customers at risk". The risk
is almost zero.

If Apple's security was as good as they claim, then not even Apple, no matter
what they did to help, could crack the phone. That's where we want to be. At
the point where it simply doesn't matter what the FBI asks for, the phone is
uncrackable. Sounds to me like we're not there yet. Apple helping crack this
phone will help us get there. And that's why I don't agree with Apple's
position here. Let's see this phone cracked open, and then evolve the security
to a point where a similar request would be impossible to achieve no matter
what Apple or anyone did.

~~~
kstenerud
> It's not "unlocking" anyway, it's exposing the phones to an opportunity of
> cracking them open. A non-trivial time-consuming task, hardly the domain of
> opportunistic hackers with stolen iPhones.

Semantic nit picking aside, it takes a trivial amount of time for a computer
electronically submitting passwords to crack a 4 digit code.

> "Backdoor" is not what is being asked for, so they shouldn't use the word
> "backdoor".

More semantic nit picking. They're asking for Apple to write software to
disable the very security features that make the 4 digit passcode secure.

> Make it so a computer needs 50 years to crack a password, even with
> electronic brute force. Then it wouldn't matter whether the self-destruct
> kicked in or not.

Says the armchair security expert.

> Apple helping crack this phone will help us get there.

No, it won't. All it will do is open a new security hole where one didn't
exist previously.

~~~
exodust
Are you suggesting a 4 digit passcode is secure? News flash: it's not.

Any 4 digit passcode is insecure by its own nature. It's security depends on
another unrelated system. This is an inherent weakness.

If you're stamping your foot demanding that your 4 digit passcode not be
compromised by the FBI or anyone else, may I suggest a smarter option: choose
a longer passcode.

Choose 11 digits and then it doesn't matter if someone at Apple or FBI or Mr
Robot goes bananas and decides to write some software that compromises your
phone's passcode retry limit. 11 digits is not trivial to crack via electronic
means.

If you're relying on a combination of a 4 digit passcode and faith in your
phone's manufacturer that the retry and self-destruct methods can't be hacked,
well that's your own irresponsible position. Keep it if you like, but it
sounds like an Apple passcode fanboy party waiting to be crashed.

As long as encryption remains solid without back doors, that's all that
matters. The rest is a matter for Apple to choose whether to cooperate with an
investigation into serious crime. They choose not to, citing "back doors" and
"protecting users". No surprises there.

------
spiralpolitik
Isn't the solution for Apple to have the phone also sign the firmware update
so that the user has to enter their passcode to accept the update and sign the
signature of their key.

If the firmware isn't signed by both keys (the users public key being stored
in the secure enclave) then the phone should refuse to boot.

That way even if Apple is compelled to sign a rogue firmware, it still
requires the user must also be compelled to accept it.

~~~
wvenable
It would then be impossible fix to boot issues like the recent Error 53
problem.

~~~
spiralpolitik
In the specific case of the Error 53 problem I don't see a problem. In the
case fingerprint sensor has been replaced with a non standard version I really
want to the phone not to boot because there is a fair chance its been
compromised.

(That said I think Apple's handling of the issue was terrible and it should
have given a much more specific error and Apple should have been much less
douchebaggy about replacing the sensor with an official version)

~~~
wvenable
That error was factory test error code -- it was never meant for end users. It
was a bug, that's all. Any other kind of bug could exist that prevents a
device from booting (like the recent Jan 1 1970 issue).

------
euroclydon
I would like to know the technical limitation law enforcement is facing when
trying to decrypt data in iOS 8.

I assume the data is encrypted using a key derived from the user's passcode,
and that that key is purged from device memory after an idle period. Brute
force attempts to guess the passcode are throttled, and too many attempts
cause the device to delete the encrypted data.

Can someone confirm I'm on track so far?

Then, law enforcement would be limited to trying to circumvent the passcode
entering throttling logic on the device, which Apple has physically engineered
to be a destructive operation, thus it's outside the capabilities of even the
most sophisticated technology labs in the US government?

Am I still on track?

~~~
tommyd
Although focussed on the Secure Enclave, this may be of interest:
[https://www.mikeash.com/pyblog/friday-qa-2016-02-19-what-
is-...](https://www.mikeash.com/pyblog/friday-qa-2016-02-19-what-is-the-
secure-enclave.html)

------
coldcode
To me this post explained a lot of the legal side of the issue:
[http://www.zdziarski.com/blog/?p=5645](http://www.zdziarski.com/blog/?p=5645)
I think it may have been on HN earlier.

------
uptown
If corporations are people, would requiring a corporation to produce work
product constitute a form of slavery? Or if writing code is a form of speech,
would forcing them to write code violate their constitutional right to free
speech?

------
ryanmarsh
Our forthright discussion on this issue on HN, while valuable, will not help
the broader problem of educating the voting public. It's kind of like John
Oliver's interview with Edward Snowden[1]. This issue needs to be translated
into "pics of my junk" and become a sort of meme. Is the issue being framed
wrong more of a problem than it not being framed at all?

1\.
[https://m.youtube.com/watch?v=XEVlyP4_11M](https://m.youtube.com/watch?v=XEVlyP4_11M)

------
hipaulshi
Maybe I shall starting offer 1m USD dollars for leaking this said software to
me. O.k. I am just kidding, but just imagine how much some organizations are
willing to offer to get hands on this. Will you ever trust Apple again if it
known this software is leaked? What if such software is leaked to a competitor
corporate? What is such software is leaked to an enemy spy agency? Apple will
be very doomed.

~~~
vectorjohn
That's a lot of money to pay for a piece of software that only works on one
phone that you don't even have access to.

------
glasz
i want to believe. but i'm having a hard time.

according to the new york times, for whatever they are still worth

> Apple had asked the F.B.I. to issue its application for the tool under seal.
> But the government made it public, prompting Mr. Cook to go into bunker mode
> to draft a response, according to people privy to the discussions, who spoke
> on condition of anonymity.

[http://www.nytimes.com/2016/02/19/technology/how-tim-cook-
be...](http://www.nytimes.com/2016/02/19/technology/how-tim-cook-became-a-
bulwark-for-digital-privacy.html)

is the nyt reporting correct? i know they tend to side with natsec bullshit. i
fear this being a really big pr stunt.

also, the nsa siding with apple is just an expression of rivalry, no?

------
allemagne
The only two reactions to this news seems to be cynicism and praise. I think
both voices in response to Apple's letters are valid and useful for improving
user security in the future, and yet incomplete by themselves.

Sure, Apple should be praised for refusing to give government agencies the
ability to unlock an iPhone, but a significant part of their motivation is not
altruistic. It's in Apple's self-interest to make a stand in this case, but we
can't always trust corporations to prioritize customer privacy over caving to
government pressure.

Similarly, Apple has already admitted that a backdoor exists for all iPhones.
In my opinion, this is an inexcusable security hole at best, and at worst an
implication that Apple intended at some point to comply with government
requests for encrypted information. However, the fact that the FBI has made
this request in the first place, and that Apple is in a position to decline
(at least initially) and make it public, is a good sign that the three-letter
agencies may not be as all-knowing as some may fear.

~~~
mindslight
> _and at worst an implication that Apple intended at some point to comply
> with government requests for encrypted information_

Apple's backdoor is for straightforward business reasons - they want to retain
digital ownership of people's devices to take a cut from app distribution.
Plus, keeping control is _harder_ than giving up control. Keeping control
allows total flexibility, like the flexibility to undermine security that's
under discussion. Removing control requires careful planning to avoid later
problems with no flexibility to fix them.

~~~
allemagne
I'm not sure I see what you're arguing. Implementing a more secure system is
never convenient, but that obviously has nothing to do with whether it's
justified.

As for my quoted statement, how can you really disagree? Having a system where
"only the good guys" (i.e. Apple, right?) can break the security of any device
is _precisely_ what law enforcement has been asking for for years, and what HN
users and the tech-savvy in general have been railing against. Now that Apple
has completely admitted that this system exists, users are downvoted here for
pointing it out?

~~~
mindslight
I'm merely disagreeing on Apple's motives. Yes, the voyeurs want backdoors,
and yes Apple has built a backdoor. But I don't think Apple has created this
backdoor _for_ law enforcement. I think it's simply due to them, like any
other company, having a hard time giving up control and creating open
platforms.

Don't sweat the downvotes. There seem to be a lot of Apple customers that
don't have a technical clue about what guarantees other systems actually
provide or what capabilities are theoretically possible. I surmise they view
computing systems solely in terms of productized offerings from companies, and
thus Apple is the leader of the pack for privacy and this case is key to
preserving that privacy.

------
staunch
If it wasn't for Tim Cook...all the other tech giants are failing ethically.

------
geekrax
Thing that bugs me most is the url `/customer-letter`. Is this the one and
only "Customer Letter" they're ever going to write?

------
jmount
Certainly claims to be a different Apple than the one that released the
iPhone3 that claimed it encrypted Exchange 2007 data (when it did not).

------
edibleEnergy
Can they not just clone the disk and bring it up in a vm? Is there anything
that would prevent them from building some tooling for that?

------
gboudrias
I dislike Apple as a matter of principle, but I really love them when they are
kicking and screaming for the right cause.

------
an_account
Why can the FBI just create a modified operating system?

I haven't seen this question answered anywhere.

~~~
theandrewbailey
It needs to be signed by Apple in order for it to be loaded.

~~~
an_account
Why couldn't the FBI subpoena Apple's signing key?

That way the FBI isn't asking Apple to create any software, it's only asking
for something that already exists.

Obviously that would be bad for Apple and computer security in general, but is
there precedent that prevents it?

------
rtpg
Disclaimer: kind of glad that Apple's making noise, but kind of frustrated
that it's about this specific case.

The answers to these questions has some pretty deceitful phrasing....

>First, the government would have us write an entirely new operating system
for their use.

Only "new" in the sense of not being exactly the same as the current one.
Implies much more work than we know to be the case.

>Law enforcement agents around the country have already said they have
hundreds of iPhones they want Apple to unlock if the FBI wins this case. In
the physical world, it would be the equivalent of a master key, capable of
opening hundreds of millions of locks.

The master key analogy falls apart because the order specifically calls for
making a version that only works on a targeted phone. At best it would be the
equivalent of Apple being asked to make many individual keys. Unless, of
course, they want to make a version of iOS with the exploit that would work on
any iPhone.

> Of course, Apple would do our best to protect that key, but in a world where
> all of our data is under constant threat, it would be relentlessly attacked
> by hackers and cybercriminals. A

This is implying hackers could do anything with a version of iOS that is made
to only work on one phone. You could absolutely release the update file that
the FBI is asking for and have no risk of compromising anything because
(again) this is for a specific phone.

>Has Apple unlocked iPhones for law enforcement in the past? >No.

(The answer then proceeds to say "Actually yes we have, just not past iOS 8)

>For devices running the iPhone operating systems prior to iOS 8 and under a
lawful court order, we have extracted data from an iPhone.

>We feel strongly that if we were to do what the government has asked of us —
to create a backdoor to our products

Using a backdoor already existing in your product...

>One of the strongest suggestions we offered was that they pair the phone to a
previously joined network, which would allow them to back up the phone and get
the data they are now asking for. Unfortunately, we learned that while the
attacker’s iPhone was in FBI custody the Apple ID password associated with the
phone was changed. Changing this password meant the phone could no longer
access iCloud services.

Seriously FBI?

I know this letter isn't for me. I want to be on Apple's side based off of how
they present the case. But if you look at the court order, off of the fact
that the FBI got a warrant for a specific device, off of the fact that they're
asking for an unlock of a specific phone, off of the fact that its technically
feasible to do this without compromising all iPhones thanks to digital
signage...

My impression is that Apple's position is that its technically infeasible to
make this exploit, which isn't really true.

There's the other "but with this, we'll have to do a bunch of phones"
argument... is there a term for being overburdened with writs from the court?
What's the constitutional protection against that? That feels like the only
valid defence at this point for them (from a legal standpoint)

~~~
delinka
Ultimately the point is that if we allow the government to compel companies to
perform work, where's the line? If the AWA is upheld by all the courts, how do
we prevent the government from just dictating to companies continuously the
work they must perform on the government's behalf?

~~~
quinnchr
I'm struggling to a think of a subpoena that wouldn't involve being compelled
to work. Subpoenaing unencrypted customer data still requires work on the
company's part. When does a subpoena become a burdensome amount of work?

Furthermore the government already forces companies to preform work. For
example, paying taxes, providing insurance for employees, complying with
industry regulations.

~~~
delinka
"When does a subpoena become a burdensome amount of work?" This is certainly
an important part of the question. I don't have a well-formed opinion on how
to define limits.

"Furthermore the government already forces companies to preform work. For
example, paying taxes, providing insurance for employees, complying with
industry regulations." These are well-defined items with known elements about
how to implement them. At this point, everyone knows that when they start a
company, taxes, insurance, and regulations are part of the game.

But when law enforcement shows up making random demands, or convinces a court
to issue an order for random demands, there must be limits. If there are no
limits, there will be no end to random government requests.

------
hahainternet
> I, for one, would like to live in a country where the government is a
> servant, not a master.

Would you? Why not Somalia?

~~~
kogus
I don't see the point you are trying to make. I'm simply suggesting that
"treating the government as no better than a customer" is probably a healthy
attitude. In general, I think society is a lot better off when the government
has to beg for data, fight for it in court, and loses more often than not.
Government should only be "powerful" in the sense that it has great power to
protect the liberty of the citizens who own it.

I recognize this as an idealistic attitude. I've always leaned libertarian.

~~~
hahainternet
> I don't see the point you are trying to make.

Somalia is a country where the government are beholden to the citizenry.

> I'm simply suggesting that "treating the government as no better than a
> customer" is probably a healthy attitude

I see no evidence for that. Being able to treat a search warrant as if it's a
polite request would help pretty much nobody.

> I recognize this as an idealistic attitude. I've always leaned libertarian.

By typical libertarian logic, Apple would be justified in torturing and
executing the FBI agents on their property, as property is private and the
Government is subordinate. The FBI agents violated the Non Aggression
Principle by attempting to coerce Apple's employees to act.

Where have I mistaken your logic there?

~~~
kogus
1 - To whom do you suggest the government should be accountable?

2 - I believe a search warrant is a completely valid and enforceable way to
compel information, including in this case. But if Apple cannot comply without
endangering every iPhone owner, then the other owners of iPhones have a stake
in the outcome. The government's right to investigate this crime doesn't
expand to the right to endanger millions of people's privacy and security.

3 - Torture would be a crime against the individual agents, and punishable as
a crime unto itself. Libertarian philosophy doesn't suggest that you have a
right to hurt other people, unless in self-defense.

~~~
hahainternet
> 1 - To whom do you suggest the government should be accountable?

Other governments and ultimately the citizenry as a whole.

> if Apple cannot comply without endangering every iPhone owner, then the
> other owners of iPhones have a stake in the outcome

They do not. No more than owners of doors when the FBI executes a warrant.

> The government's right to investigate this crime doesn't expand to the right
> to endanger millions of people's privacy and security.

No plausible mechanism has ever been put forward where this would be the case.

> Libertarian philosophy doesn't suggest that you have a right to hurt other
> people, unless in self-defense.

Right, they attempted to coerce your staff on your premesis into acting
against their own interests. This is a violation of the NAP and therefore any
level of violence in self defence is justified.

This is one of the most core Libertarian beliefs.

~~~
kogus
I am reminding myself of this XKCD comic:
[https://xkcd.com/386/](https://xkcd.com/386/), but at the risk of wasting our
time...

You said that in Somalia, the government is accountable to the citizenry. Yet
you suggest that that's legitimate ("...ultimately to the citizenry as a
whole..."). If I personally object to a law, do I have any recourse? What if
my town does? My state? My region? At what point does it become immoral for
the government to ignore change requests?

The door analogy is clever, but inaccurate. It is not possible to transmit
door keys (or battering rams) via email, and use them to simultaneously bash
open the doors of millions of people. Creating a method by which a phone can
be cracked weakens the security of the phone for all users, including innocent
users. So the FBI is asking to weaken the security of all phone users. The
"plausible mechanism" would be "hacker pays off an apple dev for a copy of the
hackable OS".

The last point doesn't make sense to me. Firstly, in a hypothetical
libertarian legal system, corporations would not have a right to self-defense,
or any other rights, as they are not humans. Even in individual cases, any
level of violence in self defense is definitely __not __justified. When you
said "...Apple would be justified in torturing and executing the FBI agents on
their property...", that's saying that trespassing warrants murder, which I do
not claim.

------
hahainternet
> Should we put in place ubiquitous video and audio surveillance in every
> square foot of the country just in case the FBI ever wants to review
> something that happened?

In public areas that are often visited? Absolutely yes, an unbiased source of
evidence available to the public? Excellent.

> Law enforcement has apparently lost the ability to do on the ground
> investigation work in favor of whiz bang-ery

Apple has provided a child porn trading network protected by the very
principles of mathematics and their refusal to cooperate with the FBI.

What they're doing is precisely what any law enforcement agency would.

~~~
sitharus
Can we not drag this down to the media scare tactic that is child pornography?

For every sicko who's trading child porn there's dozens, if not hundreds, more
who're trying to escape an abusive partner, leaving gangs, getting away from
sexual slavery, or whistleblowing on government or companies breaching the
law.

Additionally, should the millions of us engaged in perfectly legitimate
activities have a dossier created on our lives just in case? Imagine what
could happen to that information, you could be facing secret blacklists like
the UK.

If Apple create this ability its use will extend past the immediate issue and
affect people who genuinely need this security. In the digital world, as Apple
said, you cannot destroy something or lock it away once it's used.

~~~
hahainternet
> Can we not drag this down to the media scare tactic that is child
> pornography?

You mean can we not use the crime Apple has essentially made impossible to
secure a conviction on? No, I'm sorry. Kids being raped matters.

> For every sicko who's trading child porn there's dozens, if not hundreds,
> more who're trying to escape an abusive partner, leaving gangs, getting away
> from sexual slavery, or whistleblowing on government or companies breaching
> the law.

None of which is remotely related to phones being unbreakable _by law
enforcement_.

> If Apple create this ability its use will extend past the immediate issue
> and affect people who genuinely need this security

A fallacious slippery slope. If you let the Police search my house, they will
search everyone's house without a warrant and therefore it's bad to have any
searching ever.

~~~
braythwayt

      > No, I'm sorry. Kids being raped matters.
    

And I'm sorry, no _it doesn 't matter to this argument_. Civil liberties are
civil liberties, whether we are talking about investigating the theft of a
stick of chewing gum, the theft of millions of dollars, a slap in the face,
the breaking of an arm, the rape of a child, or the murder to thousands of
civilians by terrorists.

Bringing up the most horrible of crimes to justify a particular argument for
increasing the powers of law enforcement is an obvious attempt to appeal to
emotion, rather than to experience.

If your argument is any good, it will be just as good when talking about why
the police should be able to search your phone for evidence of tax evasion as
it is for talking about why the police should be able to search your phone for
evidence of child rape and terrorism.

Otherwise, we go to a place where we say, "Well, the shouldn't summarily
execute people who steal cigars from convenience stores, but when it comes to
terrorists, we shouldn't let laws get in the way of their need to do what's
expedient."

~~~
hahainternet
> Brining up the most horrible of crimes to justify a particular argument for
> increasing the powers of law enforcement is an obvious attempt to appeal to
> emotion, rather than to experience.

No it's actually an appeal to both. The increase in powers here if any exists
whatsoever is minimal. I'm advocating Apple comply with them.

> If your argument is any good, it will be just as good when talking about why
> the police should be able to search your phone for evidence of tax evasion
> as it is for talking about why the police should be able to search your
> phone for evidence of child rape and terrorism.

If they can search your home for it, they should be permitted to search your
phone for it. Both should have the exact same expectation of privacy and the
exact same judicial oversight.

> Otherwise, we go to a place where we say, "Well, the shouldn't summarily
> execute people who steal cigars from convenience stores, but when it comes
> to terrorists, we shouldn't let laws get in the way of their need to do
> what's expedient."

There is no evidence that there is any legal protection for Apple here and
strong evidence that indeed the FBI can compel them. Nobody is advocating
breaking the law or even going around it.

~~~
braythwayt

      > If they can search your home for it, they should be permitted to search
      > your phone for it. Both should have the exact same expectation of
      > privacy and the exact same judicial oversight.
    

They _are_ permitted to search your phone for it. The problem here is that
they are saying:

 _We wish to search this home, as is our legal right. The home contains a safe
that we claim we cannot open, and we wish to compel the manufacturer of the
safe to assist us to search the safe. The manufacturer does not wish to do so,
but we insist that they be forced to do so by threat of imprisonment._

 _Furthermore, we wish to do so by compelling the manufacturer of the safe to
create technologies that could open all safes, without the knowledge of the
safe owners. We claim we only want to open this one safe, but we have this
long track record of opening as many safes as we can, using secret courts and
hearings to obtain the right to search those safes without the owners of those
safes having the opportunity to argue against us, which is a different level
of judicial oversight than being applied to searching this one house._

------
hahainternet
> The issue is the precedent that is made

That searching the effects of a dead terrorist is acceptable? I really don't
find this objectionable in any way whatsoever.

> Apple could no longer argue they had to create something to fulfill the
> warrant, since it was already created.

Good, Apple's desires to not be subject to the laws of nations where they do
business has historically hurt their customers. Even now people argue that
Apple are doing this to protect their customers of which there's no evidence
whatsoever.

~~~
zaroth
Do you really not understand how precedent works or are you being deliberately
obtuse?

The criminal act being investigated is irrelevant. They want data on a device
in their possession that they have permission to search. That is all that
matters. It could just as easily be a drug investigation or, as Comey said
last week in front of congress, investigating a car accident.

~~~
hahainternet
> Do you really not understand how precedent works or are you being
> deliberately obtuse?

Please do not insult people on HN. This false dichotomy is offensive to me.

> The criminal act being investigated is irrelevant. They want data on a
> device in their possession that they have permission to search. That is all
> that matters.

Then why are you concerned about precedent? They have met all of the
requirements to search this device. Apple is impeding them for no good reason
purely to advance their corporate interests.

~~~
oldmanjay
You would prefer not to be queried as to the limits of your understanding, but
we would prefer you stop commenting assertively on the basis of your
ignorance.

------
tacos
There comes a time where the only competition for a huge corporation is the
government. And you can't win that one (AT&T) -- even when you do (Microsoft).

