
A Message to Our Customers - epaga
http://www.apple.com/customer-letter/
======
epaga
Huge props to Apple - here's hoping against hope that Google, Facebook, and
Amazon get behind this.

One thing I was wondering is how Apple is even able to create a backdoor. It
is explained toward the end:

"The government would have us remove security features and add new
capabilities to the operating system, allowing a passcode to be input
electronically. This would make it easier to unlock an iPhone by “brute
force,” trying thousands or millions of combinations with the speed of a
modern computer."

This is actually quite reassuring - what this means is that even Apple can't
break into an iPhone with a secure passphrase (10+ characters) and disabled
Touch ID - which is hackable with a bit of effort to get your fingerprint.

~~~
bambax
I don't see how this "reassuring"; to me it's rather very confusing (as
mentioned in many other comments).

If Apple could in fact write a software backdoor, doesn't it mean that the
backdoor exists, at least potentially?

And how can one be sure that Apple is the only company able to build that
door? At the very least, couldn't the right Apple engineer be either bribed or
forced (by terrorists or the government) to build it?

"Impossible" should mean "impossible", not "not yet done, but possible".

~~~
0942v8653
It's not a backdoor, it's a frontdoor. In cryptography, there's no way to make
repeated attempts more computationally expensive. The lockout just an extra
feature Apple put on, that Apple could easily remove. If we're going to have
4- and 6- digit PINs, there is no way to stop a dedicated attacker frome
brute-forcing it. None.

~~~
harryh
"there's no way to make repeated attempts more computationally expensive"

That's not true actually. For example, the industry standard for storing
passwords on a server (bcrypt) is specifically designed to slow down password
match attempts.

~~~
rhaps0dy
It is true. You're confusing making _repeated_ attempts progressively more
expensive with making all attempts more expensive to start with

~~~
harryh
Ah yes. You are right, I was confusing those two things. Thanks for the
clarification!

------
ghshephard
I'm surprised that nobody on this thread has commented on the real substance
of this response. It has nothing to do with Apple brute forcing iPhones for
the police (which it has done for years, with a simple court order) - but
instead, is Apple making it abundantly clear, that if they comply (or are
forced to comply) with the All Writs Act of 1789 to create _this_ particular
back door, then that opens the floodgate moving forward for all sorts of
requests to add backdoors/decrease security.

It's entirely possible, that the FBI can then use this precedent to simply
have Apple remove all security from an iPhone in pursuit of an active
investigation, which can be done with a straightforward firmware update -
which IOS users tend to do without much thought.

~~~
tomp
> Apple making it abundantly clear, that if they comply (or are forced to
> comply) with the All Writs Act of 1789 to create this particular back door,
> then that opens the floodgate moving forward for all sorts of requests to
> add backdoors/decrease security.

I read it differently. Apple is saying that if they make _this particular_
backdoor, then _this_ very backdoor can also be used in other scenarios, to
crack other phones (i.e. the backdoor would apply to all iPhones C, not just
to this one).

~~~
jrockway
I interpret as the OP does. The court document asks Apple to lock the
particular image to a particular serial number, so if all goes according to
plan, the same image could not unlock other iPhones. Obviously, writing
security code on a short deadline does not make for the best security, so
that's one worry.

But Apple's letter uses the expression "technique", which I think means
they're worried the government will get another court to make them change the
serial number and sign a new image "next time". Before you know it, Apple will
have to have an entire department to make these one-off images. Someone will
say, "you know, you could save yourself a lot of time if you just made it work
on any phone." Then that image will be leaked, and their security guarantees
will be dead. (One might also worry about the DRM implications.)

~~~
shkkmo
You and OP are both wrong:

"Specifically, the FBI wants us to make a new version of the iPhone operating
system, circumventing several important security features, and install it on
an iPhone recovered during the investigation. In the wrong hands, this
software — which does not exist today — would have the potential to unlock any
iPhone in someone’s physical possession."

Apple's argument isn't about a deluge of one-off court orders creating a
slippery slope to reducing security. Apple is claiming that complying with
just this one request would make Apple's other iPhone users significantly less
secure. There would be a piece of software, signed by Apple, that could
potentially be used to unlock any iPhone you have in your physical possession.

~~~
jrockway
Here's the exact text of the court order:

"Apple's reasonable technical assistance may include, but is not limited to:
providing the FBI with a signed iPhone Software file, recovery bundle, or
other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE.
The SIF will load and run from Random Access Memory and will not modify the
iOS on the actual phone, the user data partition or system partition on the
device's flash memory. The SIF will be coded by Apple with a unique identifier
of the phone so that the SIF would only load and execute on the SUBJECT
DEVICE."

How am I wrong?

~~~
shkkmo
You said:

 _" But Apple's letter uses the expression "technique", which I think means
they're worried the government will get another court to make them change the
serial number and sign a new image "next time""_

Apple's letter directly claims that the particular piece of software created
to comply with this request will reduce the security of it's users. Obviously
this means that Apple does not think that the SIF being hardcoded with the
unique identifier of the phone (sufficiently) mitigates the risk.

 _" make no mistake: Building a version of iOS that bypasses security in this
way would undeniably create a backdoor. And while the government may argue
that its use would be limited to this case, there is no way to guarantee such
control."_

Having re-read the OP more carefully, I think ghshephard is making a different
claim than you. He is pointing out Apple's arugment about the 'unprecedented
use of the All Writs Act of 1789'. If Apple can be forced to compromise their
security via court order like this, the FBI gains the power to force Apple and
any other US company to insert backdoors / decrease security.

 _" If the government can use the All Writs Act to make it easier to unlock
your iPhone, it would have the power to reach into anyone’s device to capture
their data. The government could extend this breach of privacy and demand that
Apple build surveillance software to intercept your messages, access your
health records or financial data, track your location, or even access your
phone’s microphone or camera without your knowledge."_

------
StillBored
Ok, so I completely fail to see how a random crazy guy with a gun who shoots
up a bunch of unarmed people has "national security implications". This seems
to be a "fact" that everyone wants to agree on, but is frankly a load of BS if
one considers the government probably already has his entire call/texting
history for the last couple years.

I see this as just another "its for the children" ploy, of which I'm
completely sick of.

In that I fully support Apple/etc for finally gaining a backbone. If more
people stood up, then I wouldn't have to be naked body scanned at the airport,
or the dozens of other privacy invasions the government performs on a daily
basis simply to give themselves something to do. So, rather than admit they
won't ever be able to predict or protect the population in any meaningful way
from random people willing to give their lives to make a statement, they waste
our time and money coming up with ever more invasive ways to peek into
everyone's most private possessions.

~~~
omarforgotpwd
If it was many of the shootings perpetrated by non-muslims every day in the
US, they wouldn't have cared. But because these Americans were muslims they
need to scan their phones and make sure they weren't getting orders from some
terrorist network.

~~~
pyre
Does it really matter? Most terrorist networks would be claiming
responsibility for the attack if they were involved.

------
firloop
Even though the matters are slightly different, I couldn't help but think that
Cook is giving off a Boards of Canada vibe in this post (in a good way).

 _" Now that the show is over, and we have jointly exercised our
constitutional rights, we would like to leave you with one very important
thought: Some time in the future, you may have the opportunity to serve as a
juror in a censorship case or a so-called obscenity case. It would be wise to
remember that the same people who would stop you from listening to Boards of
Canada may be back next year to complain about a book, or even a TV program.
If you can be told what you can see or read, then it follows that you can be
told what to say or think. Defend your constitutionally protected rights - no
one else will do it for you. Thank you."_

[https://youtu.be/1-FI6D8ZXpc](https://youtu.be/1-FI6D8ZXpc)

------
Robin_Message
If the UK record on anti-terror scope creep is anything to go by, not creating
this backdoor is a very good idea.

In the UK, laws originally intended for surveilling terrorists were/are
routinely used by local councils (similar to districts I think) to monitor
whether citizens are putting the correct rubbish/recycling into the correct
bin. [1]

This is a pandora's box, and the correct answer is not to debate whether we
should open it just this once, it's to encase it in lead and throw it into the
nearest volcano. Good on Apple for "wasting" shareholders money and standing
up for this.

[1] [http://www.telegraph.co.uk/news/uknews/3333366/Half-of-
counc...](http://www.telegraph.co.uk/news/uknews/3333366/Half-of-councils-use-
anti-terror-laws-to-spy-on-bin-crimes.html) \- and lest the source be
questioned, this is one of the more reactionary newspapers in the UK.

~~~
madaxe_again
And that was eight years ago - the state of affairs has since worsened - and
the sheer irony of those same tory critics now spearheading the push for even
broader surveillance powers. It's like crack for the political class (except
rob ford) - once they start they want more and more.

------
dh997
Tim Cook: a really nice guy with blue whale-sized cohones.

There can be no compromise because China, Syria and Turkey would also lean on
Apple to break into phones of dissidents, and pretty soon, future
whistleblowers here in US too in order to prevent leaks (iPhone 7 and iCar
notwithstanding).

That's the tradeoff in not giving in to faint, vague "maybes" that there were
"external coordination" when in all likihood it was the ultraconservative,
Saudi half leading this duo into the kookooland of violent extremism.

The security services will just have to buy exploits, develop malware,
cultivate human intelligence sources and monitor everything the old-fashioned
way... It's not like that kid in a YouTube video finding a jailbreak exploit
for an iPhone and not releasing a tool is going to sit on it, he's going to
auction it off to the shop or country with the most $$$.

~~~
vacri
> _Tim Cook: a really nice guy with blue whale-sized cohones._

... backed by a company that at one point literally had more cash than the US
government. A company with a strong, expansive, and experienced legal team.
He's not a small fish; he's a major captain of industry and has a _lot_ of
political clout. I mean, good on him for his standing on this issue, but he
wields a lot of power here.

------
marak830
Im generally not an apple supporter(i dont like the closed eco system), i am
very plesantly surprised they posted this.

I am quite disappointed that the us courts are trying to force apple todo
this, and in my opinion, its just to use this case to set a precedent.

I hope Apple cant get it to work, but id hate to see what the courts would do
if that happened.

~~~
lazaroclapp
There are basically two groups of large software companies around right now:
those which make their business by collecting data, and those which make their
business by licensing software[1]. The first group has an overwhelming
incentive to not support privacy too strongly. The second group has an
overwhelming incentive to not allow too much openness. Until a better business
model (or zero-knowledge machine learning) is found, no large for profit
company can support both goals to their final conclusion[2]. So we are left
choosing one evil or the other[3].

[1] Sure, Apple only really sells hardware directly, but the software is a
significant part of the reason a lot of people by Apple hardware (e.g. 'Mac's
don't get viruses', 'iPhones have a better user experience').

[2] Sure, Google has some significant internal efforts for supporting better
user privacy (e.g. [https://googleonlinesecurity.blogspot.com/2014/12/an-
update-...](https://googleonlinesecurity.blogspot.com/2014/12/an-update-to-
end-to-end.html) ) and Apple maintains some superb open-source software (e.g.
[http://llvm.org/](http://llvm.org/) ). But in the end, Google can't be a
"privacy company" without hurting their business model and Apple can't be an
"open source company" for the same reason.

[3] Or the non-trivial inconvenience of being a self-hosting free software
purist

~~~
S4M
I can't upvote enough that excellent summary of the situation of software
companies.

One way to solve that would be to have governments support and subsidies open
source software development, but I don't see that happening in the next 5
years at the very least.

~~~
lazaroclapp
So, it is far from a simple problem. For common infrastructure, one can argue
governments could fund open source in the same way they fund highways and
bridges and physical fiber optic links. But I am not so sure that model would
be the best to innovate in end user applications, and I say so as someone
working on publicly funded research software.

There is something to be said about the market's ability to make decentralized
decisions and focus on satisfying people wants[1], so a centralized software
economy is also not a good solution. The problem with markets here is that
strong privacy and open source are, for the most part, positive externalities.
As a user, the benefit you get from having strong privacy yourself is not
usually noticeably high, nor that of having access to the source, specially
for a non-technical user, yet society arguably benefits from both. Usually the
answer to a problem of unaccounted externalities is government regulation, but
in this case, large-enough-to-matter governments have been unanimously on the
side of less privacy, rather than more (as is the case of the original
article).

[1] Ideally, software should be designed so that it preserves privacy as much
as possible while achieving its function, and is open source, _and_ it
provides all the million features and reasons people use something like
Facebook, Snapchat, Youtube, etc. Just having privacy preserving software
written for and by technophiles is not and can never be a complete solution.

------
nindalf
I'm really impressed that Apple is standing up to the government and
protecting its users' rights. I've never really considered the iPhone worth
the premium price tag, but policies like this have changed my mind.

Could someone answer a question I have though? The government wants Apple to
create this backdoor and tailor it to the specific device, so presumably it
will have a line that goes

    
    
        if (!deviceID.equals("san_b_device_id")) 
            return;
    

To make the backdoor general purpose, this line would need to be removed. But
doing so would invalidate the signature and it can't be resigned afterwards
because the attacker won't have Apple's signing key. So is the open letter a
matter of principle that they won't build _any_ backdoor, now or in the
future, rather than a specific concern about this backdoor?

~~~
camillomiller
Technically you're right, legally think of the huge precedent. FBI is using
the San Bernardino case as a legal crowbar, and it's awful.

~~~
BWStearns
If they beat the order they could always do what GP said, crack the password,
and hand in a decrypted copy of the device. Everyone goes home happy: no
precedent set, FBI gets their data, going into the future as old iOS devices
die Apple won't even be able to pull that stunt again if they want to.

~~~
jonathankoren
But if they beat the order, and that they've made a big deal about going
against the order, why would they go ahead and compromise the device's
security? What would be the point? Just to tell the government, "Hey, don't
worry about all that stuff we said, we didn't mean it?"

If they do it once, they'll do it again.

~~~
BWStearns
1) They're not saying they don't want to help investigate the SB shooters,
only that the order illegally expands the use of the All Writs Act and sets a
bad precedent for democracy.

2) If they beat the order then the FBI needs to find a new way to compel Apple
to help them do shit. That likely means the FBI needs federal legislation
passed, which in the current climate will buy Apple considerable time. This is
why they want to beat the order (though the feds could further appeal all the
way to the.... wait for it.... 8 judge supreme court!) Though I think Scalia
would have be on our side on this one). It's not about this particular case
since there will be others, it's not about this particular order since there
will eventually be legislation.

3) So why go ahead and do it anyways? Naive (but still valid) reason: they
happen to be able to help and without it being ordered they can do it without
handing over a tool for ad hoc decryption. They can even go to Hawaii after
and throw the dev machines into a volcano to satisfy their inner hobbit (I
highly recommend this part of the plan) to ensure that no one can abuse the
power of the one ring.

The non-naive answer is that for a quick project they get to show to the
public that no, we the nerds are not so obsessed with abstract systems level
thinking that we won't help when we can. It gets a lot harder for that moron
Comey to hit the morning shows and throw shitty innuendo at the tech industry
implying that we're aiding the terrorists.

Both encryption and terrorism are complicated subjects that are scary to the
average American and although they distrust the government, they also distrust
Silicon Valley. The cryptowars aren't about being right or they'd have stayed
dead in the 90s where they belong. Basically Apple makes tech look like the
good guy fighting terrorism, and for anyone who cares (smaller audience than
the fighting terrorism bit) they also defended your civil liberties.

4) This trick only works on older devices. They will die out soon anyways.
Newer devices are safe anyways. If they beat the order and do this one case
voluntarily then no precedent is set, so they can't be bullied into doing it
and old devices are safe.

One device compromised, all other devices safe, order beat, PR win, Comey
looks like a prick even to the uninformed next time he insinuates that we're
the enemy.

~~~
jonathankoren
But this all hinges on the naive assumptions that this is a one off and will
never happen again, and that later generation devices are immune from any
circumvention attempt. History says this isn't how this plays out.

If you crack the encryption once you'll get orders to crack it again and
again, and in much lower profile and lower stake cases. Look at the prevalence
of espionage tactics such as Stingrays and "parallel construction" by law
enforcement. There may not always be someone you can pump up into an crack
international terrorist, but there's always some low level drug courier, or a
"quality of life" criminal to use your new toys on.

Also you're saying hat this doesn't set a precedent, but it does. Sure there's
not a court case to point to, but it's a precedent none the less. It's that
the company not only has the means, but the will to do it. What's stopping the
government from coming back a second time, or a third time about this? What
argument do you have on either a legal court or the court of public opinion to
make when you stand up and say, "That first time was an exigent situation, and
so was the second, and the third... But this time, the fourteenth time, THIS
TIME we really mean no more!"

Finally, I don't think this trick only works on older devices. The FBI wants
them to be able to brute force the passcode through a USB connection instead
of making some sort of robot to tap the screen a bunch of times. Also
presumably the FBI wants the the two many incorrect attempts lockout feature
disabled as well, otherwise their just going to be waiting for hours on end.
Why wouldn't this rather low sophistication approach work? from a technical
stand point this is no more complicated than a mouse jiggler[0]. Of you're
arguing that iPhone 6=< have some sort of "Mission: Impossible" self destruct
mechanism, I'm sure it could be disabled given enough resources and
motivation.

Finally (for real this time!), making a big stink and then capitulating is
never a PR win. You just look like a tool to everyone involved. To the anti-
encryption side you're a weak and can be rolled, and to the pro-encryption
side you're a sell out.

[0] [https://www.elie.net/blog/security/what-tools-do-the-fbi-
use...](https://www.elie.net/blog/security/what-tools-do-the-fbi-use-when-
seizing-computers-or-the-curious-case-of-the-mouse-jiggler-device)

~~~
BWStearns
Both Paris and SB were great examples of the terrorists basically not
bothering with using encryption, if they had actually used and benefited from
encryption we'd be discussing how to get cryptography re-legalized.

I doubt there's even anything on the phone the FBI don't have from other
sources. The reason they're using the All Writs Act with this case is because
of the publicity of the case so they can point to Apple prioritizing some
vague principle most people don't care about over real dead people. Apple's
doing a good job making their argument to those who care about said vague
principle, but not to the general public.

I think you're underestimating how badly we're going to be taken to the
woodshed on this the first time it's opportune. My be it it'll be some cute
little girl dies in a Nancy Grace friendly way and the FBI manages to convince
the public that "if only we could have broken these messages" etc. It might
even be true in that one freak instance, but then we'll have
"[cute_little_girl.name]'s Law" which will make sure that such a tragedy never
gets exploi.... reported again, by making sure that the government can read
messages when they need to. The US market is too large not to capitulate at
that point. That bill passes if the voting public can be stirred up against
the greedy and aloof tech sector. It doesn't pass if they see the tech sector
and its goals as reasonable, and while we should continue trying to educate
people on why encryption is good and important for them, you don't change the
number of minds we need to change with rational arguments (or again, we'd have
won already).

Edit: very interesting link on the mouse jiggler though, thanks for that!

~~~
jonathankoren
We're already in the woodshed. We've been there for 15 years.

~~~
BWStearns
Yeah, but it'll get worse: [https://www.washingtonpost.com/news/the-
switch/wp/2016/02/17...](https://www.washingtonpost.com/news/the-
switch/wp/2016/02/17/apples-risky-bet-on-protecting-a-terrorists-
iphone/?hpid=hp_hp-top-table-main_apple-biz-650pm%3Ahomepage%2Fstory)

Choice quote: "Apple will become the phone of choice for the pedophile"

------
Artemis2
Publicizing the case themselves in a very good move.

However, the iPhone of the attacker is an iPhone 5C, which does not have Touch
ID or a Secure Enclave. This means that the time between passcode unlock
attempts is not enforced by the cryptographic coprocessor. More generally,
there's no software integrity protection, and the encryption key is relatively
weak (since it is only based on the user's passcode).

The amount of work needed to turn security into good user experience is
phenomenal:
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

~~~
joeblau
You can still enable full disk wipe after 10 failed password attempts[1]. That
was available in iOS 7 I believe (but someone on here will correct me if I'm
wrong).

[1] - [http://i2.wp.com/ioshacker.com/wp-
content/uploads/2014/09/Pa...](http://i2.wp.com/ioshacker.com/wp-
content/uploads/2014/09/Passcode-lock-erase-data.jpg)

~~~
lern_too_spel
The FBI's point is that Apple can update the software on the device to not
honor that setting, which means it is not effective against a government
attacker.

------
lemming
This may be one of the most important things Apple has done. Whether or not
you agree with their position, it's _incredibly_ important that tech companies
start publicly explaining things like the fundamental problems with backdoors
so that a lay person can understand it. Apple have the credibility to make
non-technical people take their argument seriously, and the reach to get the
message out to a vast number of people. I'm really pleased they're taking this
position.

------
GreaterFool
This is interesting:

"Specifically, the FBI wants us to make a new version of the iPhone operating
system, circumventing several important security features, and install it on
an iPhone recovered during the investigation. In the wrong hands, this
software — which does not exist today — would have the potential to unlock any
iPhone in someone’s physical possession."

Am I reading this right? Apple, if they chose to, can make a version of iOS
that disables security features and encryption and load it onto existing phone
even though the phone is locked and encrypted?

~~~
teps
What I don't understand is why Apple could create such software but a hacker
could not exploit it. I feel like that mean that there is already a backdoor.

~~~
jhasse
Because you can only deploy it, when you have Apple's private key.

------
dcw303
Very impressive letter. They've expressed their position in language that a
layman can understand, there's abundant evidence that they respect the intent
of the law authorities, and even clearer evidence that they are drawing a line
in the sand based on their principles. They will protect their customers.

I wish more companies could speak so clearly and courageously.

~~~
wildmXranat
>They will protect their customers

Where is this stated so that I can claim damages if they break said promise.

I'm sorry, but how can it not be seen that it a really is bad sign that Apple
has made this public. They may already have built the backdoor and this is a
public stunt or no matter what you do, owning a smart is not that smart.

~~~
briandear
If you'd like to claim damages, you would have to prove damages.

------
ThrustVectoring
This is the sort of thing that a professional organization - like what medical
doctors have - could help with. Let me explain.

The court order gives Apple an out: "To the extent that Apple believes that
compliance with this Order would be unreasonably burdensome, it may make an
application to this Court for relief".

Now, imagine if this was court ordering a company to engage in unethical
medical procedures, rather than unethical software development. The
professional medical community would sanction doctors that cooperated and
support those that stood by their ethical principles and refused to cooperate.
If there was a similar professional organization for software development,
Apple could reasonably rebut that telling their engineers to work on this
would be unreasonably expensive (since they'd expect to fire people or have
them resign over it).

This is another avenue for fighting the order - have a good chunk of Apple's
engineering department sign an open letter saying that they'd resign before
working on that project. The incentives seem like they'd work for making it a
thing.

~~~
tarblog
Professional ethics of software engineering is definitely something we're
going to have to grapple with more and more. Another aspect is being asked to
use Dark Patterns in a UI or build a skinner box into a game or app. There's
evidence that these things do harm to people and having a professional
organization that could help stand up to such things could be part of a
solution.

------
thothamon
So the FBI is asking Apple to build a tool that will unlock security measures
of an existing iPhone, like the one in the San Bernadino shooting, and allow
it to be read.

The problem with this is that no such tool should be possible to build. It
should not be a matter of yes or no; it should be simply impossible for Apple
to build such a tool without the private key of the user, which Apple does not
have.

If it is possible to write a piece of software which can circumvent the
protections of the iPhone without the user's private key, then Apple wrote its
security software incorrectly. Either they wrote it with an appalling lack of
security understanding; or they left in important backdoors, either knowingly
or through ignorance. But if they wrote the software correctly and did not
create backdoors of which they're aware, then the government's request is
actually impossible -- cannot be done.

So which is it, Apple? Is the point moot because you did this right? Or have
you already placed backdoors in the product which the FBI is now asking you to
exploit for their benefit?

~~~
0942v8653
I think the missing information here is how the phone is encrypted. If it's
done with the 4-digit numeric PIN, then the software could be built; it would
take 10000 tries, but at less than .1 seconds per try, it would be able to
crack the code in about 15 minutes. The current iPhone has a protection for
this; after some number of tries, it will lock you out for increasing time
intervals.

This is the only way that their claims might possibly be valid.

And a reminder, then: change your iPhone's password to a more complex one. _If
apple doesn 't make this fake OS, someone will_.

Edit: to expand on this, Apple's PR goal was to take advantage of the NSA mass
surveillance scare. On-device encryption is not very relevant to that. iCloud
security is much more important, and they've been quietly granting data from
it to the Feds. Including iPhone backups which contain most of the data
they're looking for.

~~~
eastbayjake
I mean this sincerely: has the government used one of its 10 tries on the
attacker's birth year? I hope the government has burned a couple tries on low-
hanging guesses before going through this legal hassle.

~~~
frandroid
You don't want to burn any tries in case the Apple developers would need a
few?

------
asymmetric
I see a lot of people saying they're impressed, admired, etc. at Apple for
doing this.

It's not about giving props: Apple is not doing this out of goodwill, or
because they _believe_ in protecting privacy. Apple has a competitive
advantage against Google/Facebook in that its business model does not depend
on violating their customer's privacy.

They are just exploiting that competitive advantage.

Cfr. [https://ar.al/notes/apple-vs-google-on-privacy-a-tale-of-
abs...](https://ar.al/notes/apple-vs-google-on-privacy-a-tale-of-absolute-
competitive-advantage/)

~~~
rms_returns
You are right. What you say about Facebook is true, but Google's Android is
open source, so there is no way they can plant a privacy-invading code and get
away with that.

~~~
giovannibajo1
No phone on earth runs the open source version of Android that you can
download from git. They all run custom versions that include not only closed
source personalizations to the system, but they also run lots of closed code
as root (play services first and foremost).

The reason why this doesn't happen with Android is much more mundane: most
Android phones are not encrypted so the FBI doesn't need help to read all the
customer data. They just need to open the phone and dump the flash.

------
callumlocke
> "Specifically, the FBI wants us to make a new version of the iPhone
> operating system, circumventing several important security features, and
> install it on an iPhone recovered during the investigation. In the wrong
> hands, this software — which does not exist today — would have the potential
> to unlock any iPhone in someone’s physical possession."

So there is already a backdoor. Apple are refusing to let the FBI _use_ the
backdoor.

The backdoor is the fact that Apple can push a firmware update that disables
security features, without the device's password being entered at any point.

------
Fiahil
I like the position apple is taking, However, after reading the letter, I
noticed it misses a point I consider even more important than just "a
dangerous precedent".

Apple is selling devices on the whole planet, not just in the USA. So, what's
the FBI (an _American_ agency) is requesting is not dangerous for only
American citizen, but also for iPhones' owners in Europe, Asia, Africa,
Oceania. Hell, these people are not even part of the debate, because they
don't belong in the "American democracy".

If I'm going to be affected by someone else's policies, I would like to be _at
least_ allowed in the discussion.

~~~
err4nt
I wouldn't worry too much about non-American apple users. Those of us outside
the US know how tangled law and tech can be - and so people guarding
information tend to use non-American cloud services for example.

If this goes through I just expect more people internationally to choose
something else. Not a big loss to them, but the loss of business to Apple (a
US company) might be felt.

------
wskinner
As others have noted, this is probably mostly about branding. But that's why
it is genius. Tim Cook is committing Apple to this pro-privacy position in a
very public way. This means that a reversal of this position or a revelation
that Apple has been acting contra it, would be extremely expensive to Apple's
reputation with its customers, effectively costing the company a huge amount
of money.

By publicly committing Apple to this cause, Cook makes it more likely that
internal teams at Apple as well as future versions of the company will adhere
to this position. By defining a set of actions which, if made public, would
ruin the company's brand, Cook makes it less likely Apple will take those
actions.

~~~
apozem
You're exactly right about the effects on the company as a whole. A CEO is
like a cheerleader and ship's captain for a company.

Nilay Patel over at The Verge said on one of their podcasts he once asked
Satya Nadella what it was like to be the CEO of a company as large as
Microsoft. Nadella told him being CEO meant telling a big-picture vision to
the press and the company over and over again until everyone started going in
that direction.

------
mikeash
I'm clearly in the minority here, but I don't really understand Apple's
position here, nor do I understand why everyone is rallying behind them.

Apple built hardware which was not particularly secure. The software defaults
to a four-digit PIN. They attempt to mitigate this by adding an escalating
interval between entries, and by optionally wiping the phone after too many
failed tries, but this is not set in stone and those limits can be removed
with a software update.

The government is coming to Apple and saying, "You can remove these limits. Do
that for us on this phone." Coming as a legitimate court order, I see no
problem with this request. The government isn't even asking them to crack the
phone, they just want Apple to remove the limits so the government can try to
brute force it. They're even paying Apple for their trouble.

If Apple didn't want to be put in a position where the government can ask them
to compromise their users' privacy, they should have built hardware which even
they couldn't crack. And of course they _did_ ; starting with the A7 CPUs, the
"secure enclave" system prevents even Apple from bypassing these limits. The
phone in question just happens to predate that change.

If the government was demanding that Apple do the impossible, I'd be on their
side. If the government was demanding that Apple stop manufacturing secure
phones, I'd be on their side. But here, all they're asking is for a bit of
help to crack an insecure system. They're doing this in the open, with a court
order. What's the problem?

~~~
terda12
> The government isn't even asking them to crack the phone, they just want
> Apple to remove the limits so the government can try to brute force it.
> They're even paying Apple for their trouble.

Well this exact thing isn't THAT big of a deal but it's a slippery slope. If
Apple agreed to this then what else can the government ask them to do under
the banner of "public safety"? And if Apple were to give the government an
electronic way to brute force the touch codes, it would break the trust of
every iPhone owner.

~~~
mikeash
I don't see the slippery slope here. The government is asking Apple to do
something that is both possible and reasonable. I see no slope to that from
other typical court orders.

Giving the government a way to brute force PINs wouldn't break the trust of
every iPhone owner, merely the owners of iPhones with pre-A7 CPUs. And great,
if they trusted Apple on this their trust was misplaced. You can't trust
companies not to unlock stuff when the government requests it with a
legitimate court order. If you want Apple not to decrypt your data, the only
way to ensure that is to make it so they _can 't_.

Again, Apple _has_ (so far as we know) made it so they can't, on newer
hardware. But this phone that the FBI is trying to get into is older hardware
and built such that Apple can get into it. If you're looking to point fingers,
blame Apple for building not terribly secure hardware. But don't point fingers
too hard, because they're doing it a lot better now.

~~~
zepto
The slope is that if they can order Apple to engineer one thing, they can
order them to engineer another.

It is possible for Apple to the weaken the secure enclave on all future
iPhones. It would be reasonable to do so from the point of view of giving law
enforcement a useful tool. Therefore since Apple can be ordered to do
engineering to make law enforcement easier, why should they not be ordered to
do this?

That is the slippery slope.

~~~
mikeash
> The slope is that if they can order Apple to engineer one thing, they can
> order them to engineer another.

How does that at all follow? Right now, a cop can lawfully order me to
identify myself. Does that mean they can also lawfully order me to go to the
nearest coffee shop dressed as Bozo the Clown and shout, "I am in love with
the ghost of Princess Diana"?

I don't understand how complying with an order to use an _existing_ security
hole to break into in someone's device somehow sets a precedent that the FBI
can in the future go to Apple and set the parameters for how their products
are designed.

~~~
zepto
Because the hole doesn't actually exist unless Apple engineers custom code for
the FBI. If the FBI can force Apple to engineer code to create security holes
for them, that establishes a precedent.

Explained better by someone else here:
[https://news.ycombinator.com/item?id=11120036](https://news.ycombinator.com/item?id=11120036)

~~~
mikeash
I would argue that the hole is the fact that Apple can even load new software
that allows this attack. It already exists.

But I'm not sure that distinction is important. The other comment you linked
to lays it out pretty nicely and it doesn't rely on a hole existing it being
created. It's ultimately just about compelling creation.

I wonder, what if the FBI just requested the relevant signing keys and source
code? That seems like a much worse outcome, but at the same time less of a
reach.

~~~
zepto
Why is that less of a reach?

~~~
mikeash
Because it's just asking them to turn over information the authorities need
for their investigation, which is a pretty normal sort of request. None of
this troublesome asking them to build new software.

~~~
zepto
Fair point.

------
cromwellian
There's a simple way to defeat Apple's argument. The judge could simply ask
Apple to flash the new firmware on that phone, let the FBI run the brute force
under their supervision and obtain the contents they need, and then flash back
a non-compromised version of the OS.

The government would never have access to a phone with a compromised version
of an OS that they could use to repeat this trick. Rather, the government
would have to obtain court orders and have forensics done under supervision.

This isn't a backdoor and doesn't affect consumers, and sets a really high bar
to trying to scale this for the government because it requires Apple as the
gatekeeper every time to agree to do the one-off hack.

The cynic in me thinks that this letter is more about brand image. Apple wants
to claim they can't hack their own phones, even if the government asks, but
clearly in the case of the iPhone 5C it _IS_ possible for them to do it, and
this creates a contradiction with their public marketing and privacy position.
If they didn't release this open letter, then simply complying with the judges
order would make them look bad.

~~~
nroach
The problem is that once created, it would be easier for future warrants to
ask Apple to simply re-perform the same trick it's done in the past. Apple's
core argument is that allow this once opens the door to doing it repeatedly
because right now Apple doesn't have the toolchain to do this. Once the
toolchain exists, its deployment is trivial.

~~~
cromwellian
So? At least in American jurisdiction, the 4th Amendment doesn't guarantee the
right to unbreakable crypto. It says:

"The right of the people to be secure in their persons, houses, papers, and
effects, against unreasonable searches and seizures, shall not be violated,
and no warrants shall issue, but upon probable cause, supported by oath or
affirmation, and particularly describing the place to be searched, and the
persons or things to be seized."

Upon probable cause, the government may issue warrants. The US government,
backed by the people of the United States, have a right to compel a private
corporation to comply with reasonable searches and seizures on probable cause
with a warrant.

The government is not asking Apple to deploy nuclear weapons, nor ship all
iPhones with a hack. They are specifically asking for help with one vulnerable
phone, an iPhone 5C. They may be asked to do this multiple times, but they can
keep whatever engineers and tools they use internally private.

I mean, let's get real for a second. The toolchain already exists. Apple has
the source code, hardware simulators, debugging harnesses, and the original
engineers. There's no magic. As long as those things exist, the danger of a
hack getting public is real, especially if the source for iOS is ever stolen,
or one of the core engineers goes rogue. If Apple's own internal security
can't keep a more polished tool under wraps, they won't be able to keep the
subcomponents of it under wraps.

There's a reasonable middle ground between "government has a backdoor and can
scan and read everything" and "it's impossible for the government to even
obtain legal warrants on probable cause for a very targeted piece of
information" What's being discussed in this case is not Snowden-level drag-net
snooping. We're not even talking about a wire-tap. We're literally talking
about the government finding a Safe/Vault inside the house of a murderer, and
talking to the manufacturer of the Safe/Vault to get them to pick the lock
without destroying the evidence inside. The Vault-maker in this scenario
doesn't even have to give over the blueprints of the proprietary lock
mechanism, they just need to open this one vault.

~~~
nemothekid
> _I mean, let 's get real for a second. The toolchain already exists. Apple
> has the source code, hardware simulators, debugging harnesses, and the
> original engineers. There's no magic. As long as those things exist, the
> danger of a hack getting public is real, especially if the source for iOS is
> ever stolen, or one of the core engineers goes rogue. If Apple's own
> internal security can't keep a more polished tool under wraps, they won't be
> able to keep the subcomponents of it under wraps._

> _As long as those things exist_ This is false. Apple could hand you all the
> things you mentioned and you still wouldn't be able to break an iPhone 5C.
> You would still need Apple's master private encryption key.

As you are framing the question, is, the government should force Apple to hand
over their private encryption keys. If thats so, should citizens in other
countries be wary of the fact that their data stored in Apple servers are
privy to the US govt? Or should Americans be worried that China can coerce
Apple to hand over encryption keys?

In terms of global politics, The vault maker in this scenario has to worry
about other strong men asking for such a master key when they need to hunt
down gay men or something as trivial, once they realize this is possible.

~~~
cromwellian
No, I'm saying Apple shouldn't hand over their encryption keys. I'm saying the
FBI should hand over the iPhone, and Apple hands back the files, but doesn't
give them any hacked phone.

In a paperless world, and unbreakable encryption, what is the point of
warrants or regulations at all?

If a company that say, committed crimes, financial or criminal, has a warrant
served on them, what if the response is, "Hey, we'd love to give you our
emails, but all employees use end to end encryption, and every desktop has
unbreakable filesystem crypto, and our IT department can't unlock anything, so
you must compel the users to hand over keys?"

Can that be a defense against all warrants and crimes? If politicians are
suspected of accepting bribes with strong probable cause, do we simply accept
that their phones and email communications with lobbyists and corrupt bribers
can't be accessed?

What does society resort to then, rubber hose cryptanalysis? Imprisonment on
lack of evidence until they turn over the keys?

Transparent democracy is on a crash course with cryptoanarchy. The same people
who are chanting for absolute unbreakable cryptography are some of the same
people supporting Bernie Sanders and would rail against offshore Cayman or
swiss financial obfuscation by the mega rich.

If we want non-corrupt government and industry, we need a way to investigate
serious crimes. In the past, this meant seizing papers, letters, and records
under warrant. Nowadays, it may be possible for the entire digital crime trail
to be unbreakable with no recourse except catching people in the act. However,
when the FBI entraps people with sting operations "in the act", civil
libertarians decry that too.

So how do we police the bad? If you look at many third world countries with
trouble advancing, a lot of is due to corruption. Is the danger of the
government subpoenaing your email worse than the danger of tens of thousands
of corrupt businesses spreading financial risk all over the economy and
political system?

~~~
nemothekid
> _If a company that say, committed crimes, financial or criminal, has a
> warrant served on them, what if the response is, "Hey, we'd love to give you
> our emails, but all employees use end to end encryption, and every desktop
> has unbreakable filesystem crypto, and our IT department can't unlock
> anything, so you must compel the users to hand over keys?"_

Are you an American or Foreign? In the American constitution, the 5th
amendment, legally protects a party from being forced to incriminate one self.
So if you are still alive, and slapped with a warrant, your rights protect you
from giving up your private key.

~~~
cromwellian
I'm aware of the 5th amendment, but prosecutors have worked around it and
journalists have gone to jail to protect sources. [http://www.rcfp.org/jailed-
journalists](http://www.rcfp.org/jailed-journalists)

Consider the various ways to police crime:

1: Before the fact, preemptively. Active surveillance and-or entrapment.
Widely criticized.

2\. After the fact, forensic analysis. Previously, physical evidence
collected, warrants for documents. In digital realm, foiled by cryptography.
Attempts by government to restore status quo to pre-digital capabilities
widely criticized.

3\. Compulsion. Prosecutors lean hard on individuals with digital evidence to
turn over materials. Runs afoul of 5th amendment and civil libertarians.

At least for many types of crime, especially white collar crime, this leaves
the authorities almost no recourse. Your politicians can communicate securely
with their paymasters, and receive untraceable payments over bitcoin. Although
you may find HUMINT witnesses who can give you probable cause, there may be no
way to obtain real evidence.

The Silk Road founder was only caught because of active surveillance,
literally caught him with his computer unlocked. This would be like waiting
for the San Bernardino killers to unlock their iPhone, and them seizing it
before the auto-timer relocked the screen. Not exactly possible for all types
of crimes.

Is active physical surveillance of suspects by the state any less creepy than
digital warrants?

------
rdl
As far as I'm aware, the most proper attacks here are, in order of cost:

0) Find some errata. Apple presumably knows as much as anyone except NSA. Have
plausible deniability/parallel construction.

1) OS level issues, glitching, etc. if the device is powered on (likely not
the case). Power stuff seems like a particularly profitable attack on these
devices.

2) Get Apple, using their special Apple key, to run a special ramdisk to run
"decrypt" without the "10 tries" limit. Still limited by the ~80ms compute
time in hardware for each try.

(vs. an iPhone 5S/6/6S with the Secure Enclave:)

3) Using fairly standard hardware QA/test things (at least chip-level shops;
there are tens/hundreds in the world who can do this), extract the hardware
key. Run a massively parallel cluster to brute force a bunch of passphrases
and this hw key, in parallel. I'd bet the jihadizen is using a shortish weak
passphrase, but we can do 8-10 character passphrases, too. They may have info
about his other passphrases from other sources which could be useful.

While I'm morally against the existence of #3, I'm enough of a horrible
person, as well as interested in the technical challenge of #3, that I'd be
willing to do it for $25mm, as long as I got to do it openly and retained
ownership. In secret one-off, $100mm. I'd then spend most of the profits on
building a system which I couldn't break in this way.

------
owenwil
I _really_ hope they actually physically can't access the data on this phone.
It's entirely possible this could be the case -- I've been trying to consider
the vectors they could use:

\- lightning cable delivered iOS patch (probably won't work because iOS won't
negotiate over USB until you tap a dialog box)

\- OTA update (not connected to internet)

\- Cracking open the device and accessing the storage directly (encrypted
until boot time)

The most likely vector I can think of:

\- - Lightning cable delivered iOS patch _from a trusted computer_ (i.e one
that the terrorists actually owned)

It's quite impressive that Apple is taking a stand like this, though perhaps
unfortunate timing WRT the larger encryption debate.

------
imroot
If you look at past cases where the All Writs Act has been invoked, the Courts
have rejected this type of government conscription.

Effectively, the government is forcing Apple to take receipt of a device that
it does not own or posses, then perform potentially destructive services on a
device, and then perform services that could potentially require Apple to
testify at a trial under the Confrontation Clause of the Sixth Amendment.

I really think that Apple's in the clear here, and the AUSA's in the case are
pulling all the stops to get Apple ordered to break the encryption.

------
jasonlingx
The fact that they can create this backdoor, doesn't that mean it already
exists?

What Apple needs to do then instead of writing this letter, is release an
update that closes this backdoor.

~~~
SideburnsOfDoom
> The fact that they can create this backdoor, doesn't that mean it already
> exists?

Quite likely.

As I posted on the other discussion here:
[https://news.ycombinator.com/item?id=11116343](https://news.ycombinator.com/item?id=11116343)

> If it's possible to make such a "backdoored" build of iOS, then there are
> state actors who will be throwing $Millions at doing it already, with or
> without any willing help from Apple.

~~~
danieldk
The security here is Apple's signing key that is used to sign updates. If you
believe RSA is safe (which I assume is being used) and the key has a
reasonable length, throwing a couple of millions at it won't bring you much.

I guess what the FBI wants is a backdoored iOS version and to have Apple sign
it with their signing key (which means that the FBI can use it over and over
again).

~~~
SideburnsOfDoom
That is hard, we know. But it is not impossible for those with time, resources
and willingness to think outside the box.

For instance, signing keys can and have been stolen, on the principle of "if
you can't brute-force it, hack in and take it".

[http://arstechnica.com/security/2013/02/cooks-steal-
security...](http://arstechnica.com/security/2013/02/cooks-steal-security-
firms-crypto-key-use-it-to-sign-malware/)

[http://blogs.adobe.com/security/2012/09/inappropriate-use-
of...](http://blogs.adobe.com/security/2012/09/inappropriate-use-of-adobe-
code-signing-certificate.html)

[http://www.androidauthority.com/ssl-added-removed-google-
moc...](http://www.androidauthority.com/ssl-added-removed-google-mocks-nsa-
crypto-code-easter-egg-390162/)

[http://arstechnica.co.uk/tech-policy/2016/02/its-legal-
for-g...](http://arstechnica.co.uk/tech-policy/2016/02/its-legal-for-gchq-to-
break-into-computers-and-install-spyware-tribunal-rules/)

~~~
danieldk
_That is hard, we know. But it is not impossible for those with time,
resources and willingness to think outside the box._

I assume that Apple has a hardware security module for key generation and
storage, perhaps even custom-designed and built, to prevent key
extraction/copying.

Of course, in the end you have to trust Apple that only a limited number of
employees have access to such hardware, that they have proper auditing
procedures, etc.

But the probability of a compromise can never be 0, unless you design and
produce your hardware yourself, write all code that this hardware runs
yourself, and never leave your computing device unattended. Since that is not
practical for most people, you have to put trust in some third party.

~~~
SideburnsOfDoom
If there's one thing that we have learned over the last few years from Snowden
et al, we have learned that it is safe to assume that these state actors will
be trying all the avenues that you or I can think of, and spend years
discovering new ones that we have not thought of.

~~~
danieldk
I have trouble understanding your point. What's the alternative to trying to
minimize the probability of crypto system compromises?

~~~
SideburnsOfDoom
You make a good case that Apple are far ahead as industry leaders here; that
seems solid. It's less solid that this means they are impervious, or that the
thinking "I can't see any holes in this process, therefore there are no holes
in it" is sound.

------
whack
I've never been an Apple fan but this was a fantastic and bold move by them.
Software security and hacking is already an enormous problem that every single
person has to deal with. Even major companies like the NYTimes have been
hacked by malicious users in the recent past. We need to take every reasonable
action to combat this threat. Building deliberate vulnerabilities (yes, every
backdoor is a vulnerability) into our software and devices is going to make
all of us less safe, and all of us more vulnerable to unforeseeable attacks in
the future.

------
drawkbox
Apple is doing the right thing, the American way, which has been forgotten,
that puts freedom over security. Thanks Apple, don't give in. Their 1984 anti
Big Brother superbowl ad has finally come to fruition[1].

[1]
[https://www.youtube.com/watch?v=VtvjbmoDx-I](https://www.youtube.com/watch?v=VtvjbmoDx-I)

------
jusben1369
I think, as a society, it boils down to this: "And while the government may
argue that its use would be limited to this case, there is no way to guarantee
such control."

Can a private, for profit, company deny the will of an elected government
working to solve a heinous crime based not on what they say they will do but
because they cannot give a 100% guarantee that this is the only time/way it'll
be used? Apple acknowledges that the government is saying it's limited to this
case but because there's no guarantee (100% certainty) they feel they can deny
it?

If yes, what does that mean as a broader precedent. Are we comfortable with
private companies denying an elected government based not on what they agree
to, but instead because there's a chance it'll be used in other ways?

As terribly flawed one might feel about government very few would think it has
less accountability than a private company.

~~~
tdaltonc
I see your point. Defying a order from a democratic government is a big deal.

But Apple isn't saying they wont do it. They just want to make it really clear
that they're not happy about it and they want the government to change their
mind.

The government still has a monopoly on the legitimate use of force.

~~~
jusben1369
```Opposing this order is not something we take lightly. We feel we must speak
up in the face of what we see as an overreach by the U.S. government. We are
challenging the FBI’s demands with the deepest respect for American democracy
and a love of our country```

I read into that that they are in fact fighting this? And I agree, I think
it's great that Apple is forcing this discussion into the public domain. These
are key issues for us all.

------
l3m0ndr0p
Apple's encryption appears to be done in such a way that government entities
can safely use them as well as "consumers." But what may happen, is that Apple
will be force to produce 2 kinds of iPhones. One for consumers, with strong
encryption, but a "backdoor" for warrant "cough" based access. 2nd type of
iPhones for government use (string encryption, no backdoors)

They may already have this in place now, but what we are seeing now is a show.
They are testing how people/consumers are going to react to this situation.
Out government probably figures that nobody will care in the end.

In the USA, we have lost our liberty. It's time to wake up and see what is
happening. It's getting worse & the people within our government are working
hard to enslave us even more.

------
thorntonbf
This is an interesting chapter in the "Tim will never be Steve" saga that so
many people are infatuated with.

This particular hill that Tim Cook has decided to defend is as important as
anything Steve Jobs ever did at Apple.

------
agebronze
This is just huge hypocrisy and full of lies. First of all, Apple CAN attempt
to brute force the password. Compiling whatever new firmware is needed and
signing it with their keys will not introduce any new backdoor like they
claimed and lied to the public - the backdoor is already there, and it is
their private keys. Just like that "backdoor" somehow end up at some bad guy's
hand, so could their private keys.

I would agree with Apple if they wanted FBI to pre-submit all their guessed
passcodes for brute force for apple to try, and for apple to have the sole
responsibility for that, so that getting said "backdoor" (which really is
nothing more than a door handle) will be as hard as getting their private
keys, and governments will not keep the said backdoor in their hands. I would
also agree if Apple claimed they don't want to be able to crack devices at a
judge's order (although that would be against the law - so they can't claim
that).

But this is NOT what Apple said. This whole letter is just one big PR bulshit.
They CAN brute force a passcode. They failed enforcing significant delay
incurred when failing a passcode attempt - even tho this issue was already
known for YEARS (will give citation if needed) when apple designed the
discussed iPhone 5C - and they also failed requiring passcode to update the
device. They already have their convenient backdoor in place in the form of
their private keys.

~~~
voidpointer
> This is just huge hypocrisy and full of lies. First of all, Apple CAN
> attempt to brute force the password. Compiling whatever new firmware is
> needed and signing it with their keys will not introduce any new backdoor
> like they claimed and lied to the public - the backdoor is already there,
> and it is their private keys. Just like that "backdoor" somehow end up at
> some bad guy's hand, so could their private keys.

Thank you! This is the most important technical point about this whole thing.
All the talking about the SE (fascinating as it may be) is irrelevant. All
strong crypto that is based on a private key being kept in a secure vault at
some corporation does have a backdoor. The keeper of the key can be compelled
to use it to sign something.

This is exactly why this would be such a dangerous precedent. Government
giving software specifications that are signed with a vendors public key. In
this case, it's a one off but it's a step into the direction of "upload a
screen-shot of the phone's display every minute to ftp.nsa.gov with your next
iOS update". And no SecureEnclave will protect against that. It will just be a
OS update signed by Apple.

------
plorg
If I understand correctly, any piece of software that would be used would need
to be signed by Apple. Furthermore, the FBI's warrant(?) says specifically
that it would only need to work for one device ID. Thus it would be relatively
straightforward to create an update for the FBI that could pretty clearly only
be used on the phone in question. Unless the FBI had Apple's signing key they
could not reuse the software (assuming they couldn't break the bootloader
chain of trust, which they apparently cannot if this is the route they are
taking). Capability is not the grounds on which they are arguing.

This is a very clearly political refusal. Apple is saying that about as
explicitly as they can in this message. Whether or not they can do it, Apple
doesn't want to be caught in the game of being a government surrogate or
having to determine for themselves if government requests are legitimate
(imagine, say, if the Chinese government asked for data from a dissident's
phone - would Apple want to risk that market by denying a request that they
have complied with in the US?). It's unfortunate for them that the FBI is
making this request while people still own phones like the 5c for which they
could theoretically disable security features, as opposed to the newer phones
which it is possible they are completely unable to defeat.

~~~
return0
> would be relatively straightforward to create an update for the FBI that
> could pretty clearly only be used on the phone in question

How? Wouldn't the FBI reverse engineer it?

It seems like apple would have to update all phones to a new version (which
would prevent the exploit provided to the FBI) before handing it to them.

~~~
plorg
It's not that reverse engineering is difficult - the FBI could theoretically
do that now. It's that it needs to be signed to be accepted by the phone.

Edit: see, for example, here: [https://stratechery.com/2016/apple-versus-the-
fbi-understand...](https://stratechery.com/2016/apple-versus-the-fbi-
understanding-iphone-encryption-the-risks-for-apple-and-encryption/)

------
dfar1
I know that because this is on hacker news, everyone is talking about whether
it's possible to access the data, and if possible... how easy or how hard it
is. But the focus should be on the whether there should be a clear
line/understanding between security and privacy, or should we keep everything
black and white as it is now, just looking at extreme cases?

If they cannot co-exist, I'd rather have more security and less privacy. But
ideally, I shouldn't have to choose between them.

~~~
mcintyre1994
I don't really see how you propose to separate these though.

If you have information about something you're planning that harms my personal
security, there's always going to be a necessary tradeoff between your privacy
and my security.

------
mrb
Link to the FBI order: [https://assets.documentcloud.org/documents/2714001/SB-
Shoote...](https://assets.documentcloud.org/documents/2714001/SB-Shooter-
Order-Compelling-Apple-Asst-iPhone.pdf)

(Edit: deleted part where I was wrong. Thanks robbiet480 for correcting me.
It's 2am here and I was tired.)

Also, prediction: if Apple refuses to build a brute forcer, someone else will
do it and sell it to the FBI. Just wait and watch.

~~~
robbiet480
iPhone brute force hardware already exists [1]. The issue is that when Touch
ID and/or a passcode is enabled, the device locks itself for a few seconds-a
few hours every time an incorrect pin is entered. So brute forcing would take
an extremely long time.

In addition, there is a setting on all iPhones to erase data after 10 failed
pin code entry attempts.

The FBI wants Apple to provide a custom iOS build that can be installed on the
device that allows for remote (over the network) brute forcing with the
increasing timeout/erase data protections totally disabled.

1\. [http://techcrunch.com/2015/03/19/iphone-bruteforce-
pin/](http://techcrunch.com/2015/03/19/iphone-bruteforce-pin/)

~~~
interpol_p
But how can the custom iOS build be installed on the device while it is
locked?

~~~
nicky0
"The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery
mode, or other applicable mode available to the FBI."

~~~
Karunamon
DFU/Recovery wipes the device.

------
illumin8
If you oppose this, please let President Obama know. The FBI is part of the
executive branch of the government, and as such, directly reports to the
president. In other words, if he tells them to stop, they must comply. Please
register your complaint here:

[https://www.whitehouse.gov/contact](https://www.whitehouse.gov/contact)

Here is my letter to them:

Dear President Obama,

I've voted for you in both elections, and have been a firm supporter on all
your causes (affordable care act, and more). However, your FBI has clearly
overstepped it's authority by demanding that Apple spend engineering resources
building a software product that can break the encryption of a terrorist's
iPhone.

Seriously, you need to stop this. You are the head of the executive branch of
the government, of which the FBI is directly underneath your jurisdiction.
Director James Comey is directly within your chain of command.

What the FBI is asking for is a master key to be created that can decrypt any
iPhone. This makes all Americans with Apple devices insecure in the face of
threats to our personal security and privacy. I hope you can understand that
this is clearly unacceptable, and needs to be stopped.

I want to register my complete opposition to the FBI in this circumstance.
Please stop this.

Thanks, xxxx

~~~
user9982
>Seriously, you need to stop this. You are the head of the executive branch of
the government, of which the FBI is directly underneath your jurisdiction.
Director James Comey is directly within your chain of command.

You don't need to tell the man what he can do in his capacity as president, he
knows what he can do. Identify the issue clearly and concisely, but don't try
and tell him what he can and cannot do. Whether you intended that or not,
that's how it comes across.

> What the FBI is asking for is a master key to be created that can decrypt
> any iPhone.

Not true, they're asking for a way to submit guesses electronically and
removing the auto-wipe and 10 guess limit. If you are going to request help on
an issue, ensure you know what you the issue is.

I wouldn't send this message in, and I'd suggest you restructure what you're
writing because this is simply not effective, and if other people begin
sending the same message in it's going to make everyone who opposes this issue
look misinformed with a superiority complex (that is how the entire second
paragraph comes across).

~~~
illumin8
The important thing is to voice your disapproval. I'm confident that my
wording could be improved, but it's important that Obama quantify the sheer
number of US citizens that disagree with this despicable act.

~~~
user9982
The important thing is to voice your disapproval in a manner that a)
identifies the issue and b) identifies your stance on the issue. Your comment
misses the mark on a) by being down right wrong, and your stance on b) comes
across as you believing you know more about these matters than the people
executing them. The fact that you come across that way without having a clear
understanding of the issue is going to lead them to dismissing your
disapproval.

------
jwr
The important lesson here is that it is time to design the next phones in a
way that makes it impossible to either install a software update without
unlocking the device or implement auto-erase functionality in hardware.

That way for future phones at least, the issue would become moot: there would
be no way for Apple to build and/or install a custom software image that
allows brute-force password cracking.

~~~
LeoNatan25
Already there with 5S and above.

------
davidhariri
I'm no security expert, but how would Apple access previously encrypted data
with a different version of iOS? Doesn't having that ability imply they
already have a "back-door"? Could someone explain what I'm missing here or is
it more that that would be a one-off solution and the FBI is asking for a
global, remote, no apple needed solution...

~~~
kentonv
If you read the letter carefully, you'll notice it talks about brute-forcing
the password. I guess the assumption is that the password is brute-forceable,
but the FBI is unable -- or unwilling -- to write their own software to do it.
This seems surprising to me; you'd think the FBI could employ a few hardware
hackers for this. Maybe they are only pushing the issue for political reasons
-- actual jihadist terrorist attacks on US soil don't happen very often,
better milk it for all it's worth?

------
uberneo
I remember the old dead community of OpenSource Phone -
[http://wiki.openmoko.org/wiki/Main_Page](http://wiki.openmoko.org/wiki/Main_Page)

------
chillaxtian
if you are interested in the technical details of iOS security:
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

------
iamshs
Cannot be more glad to see Apple's stand on this. Let's not forget what
happened with Blackberry some 5 years ago. India, Saudi Arabia and UAE got
monitoring ability on its platform:-

1)[http://timesofindia.indiatimes.com/tech/tech-
news/telecom/Go...](http://timesofindia.indiatimes.com/tech/tech-
news/telecom/Government-BlackBerry-dispute-ends/articleshow/20998679.cms)

2) [http://www.reuters.com/article/us-blackberry-saudi-
idUSTRE67...](http://www.reuters.com/article/us-blackberry-saudi-
idUSTRE6751Q220100810)

3)
[http://www.thestar.com/business/2010/08/16/threats_of_blackb...](http://www.thestar.com/business/2010/08/16/threats_of_blackberry_ban_in_india_saudi_arabia_echo_us_debate_on_encryption.html)

------
tommynicholas
No sympathy for terrorists, no sympathy for weakening encryption.

I can understand someone outside of tech not understanding how those are
comparable statements, but if anything the latter is more important.

------
aidos
OT but this post is well on its way to becoming the most popular post since
Steve Jobs died.

[https://hn.algolia.com/?q=&query=&sort=byPopularity&prefix&p...](https://hn.algolia.com/?q=&query=&sort=byPopularity&prefix&page=0&dateRange=all&type=story)

------
NicoJuicy
Would this really be true or is this just a decoy, to let you believe there is
no backdoor?

I do believe there is no backdoor for when a city court requests it, but i
don't really believe that the FBI or CIA doesn't have access to it.

Considering that iPhone already exists a long time, they must have some means
to backdoor the "iCloud"...

------
mrmondo
Massive props to Apple, again I am impressed by their commitment to customer
privacy.

------
cognivore
If I were Cook, I'd draw a line in the sand. If we are force to comply, we
exit the phone business, because we won't make phones that compromise our
customer's security.

But that would take more balls than anyone left here in this "Land of the free
and home of the brave" seems to have left anymore.

~~~
JoeAltmaier
[http://www.commondreams.org/news/2014/04/17/lavabit-
company-...](http://www.commondreams.org/news/2014/04/17/lavabit-company-
defied-nsa-surveillance-loses-appeal)

~~~
makecheck
And, his letter is still posted on his web site, describing what he went
through:

[http://lavabit.com/](http://lavabit.com/)

------
SCdF
> Specifically, the FBI wants us to make a new version of the iPhone operating
> system, circumventing several important security features, and install it on
> an iPhone recovered during the investigation. In the wrong hands, this
> software — which does not exist today — would have the potential to unlock
> any iPhone in someone’s physical possession.

They really need to put that paragraph closer to this one:

> The government would have us remove security features and add new
> capabilities to the operating system, allowing a passcode to be input
> electronically. This would make it easier to unlock an iPhone by “brute
> force,” trying thousands or millions of combinations with the speed of a
> modern computer.

The first paragraph without the second implies that iOS isn't actually secure
at all.

------
rcthompson
I see a lot of discussion about "Secure Enclave" and other hardware security
features and such, and I'm not sure I see the relevance. Assuming that the
data has already been properly encrypted, stored on disk, and purged from
memory (by shutting down the phone) by a version of iOS that did not already
contain a backdoor when the data was encrypted, there's no magic combination
of hardware and software that can decrypt that data without the password,
right? This seems to be supported by Apple's claim that the best they could
possibly do is provide a channel for the FBI to brute force the password.

So am I missing something that makes the iPhone's internal security
architecture relevant here?

~~~
matthew-wegner
Yes. The target phone, an iPhone 5C, lacks a Secure Enclave.

The password retry delay, and subsequent deletion of keys, is enforced by iOS
here. Apple could provide some kind of software to allow for unlimited
attempts (and an interface to do so in an automated way, which the FBI is
specifically asking for).

On newer phones, the Secure Enclave contains the keys, and enforces both the
retry delay and the deletion of its contents. There isn't a way around this
without also upgrading/flashing the SE system (and it isn't even clear if
Apple can do this in a way that preserves the keys).

~~~
rcthompson
Ah, I see, so with newer hardware it wouldn't even be possible for Apple to
enable brute forcing with a software update. But even with older hardware,
presumably allowing brute forcing is still the worst they can do, right?
(Assuming no prior backdoor exists.)

------
greggman
I think it's outstanding that Apple is standing up for this.

Will they, can they do anything about data in iCloud as well? While you can
turn off iCloud I'd guess the majority of people are using it. Given you can
access much of it at iCloud.com that would seem like whether or not you can
unlock an iPhone most customers' data is available directly from Apple. Mail,
notes, messages, photos, etc. No idea about other apps data that get backuped

Again I'm applauding Apple for standing up for encryption. If they could some
how offer the same on iCloud I'd switch from Google (Google is not encrypted
either. My point is I'd switch to a service that offers it)

------
jonathankoren
Apple does deserve the respect their getting for standing up to the government
about this. They're absolutely right that this is an attempt to fatally
undermine security for a whole host of devices, and sets a disturbing
precedent.

What do find interesting, is that Apple isn't the first manufacturer that the
government as ordered to crack a device. An "unnamed smartphone manufacturer"
was ordered to crack the lock screen on October 31, 2014.[1] No one made a
fuss then, so someone caved.

[1]
[https://en.wikipedia.org/wiki/All_Writs_Act](https://en.wikipedia.org/wiki/All_Writs_Act)

------
eva1984
It makes sense for them.

If they put a backdoor in iPhone for US government, they are effectively
thrown out of Chinese market.

Interesting enough, what will Apple do if Chinese government demand they to
decrypt/put backdoor in exchange of staying in the market?

~~~
sberder
I was about to mention the Chinese case, the Chinese government asking foreign
companies to install means of control and access in their products.

Does that mean that apple will not provide such means or disable security for
phones sold on the Chinese market? That would be surprising given the
potential size of this market.

~~~
eva1984
They will probably produce some China-only version to please the government. A
open letter like this, will instantly end their relationship with the
government, they won't even attempt to do it.

~~~
squidlogic
I thought they already did this, but I have no proof. It seems reasonable
because Google got evicted from China for not playing ball, so I find it hard
to believe that Apple would get a free pass.

~~~
umanwizard
I don't know if they'll get a free pass or not, but Apple is a __lot __more
popular among the Chinese youth /middle class/intelligentsia than Google. They
may have a bit more sway.

------
FIX_IT
Just so you know when I forgot the password to my password to my iPhone
remembered that I choose 1 from the top section 2 repeated numbers from the
row below 1 below that and then a zero. So I tried it untill it said "iPhone
disabled connect to iTunes", this is when I found out you can reset the
disabled time by clicking on a computer the backup button. So therefor you
could successfully create a program that tries passcodes 5 or 10 times then
tries and fails to back the iPhone up. (There is no need for a backdoor or
anything fancy)

------
wicket
A phone without a backdoor would be illegal in the UK once the Snooper's
Charter comes in to full effect. I'm very interested to see how the UK
government will react to Apple's stance.

------
Your_Creator
The All Writs Act is a United States federal statute, codified at 28 U.S.C. §
1651, which authorizes the United States federal courts to "issue all writs
necessary or appropriate in aid of their respective jurisdictions and
agreeable to the usages and principles of law."

well, as far as I can see, it is not agreeable to the use and principle of law
to force a company (or a person since corporations are people) to spend money
and waste resources to compromise its own security systems, which happens to
be something they morally object to.

------
xlayn

      While we believe the FBI’s intentions are good, it would
      be wrong for the government to force us to build a 
      backdoor into our products. And ultimately, we fear that 
      this demand would undermine the very freedoms and liberty 
      our government is meant to protect.
    
      Tim Cook
    

Kudos to this guy for standing up to an idea.

Now on practical notes, this is about security, providing a digitally secure
platform to both users and providers, prevent tampering, keeping data secure.

Microsoft could take a cue.

~~~
empressplay
The court order says Apple should make the software only work on the specific
phone in question. Nobody could modify the software to work on other phones
any more than they could make the changes to iOS themselves.

Apple is misrepresenting the situation and perhaps it's because they're afraid
that in the future the government will come knocking again, but I think it
hurts them to not be completely above-board about this.

~~~
Zigurd
That's not really possible. There isn't a mechanism to create this condition:
"Nobody could modify the software to work on other phones"

Tim Cook is right. Once this is unleashed, there are no limits and all iPhones
are insecure.

------
datashovel
I'd love to know the names of the people within the FBI who are pushing this
agenda. The only way this foolishness is going to stop is if those people are
out of a job.

~~~
adam12
I'm sure this backdoor request is fully supported by Obama as well as Hillary.

------
mcintyre1994
I don't quite understand - what is the actual purpose of being able to push a
new version of iOS while locked? Apple don't seem to use this - people stick
to whichever version they're comfortable with on old devices and accept
whatever limitations.. so why does the functionality even exist?

Even with the restriction of being plugged in, outside of Apple who needs to
push iOS versions at tethered devices and will be hindered too badly by having
to unlock them first?

------
ohazi
This is the _only_ acceptable response.

------
ocean3
Even if Apple created a backdoor, how are they going to install it on locked
phone? Are locked mobile able to update without access to internet or user
passcode?

------
aiabgold
I'm curious: is it likely that Apple was under a gag order regarding the
backdoor proposals/discussions?

I've always wondered why large tech companies/corporations abide by such
orders instead of speaking out. Even if Apple was under a gag order, they've
created a PR nightmare for the alphabet agencies; Apple could be pursued in
court, but that pursuit would now likely be done in the face of negative
public opinion.

------
teacurran
Why is there very little talk about the First Amendment in this whole
discussion? They are asking to write custom software.

The supreme court has ruled in separate cases that: 1\. that software is
speech 2\. that a person (corporations are people according to them) cannot be
compelled to speak

It would seem to me that the FBI could perhaps subpoena technical
documentation from Apple but it should be required to hire their own
developers to write this software.

------
at-fates-hands
The easy solution to this is to have the gov send Apple the phone. They break
into themselves and then hand back the phone with the pass code turned off and
whatever software they need to install to do it removed, leaving no trace of
how they actually did it.

Win/Win

No software backdoor is created, the FBI gets its data and we all go on with
our lives. Why are we spending so much time gnashing teeth over something that
has a very simple solution to it?

~~~
hunterjrj
Chain of custody. The guv would need to be able to prove, in a court of law,
who touched it and when, for what purpose, etc. Handing it off to Apple in
good faith would make it inadmissible as evidence in a trial.

You could argue that Apple could allow an official to be present while work is
being done on the phone, but then Apple risks having it's (to-be-developed)
method being viewed and/or captured by said official. Very risky.

------
mckoss
A possible compromise would be to add a backdoor to the security module that
would unlock the phone in exchange for a proof of work.

It would be relatively easy for the chip to offer a challenge and accept, say,
a $100,000 proof of work to unlock the phone. This way, we prevent bulk
surveillance but still allow the government to access high value targets'
devices.

------
leecarraher
Backdoor is somewhat of a misconception. What they want are two front doors,
ie we encrypt your message with the recipients public key, and we make a copy
with our(in this case apple's) public key. We send both messages over the
internet, and apple or your isp/cell service provider (we can also assume nsa
prism has it too) stores the apple key'd message or both. When the government
wants access, they can issue a subpoena for information from the isp/cell
provider for the encrypted data (or just download it from Saratoga Spings),
then they issue a warrant to apple to decrypt it with their private key. This
is likely the only reasonable and responsible outcome that I can see resulting
from this debate. Or, pessimistically it becomes an issue for political fodder
and we leave it up to politicians who have little to no understanding of the
technology to devise some technologically inept solution.

------
Merad
What's the potential fallout on this case? I assume Apple is appealing the
ruling - what happens if the ruling is upheld and Apple refuses to comply
(unlikely IMO, but what if)? Could the DOJ target individual Apple engineers
and order them to do it or face contempt of court charges?

~~~
programmarchy
Yeah, even if the DOJ ordered Apple the corporation, ultimately they would
still have to force one or more Apple engineers to perform the task. That
seems messed up. Would the Apple engineer at least have the option to quit?

~~~
dragonwriter
> Yeah, even if the DOJ ordered Apple the corporation, ultimately they would
> still have to force one or more Apple engineers to perform the task.

More likely, they would impose consequences against Apple-the-corporation were
it not to cause the task to be performed, rather than forcing an Apple
engineer to perform the task.

------
philip1209
Their iMessage encryption is fascinating. It basically makes it impossible to
retroactively decrypt iMessages. With a court order, they can start MITMing
conversations, but unless they intentionally generate a MITM keypair they are
cryptographically locked out of the conversation.

[http://techcrunch.com/2014/02/27/apple-explains-exactly-
how-...](http://techcrunch.com/2014/02/27/apple-explains-exactly-how-secure-
imessage-really-is/) (Link to Apple's paper is in the article)

(Yes, Apple could add this key for everybody at the beginning, but if their
intention is security then it is a brilliant system.)

------
rubicon33
I hope apple employees + executives read this:

I am now officially, an Apple fanboy. That's right, I'm gloating to family and
friends, about how Apple is standing up to the man, doing the right thing, and
refusing to compromise their security.

Keep up the good fight.

------
joering2
> And while the government may argue that its use would be limited to this
> case, there is no way to guarantee such control.

Good one Tim! I mean how long did the LE think they can abuse constitution,
put spy devices on people's cars without warrant, use stingrays and do all
sort of other crazy stuff including planning and executing white-flag attacks
without any consequences whatsoever?? I mean, at some point, we the people -
for a good reason - will lose all and any trust we have in them! And that's
what Tim is saying in this one sentence that with overwhelming evidence, the
US Gov would have hard time arguing against!

------
Sealy
Huge respect to Tim Cook for standing up for the personal information security
of Apples users around the world. When a non tech demands something as stupid
as a back door, they do not acknowledge how weak they make data security.

------
huntleydavis
Privacy is obviously the foremost issue at hand with the Government's request
here, but there is also a huge potential impact on the future of the iPhone
software. There is a huge difference between granting access to a user's data
at the Government's request vs demanding a customized build of the iPhone's
OS. Imagine the long-term implications of having a third-party tether its
misaligned feature requests to every OS update that the iPhone makes. What
would be the continued relationship with Apple and the agency behind this?
Would this evolve into something analogous to HIPAA compliance?

------
mempko
> Specifically, the FBI wants us to make a new version of the iPhone operating
> system, circumventing several important security features, and install it on
> an iPhone recovered during the investigation. In the wrong hands, this
> software — which does not exist today — would have the potential to unlock
> any iPhone in someone’s physical possession.

The scary part here is that the iPhone data is really not that secure. If
apple can overwrite the OS and get access to the data, this means the keys are
stored on the phone somewhere, and not password protected, or "fingerprint"
protected.

------
krylon
Given the way a lot of people (and the media) tend to go completely bonkers
when somebody says "terrorist", this is commendable.

It remains to be seen, though, what Apple will actually do, in legal terms.
Will they flat-out refuse to cooperate, even if this means that they will be
fined or Mr. Cook will be imprisoned for contempt or something like that? Will
they actually send their lawyers to challenge the court decision? _That_ would
be very interesting to watch, and if they succeeded, it would create a
precedent for a lot of other companies. But so would their failure.

------
larrymcp
To play devil's advocate:

Mr. Cook expressed concern that "the government could intercept your messages,
access your health records or financial data, track your location, or even
access your phone's microphone or camera without your knowledge".

As I read this I wondered, "what harm would actually happen if that occurred"?
If the government did read my messages and get my health records & financial
data and track my whereabouts, I can't think of anything bad that would
actually happen as a result of that.

Is there anything specific that I should be worried about in that scenario?

~~~
agentdrtran
"Arguing that you don't care about the right to privacy because you have
nothing to hide is no different than saying you don't care about free speech
because you have nothing to say"

------
rloc
I feel like Apple is intentionally over simplifying it for the purpose of this
letter or maybe to push back on the FBI ask more easily.

Apple could propose to secure access to the FBI using the same level of
security that it uses to protect the access to the phone content for the owner
of the phone himself. Tim Cook only talks about one solution of a "tool" that
it could install.

If the same level (and method) of security is used then saying that there is a
risk of the backdoor being hacked would be equivalent to saying that there is
a similar risk of the user access being hacked.

~~~
thetruthseeker1
I dont know if there can be a practical solution based on what you said. If an
FBI agent could do it, he could lose that methodology due to him getting
hacked or he could do that on purpose with malicious intent( theoretically
speaking).

------
bigolebutt666
Wouldn't one assume that once the phone is powered up there is some kind of
code at startup or scheduled that would query an apple update server about
updates,fixes,etc. At that point it is reasonable that a company such as Apple
would force certain updates into the phone whether the customer wanted that or
not? All Apple would have to do is direct the phone to a phoney update
site(for this IMEI only)containing code that would dump RAM to an outside
server. No other phones would be affected and the data would be retrieved.
World saved!

------
rdl
I wonder how much of that was personally written by Tim Cook, vs. various
other people within Apple (I'm sure legal, PR, product, etc. all had input,
but this feels like something he wrote himself.)

~~~
WA
This is probably the reason why someone else wrote it. Because writing in
"your voice" but without filling words is incredibly hard. But then, it's not
the CEOs job to be a good writer.

~~~
rdl
It absolutely is the job of the CEO of the world's (second) most valuable
company to be a good writer.

~~~
WA
Being a clear thinker (role of the CEO) and being a concise writer are two
different things. The latter requires a lot of training. The CEO of the
world's (second) most value company must be able to clearly articulate his
goals and his vision, but he doesn't have to be a wordsmith to put his vision
on paper.

Writing != thinking != talking

Writing might make you a better communicator in general, maybe a better
thinker, too. But clear thinking and talking don't make you automatically a
better writer.

~~~
rdl
I think CEOs (and even senior managers) end up communicating in writing to
most of their teams most of the time, so clarity in written communication is
essential to top tier people.

------
AYBABTME
While I think protecting user data is important, I don't understand what the
fuss is about. Anyone could (given technical knowledge + tools) take apart a
phone, pull the encrypted data out of storage, and then brute force the
encryption on a large machine.

The FBI doesn't need the modified iOS code, and that Apple write/not-write it
doesn't change anything in the end, since someone else could just as well
write the software with some reverse engineering.

[edit: if you downvote because I'm wrong, please explain because I'd love to
know why]

------
bigolebutt666
Wouldn't one assume that once the phone is powered up there is some kind of
code at startup or scheduled that would query an apple update server about
updates,fixes,etc. At that point it is reasonable that a company such as Apple
would force certain updates into the phone whether the customer wanted that or
not? All Apple would have to do is direct the phone to a phoney update site
containing code that would dump RAM to an outside server. No other phones
would be affected and the data would be retrieved

------
HighPlainsDrftr
This is ugly. If Apple can indeed break into the phone, they need to say, "We
have to stop production now. All of our engineers will need to be behind this.
It will cost us at least a billion dollars, if we can do it. We will miss
deadlines for new products and software. Write us a check for $1 billion and
we will start on it. We may need a few billion more. Write the check - we'll
do what we can do. And lets hope we don't accidentally destroy the evidence
doing it."

------
ezoe
It's almost certain that Apple helped American government to violate
customer's privacy. It looks to me this is just a marketing stunt for post-
Snowden revelation.

If Apple cared customer's privacy and security so much, how could they sell
non-free software that is hard to audit, computer with baseband processor,
relies on central server which allows the single point of failure.

My understanding is Apple customer don't much care about their own privacy and
security but has weakness on marketing.

------
roadnottaken
Can someone explain this to me? The FBI requests a new version of iOS to be
installed on a single phone that was involved in the attack. What, exactly,
does this mean? If the phone is locked, how will they install new software on
it without unlocking? People are suggesting an update to iOS that will get
pushed-out to all users, but contain a backdoor that is specific to that one
particular device -- but how will the new iOS version be installed without
unlocking first?

------
Synaesthesia
I always wondered why Apple took so much trouble with the secure enclave
design, I thought it was really overkill, now I see it was really necessary
for instances like this.

------
Zigurd
This is why technology companies have to go farther than implementing
proprietary security systems: They have to put the capability to circumvent
security out of reach of themselves.

Real data security has to be a mix of services that are friendly to reliable
key exchange and strong unbreakable encryption, and verifiably secure endpoint
software, which in practice means open source software where the user can
control installation, that implements encryption.

------
joeclark77
Can someone explain this to me: if the data is encrypted, how does switching
the operating system out enable one to read the data? I'm a layman in this
area but I can only surmise that the data is stored _unencrypted_ and it's the
operating system itself that's somehow locked. If a change of operating system
can open up encrypted data, then what's the point of encrypting hard drives or
data sent over a network?

~~~
cmurf
It doesn't. Changing the OS allows removal of the anti-bruteforcing
feature(s): the delay between attempts that increase exponentially, or the
delay limit of 10 where the encryption key(s) are deleted and in effect all
user data. This may not be possible on iPhones with secure enclave though, the
anti-bruteforcing is in part built into the secure enclave, but the iPhone 5c
in question in this case doesn't have a secure enclave so it might be
possible. And further they want an automated way to iterate the password.
Basically they want a backdoor to make it easier to bruteforce the phone
through guessing the passphrase.

So there is nothing for Apple to hand over. There are no actual keys (they're
on the phone itself in an unaccessible way, practically). The court order in
effect orders them to write a derivative OS without bruteforce inhibition
features. They probably can do that, it probably isn't burdensome, but is it
legal to compel a company to write code? Can a court order you to write a
book? Or a letter? They can make you turn over facts or evidence, but it's
specious they can make you create something, even if you have the capacity to
create it.

------
geocar
• Can Apple upgrade iOS on a single device that is locked, from a new
untrusted laptop without wiping it?

• Can Apple OTA upgrade iOS when the device is locked?

------
tdsamardzhiev
If they provide to the government what the government wants now, next year the
government will come back with even more ridiculous request. Mr.Cook is right
- it'd be great if we can avoid creating a precedent.

Oh wait they already did by providing their clients' data. Trying to stop the
government now is like trying to stop a high-speed train. Still, good luck to
them! Good to know they are not just pushed around without any resistance.

------
autoreleasepool
Wow this made my day. I think my faith in Apple's privacy concerns got a much
needed revitalization. Privacy and encryption are the number one reason I
stick with iPhone and Mac with File Vault. It was always hard to completely
trust them after PRISM. However, that was arguably a different Apple.

This stance against the government come poetry reaffirms my faith in the
genuineness of Apple'e encryption efforts and Tim Cook specifically.

~~~
herbst
If this is your main reasons to choose a OS you clearly should use Linux then
;)

~~~
autoreleasepool
I prefer FreeBSD :)

No code from [redacted] makes it an excellent choice for the privacy
conscious.

I use Slackware in the cases where I need a Linux kernel. I think that might
give you an idea of what I'm trying to avoid. [redacted]

Anyways, I use *BSD daily in VMWare Fusion for any development that isn't
related to iOS. I also do my email and web surfing in OS X because it's simply
more pleasant.

------
lucio
Can't they dump all the data from that particular device and then send it to
the FBI? Maybe the judge will order that? Obviously they're confessing they
can break the encryption but they would not do it, on principle. I don't see
how they can win this fight. If it's the iphone of the shooter, and they can
decrypt it, they should do it. It is not the same as to give the FBI a tool to
unlock any iphone.

------
tibbon
I heard this morning on the (semi-conservative FM radio) that this was a
national security issue, and that Apple is helping terrorists in not bypassing
this.

I don't get it- the shooters are dead. How is what is on their phone a matter
of national security? We probably have 99% of the information we'll ever have
on them. There is no larger plot. Not having what's on this device I cannot
imagine puts anyone at risk.

~~~
rahoulb
> Rather than asking for legislative action through Congress, the FBI is
> proposing an unprecedented use of the All Writs Act of 1789 to justify an
> expansion of its authority.

It's all about precedent.

------
andy_ppp
Surprising the FBI doesn't have a division of highly paid individuals who can
crack iPhones... There are plenty of people online with a vested interest in
this topic who I'm sure you could hire to help.

My guess is that this is more about pushing back the law and peoples rights
than is is about getting access to this device.

But then I'm highly cynical about what the government claim they can do with
technology for obvious reasons.

------
AshleysBrain
While basically being on Apple's side here, as I understand it, jailbroken
devices are unofficial builds of iOS that have some security features removed
(e.g. limits on which apps can be installed).

Is it not possible for law enforcement to get what they want from that, if all
they want is a custom build of iOS that can be hacked around? And why is it
even possible for that to work if the data is supposed to be kept secure?

~~~
wilg
I think jailbreaking requires you to erase the phone first.

~~~
dh997
Most previous jailbreaks required an unlocked device with passcode disabled
and find my iphone turned off (because passcode encrypts things).

I'm still waiting for JB for 9.2.1 or 9.3 when released but there are already
semijailbreaks (browser based installs a temporary app) and some unreleased
PoCs, but Cydia MobileSubstrate and other tools need to be ported / verified
too.

Perhaps if Apple allowed the devices to be officially customer hackable (like
flux, springboard replacements, transmission, 3G unrestrictor and changing
fonts), there would be less need to develop exploits... Unfortunately, there
is great demand from governments to buy exploits and keep those secret (not a
conspiracy but tools in a market)

------
facetube
This was his employer's phone, right? As in, it was government-owned property
being used in the course of terrorism. Were they using Apple's Mobile Device
Management (MDM) framework or some other form of key escrow? If not, why
should Apple bail out a government entity, at the expense of its own customers
and security, that couldn't even be bothered to follow best practices?

------
Synaesthesia
There is one way to brute force an iPhone called IP Box. It's a hardware
device which can brute force a 4 digit pin in ~111 hours.
[http://blog.mdsec.co.uk/2015/03/bruteforcing-ios-
screenlock....](http://blog.mdsec.co.uk/2015/03/bruteforcing-ios-
screenlock.html)

But it only works on iOS 8.1 or earlier, was patched in iOS 8.1.1

------
grecy
Does anyone know what the consequences for Apple will be if they keep
refusing, but the courts say they must?

Massive fines? (we know they have the cash to cover it)

Jail time for execs (whoa!)

?

------
dang
This is now the most-upvoted story HN has ever had.

~~~
jey
And apparently #2 is also by Tim Cook?
[http://www.hntoplinks.com/all](http://www.hntoplinks.com/all)

~~~
dang
That's #4, behind 3078128 and 3742902 (though that one cheated).

------
vbezhenar
This is very disappointing letter for me. It means that Apple can indeed build
a backdoor into existing phones, they just don't want to do it (or so they
speak). I was under impression that Apple employs security hardware which
protects keys and makes impossible to penetrate that defense. If it's not the
case, iOS security is not as good as it could be.

~~~
edraferi
Other commentary suggests this capability is targeted at the iPhone 5C, and
that at least some newer models do have protections against this technique.

------
uberdingo
This is all brave talk until they publicly say the same thing to China, until
then this political bluster. [http://qz.com/332059/apple-is-reportedly-giving-
the-chinese-...](http://qz.com/332059/apple-is-reportedly-giving-the-chinese-
government-access-to-its-devices-for-a-security-assessment/)

------
BWStearns
Good on them. I was hoping that they'd be able to manage a way to unlock this
one without potentially breaking the whole model (by exploiting some bug in
the presumably outdated version installed or something that wouldn't
positively degrade the security model), but given that that's not the case
then I think they're making the right choice.

------
kabdib
I'm betting there are similar vulnerabilities in the current "Apple doesn't
have the keys" versions of iOS and the hardware. For instance, do a similar
mandated firmware update to the secure enclave, and now you get unlimited
guesses at a PIN.

edit:

Ah, I've found a couple of sources claiming that the secure enclave wipes its
keys if its firmware is updated. Makes sense.

------
mirimir
It may well be that Cook's stand will soon become unworkable in the US. The US
is always at war, after all, at least effectively. I wonder if Apple would
just leave. It's already earning ~60% of revenue outside the US, after all.
And hey, it's sitting on tons of offshore cash. Maybe it could build its own
country on an unclaimed reef somewhere.

------
lisper
It is worth pointing out one salient fact: the phone in question did not
belong to the shooter, it belonged to the shooter's employer, which in this
case is the county government. That makes Apple's position much less tenable
because the owner of the phone is (presumably) consenting to -- maybe even
actively encouraging -- the recovery of the data.

------
Cthulhu_
What I think Apple should (also?) do is appeal to both the law enforcement
themselves, and the government - basically go "All secret communications from
law enforcement and government figures - up to the President - would be at
risk", or something to that effect.

I doubt the ones giving these orders would be comfortable with their own
privacy being at risk.

------
notthegov
It's hard for me to have respect for an organization that was built by J.
Edgar Hoover, a person who did not respect the law or American's rights.

The philosophy of corruption and oppression still echoes throughout the FBI.
Even today, there are FBI agents that work for private interests. You can't
reform a mafia, you must abolish it and start over.

------
joshcrawford
"For years, cryptologists and national security experts have been warning
against weakening encryption. Doing so would hurt only the well-meaning and
law-abiding citizens who rely on companies like Apple to protect their data.
Criminals and bad actors will still encrypt, using tools that are readily
available to them."

Sounds just like gun control :)

------
ThinkBeat
If I were a betting man I would put good money on the bet that a bypass exists
and is well known to the government.

What parts of the government is a different matter.

This is a perfect setup. Get all the bad guys to run out and buy iPhones (good
for Apple) believing that they are safe from the US surveillance machine.

Then the appropriate agency can slurp up whatever it wants.

------
carsonreinke
Maybe I am missing something here, but the Washinton Post says "Federal
prosecutors stated in a memo accompanying the order that the software would
affect only the seized phone". What is so wrong with that? If they just use it
only on this phone? Or is that the weapon has been created and could be used?

~~~
ncallaway
Apple is sayong that the federal prosecutors are wrong in this assertion.

Apple is saying that any solution that is applied to specifically this phone
can trivially be generalized to all other iPhones (or, at least other iPhone
5cs). Further, unlocking this phone in response to this order establishes a
precedent that this is okay. You are much better off legally if you fight the
first request than if you fight the thousandth request.

------
vu3rdd
I don't see how this message is reassuring. Are they expecting the customers
to just take their word? Without Apple showing the world, every bit of
software that they run on their phones, these statements are at best, meant to
mislead the users that Apple is doing something on the user's behalf.

------
WA
I wonder why no one pointed out that privacy boils down to trust:

That letter might be the truth or could be some kind of decoy. Maybe the
backdoor will come and Apple knows that already and they try to limit the
damage to their brand.

Like "we tried to resist having a backdoor installed, but we couldn't do it
ultimately".

------
teekert
Am I wrong to think that this brute forcing can still be applied when the raw
memory chip is taken of the iPhone? The wipe-all-data-feature requires write
access to the chip + some intelligence and monitoring. These capabilities
should be physically removable from the actual memory chip, right?

~~~
chillaxtian
read section 'Hardware Security Features' here:
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

~~~
teekert
Ok, so: _" The UID allows data to be cryptographically tied to a particular
device. For example, the key hierarchy protecting the file system includes the
UID, so if the memory chips are physically moved from one device to another,
the files are inaccessible. The UID is not related to any other identifier on
the device."_

The secure enclave must still give it's UID under some circumstances? This
still does not appear to be immune to hardware hacking.

Moreover, this UID can also be brute forced imo, when the memory chip and
secure enclave are physically separated. Whatever is needed to de-encrypt the
data must be brute force-able, especially when the memory is separated from
the wipe-all-data initiator which does not seem to be impossible if you know
the chip design well enough?

~~~
evanosaurus
Unless there is a bug in their hardware implementation of AES-CCM or (
_shudder_ ) some sort of crazy disclosure vulnerability in the APIs they
provide, there is (presumably) no way to get at the UID. Even if you were to
decap the chip and get at the UID physically, you still aren't any better off
as it derives the actual encryption key on boot from the UID.

The Secure Enclave is essentially a hardware security module, in more general
terms. The only thing that leaves its boundaries are the results of crypto
operations, not the parameters that went into calculating them.

~~~
teekert
Thank you. So you mean to say that making an electron micrograph of this chip
will not even reveal it's secrets? If the data persists after removal of
power, some physical structure contains the data. Somewhere a hash of the
fingerprint/password needs to be stored, then somewhere the function to
compute that into an AES cipher needs to be stored.

I'm going on and on about this because I see no way in which this problem is
not down-boil-able to brute-forcing the password the user puts in.

Actually Apple admits this much! They can build a work around! What stops the
three-letter-agency from building it?

There must at some point be a complex user entered passphrase if you want to
be safe. This can be a fingerprint of course but there is always the 4 letter
password/passphrase that is the weak point.

I could be completely wrong, so far I'm not convinced I am.

------
danbmil99
Techie question: if Apple can compile a neutered version of iOS to bypass
encryption, why can't a hacker (or US govt nerd) at least in theory reverse
engineer iOS and patch it accordingly?

(guess answer: iOS needs to be signed. So what they are really asking of Apple
is to sign a lobotomized iOS image...)

------
delinka
What's all this talk about pushing updates to locked phones? I have to get
involved every time there's an OS update for any of my iDevices. That damn red
dot on Settings.app just stares at me while I try to find a time I'd like to
be without my device for half an hour.

------
hoodoof
A company with courage. Hard to believe when virtually no institution,
government or corporate has it.

------
SkidanovAlex
> People use them to store an incredible amount of personal information, from
> _our_ private conversations to _our_ photos, _our_ music, _our_ notes

I wonder if this is a grammar mistake, or Apple actually considers the private
conversations, nodes, photos to be theirs?

~~~
mcintyre1994
I think this is grammatically correct, Tim Cook is just identifying himself as
part of the group that store personal information on their iPhone.

------
jacquesm
The real security risk is the ability to update the phone's OS without
authorized user consent at least as strong as the original protection the FBI
are trying to break.

Right now it all hinges on Apple's private key and that's a very thin wire to
hang all this privacy off.

------
maxnaut
Many may hate Apple, however it's undeniable that they're so committed to user
security.

------
mladenkovacevic
Way to go Apple.

And Edward Snowden just tweeted this a few minutes ago in response to another
tweet proposing Google back up Tim Cook: "This is the most important tech case
in a decade. Silence means @google picked a side, but it's not the public's."

------
okasaki
Since Apple is part of PRISM[0], the FBI can just ask the NSA.

[0]
[https://en.wikipedia.org/wiki/File:PRISM_Collection_Details....](https://en.wikipedia.org/wiki/File:PRISM_Collection_Details.jpg)

~~~
elemenopy
No: data held on Apple servers (iCloud) can be provided with a warrant, but
this is about data held on an iPhone but not saved to Apple servers. As a
result, neither Apple nor the NSA can immediately provide it, and some means
of hacking the iPhone is needed.

------
cant_kant
Link to the full order:

[https://assets.documentcloud.org/documents/2714005/SB-
Shoote...](https://assets.documentcloud.org/documents/2714005/SB-Shooter-
Order-Compelling-Apple-Asst-iPhone.pdf)

It is a PDF.

------
stefek99
"In the wrong hands, this software — which does not exist today — would have
the potential to unlock any iPhone in someone’s physical possession."

Someone who believes in conspiracy theories would make a statement that "now
it is official" :)

------
lucio
Being realistic, ¿how many fewer iphones will apple sell if they remove the
SE? ¿How many people will not buy an iphone if they are told that their info
can be accessed with a judge's warrant? I'm guessing a 0.1% drop in sales?

------
bumbledraven
Cook wrote that "this software ... would have the potential to unlock _any_
iPhone in someone’s physical possession." (emphasis mine)

Is that true? What if it's locked with a secure 128-bit (e.g. 10-word
diceware) passphrase?

~~~
sloanesturz
I believe you are correct. Cook probably meant that it would unlock any iPhone
with the standard 4 or 6 number passcode.

------
p01926
Wow. This is the first HN submission to exceed 5,000 points!

To honour Tim, and his advocacy for our industry, I'm going to spend the rest
of my week developing privacy/security projects. I encourage everyone else to
do likewise.

------
thorn
I wonder what will be response of other manufacturers making phones with
Android.

------
rodionos
Can they publish a copy of the FBI letter. Otherwise, Apple's description
feels a bit circumstantial and opinionated. I feel like I can make a better
judgement on this whole issue if the request is made public.

~~~
timv
[https://news.ycombinator.com/item?id=11117222](https://news.ycombinator.com/item?id=11117222)

------
ageofwant
I am happy AAPL is taking this stance. But I can't help but believe that is
has very little to do with liberty, and very much with the bottom line. Either
way, I guess we should be grateful for little mercies.

------
tdaltonc
What are the odds that Apple has been ordered to do this before, but every
other time they were asked it was in a FISA court? That would mean that this
is the first time they've been allowed to talk about it.

------
hughw
Is it possible for a human just to try all 9999 passcode combinations?
Assuming the 10-failure erasure is switched off -- a bad assumption, I know.
Is there an additional slowdown after a lot of failed attempts?

------
cmsimike
Heads up that I just recently discovered - if your iphone has touch id
enabled, you can go into the touch id settings and selectively disable touch
id for phone unlocking while keeping it for the app store.

------
Kenp77
I'm sorry but "Smartphones, led by iPhone"? Bit presumptuous.

------
intrasight
Apple's OSs are closed and therefore inherently unsecure. When Apples caves,
and they undoubtedly will, it will have the beneficial consequence of being a
boon to open source communications software.

------
sandworm101
Google CEO Sundar Pichai has thrown in with Apple in a series of tweets
explaining his position.

[https://twitter.com/sundarpichai](https://twitter.com/sundarpichai)

------
znpy
Kudos to Apple for standing up to the US government and stand by its users.

------
muddi900
The court order was posted on HN hours before this letter and eother Tim Cook
has not rrad the order or he's lying about the back door. What the court
ordered was the removal of the auto-wipe.

------
amelius
Question: is it possible to design a cryptographic system that, whenever it is
accessed by a third party (government), this is made publically visible in a
log? Can blockchain technology help here?

~~~
jonathankoren
No, because I put the device in a faraday cage, with whatever proxies I need,
crack it, then put it in a woodchipper. No one ever finds out.

~~~
aeonflux
Existence of such system wouldn't make much difference, because no authority
would wan't to use it to access the data. They would claim that the target's
inability to detect a wire is crucial to the ongoing investigation.

------
clarus
I think there are two orthogonal questions:

* Does Apple pretend the FBI cannot access to its devices?

* Can the FBI access to its devices?

The only thing we learn here is the answer to the first question. We know
nothing more for the second one.

------
neves
What happened that now the companies can talk about these gov requests? The
most nefast thing in these gov orders about terrorism is that the companies
were forbidden to discuss it publicly.

------
empressplay
"Apple's reasonable technical assistance may include, but is not limited to:
providing the FBI with a signed iPhone Software file, recovery bundle, or
other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE.
The SIF will load and run from Random Access Memory and will not modify the
iOS on the actual phone, the user data partition or system partition on the
device's flash memory. The SIF will be coded by Apple with a unique identifier
of the phone so that the SIF would only load and execute on the SUBJECT
DEVICE."

People hyperventilating that the tool could be used to crack other phones can
relax, given the last clause in the quoted text (from the actual order).

------
cft
The numerical passcode is likely his ATM pin, or a code from his bank/PayPal
or some such. I hope the government can simply subpoena his bank/PayPal etc
and this will end at that.

------
nateberkopec
Is there _any_ doubt that when the FBI brings up a law from the 1700's to
justify breaking digital encryption in 2016 that they are completely making it
up as they go along?

~~~
rhino369
Sure. Laws don't have to be so specific and they are typically better if they
aren't so specific. The law basically says courts can order anything to upload
the law. Encryption is part of anything.

------
dovdov
"We have no sympathy for terrorists."

They felt the need to state that, huh?

~~~
nindalf
I was surprised that they felt it necessary but its pre-emptive counter to a
pretty common argument - that smartphone makers are aiding and abetting
terrorists by providing them encryption tools.

------
Overtonwindow
The way I read this, is that Tim Cook has and said it can't be done, only that
it shouldn't be done. This leads me to suspect that Apple can decrypt your
phone, and they know precisely how to do it, but in doing so would disrupt
their entire marketing campaign around safe and secure encryption.

I'm just a government relations guy, not a security person, so please forgive
me, but I'm not sure where I fall on this. I want the FBI to be able to
decrypt the San Bernardino attackers phone. The same time, I don't want the
government to be able to decrypt my phone. This is one hell of a damned if you
do, damned if you don't situation, and I'm really stuck.

------
blisterpeanuts
Everyone in the U.S., please write to your Congressional representatives and
also to the Presidential candidates you support. They need to know they can't
get away with this.

------
LeicaLatte
Possible or not, the FBI seems to have formalized the issue using this
opportunity. They are asking the questions they have been wanting to ask since
the release of smartphones.

~~~
joezydeco
Didn't they try this with another (smaller, less publicized) case about 4
months ago?

[https://www.eff.org/deeplinks/2015/10/apples-eula-gives-
it-l...](https://www.eff.org/deeplinks/2015/10/apples-eula-gives-it-license-
invade-your-privacy-government-claims)

------
thrillgore
Am I the only one buying a new iPad because of this announcement?

------
hackuser
My guess is that it's likely that the FBI can access the data without Apple's
help. Based on what we know, how do we distingish between these two
situations, and which seems more likely?

A) Apple has created unbreakable security. The FBI cannot access the data and
needs Apple's help.

B) iPhone security, like all other security, is breakable. iPhones are a very
high-value target (all data on all iPhones); therefore some national security
organizations, probably many of them in many countries, have developed
exploits. The FBI, following normal practice, does not want to reveal the
exploits or capability and therefore must go through this charade.

------
darthmaul
Apple stand is a bunch of BS. The main issue should be that the safety and
protection of humanity. Terrorists are not humans they are bent on destroying
humanity.

------
rajacombinator
I can't recall any previous instance of a mega corporate opposing the
tyrannical US Govt. I fully expect Apple to lose here but it is a valiant and
rare effort.

------
atmosx
I thought Apple already had backdoors. I feel relieved that my iPhone is not
backdoored and I'm also very happy for a company who's products I use daily.

------
dawhizkid
So this was a work phone owned by his employer. Does that change things?
Surprised they didn't have IT software installed already to monitor the
device.

------
thetruthseeker1
Are iphone hard disks (and the files within it) and cloud content encrypted
based on this single private key that is stored in the secure enclave on
iphone?

------
RalphJr45
Wouldn't many countries like Russia and China stop allowing the sale of
iPhones or at least their use by government officials if the FBI succeeds?

------
agebronze
Actually, someone other than Apple is already able to do the requested things
in the court warrant (brute force passcode from a locked iPhone) - ih8sn0w has
an iBoot exploit for A5 chipset (same as iPhone 5c), so he can probably boot
an unsigned kernel, and use some public tools already published to crack the
said passcode. If some lone hacker can do it don't be fooled for a minute that
NSA can't, or that the feds couldn't buy something similar from another
hacker. This is Apple covering their P ass from the press.

------
Shivetya
Good for them. Freedom comes with a price, sometimes that price of freedom is
protecting the privacy of the worst of us to protect all of us.

------
pmarreck
HN is the first place I came to for discussion on this and I just wanted to
thank you all for keeping it civil, intelligent, and objective

------
ratfacemcgee
I have never been more proud to have worked for Apple. Tim isn't afraid to
give the government the old double forks when it counts!

------
lifeisstillgood
Does anyone have a decent architectural overview of iPhone (6)? Security -
these enclaves etc sound good but devil is in the details

~~~
interpol_p
This is a good overview
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

------
castratikron
This is a great way to build public awareness for this issue. Hopefully this
will allow more people to get involved in the fight.

------
droopybuns
Applying an update to break encryption would violate chain of custody and
render the information obtained inadmissible in court.

------
phkamp
The fact that Apple indicates that they would be able to produce such a
software version is in itself a backdoor in the iPhone.

~~~
lern_too_spel
Correct. What this letter says is that if Apple wants to get around the wipe
restriction because they want the data themselves, they can do it. This means
that they will build the backdoor as soon as they are legally compelled to,
but in the meantime, they can use it for a PR stunt.

------
partiallypro
In the future, once terrorists have TouchID iPhones, couldn't they just use
the corpse's finger to unlock the phone?

~~~
lololomg
Touch ID can't be used if

\- The phone was turned rebooted

\- It's been more than 48 hours since the last time the passcode was entered

\- The user didn't set up a touch ID fingerprint

------
tomelders
Remember, iPhone's are available world wide. If the US wants to play world
police, then I want a vote in the US election.

------
darthmaul
Threat to humanity should trump all the garbage Apple and its lackeys are
spewing out. Terrorists are not human.

------
bmoresbest55
As much as I would love to believe in Apple (and any other large tech
company), a part of me still thinks that maybe they are working with the
government in this letter. The FBI knows that the average US citizen does not
want to be hacked. What is to stop the FBI from allowing Apple to say these
things and put on a show publicly while simultaneously giving over the 'master
key' anyway?

------
intrasight
And what happens to the engineer tasked with writing this hack if he fails and
ends up bricking the phone?

------
puppetmaster3
I'm a libertarian. But islamic terrorist phone is just evidence - Apple must
unlock it for the FBI.

------
kyle4211
It took me a bit, and I believe no one has summarized this very well yet.

FBI: "You've built a device that makes it nation-state-difficult to install
custom software without DRM keys. We'd like you to assist us in deploying
software signed with your keys."

Apple: "That feels way too much like asking a CA to sign a cert for you, so
fuck off."

I'm honestly not sure which side I'm on here.

------
joelbondurant
Tim Cook admits iOS is already back-doored in the most weaselly worded message
I've ever seen.

------
jwiley
So only Apple has the ability to do this...not the US government. So we trust
Apple but not USG?

------
yummybear
I can't read it from the letter - are they going to refuse to cooperate? Can
they do that?

~~~
SEMW
They're not 'refusing to cooperate' in the sense of just ignoring the court or
something. They'll file a motion to have the order lifted, with their reasons.
If that's refused, they'll presumably appeal the refusal to a higher court,
and so on, to the max extent they can. That's not being obstructionist, it's
their legal right.

------
mesozoic
And the government wonders why people from tech don't want to work for it.

------
guylepage3
All of a sudden I'm starting to think my PiPhone is looking pretty good.

------
NinoScript
Can't they dump the drive's data to protect it from being erased?

------
boredatnight12
Hmmm. If this pans out in Apple's favor, I may finally buy an iPhone.

------
Dolores12
You don't own apple hardware, so you can't protect your device.

------
maindrive
I think Apple tried to prove that they don't give any user data to agencies.
PR stunt. But actually they got it fundamentally wrong as this was actually
case of national security and the attack that happened. So huge PR stunt, but
a own goal

------
supergirl
sounds like the backdoor already exists, but only Apple knows how to use it.
same as if Apple knew a master password for this phone but refused to give it.
they are saying they don't want to give it because once the FBI has it, then
they are free to use it anywhere. pretty strange post from Apple.

probably they try to fight this request by arguing that the government is
actually asking them to effectively remove security from all the phones (of
this model at least). they would be happy to help break this one phone as long
as it doesn't affect any other phone.

in that case, then Apple should just break the phone and give it back to the
FBI after removing the backdoor.

------
unixhero
Plot twist.

This is actually the result of a barter. The Gov gets to have some low level
TOP-SECRET access in trade for this easy access code and that Apple gets to go
public to keep the populace calm and pretend they are fighting this thing.

------
HoochTHX
This is the FBI going after a Parallel Construction path. They already have
all the information from the NSA bag o'tricks, but none of those can be used
in court. But an unlocked phone unlocks the legal obstacles.

------
tosseraccount
Doesn't the phone belong to San Bernardino County?

------
mrmondo
Mods: can you please update title to add some context?

------
a-b
well, no one protected from thermo-rectal crypto-analysis. The only difference
is that gov guys want to keep it hidden from target

------
darthmaul
seems that this forum is 'moderated'. Views that don't kiss the self
proclaimed savior of freedom are deleted.

------
wildmXranat
To all in-love-with-Apple downvoters, please read this Schneier sound analysis
of the same type of situation that RIM(Blackberry) has been met with:
[https://www.schneier.com/blog/archives/2010/08/uae_to_ban_bl...](https://www.schneier.com/blog/archives/2010/08/uae_to_ban_blac.html)

/quote: "RIM's carefully worded statements about BlackBerry security are
designed to make their customers feel better, while giving the company ample
room to screw them." /endquote

I have lost enough points on this thread to simply double down on this issue.

This is not a good sign at all. While Google can't compete with Apple on the
principle of "not spying on their users". All Apple has to to is to publicize
it and then ask for forgiveness from it's users later.

~~~
rconti
"This is not a good sign" != "This is a bad sign".

I've re-read your comment several times, and I don't get how it's novel or how
it applies here. Of _course_ it _could_ be true in this case, as it could be
true in any decision any company makes. But I don't see how it's insightful or
proves or suggests that Apple is doing the same.

You're not some kind of martyr for the anti-Apple cause here. I think we all
know that Apple could be saying one thing and doing another. That doesn't mean
that they are, and it doesn't mean that this open letter is proof that they
are.

------
7GZCSdtn
As a software developer i'm always looking for the real bug. Weapons kill. Not
Iphones.

------
fiatjaf
Cry, US Government!

------
satyajeet23
Dear Tim Cook,

Thank you!

------
berkeleynerd
A friend of mine at Apple reported multiple Black Vehicles (Lincoln Town Cars
and Escalades) with at least one having MD License Plates at the Apple
Executive Briefing Center this morning between 11AM and Noon. Occupants had
ear pieces and sun glasses and were accompanied by a CHP (California Highway
Patrol) cruiser and three motorcycle escorts. I suppose it's possible this was
a quick (less than 1 hour) VIP stop but given Tim's message last night, as
well as the reaction of folks on campus who were bandying about comments like
"I don't want to work on this or because I don't want to be deposed" the
impression certainly was it was not a friendly visit. Given Tim's very public
push-back I'd think delivery of an NSL with accompanying intimidation is at
least possible. I submitted this HN and updated in real-time. There's a bit
more discussion here:

[https://news.ycombinator.com/item?id=11120365](https://news.ycombinator.com/item?id=11120365)

~~~
archmikhail
Save people like me a trip to the Google: NSL = A national security letter
(NSL) is an administrative subpoena issued by the United States federal
government to gather information for national security purposes. NSLs do not
require prior approval from a judge.

~~~
iamdave
IANAL, but how is it a subpoena if it doesn't originate from the judiciary?

~~~
berkeleynerd
Nicholas Merrill famously fought a 11 year legal battle (and finally won) the
right to reveal all aspects of a National Security Letter (NSL) served to him.
Almost all such letters are accompanied by a complete gag order.

[https://www.calyxinstitute.org/news/federal-court-
invalidate...](https://www.calyxinstitute.org/news/federal-court-
invalidates-11-year-old-fbi-gag-order-national-security-letter-recipient-
nicholas)

EDITED / CORRECTIONS - Thanks commenters - The battle was won by Nicholas
Merrill not Ladar Levison of Lavabit fame as I originally posted.)

~~~
uxp
It was Nicholas Merrill from a little ISP called Calyx Internet Access that
famously challenged the NSL process.

LavaBit's Lamar Levinson is assumed to be under a gag order from some request
he was given by the US government, of which he declined by way of folding his
company and claiming that he could not comply moving forward if he was no
longer the middleman of some form of communications.

~~~
schoen
As a further correction, the Lavabit founder's name is spelled Ladar Levison.

Rather than making assumptions, you can read about the specific kinds of legal
process involved in the Lavabit case at

[https://en.wikipedia.org/wiki/Lavabit](https://en.wikipedia.org/wiki/Lavabit)

You can also read the Fourth Circuit decision on his appeal, among other
things.

------
caogecym
under what kind of pressure would Tim write this public letter?

------
mnglkhn2
There needs to be a distinction between state security and "retail" security.
State security agencies have the legal framework to compel Apple to do
anything and not even talk about it. What I call "retail" security is any act
by any legal enforcement agency in the country. Their requests are bound to be
in large numbers and for all kinds of things. On top of that, these requests,
apparently, are not yet covered by a legal framework. Hence the need to force
upon an old law to try and make Apple comply.

What's at stake for Apple is not only their principles but also one of their
marketing pillar: "you, the user, can trust us with your data/privacy." By
asking Apple to give that up, and quietly, you actually are asking them to
undermine their business model. Shareholders will not appreciate that if they
wouldn't have a chance to hear about it first. The Apple brand would lose from
its value and it would reflect in the AAPL share price.

My point is that the whole thing needs to have legal backup. And Apple is
asking for this exact thing: give me a law to use. And not something from the
1700's.

------
jsprogrammer
If it is possible to build the requested OS, then it can be said that the
iPhone already has a backdoor.

If the device were truely locked down, there would be no aftermarket solution
to unlock it.

My understanding is that Apple was asked to supply software that would prevent
their software from destroying evidence on a particular device. They should
comply with this order, especially given the device in question.

~~~
Synaesthesia
That's true for the iPhone 5 and earlier but for the iPhone 5S and later Apple
actually made it impossible (see secure enclave). But it's not about that but
rather the legal implications this has - it would set a precedent allowing the
government to basically compel any company to provide keys to decrypt
information which is a huge blow to privacy.

------
DannoHung
Cook says any iOS device could be breached if this software were created. But
other articles have led me to believe that any iOS model with touchid is
immune due to the secure enclave being in play even for non-touch passcode
access. Is this wrong?

------
rogersmith
Gotta give it to Apple, they sure know how to pull off a PR stunt.

------
zobzu
What im reading is that apple can remote install an update that disable
encryption. They dont want to do it.

But that they have the capability is a bit scary.

~~~
briandear
Please read carefully. They don't have that capability. The FBI wants them to
create it.

~~~
clort
if they can create it, they have the capability.

~~~
ThatPlayer
They never say they can create it. They say the FBI wants them to create it,
which may or may not be possible and is not addressed in this letter.

~~~
zobzu
you've noticed auto updates yes?

------
ogezi
it's a slippery slope.

------
caogecym
under what kind of pressure would Tim decide to write this public letter?

------
blazespin
Apple should be more clear that this is 5C and not the latest version.

------
jaboutboul
Just unlock the freaking phone for them...

------
z3t4
Instead of FBI paying apple engineers to hack a phone, why don't they ask
their kids !? It would probably save millions of dollars.

~~~
VLM
Whats on the phone that isn't already on at least one cloud server and on NSA
spy servers and telco records? That's the real question that everyone is
carefully avoiding.

I mean, you could break into my android phone at enormous effort to use my
phone to access my gmail app, but isn't it easier to just ask google, and I'm
sure the telco and NSA are already logging everything anyway?

You could break into my phone to use my phone to use my facebook app to look
at my uploaded pictures, but isn't it a million times easier to just contact
facebook to access my facebook account?

In this new era of dumb terminals, its like a FBI agent demanding access to
the terminal settings screen of a vintage VT102 in order to track terrorists
or whatever. Or a demand to know my modem init string. It demonstrates a
fundamental lack of understanding of the entire ecosystem from top to bottom.

The purpose of all this drama is to avoid discussion of the insecurity of
cloud services.

It might be that apple people use the cloud a lot less than us android people.
I'd be interested and surprised to learn that.

------
droithomme
Well they seem to be saying that the approach they describe, to make a
modified OS, would actually work to circumvent encryption on a preexisting
device. That means that they already know the device is not really actually
secure.

They aren't talking about putting a back door into systems to be used in the
future, they are saying it's indeed feasible to place a backdoor on a device
already out there and then use the backdoor to access the device. That means
the device is not actually secure.

------
planetjones
With the due legal process the police can search property, safety deposit
boxes, bank accounts, vehicles, etc. etc. Why should a smartphone be any
different just because Apple says it is ?

As much as I value privacy I really don't agree with Apple's stance here - if
due legal process has been followed, why shouldn't they be able to read the
contents of an iPhone ?

And yes I get that third party encryption can be used, which isn't owned by
Apple and that there's little the authorities could do about it - but that's
not the case at hand here.

~~~
GuiA
This is not the government asking to search a single vehicle or safebox, to
take your examples. This is the government asking that every safebox or
vehicle in the world be made instantly unlockable by design.

~~~
wildmXranat
Exactly. I think OP is a bit fast and loose on the definition of area of
interest.

It's not that it's bad if it's this case. It's that is bad if it applies as a
built-in backdoor for whenever they feel like using it.

------
Twisell
[In walk the drones]

"Today we celebrate the first glorious anniversary of the Information
Purification Directives.

[Apple's hammer-thrower enters, pursued by storm troopers.]

We have created for the first time in all history a garden of pure ideology,
where each worker may bloom, secure from the pests of any contradictory true
thoughts.

Our Unification of Thoughts is more powerful a weapon than any fleet or army
on earth.

We are one people, with one will, one resolve, one cause.

Our enemies shall talk themselves to death and we will bury them with their
own confusion.

[Hammer is thrown at the screen]

We shall prevail!

[Boom!]

On January 24th Apple Computer will introduce Macintosh. And you'll see why
1984 won't be like '1984.'"

\------------------------------------------------------------

Apple Superbowl AD "1984"

Transcription courtesy of George Gollin, 1997

Edit:Removed the link to the video. My goal wasn't to draw traffic anywhere it
was just to point out that some of Big Brother sentences in an Ad aired 30
years ago still have strong resonance today.

"Our enemies shall talk themselves to death" Hum... just read yesterday that
NSA is believed to use machine learning over cell big-data to determine drone
target...

------
wildmXranat
Not getting an iPhone, even secured - Check!

I bet hardware vendors are just salivating at the concept of having to produce
thousands of iPhone cracking docking stations.

------
lunasight
> The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino
last December. We mourn the loss of life and want justice for all those whose
lives were affected. The FBI asked us for help in the days following the
attack, and we have worked hard to support the government’s efforts to solve
this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it.
Apple complies with valid subpoenas and search warrants, as we have in the San
Bernardino case. We have also made Apple engineers available to advise the
FBI, and we’ve offered our best ideas on a number of investigative options at
their disposal.

We have great respect for the professionals at the FBI, and we believe their
intentions are good. Up to this point, we have done everything that is both
within our power and within the law to help them. But now the U.S. government
has asked us for something we simply do not have, and something we consider
too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating
system, circumventing several important security features, and install it on
an iPhone recovered during the investigation. In the wrong hands, this
software — which does not exist today — would have the potential to unlock any
iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake:
Building a version of iOS that bypasses security in this way would undeniably
create a backdoor. And while the government may argue that its use would be
limited to this case, there is no an way to guarantee such control

This is just pure awful they admit to helping the fbi. how can we trust them

------
rms_returns
This is quite unlike Apple. Is this the same company that insists on keeping
its source proprietary and is always against FOSS? The idea that you care for
your users' privacy and still like to keep control on them by not giving them
the freedom to modify source-code is not what I buy.

~~~
nindalf
You're talking nonsense. Protecting users' privacy and keeping source code
closed aren't mutually exclusive.

~~~
rms_returns
Yeah. Loving your spouse and hiding things from them are also not mutually
exclusive. Yet, you hardly find people that do both. Not to mention, this
situation would never have arisen if iphone were open-sourced, since all
activity would have been monitored by the community.

~~~
aeonflux
You have way too much faith into the the idea, that open source projects are
free of backdoors / exploits and that the community can prevent creation of
those. Aside from that, there is currently no platform, where 100% code is
open source.

~~~
ageofwant
Of course we can't. But we are in a vastly better position to positivity
affect these things if we do have the source.

It is immaterial whether or not any platform is currently 100% open source.
rms_returns is right in noting the discrepancy between noble apple sticking it
to the man and walled garden apple sticking it to the user.

