
Apple Has Not Unlocked 70 iPhones for Law Enforcement - augb
http://techcrunch.com/2016/02/18/no-apple-has-not-unlocked-70-iphones-for-law-enforcement/
======
WatchDog
Not sure if this has been discussed, but it seems that the only reason this is
an issue is because Apple has the ability to install new software on a locked
device.

Couldn’t Apple simply remove the ability for a software update to take place
without the device being unlocked or wiped?

Obviously this wouldn’t apply retroactively to old IOS versions, but it would
be consistent with their change in policy from IOS 7 to 8.

~~~
vilhelm_s
I think they have already done this, but the phone at the center of the
current controversy had not been updated to the new version of iOS. (In the
new version, in order to update the OS on a locked phone you need to enter
your passcode).

Edit: No, apparently this is wrong. I got it from
[https://www.washingtonpost.com/news/volokh-
conspiracy/wp/201...](https://www.washingtonpost.com/news/volokh-
conspiracy/wp/2016/02/18/preliminary-thoughts-on-the-apple-iphone-order-in-
the-san-bernardino-case-part-1/) , but that article has since been updated.

~~~
ubernostrum
From TFA:

 _The California case, in contrast, involves a device running iOS 9. The data
that was previously accessible while a phone was locked ceased to be so as of
the release of iOS 8, when Apple started securing it with encryption tied to
the passcode, rather than the hardware ID of the device. FaceTime, for
instance, has been encrypted since 2010, and iMessages since 2011._

 _So Apple is unable to extract any data including iMessages from the device
because all of that data is encrypted. This is the only reason that the FBI
now wants Apple to weaken its security so that it can brute-force the
passcode. Because the data cannot be read unless the passcode is entered
properly._

~~~
pyre
They want Apple to disable limitations that disallow them from brute-forcing
the passcode. This isn't even possible (supposedly) on newer A7-based devices
that have security hardware (i.e. the "secure enclave").

~~~
SAI_Peregrinus
I'd say it's certainly possible, just difficult. Considering what Chris
Tarnovsky has done as a hobbyist (with a focused ion beam workstation) to hack
several TPM and smart card chips and get their secrets out the FBI has to be
incompetent not to be able to do the same thing to Apple's hardware. Or this
is just a fishing expedition because doing that is difficult and doesn't give
them any more power.

------
sandworm101
>>> Apple also argues that since its reputation is based on security and
privacy, complying with the court’s demands based on an expanded application
of a 200-year-old law could put it at risk of tarnishing that reputation.

One one hand, a corporate reputation. On the other, people from the FBI saying
they need access to prevent terrorist attacks. There are many arguments to be
made. This is not the winner.

~~~
rosser
One of those sides is being fatuously disingenuous.

The San Bernardino shooters demonstrated rather competent op-sec, and the 5c
the FBI wants Apple to backdoor was a _work phone_ , not even a personal one.
I'm spectacularly skeptical they used that phone in any way whatsoever
relevant to the attack. More to the point, _the attackers are already dead_.
There is _nothing to prosecute_ here.

They picked this case as the one to push because they thought it had the best
PR value for their purposes, not because of any exigence or concern over
further impending attacks.

That is to say, the FBI is cynically leveraging the outrage at a mass-
shooting, inflicted by "brown people", to further their anti-crypto, "We must
have access to ALL OF THE THINGS!" agenda. It's pretty disgusting, frankly.

EDIT: snark.

~~~
lern_too_spel
Most people who use a work phone do not use a personal phone. This phone could
have evidence to implicate their neighbor ([http://abcnews.go.com/Blotter/san-
bernardino-shooters-neighb...](http://abcnews.go.com/Blotter/san-bernardino-
shooters-neighbor-enrique-marquez-pleads-guilty/story?id=36117532)) in a
crime.

I see nothing "disgusting" about the FBI's behavior here.

~~~
rosser
You're perfectly within your rights to trust law enforcement as much as you'd
like, believe that any investigative power they're given won't be abused, and
that this isn't a cynically calculated grab for exactly that purpose.

I don't.

~~~
lern_too_spel
This isn't a grab for power — this is a power that they _already have_. They
went through the courts to get the proper order, following the letter of the
law. We give the government the power to open bank safes, seize property, and
wiretap; but we make laws that allow them to do so only when it's in society's
best interests. If you don't think the law serves society's best interests,
it's up to you as a citizen that your government represents to help change it.
In this case, I think you'll find that a majority of people will say it is in
society's best interests to open the San Bernardino shooters' bank safes,
obtain their phone records, and decrypt their phone in order to look for
accomplices.

~~~
rosser
FISA warrants and NSLs are "following the letter of the law", too. You don't
think they're subject to egregious abuse?

As for what the "majority of people" find reasonable, all I have to do is look
at the polling numbers in the current presidential race to find myself in
abject disgust at what the "majority of people" appear to think. They seem not
to give half a shit what happens to anyone else, as long as they can watch the
next episode of American Idol without fear of "getting blowed up by terr'ists"
— despite the fact that more people have _won the lottery_ than have died from
terrorism in the US since September 11th.

EDIT: Anyway, there's nothing new or informative to be found in this
discussion. We disagree. Let's leave it at that.

~~~
iofj
This is a court case. What is under this discussion is not what is reasonable,
but what is the law and how it applies to this case. That seems to come down
to whether or not is is considered an "undue burden" on Apple to break
security on this device, and that's it.

If you want the law to change, I would hope that the court sides with the FBI,
as that will provide far more ammunition to anyone wanting to change the law
than ruling in favor of Apple.

~~~
narrowrail
The only law I've seen proposed would force Apple to comply with the FBI
request. Given that, perhaps the law doesn't currently mandate what you think
it does?

------
Overtonwindow
I wonder what Apple's PR machine is doing on this front to support their
position. The more articles I see about what Apple can, cannot, will not, or
could not do, makes me suspect Apple's PR team is behind some of those
articles.

------
dclowd9901
I get so sick of this. Whenever some sort of dictatorial measure is taken or
proposed, the go-to cleanup is "this is the norm, this is status quo, nothing
to see here."

We saw it all over the fucking Bush campaign when special rendition reports
and waterboarding shit was coming out. We even saw Bush (probably Cheney)
writing his stupid little memos which he tried to mold into executive orders,
absolving the CIA of torture.

Does anyone actually buy this fucking defense?

------
hauget
Excuse my ignorance, but is Apple's encryption in iOS 9 devices significantly
stronger than any of the newest Android devices? Also, are there any
precedents of any government demanding the unlocking of any Android phones?

~~~
CaptSpify
>Excuse my ignorance, but is Apple's encryption in iOS 9 devices significantly
stronger than any of the newest Android devices?

AFAIK, I don't think we know, or can compare. It's my understanding that
Apple's setup is closed-source, so we can't inspect, whereas Android's is
open.

Someone please correct me if I am wrong.

>Also, are there any precedents of any government demanding the unlocking of
any Android phones?

This isn't an issue, as, by default, android phones send everything to the
cloud (exceptions for custom roms, etc). So govt can just ask Google for a
copy from their server instead.

------
weinzierl
This is interesting and new to me:

    
    
       > For iOS devices running iOS versions earlier than 
       > iOS 8.0 [..]
       > Please note the only categories of user generated 
       > active files that can be provided to law enforcement [..] are: 
       > SMS, iMessage, MMS, photos, videos, contact, audio recording, and 
       > call history. 
       > Apple cannot provide: email, calendar entries, or any third-party 
       > app data.
    

What is the difference between the two categories? Are email and calendar
entries encrypted on iOS7 and below?

------
satyajeet23
If it's about Apple, some key detail will always get jacked up by a journalist
like Shane Harris of 'the daily beast', trying to translate it for the public.

Report is extremely misleading and an example of bad journalism.

------
marcoperaza
TLDR for what follows: Mandated backdoors must be a red line, but this is not
a request for a backdoor and actually seems pretty reasonable. Trying to argue
that the tech industry shouldn't help, even in this case, is not only the
wrong position in my book, but a sure way to lose the bigger debate.

My views on the general encryption controversy are:

1\. Everyone must be free to make their technology as secure as they possibly
can. There can be no mandated weakening of security, back-doors, or other
requirements to make the information more easily accessible by law
enforcement. On newer iPhones, Apple has patched up the flaw that the FBI
wants their help with exploiting. They must continue to be allowed to do that.

2\. The government must be able to demand, with a court order predicated on
probable cause, that companies provide any and all information that they have
that could be useful in circumventing their security features. This can be
everything from technical specifications and threat-model analyses, to lists
of unpatched vulnerabilities and code-signing keys.

3\. It seems to me that American companies have a moral obligation that goes
beyond the legal obligations in point #2. They should be actively assisting
the government in recovering information, especially when concerning issues of
national security. In extreme circumstances, like total war, this should
definitely be legally mandated. I'm undecided as to what the policy should be
generally. On a practical level, it's probably not feasible for the government
to, e.g. start hacking around the iOS codebase themselves, so just information
might not be enough.

I'm not too troubled by this court order, especially given the particular
circumstances. The right to make products as secure as you can, even from
yourself and the government, is what's really important to defend.

Trying to argue that the tech industry shouldn't help, even in this case, is
not only the wrong position in my book, but a sure way to lose the bigger
debate.

Disclaimer: These are obviously my own personal views and nothing else. They
do not necessarily reflect the opinions, policies, or practices of anyone but
myself.

~~~
cmurf
I call b.s. There is no possible justification for a company to hand over any
private keys, code signing or otherwise.

There's valid questions whether a data set should or needs to be encrypted.
But giving up private keys means some entity can masquerade as that key's
owner, and alter what that owner has written. Not OK. Authentication,
veracity, provenance, of a data set is vital to the trust model. Why would you
advocate breaking this?

Second, assisting which government? Any? All? Some? What's the metric?

~~~
marcoperaza
Our threat-models and business-models don't have the benefit of being law.
They must rise and fall according to the realities of the real world that they
contend with.

A digital signature does not inherently mean much. It just (pseudo-)guarantees
that a holder of a copy of the private key signed the data. Taking that to
mean anything else, such as that a signed update was freely designed by Apple
with only pure intentions, or even that Apple is the only holder of the
private key, is nothing but our own wishful thinking. This court order has
hopefully shattered our naivete. The real-world now says that you can't trust
that a third-party's private keys are not controlled by governments. It was
extremely unrealistic for Apple to include the government as an adversary in
their threat-model, claim to protect against it, but then leave this opening.

> _Second, assisting which government? Any? All? Some? What 's the metric?_

When it comes to questions of legality, the simple answer is that the company
has to abide by the laws of all of the countries that it operates in. If they
don't want to, or if they can't because two countries have conflicting legal
demands, they have no choice but to leave that country. This cannot be any
other way. See my other responses in this thread for a more detailed
discussion of this "but what if China did this" issue, if that's the argument
that you're alluding to here.

When it comes to questions of only morality, it's up to the individual people
in a company to advocate for the right course of action. Only their own
conscience, and the free judgments of their peers, can guide them.

~~~
cmurf
The court order does not require Apple turn over their private key. What it
does is compel Apple to sign something with their private key, under duress.

You're basically making a might makes right argument, far extended from a
Hamiltonian position of implied powers.

If public key cryptography can't be trusted, and governments are as a matter
of fact not fully trusted and in at least the U.S. we're taught explicitly
that it can't blindly be trusted which is why we have a _written_
constitution, then the entire trust model fails and we can just stop with all
of this nonsense and go back to pen and paper.

