
Apple’s dangerous game - abdullahkhalids
http://www.washingtonpost.com/news/volokh-conspiracy/wp/2014/09/19/apples-dangerous-game/
======
RickHull
Orin Kerr is clearly a lawyer. He argues not for what is right but for what is
permissible given the existing body of law. Where the existing body of law is
wrong, his writings have no sympathy for the damage inflicted on its victims
nor any sense that resisting such wrongs is noble.

If the government never abused its authority, I would be much more sympathetic
to Kerr's position. Given the facts of prosecutorial abuse, vague and
conflicting laws, the ongoing gutting / suspension of habeas corpus, "three
felonies a day" \--
[http://www.threefeloniesaday.com/Youtoo/tabid/86/Default.asp...](http://www.threefeloniesaday.com/Youtoo/tabid/86/Default.aspx)
\-- and many more reasons not to rest moral authority on our body of law, I
find his insistence that we revere lawfulness to be foolish at best. It's
important to stand up to unjust laws.

Leaving aside warrants which are merely held to be lawful (where the true
facts, might we know them, show the opposite), it's not just lawful warrants
that this technology protects against. Law-abiding citizens may be compelled
to reveal private information via totally unjust mechanisms such as National
Security Letters. Consider Nick Merrill's experience:
[http://en.wikipedia.org/wiki/Nicholas_Merrill#Challenging_th...](http://en.wikipedia.org/wiki/Nicholas_Merrill#Challenging_the_National_Security_Letter:_Doe_v._Ashcroft)

Also [http://techpresident.com/news/24272/nick-merrill-fought-
fbi-...](http://techpresident.com/news/24272/nick-merrill-fought-fbi-can-he-
beat-nsa-too)

Let's not even get into protecting yourself from the NSA. Any tools which help
us resist such tyranny, even at the risk and expense of civil disobedience,
are to be applauded. Reserve your condemnation for the John Yoo of the tech
world.

~~~
snowwrestler
Orin Kerr is definitely a lawyer, and has been working and writing on digital
crime and civil rights for years. For example, he was part of the defense team
during weev's appeal of his conviction.

I do think he thinks beyond the technicalities of the law. And I think he's
correct that there are legitimate reasons to pierce device encryption. If a
victim is murdered, and their phone is locked, it sure would be nice too see
if there were some evidence on that phone that could help catch the killer.

That said, he knows better than most what the law does say, and I think that
makes this an op-ed worth reading carefully. I found two of his points
interesting.

1) A passcode is probably not covered by Fifth Amendment protections. Thus a
phone owner who refuses to lock their phone could be punished by the court
anyway.

2) An encryption system strong enough to thwart legitimate and legal police
actions could lead to a legislative "backlash" (my word, not Orin's) of
increased consequences for failure to unlock. This is the "game" in the title
of the op-ed. Technology and law are often in tension and an escalation from
one side could trigger an escalation from the other.

But I disagreed with his state that "The policy switch doesn’t stop hackers,
trespassers, or rogue agents. It only stops lawful investigations with lawful
warrants." That seems wrong to me because it's not just a policy change, it's
an improvement in how the OS encryption actually works. Seems to me that could
help keep out some bad guys too.

~~~
RickHull
I completely agree with your assessment of Kerr, and there is nothing wrong
with purely legal analysis.

What got to me initially was the deliberate limiting of legal scope to
warranted privacy violations, as though warrantless privacy violations are
insignificant. He's welcome to do so, but he should be clearer about this and
less rhetorically dedicated to his agenda.

 _EDIT: I see now that he reasons this technology only affects warranted
privacy violations, since Apple has a policy of refusing unwarranted
inquiries. He is overlooking the possibility that Apple may nonetheless be
pressured, deceived, or compelled to violate users ' privacy. He doesn't
realize that Apple's claim of being invulnerable to warrants also protects the
user from Apple's own vulnerabilities. It's not just USA law enforcement that
targets users by exploiting Apple's access to protected data._

My understanding is that warrantless privacy violations have grown
tremendously since the Patriot Act, both in scope and quantity -- that we have
many new laws or codes since then which reclassify previously warrantful
privacy violations as now warrantless.

He asks "How is the public interest served by a policy that only thwarts
lawful search warrants?"

This is a nasty rhetorical trick. Does Apple's policy change serve any other
purpose than to thwart lawful search warrants? Kerr puts "no" in your head,
but of course the answer is yes.

He then makes appeal to the "civil libertarian tradition" of protection from
warrants, saying that Apple is thumbing its nose at "that great tradition".
What a joke. Apple is selling a product with powerful capabilities, some of
which can be used to break the law, all of which have legitimate lawful
purposes. Are Ford, Boeing, Delta, Budweiser, Louisville Slugger, Smith &
Wesson all playing dangerous games as well?

Lastly, this is about removing Apple's ability to unlock. Just like Colin
Percival's tarsnap. Doing proper security via proper cryptography. Minimizing
untrusted third parties. Apple is getting criticized by Kerr for doing the
right thing for its users. There are no victims in this action. It should
matter not a whit that this makes the government work harder to violate
privacy. Apple has no general duty to offer permanent taps for the
government's pleasure.

 __This is why focusing only on warrantful privacy violations is a travesty of
justice. __

Please watch this before re-reading
Kerr:[https://www.youtube.com/watch?v=eT2fQu50sMs](https://www.youtube.com/watch?v=eT2fQu50sMs)

------
AnthonyMouse
> Because the victim isn’t alive to share his password, and the phone will
> have locked before the body was found, the government won’t be able to
> search the phone to find the messages. Apple’s policy will keep the police
> from finding the killer. That seems bad.

The problem with this argument is that it applies to _all_ secure encryption.
The purpose of encryption is to keep people not authorized to access private
data from being able to access it. The fact that it may work as designed is
hardly a sufficient excuse to backdoor everything.

Kerr's position seems to be that the government should be authorized to unlock
everything in one way or another. The list of problems with that is long and
well known. We can't actually trust anyone, including the government, with the
keys to everything. Once they have the keys there is nothing to stop them from
using them without a warrant.

And backdoors are security vulnerabilities. You intend for them only to be
used by the government using constitutional process but they end up being used
by criminals and foreign intelligence.

Meanwhile a required lack of security causes chilling effects. Politically
unpopular groups will be afraid to communicate if their devices are compelled
to spy on them and corrupt government officials can use that to harass and
oppress them. Not to mention the small matter that it enables corrupt
government officials to harass and oppress them.

~~~
snowwrestler
I think Orin's point is not that everything should have a backdoor (which
would not comport with his many previous writings on tech and law), but that
Apple's shift, and in particular the public way in which it has been done,
could trigger unintended consequences from the courts or Congress. Hence the
"dangerous game" title.

~~~
toomuchtodo
Is Congress now going to legislate away the fourth amendment?

~~~
snowwrestler
The Fourth Amendment prohibits unreasonable searches and seizures, not all
searches and seizures. Given a warrant is properly issued, a court can hold a
person in contempt if they refuse to provide the documents listed in the
warrant. So the question is, what is the punishment for contempt? Congress can
legislate that for specific situations.

------
eksith
The very first sentence starts with a faulty premise: "Apple has announced
that it has designed its new operating system, iOS8, to thwart lawful search
warrants"

Why couldn't Apple have designed it to remove itself from the burden of having
to play fisherman? While it's true that there may have been moral drivers in
this design decision, it makes sound business sense as well.

You can't betray what you don't know. Which is the ultimate position to take
in order to be competitive in a privacy conscious world. Besides, warrants are
rubber stamps now (as pointed out here and many places elsewhere) and are by
no means the carefully measured moderator of state influence they may have
once been.

------
microcolonel
I think it's hard for a lawyer, who has to say things like "If I understand
how it works" before speaking about a topic, to understand that it is
infeasible to create a system which is both secure in general, and insecure
when the government wants something. These changes are likely in response not
only to public distrust in institutions like NSA, but also to existing general
threats to the security model of previous iOS releases, which like any system
with a backdoor, have the potential for the backdoor to be used nefariously.

------
abdullahkhalids
The arguments in this article are hinged on one crucial premise: Apple stills
owns your device even after selling it to you. This is different from gmail
where your data is on servers owned by google. The analogy to this premise is
that the producers of a safe that they sell to you must be able to provide the
government a key to the safe. This clearly does not make sense, and neither
does requiring Apple to always have a backdoor to your device.

~~~
nl
_The arguments in this article are hinged on one crucial premise: Apple stills
owns your device even after selling it to you._

Hu? That idea isn't even mentioned in the article, and it seems entirely
irrelevant.

 _The analogy to this premise is that the producers of a safe that they sell
to you must be able to provide the government a key to the safe. This clearly
does not make sense.._

It may not make sense to you, but key escrow is a thing both for encryption
keys[1] and for physical keys[2].

I agree there are pretty serious problems in both cases

[1] This is basically what happens in iOS 6 and below: Apple has a key they
use to unlock the device. Another example is the aborted Clipper chip:
[http://en.wikipedia.org/wiki/Clipper_chip](http://en.wikipedia.org/wiki/Clipper_chip)

[2]
[https://www.schneier.com/blog/archives/2011/07/physical_key_...](https://www.schneier.com/blog/archives/2011/07/physical_key_es.html)

~~~
golemotron
The point is that when someone sells you a thing, they should no longer be
responsible to someone else for it. You should be responsible. It is now your
thing.

~~~
nl
Oh, I see what you are saying. That's an argument that isn't supported in the
physical world, and one that isn't made in the article.

In this case it is more similar to car manufacturer being asked under warrant
to produce a new electronic key for a car.

They have specialized skills and tools that allow them to do that, as did
Apple prior to iOS 8.

No one is making the argument that Apple retain ownership of the device.

(To be clear - I think Apple is doing the right thing here. But there is no
point in trying to confuse the issue with inaccurate arguments)

------
chroma
For most of us, Orin Kerr is on the opposite side of the privacy debate. He
thinks that current online privacy laws go too far in the direction of
protecting the accused. There is a summary of one of his talks at
[http://hlrecord.org/?p=10987%7COrin](http://hlrecord.org/?p=10987%7COrin).
(It's a shame there is no recording or transcript of that talk.)

I think the main difference between my opinions and his is that he places much
more trust in government.

------
bengrunfeld
Apple's architecture decision proves a quiet point - that Apple feels it has
been illegally pressured by the government (notably the NSA) to crack too many
cell-phones, and that their own business and the rights of their users are at
risk from an overly aggressive government. I interpret Apple's move as a self-
defense mechanism to attempt to stop the immoral actions of a government over-
stepping it's bounds, and to protect the people that Apple cares about most -
their users.

------
clamprecht
The article asks:

> How is the public interest served by a policy that only thwarts lawful
> search warrants?

Perhaps the answer is that judges act as rubber stamps now, authorizing way
too many search warrants. The author assumes that the judges are fairly
applying the 4th amendment.

~~~
PhantomGremlin
> judges act as rubber stamps now

Exactly. Our government is as overbearing and oppressive as it's been in a
very long time. I read somewhere that judges approve something like 99% of the
search warrants presented to them. They no longer serve as a check on law
enforcement.

I'd love to hear otherwise, to hear how judges are doing a good job balancing
the interests of people against the interests of law enforcement.

~~~
vinceguidry
I did not downvote you, but there's a perfectly good explanation for why 99%
of warrants would be approved.

Consider that the police know what warrants a judge is likely to approve and
which he's likely to turn down. Over time, police departments will learn to
submit the warrants they can get approved and not bother wasting a judge's
time with warrants that won't be approved.

So very high approval rates do not necessarily signal the loss of the judge's
ability or willingness to check police power. Just that everybody involved is
doing their jobs as professionals in their domain.

~~~
AnthonyMouse
> there's a perfectly good explanation for why 99% of warrants would be
> approved.

Your argument is that the approval rate could be 99% legitimately. Even if
true it doesn't actually provide any evidence that judges are not acting as a
rubber stamp, it only attempts to discount some evidence in favor of it.
Moreover, the fact that the approval rate is 99% is still evidence that judges
are not being very critical in approving warrants. The high rate makes it
statistically more likely that judges are approving warrants uncritically than
would be the case if the rate was lower.

~~~
vinceguidry
> Moreover, the fact that the approval rate is 99% is still evidence that
> judges are not being very critical in approving warrants.

No it is not.

> The high rate makes it statistically more likely that judges are approving
> warrants uncritically than would be the case if the rate was lower.

As I said in another sub-thread, you are going to have to explain the
statistics involved if you want to make an argument from them. Otherwise you
are just being circular. You cannot expect a statement to be its own proof.

~~~
AnthonyMouse
> No it is not.

Yes it is. There is no question that it is evidence, so you can only be
questioning whether the evidence supports the proposition. You want a
mathematical proof? OK. Without knowing the percentage of approved warrants we
can't exclude the possibility that exactly zero warrants were approved, which
would be a hard disproof of the proposition that judges are uncritically
approving warrants. Discovering that the percentage is nonzero eliminates the
possibility that it is 0%, which increases the probability that the
proposition is correct. Continuing by induction, with each warrant not
approved the possibility that it was approved uncritically is eliminated,
whereas with each warrant approved the possibility that it was approved
uncritically is retained. So the higher the percentage of approved warrants
the more it supports the proposition that they are being approved
uncritically. QED.

What you're arguing is that it's weak evidence. It could be that there were
many legitimate warrants requested and few illegitimate ones. But that's a
weak argument unless you can present some contrary evidence for the opposite
position. All you're doing is claiming that the proposition hasn't been
strongly proved, not proving it incorrect.

~~~
vinceguidry
Your position is circular. You are defining 'critical-ness' with the sole
criteria being the approval rate. If we accept that as the only criteria, then
sure the proposition stands, but then it loses meaning and becomes nothing
more than a rhetorical device.

Generally speaking, we mean 'critical' to involve the weighing of a decision
against an ethical standard. Assuming the standard isn't changing, we would
expect an equilibrium to emerge where the parties involved would come to
understand the standard, especially given that the warrant process is not
adversarial. So there's no reason not to expect the approval rate to settle at
close to 100%, with the few denials being law enforcement attempts to skirt
the boundaries.

Anything less than a very high rate would indicate that the standards are
inconsistent or are being inconsistently applied. The only reason I can see to
be inconsistent is to game statistics like this one that seem to mean more to
certain people than they actually do.

~~~
AnthonyMouse
> Your position is circular. You are defining 'critical-ness' with the sole
> criteria being the approval rate. If we accept that as the only criteria,
> then sure the proposition stands, but then it loses meaning and becomes
> nothing more than a rhetorical device.

There is nothing circular about it. There is no requirement to exclude other
evidence. Regardless of what other evidence exists, discovering that there is
a very high approval rate makes it more likely that warrants are approved
uncritically than it was before the approval rate was known.

> Assuming the standard isn't changing, we would expect an equilibrium to
> emerge where the parties involved would come to understand the standard,
> especially given that the warrant process is not adversarial. So there's no
> reason not to expect the approval rate to settle at close to 100%, with the
> few denials being law enforcement attempts to skirt the boundaries.

The fact that the process is not adversarial only makes it more likely that
judges are approving warrants they shouldn't be. Moreover, prosecutors have
every incentive to try to "skirt the boundaries" as often as possible, unless
the boundaries are so broad they don't need to be skirted.

> Anything less than a very high rate would indicate that the standards are
> inconsistent or are being inconsistently applied.

Or that prosecutors are continually testing the fences as they have every
incentive to do.

------
cwyers
"The first question is whether the government can lawfully compel the
telephone’s owner to divulge the passcode. I believe the answer is that yes, a
person can in fact face punishment for refusal to enter in the password to
decrypt his own phone. If the government obtains a subpoena ordering the
person to enter in the passcode, and the person refuses or falsely claims not
to know the passcode, a person can be held in contempt for failure to comply."

Soooooooooo... there already is a legal mechanism to force the phone's owner
to divulge the password. Why isn't that enough?

~~~
andrewchambers
I don't know how they can reasonably prove you remember what the code is.

~~~
hangonhn
Because most people use their phones on a daily basis, it would be rather
difficult to convince a jury that you don't remember, especially when most, if
not all, members of the jury will have a smart phone themselves.

------
aianus
I would rather let a few murderers go free than expand the government's power
to store and retrieve private communications. You never know what benign
things you might be doing today that will one day be dangerously illegal (or
maybe just embarrassing enough to hurt you now that you have some kind of
power).

------
foobarqux
Kerr's argument sounds silly when you use analogies with conventional devices.
Nobody would argue that a highly secure safe should have a backdoor that can
be used by the manufacturer to open it.

~~~
rmc
Safes do have a back door. Drills.

If you get possesion of a safe, you can physically cut it open (given enough
money, expertise and time).

------
ianferrel
The public is interested in (and is served by) a technical solution to lawful
search warrants because the public no longer believes that lawful warrants are
just.

~~~
chasing
You're speaking for an awful lot of people and an awful lot of different kinds
of search warrants....

------
dozenal
"Apple’s design change [is] one it is legally authorized to make, to be
clear."

That's really all that matters to me.

------
jqm
These are the fruits of the abuse of the fourth amendment.

This action is a prime example of why (ethics and liberties aside) getting
carried away with surveillance and warrentless snooping was ultimately a bad
idea even for those who were doing it.

There are reasons for fair treatment of citizens that stem directly from
practicality. But lack of foresight and hubris seem all too common with
government officials as of late.

------
hatty
I'll leave it to other hackers to put it more eloquently. Any means to bypass
the encryption on iOS 7 and before are vulnerabilities that adversaries can
use to bypass the encryption on iOS 7 and before. Apple is basically saying
that they didn't build in back doors, which this author is making the case
for. iOS 8 data is still available to the government by other means than
warrants and at much, much, much higher expense.

~~~
zalzane
are you sure they built in a real backdoor?

i thought they salted user PINs with a hard-wired nonce that's specific to
every device - then when the fuzz needs to get the device unlocked apple looks
up what the hard-wired nonce is for that specific device, and then crack the 4
digit pin.

anyone have details on how apple actually unlocked devices?

~~~
wyager
>apple looks up what the hard-wired nonce is for that specific device, and
then crack the 4 digit pin.

Not quite. There are two nonces involved. One is (probably) easy for Apple to
extract (the randomly generated, re-writable value in effaceable storage) and
the other is (supposedly) very difficult to extract, because it's burned into
the CPU hardware. If all works as intended, the only way to extract the second
one is to decap the CPU and read it with a microscope.

Also, you can have arbitrarily long PINs, including alphanumeric.

If Apple's security PDF is correct, the only obvious way Apple can break the
PIN is via brute force, which I believe they claim to provide when LEAs
request it.

------
ecspike
I don't think the genie can be put back into the bottle. If Congress passes
laws to require it on US-bound devices, I could see interested parties
importing from Asia and Europe and/or embargoes on our devices abroad.

It could hurt US companies abroad because they will rightfully think the US
Gov't has a backdoor. I think Apple and Google would use their lobbying funds
to make sure the author's hopes never happen.

------
joeblau
> Apple’s new policy seems to thumb its nose at that great tradition. It stops
> the government from being able to access the phone precisely when it has a
> lawful warrant signed by a judge.

When the government has a warrant, the person being called into question is
being served the warrant. The way it works now, the Government just bypasses
the person and forces the carrier/handset manufacturer to give up information.
This step is just brining the law back in line with how it was traditionally
instated and enforced. That's like the Government going to Audi for a backdoor
key to unlock my car while I'm not there so they can search it.

------
polarix
The only really consistent position will become to treat a person's digital
trace as we treat the contents of their head. The two are merging sooner
rather than later.

------
skriticos2
It's clear by reading the article that this is a purely one-sided legal
assessment of the issue.

Putting on Apples shoes I think this decision was more guided by financial,
risk and PR motivations.

Apple is a mass technology company focused on synergies. Dealing with red tape
and custom data decryption is definitely outside that area. It adds additional
cost, not just for the work performed, but also organisational overhead (they
probably have to hire people for this, open new organisational units or divert
existing resources from more useful work). I'm sure that can't be simply
expressed in numbers and much less covered by the government, even if they pay
for the services.

There is the risk which has a technical nature: if a black hat / leak (e.g.
disgruntled employee) / someone outside the organisation ever acquires the
means to unlock costumer phones outside the confines of Apple, all hell would
break loose. And while I'm sure they make proper precautions, not having to
deal with this as a risk is likely more desirable.

The PR side is the most obvious one: there are lot's of people that are not
comfortable that Apple can simply unlock their data and will buy something
else just because they are not so forthcoming with the court warrants. More
importantly, businesses will have a very hand time to justify the risk of
information leakage because they use a product that has outside access
vectors.

So this decision seems to make perfect sense to me from a business
perspective: less cost + operational overhead, less risk of something going
horribly wrong and better PR light. What's not to love about it?

Note: IANAL, I don't own any Apple devices and I'm not a U.S. citizen nor do I
live in the U.S.

------
simonh
Looking at this from outside the US, the situation looks very different. Any
argument that can be made that US law enforcement should have a right to
require access to the contents of phones, and that it is immoral to foil
lawful requests to access user data applies has to pass a simple test. Would
you make exactly the same argument in favour of government access in China,
Russia, Iran, etc?

Apple has customers and suers across the globe. Any portection or lack thereof
they put into their OS affects all of them, not just the ones in the US. Their
policy has to serve all of those people equally. User protections in iOS8
protect users in these countries just as much as they portect user data in the
US. Weaknesses and gaps in that portection expose user data in those countries
just as much as they expose them in the US.

------
chvid
I think it is perfectly reasonable what Apple is doing. They are basically
saying: Dear government. If you want us to make a backdoor in our software and
setup a service to which you can send phones to have them unlocked, then
please make a law that explictly forces us to do so.

------
golemotron
All of this debate happened around the Clipper Chip in the 1990s
[http://en.wikipedia.org/wiki/Clipper_chip](http://en.wikipedia.org/wiki/Clipper_chip)

Sane people realized then that mediated security is no security at all.

------
briandh
I know these "blow-by-blow" comments can seem petty, but there are just so
many things wrong with this post. It's sad, because I have a lot of respect
for Kerr (besides being a very intelligent guy, he participated in weev's
legal defense pro bono).

> That’s hugely important. And under Apple’s old operating system,
> cryptography protects iPhones from rogue police officers, too.

Kerr is a lawyer, not a techie, and it really shows here. This statement first
of all implies that the previous system was "adequate, but with a backdoor",
which of course we know is not possible.

But that is a matter of theory; the practice, according to [1], is that
previously iOS filesystem encryption wasn't tied to the user passcode anyway,
so it wasn't even a matter of "decrypting" the devices, nor was it something
only feasible for Apple to do. Indeed, vendors such as Elcomsoft (believably,
given the info in those slides) claim that their software can grab most user
data without a password [2].

The aforementioned slides mention a second level of encryption, "data
protection", which is derived from the passcode, and which Elcomsoft claims to
not be able to access. I wonder whether Apple could (since if implemented
properly it shouldn't have).

> Because Apple demands a warrant to decrypt a phone when it is capable of
> doing so, the only time Apple’s inability to do that makes a difference is
> when the government has a valid warrant.

Kerr never says it, because he knows it's not true, but he gives the
impression throughout the piece that because Apple could access on-device data
when presented with a warrant, it had to. It didn't. Perhaps Kerr thinks they
had/have a moral obligation?

> The policy switch doesn’t stop hackers, trespassers, or rogue agents. It
> only stops lawful investigations with lawful warrants.

I don't know what Kerr is getting at here. If it is within the means of
"hackers" or "trespassers", it is surely within the means of law enforcement.

> How is the public interest served by a policy that only thwarts lawful
> search warrants?

But it doesn't _only_ thwart lawful search warrants, it protects against other
would-be intruders. Apple is simply choosing, in light of recent controversies
regarding government incursions on privacy, to highlight that aspect.

> Apple’s new policy seems to thumb its nose at that great tradition. It stops
> the government from being able to access the phone precisely when it has a
> lawful warrant signed by a judge. What’s the public interest in that?

Apple is responding to a political and market reality: a lot of us don't
believe that "great tradition", as practiced, is actually serving the public
interest. I guess it is out of the scope of the piece for Kerr to explain why
we're mistaken, but many of those applauding Apple's decision aren't going to
buy it, and yet it's his argument's key premise.

> Because the victim isn’t alive to share his password, and the phone will
> have locked before the body was found, the government won’t be able to
> search the phone to find the messages. Apple’s policy will keep the police
> from finding the killer. That seems bad.

Maybe. But recall, for perspective, that 15 years (generously) ago people
didn't carry these repositories of personal information with them.

> If we get a lot of cases like that, I suspect Congress may look to
> legislation to try to restore the privacy/security balance more in the
> direction of the traditional Fourth Amendment warrant requirement.

I would expand on the same point as above: what year's balance are we striving
to "restore" to? Law enforcement today have unprecedented access to personal
information, and that will be only marginally less true when smartphone
encryption is more widespread and robust.

> The most obvious option would be follow the example of CALEA and E911
> regulations by requiring cellular phone manufacturers to have a technical
> means to bypass passcodes on cellular phones.

I know Kerr is talking about what Congress might do and not what would be
ideal, but I'm sure Kerr knows the lessons of the Clipper chip [3].

I know I'm preaching to the choir here, but I've been stewing over this since
I read it on Friday. Apologies for the rantiness.

[1] [http://www.slideshare.net/eltufl/ios-encryption-
systems](http://www.slideshare.net/eltufl/ios-encryption-systems) [2]
[http://www.elcomsoft.com/eift.html?r1=pr&r2=ios6#passcode](http://www.elcomsoft.com/eift.html?r1=pr&r2=ios6#passcode)
[3]
[https://en.wikipedia.org/wiki/Clipper_chip](https://en.wikipedia.org/wiki/Clipper_chip)

~~~
hawleyal
Well said.

------
mikhailt
While I do understand the situation the law enforcement is dealing with, it is
not an excuse to not allow us to have complete privacy and full encryption
support on our digital devices. Nobody including governments has any rights to
have any access to my personal data. If I don't share it with anybody, it's
mine just as my personal thoughts in my head.

I agree with what Apple is doing and I want them to do more. There are still
some remaining holes that need to be closed up and all users should be
encouraged to enable 2FA with the mandated switch in a few years.

------
joesmo
So basically, the author wants every "secure" software system to have a
backdoor. iOS versions prior to 8 all had a backdoor and he's lamenting that
Apple has closed this backdoor. Not only that, if this applies to the iPhone,
it must apply to any and all encryption software. While it's widely known that
the government already has backdoors into a lot of popular software, codifying
this in law or expecting that all software be built with backdoors is
preposterous.

Furthermore, his feeble explanation of why this isn't a 5th Amendment issue
doesn't stand up. The courts have so far come down on both sides of the issue,
and while he cites one court case, there are others and other decisions that
disagree ([http://www.outsidethebeltway.com/federal-appeals-court-
fifth...](http://www.outsidethebeltway.com/federal-appeals-court-fifth-
amendment-protects-suspect-from-having-to-decrypt-hard-drive/)). It's obvious
that the author believes so strongly in the right of the government to have a
backdoor that he ignores significant case law that does not back his viewpoint
(I'm sure he's aware of it or he's a piss-poor law professor).

Because the author doesn't see this for what it is, a backdoor, he can't even
begin to understand other, unforeseen consequences (such as criminals getting
ahold of the decryption process) and even brushes off this issue early in the
article.

~~~
stephen_g
> iOS versions prior to 8 all had a backdoor and he's lamenting that Apple has
> closed this backdoor.

iOS versions prior to version 8 didn't have a backdoor - I believe the
difference is just that less content was stored encrypted before.

~~~
joesmo
In that case, prior versions are just in-securable. If less content was
encrypted, however, why would investigators need Apple's help in getting to
the data when the data is stored in unencrypted right on the device?

------
a3_nm
(IANAL.) From the three last options proposed in the article:

\- Option 1 doesn't make sense to me because, if you assume that private
crypto is lawful, the phone's data could be encrypted using the passcode, so
that it would be technically impossible to bypass the passcode.

\- Option 2 seems like a bad idea to me. The central premise is that there is
one passcode which you must know, so you can be forced to provide it. However,
if you are using deniable encryption you could have multiple passcodes, an
"everyday" passcode, a "secondary" passcode for sensitive data, a "duress"
passcode, etc., and there would be no way for you to prove that you have
provided everything. Under this interpretation, forcing users to hand in "all
available passcodes" sounds too much like asking the accused to help the
investigation make sense of their encrypted data.

\- Option 3 seems more reasonable (although I think it is undesirable). In
fact, I am surprised that retention of text messages is not yet an obligation
under the law (though maybe they are already retained, just not lawfully). Of
course this would make a lot less sense if people (or their phones) encrypted
texts...

Yes, telephony is stuck with historical ad-hoc messaging protocols (SMS),
proprietary devices, OSes and applications, with strong dependencies to their
manufacturers and mobile providers, and people do not often use serious crypto
on them. But mobile phones with a mobile connection are essentially computers
on a network, and one should be wary of special bypass mechanisms on passcodes
or texts that would make no sense with computers.

------
drivingmenuts
> Apple’s new policy seems to thumb its nose at that great tradition.

Tradition is not law.

I like Apple's approach. It's not their job to do the government's job for
them nor is it their job to make the government's job easier. It's their job
to develop the best product they can and not be hindered by tradition or
whatever other albatrosses they don't wish to accept.

The government is now free to develop the best methods they can to retrieve
information, within the law.

------
bhashkarsharma
With all due respect, being somewhat of a privacy and security freak (although
I use social networks etc.), all I can read while going through the article is
a crybaby whining about how it is harder than before to get user data.

The article makes an assumption that the warrants and the snooping is only
done in legal and genuine cases, which has been continuously been proved wrong
lately. Not to mention all the secret courts, FISA orders etc

------
jrochkind1
I agree it's odd that they're actually marketting it as anti-law enforcement.

There are certainly other reasons to appreciate secure encryption though. But
"not even a disgruntled apple employee, or one paid by your business
competitor" brings up questions that are not good marketting to put in
people's heads. Even "not even a hacker, cause we used actual secure crypto"
is not what they want people to think about.

Now, personally, I include "not even law enforcement" in my list of attackers
I'd like to defend from, and there's nothing wrong with that (and there's not
supposed to be in America, the 4th ammendment and all). But the fact that it's
actually good marketting generally (or at least they're betting on it) --
well, we have Snowden to thank for that. And I doubt it will last.

Also, of course, there are a variety of reasons the iphone crypto as a system
isn't all that secure -- including but not limiting to the fact that we still
have trust apple (we can't see the code, or the code in the updates pushed out
regularly).

Not that I disagree that it's a benefit to make it harder for law enforcement

------
abalone
Is this actually new in iOS 8? I thought this has been the design of iMessage
from the beginning -- namely that Apple does not have the decryption keys.

What Apple does have is the directory of recipient public keys that your
device should use to encrypt its messages. (Background: iMessage encrypts each
message separately for every recipient device, which shows just how far Apple
went to protect key security. Not only does Apple not have the keys, private
keys never have to be exchanged among devices ever.)

But technically speaking -- I have no idea if they actually do this -- that
gives them a way to insert a "wiretap" of sorts in the form of an additional,
silent recipient that you don't know about. Think of it as adding another
device to your iCloud account, only it's not yours. Still, this could at least
be discovered by monitoring the size of the outgoing data to see if it matches
the expected number of recipients.

~~~
function_seven
The article is referring to the contents of the phone's storage, not iMessage.
Apparently before iOS 8, Apple could decrypt the contents of a phone's
storage, and would do so when presented a warrant.

------
cyphunk
His argument for why Apple should assist the government in obtaining private
information shows how easy it is to write the meaning of the constitution
however you please, at least when a society is in the middle of intense
technological evolution:

    
    
        The civil libertarian tradition of American privacy law, 
        enshrined in the Fourth Amendment, has been to see the 
        warrant protection as the Gold Standard of privacy 
        protections. ... the government must go to a neutral
        magistrate and make that case before it conducts the 
        search. ... But when the government does make that 
        showing, on the other hand, the public interest in
        solving crime outweighs the privacy interest. That’s
        the basic balance of the Fourth Amendment

------
CRASCH
A back door is a vulnerability regardless of its intended use. There is no way
to assure it can't be used elsewhere. I would be shocked if the existing
decryption method used internally at Apple wasn't fully available to other
agencies. How hard would it be for a three letter agency to get someone in the
right position to obtain the keys and tools?

The closing of this vulnerability makes business sense for Apple. Considering
how connected these devices are only, marooned data would not already be
available elsewhere. Perhaps a photo that wasn't uploaded or a note taken.
Everything else is accessible. Location data is available via cell tower logs.
Voice calls, call history, SMS, email, web history, etc... all available
already...

------
DigitalTurk
For a moment I thought I was reading a Xinhua article.

~~~
yvsong
This could be the reason why iPhone 6 has not been approved in China. With
heavy smuggling they will get plenty. A lot of arguments assume the (American)
government is decent, but iPhone is a global device. Let's hope Apple products
have the same security all over the world.

~~~
hawleyal
Agreed. Everyone forgets that my phone has to be protected from everyone in
the world.

------
pothibo
So, when BlackBerry was doing it, it was all good. Now that it's Apple, it's
bad?

------
frozenport
Apple is an international company, while I trust my government I do not trust
the Chinese.

~~~
gonzo
"Love your country, but don't trust your government."

the question is, why do we take Apple at their word here?

[http://www.jwz.org/blog/2014/09/why-we-believe-
apple/](http://www.jwz.org/blog/2014/09/why-we-believe-apple/)

------
mariodiana
Let me see if I get this straight. If I design a front door that can't be
broken down, which also comes with a lock that can't be picked, it's my
responsibility to keep a duplicate of every key, just in case law enforcement
should come by and want to get into someone's home?

------
iaw
This seems likely to be more than a little related to the recent warrant
canary discovery. My expectation is that Apple was recently forced to do
something they considered unlawful and the iOS8 changes will prevent that
issue in the future.

~~~
gcb0
the world is much simpler than that, my friend.

Apple left the canary there to not alarm their customers. As far as we know,
they could have received millions of warrants and never removed it for market
reasons, or they could have received warrants that told them to not change
that page as to not harm ongoing investigations.

Anyway, it is very conveniently timed with the release of ios8, and their
month old marketing plan of announcing that new 'super secure feature' that
"we can't comply to warrants even if we wanted BS". I'd have to be a huge fool
to not thing that this is a pure marketing ploy to force me to move to IOS8,
and nothing more.

Also, you probably can still access iCloud pictures that the phone uploads
automatically by just guessing random urls or something or another.

------
senthilnayagam
I support the Apple stand. All new iPhones have fingerprint based
authentication which stores the info in a crypted way inside the phone itself.
Let the government force the accused to unlock via fingerprint.

------
ZeroCoin
>The first question is whether the government can lawfully compel the
telephone’s owner to divulge the passcode. I believe the answer is that yes, a
person can in fact face punishment for refusal to enter in the password to
decrypt his own phone. If the government obtains a subpoena ordering the
person to enter in the passcode, and the person refuses or falsely claims not
to know the passcode, a person can be held in contempt for failure to comply.

I thought the exact opposite was true?

That no judge or any court can force you to disclose your password?

ie self incriminating your own person.

Apparently it varies from country to country:
[http://en.wikipedia.org/wiki/Key_disclosure_law](http://en.wikipedia.org/wiki/Key_disclosure_law)

------
drderidder
It's a catch 22. They'll be criticized, but they need to deploy much stronger
security measures to show customers that they're not playing along and 'data-
pimping'. Unfortunately for them it's probably too late. When open-source
alternatives like ubuntu mobile and firefox os are ready for prime time, they
stand to take a good market share, imho.

------
gress
Jeff Bezos's newspaper publishes an article impugning Apple.

------
junto
A right to silence and a right not to incriminate oneself.

------
gradestack
You have to explain the statistics involved

------
jimmyshegzy
I wanna learn real hacking . I'm a Nigerian 18yrs old

------
dba7dba
I feel nauseated by the higher than thou attitude of Apple.

While Apple is pointing fingers at Google about collecting user data, let's
see what kinds of business Apple and Google are in.

Apple's main strength is designing product, farm them out to contractors for
assembly, market the products and sell them.

When it comes to child labor or near slavery like working condition that the
Chinese assembly line workers labor in, Apple's defense is that they are not
Apple employees. Well, that doesn't fly with me. They are still paying (or not
paying enough) for the tough condition.

And what business is Google in? Their business is in collecting user data and
selling them for ad. They don't farm out assembly work to contractors that
force employees to work in slavery like condition.

So what do like less? 1) Supporting a company that condones slavery like
working condition? 2) Supporting a company that collects data but does not pay
contractors who put workers into slavery like working condition?

So I'd say it's even. Apple is not any better than Google. And frankly,
looking at me.com and other cloud platform that Apple has tried to prop up and
failed so far, I'd say Apple is better off staying in their line of business.

~~~
rimantas
Tell me, was me.com also built using child labour? Or does Android run on
nothing but the thin air? If not, can all makers show something like
[http://www.apple.com/supplier-responsibility/](http://www.apple.com/supplier-
responsibility/) ?

