
Apple – Privacy – Government Information Requests - declan
http://www.apple.com/privacy/government-information-requests/
======
downandout
_" On devices running iOS 8, your personal data such as photos, messages
(including attachments), email, contacts, call history, iTunes content, notes,
and reminders is placed under the protection of your passcode. Unlike our
competitors, Apple cannot bypass your passcode and therefore cannot access
this data"_

This is key. The way we engineer software and services can have a major impact
on the war against overly invasive government requests. We know that these
requests will come; it's our responsibility to design things in a way that
protects customers from our legal obligations when confronted with them to the
greatest extent possible.

While this certainly serves their own interests, kudos to Apple for baking
this type of consideration into the basic iOS design. They should and will be
financially rewarded for it.

~~~
kristofferR
The quote is extremely misleading. Sure, it's encrypted by your passcode, and
that's great.

It's important to note however that the passcode is just 4 digits long by
default, and could be bruteforced by Apple in milliseconds if they wanted to.
So to say that "Apple cannot bypass your passcode" is misleading, as guessing
it is absurdly easy.

[http://www.slideshare.net/alexeytroshichev/icloud-
keychain-3...](http://www.slideshare.net/alexeytroshichev/icloud-
keychain-38565363#47)

~~~
downandout
Not the point. The point is that they've done enough to legally respond to
government requests by saying "we don't have a way to access the data".
Hacking customers' phones, regardless of how easy it might be, is far beyond
the scope of anything that a US court can order a private party to do.

~~~
rlpb
> is far beyond the scope of anything that a US court can order a private
> party to do.

They could order Apple to disclose signing keys so that the government can
install spyware themselves. See
[http://en.wikipedia.org/wiki/Lavabit#Suspension_and_gag_orde...](http://en.wikipedia.org/wiki/Lavabit#Suspension_and_gag_order)
for a case where they have done something similar before.

~~~
abritishguy
>the government can install spyware themselves

It would have to be an OS update since applications don't have access to that
stuff, even if signed with an apple key.

~~~
rsynnott
That's probably not quite true. Take an app that the user is already likely to
have given access to their photos, like Facebook. Create a malicious app with
the same app identifiers, sign, and push to the user. At that point, the phone
thinks it's Facebook, and should allow access.

That said, it'd be kind of unsubtle, and they'd probably get caught.

~~~
abritishguy
Ahh yes, I assumed we were talking about messages.

------
mkal_tsr
Yet no comment from them about what being a "provider" under PRISM entails.

* "In addition, Apple has never worked with any government agency from any country to create a “back door” in any of our products or services."

If Apple provides an interface to request user-data to law enforcement / NSA,
that's not a back door in the _product_ or the _service_.

* "We have also never allowed any government access to our servers. And we never will."

If they provide user-data after being served with a warrant (possibly through
email or to their legal department), their servers were never accessed, yet
the data was provided.

It's always interesting to read what _is_ and _isn 't_ said. Word games, I
swear.

~~~
declan
>If they provide user-data after being served with a warrant their servers
were never accessed, yet the data was provided.

Yes, that's true. But if you run an email service that's based in Cupertino,
what do you do when served with a lawful search warrant or wiretap order? Say
"no," and be found in contempt of court and have _all your servers_ carted
away by some men in black so they can find the file they're looking for?

There are a few ways around this. One is not to permanently store warrant-
worthy user data (Snapchat, Wickr, etc.). Another is end-to-end encryption
(original version of Hushmail), though key distribution and UX become
problems. A third is to move your servers to Switzerland, though then you get
served with process from the Swiss authorities instead of FBIDHSDEAetc.

Apple has taken none of these steps. Assume our hypothetical Cupertino-based
email service has not either. What do you do when the Feds show up with a
lawful order?

~~~
mkal_tsr
I go under the assumption that if I'm not the one generating and providing the
encryption keys (and really, pre-encrypting the data), absolutely nothing is
secure/encrypted. And in all honesty, if it touched the internet, it's already
insecure to some extent.

It's been "fun" to read Jewel vs. NSA proceedings, press statements, and
officials' statements about these issues because of the extent to which they
play word games to _technically_ not lie according to specific (re)definition
of words.

~~~
antimagic
Why stop there? If you're not root, you're not secure either. For example, if
someone else is root, they can do whatever they want with your locally
generated keys, upto and including sending them to the NSA with a little bow
on top.

Even if you _are_ root it's no guarantee, thank you rootkits :/

As with all things security, sometimes it comes down to trust and legal
protections.

------
crishoj
Here's an observation, and an idea for testing Apple's claims on iMessage
privacy:

China seems quite determined to block IM systems which do not cooperate with
the authorities and permit monitoring of communications. Most recently, both
Line and the Korean KakaoTalk were blocked [1].

Skype remains useable in China, presumably because Skype permits efficient
monitoring [2].

It seems unlikely that China would tolerate such a prominent opaque
communications channel as iMessage in the hands of a significant proportion of
their citizens.

Thus, if China refrains from blocking iMessage for a prolonged period of time,
wouldn't it be reasonable to assume that China is in fact able to snoop on
iMessage?

[1] [http://www.ibtimes.com/china-restricts-messaging-apps-
confir...](http://www.ibtimes.com/china-restricts-messaging-apps-confirms-
blocking-line-kakaotalk-last-month-1651620)

[2] [http://www.reuters.com/article/2012/01/31/us-china-
dissident...](http://www.reuters.com/article/2012/01/31/us-china-dissident-
idUSTRE80U0BJ20120131)

~~~
AJ007
It is interesting you are the only one who has mentioned China. The other side
of the coin is Apple being worried about being locked out of China for not
being secure enough. [http://www.reuters.com/article/2014/07/11/us-apple-
china-idU...](http://www.reuters.com/article/2014/07/11/us-apple-china-
idUSKBN0FG0S520140711)

~~~
crishoj
...where by "not being secure enough" means that location data ends up on US
servers. I'd be surprised to see the same sort of statement if location data
was collected and kept nationally.

Nonetheless, Apple's relationship to China is a interesting case:

• On one hand, China is posed to be the largest market for Apple in just a
handful of years.

• On the other hand, it's hard to imagine China approving of e.g. un-snoopable
instant messaging in the hands of the populace.

------
clamprecht
So the US is now a country where mainstream companies market it as a
competitive advantage that they will try to minimize what they will release to
the government. I'm glad companies are doing this, but I'm sad that they even
have to.

~~~
zaroth
The desire for privacy, and a healthy dose of suspicion for ones government,
have both existed for as long as the notion of government itself.

Personally, I'm waiting for the other shoe to drop. Now that the police can't
go to Apple for your data, we'll start seeing more judges ordering users to
unlock their phones to allow them to be searched. I don't think it will be
long before such a case reaches SCOTUS, and then we'll see how that works
out...

~~~
x0x0
I am no lawyer.

While lower court decisions have gone both ways, according to wikipedia,

    
    
        in United States v. Doe, the United States Court of Appeals for the 
       Eleventh Circuit ruled on 24 February 2012 that forcing the decryption of 
       one's laptop violates the Fifth Amendment [1]
    

So there is hope.

Perhaps a more practical problem is even with a 5 digit pin, it's entirely
possible to simply try all combinations. You probably need more than 8 digits
before that becomes impossible.

[1]
[http://en.wikipedia.org/wiki/Key_disclosure_law#United_State...](http://en.wikipedia.org/wiki/Key_disclosure_law#United_States)

~~~
matt-attack
> even with a 5 digit pin, it's entirely possible to simply try all
> combinations

Given a _long_ time. Don't forget iOS slows your brute-force attempts down
substantially. I'd be curious how long a brute-force attack would take given
the current behavior of the OS.

~~~
kissickas
Is there no way to duplicate the whole filesystem onto a computer and try
decrypting that? I didn't think they would actually be thumbing in every
password combination...

~~~
shalmanese
There is supposedly no way of duplicating the secure element which is what
controls the encryption.

------
DigitalSea
The honest truth about all of this is, even if Apple were handing over
information because of back doors, custom database interface applications for
the NSA, they wouldn't tell us and would probably be gagged from doing so
anyway, have we all forgotten about Lavabit? I hope not.

I think we are all intelligent enough to know that even if Apple were handing
over information, it wouldn't exactly be good for business to admit you've
been complicit in handing over personal details to the Government, would it?
"Yes, we have been giving away your information, but we promise not to do it
any more. Hey, we just released a couple of new iPhones, want to buy one"

Anyone else notice the page is cleverly worded and any mention of security
seems to be limited to iOS 8 context? "In iOS 8 your data is secure", "In iOS
8 we can't give law enforcement access to your phone" \- maybe I am just
overanalysing things here, but I have learned not to be so trusting of
companies as big as Apple considering the amount of information that they
hold.

You know we're living in a new kind of world when privacy is being used for
marketing purposes...

~~~
unknownBits
It is pretty naive nowadays to store any of your data online while thinking it
is secure and no one will, or can touch it. Unless you do some hardcore
encryption on your data yourself, which is really hard or even impossible with
some data like call-logs, geographic location-data etc..

------
jpmattia
> less than 0.00385% of customers had data disclosed due to government
> information requests.

According to [1], there are about 600 million apple users, so this translates
to 23,000 customers exposed due to government information requests.

Seems like a large number. Is 600M correct?

[1] [http://www.cnet.com/news/apple-to-reach-600-million-users-
by...](http://www.cnet.com/news/apple-to-reach-600-million-users-by-end-
of-2013-says-analyst/)

~~~
nknighthb
There are millions of people arrested in the US alone every year. It doesn't
seem unbelievable that 23,000 of them had iPhones/iPads that law enforcement
wanted data off of.

------
fpgeek
My fundamental issue with Apple's privacy claims is they are pretending that
they have a technological solution to what is, ultimately, a political
problem. As the laws in the US (and I imagine some other countries stand),
Apple can be compelled provide your data to appropriate governmental
authorities, install back doors, not tell you and even lie to you and the
world about it. As long as that's true, no assurance from _any_ third-party
service provider is worth a damn.

I can understand the marketing benefits Apple sees in making these
disingenuous privacy claims. I'd be willing to call that "just business"
except for one thing: Trying to persuade people they have a technological
solution will necessarily get in the way of the absolutely vital political
project of destroying the political and legal foundations of the surveillance
state.

------
ckuehl
I'm very skeptical that traditional screen-lock passcodes offer useful
protection for the average person. Most people still choose to use 4-digit
passcodes for convenience, leaving exhaustive key search [1] well within the
reach of even very small attackers.

Are these four-digit passcodes being used to derive encryption keys? If so,
I'd like to hear where the additional entropy comes from. There's no use
encrypting things with a 128-bit key when the effective entropy of the key is
really only ~12.3 bits.

I'm sure the engineers at Apple would not have overlooked this; it would be
great to hear more about the specifics.

[1] especially if the attacker can download encrypted data and try an infinite
number of times (instead of e.g. typing the passcode on the phone or hitting
the iCloud servers)

~~~
wyager
IIRC, there are three crypto keys involved:

1\. PIN

2\. Random key in effaceable NAND storage (generated on device reset)

3\. Burned in permanent CPU-unique key.

I _think_ OP's linked statement from Apple means that Apple is now _also_
encrypting data stored on the iCloud servers.

[http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...](http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.pdf)

~~~
declan
>I think OP's linked statement from Apple means that Apple is now also
encrypting data stored on the iCloud servers.

From today's announcement (scroll to "iCloud"): Mail and Notes are not stored
in encrypted form on iCloud servers. [http://www.apple.com/privacy/privacy-
built-in/](http://www.apple.com/privacy/privacy-built-in/)

From December 2013: Mail and Notes are not stored in encrypted form on iCloud
servers.
[http://support.apple.com/kb/HT4865](http://support.apple.com/kb/HT4865)

~~~
wyager
Then what has changed?

~~~
declan
This is a very good question. I'm now no longer sure.

------
declan
If you're an iOS user who becomes the target of an investigation by a law
enforcement or intelligence agency, remember your data is likely _unencrypted_
in the cloud. So if your device is inaccessible, your email, your location
history, your text messages, your phone call history will probably remain
accessible. Apple acknowledges, for example, that "iCloud does not encrypt
data stored on IMAP mail servers":
[http://support.apple.com/kb/HT4865](http://support.apple.com/kb/HT4865)

[Edited because it now seems unclear which Apple policies have changed.]

~~~
threeseed
> though the celeb hacking shows the limits of that approach

Apple has clearly stated that its system was not compromised.

The user reset questions were socially engineered meaning it is irrelevant
whether or not the data is encrypted. From Apple's perspective the owner of
the data is downloading it.

~~~
declan
> The user reset questions were socially engineered

Yep, you're right. My point, perhaps poorly stated, is that if Random Hacker X
can figure out the answers to the iCloud reset questions, so can a law
enforcement agency. Then they can log into that account. Impersonating someone
this way is legal -- or at least has not been ruled to be illegal -- as long
as it's done under court supervision under the Wiretap Act or similar legal
authority authorizing prospective surveillance.

Possibly related: I disclosed last year that the Feds have demanded that major
Internet companies divulge targeted users' stored passwords, and in some cases
the algorithm used and the salt: [http://www.cnet.com/news/feds-tell-web-
firms-to-turn-over-us...](http://www.cnet.com/news/feds-tell-web-firms-to-
turn-over-user-account-passwords/)

~~~
stephenr
> if Random Hacker X can figure out the answers to the iCloud reset questions

Answers about very famous people. Wikipedia will not tell me _your_ mothers
maiden name.

Also, as much as I sympathise with the women whose accounts were breached,
actors aren't always the sharpest tools in the shed, and phishing schemes are
a common tool for gaining access to other peoples accounts. One of them (I
don't remember which) publicly claimed iCloud backup for her iPhone was "too
complicated" a while ago. Given that it's as complicated as "turn it on, and
make sure it gets plugged into power with Wifi every so often", I don't doubt
some of them would fall victim to even a very simple phishing scam.

------
zobzu
"Unlike our competitors, Apple cannot bypass your passcode and therefore
cannot access this data" Oh really? Privacy is marketing now.

~~~
lmedinas
It's one of the strongest marketing moves nowdays. Imagine the amount of
"secure" messengers out there in which you only have a "word" from the company
saying that it's secure. But they deliver a full closed source service in
which there is no way to believe or not. So marketing works.

------
sidcool
Apple has taken a shot against Google and facebook. It has mentioned that
unlike its competitors their business model does not depend on selling user
data. Which is kind of true, but Google and facebook's business model itself
is using user data for marketing.

Sometimes I feel it's not unethical to use user's data for marketing, the way
facebook and Google tell us; that they don't directly share details with
marketers, but they let them target the audience.

------
danford
Except it's not open source. If it's not open source then you have no idea
what's going on beyond what Apple tells you.

Ask your self:

Would Snowden use this phone? Your answer to this question is the same as the
answer to the question "Is this phone secure?"

I guess I'll get downvoted for this sense it goes against the Apple
circlejerk, but this issue is more important to me than magic internet points.

~~~
mox1
Nobody sells a 100% open source phone. Everyone uses broadcom, motorola, TI,
etc. chips which have propriety firmware. (I'm aware there are a few obscure
phones that claim to be 100% open, but not many people use them, not
available, etc.)

------
xkiwi
Finally those numbers of iPhone activation and mac sold are useful.

#1 Mac unit sales

[http://www.macworld.com/article/2062821/apple-by-the-
numbers...](http://www.macworld.com/article/2062821/apple-by-the-numbers-mac-
not-dead-yet.html)

2010 @ 13662k

2011 @ 16735k

2012 @ 18158k

2013 @ 16341k

Total = 64,896,000

#2 iPhone unit sales

[http://www.statista.com/statistics/232790/forecast-of-
apple-...](http://www.statista.com/statistics/232790/forecast-of-apple-users-
in-the-us/)

I only take the number from 2013 & 2014 because Apple trend to upgrade fast.

2013 @ 53.6 Million,

2014 @ 63.2 Million,

Total = 116,800,000

Now, quote from "Government Information Requests"

"less than 0.00385% of customers had data disclosed due to government
information requests."

Only 699529.6 round to 699529 customers had data disclosed.

~~~
dasp
0.00385% of 181,696,000 is 6995, not 699529.

Even if we take the total number of iTunes customers (~800 millions), it comes
down to 30,800.

~~~
hooplarbazaar
It's between 0-250. Tim Cook confirmed it on the Charlie Rose interview on
Monday.

~~~
calvin_c
That was for the first half of a this year alone. They specify this in the
page on Government Requests

------
adventured
I understand this does nothing to stop the NSA from snooping on me. However,
the local / state police are a much more imminent threat to your average
person with the rise of the police state than the NSA and FBI are. The local
police are becoming ever more aggressive when it comes to your privacy and
devices like your phone.

If this turns out to be as good of a move as it seems like it is, Apple has
acquired my attention in a way they weren't able to previously (I've been an
Android user from day one). Plus I like the new larger iPhone 6.

~~~
krisgenre
Android phones also have 'Encrypt Phone' feature ( since Gingerbread ) but not
on by default.

------
wyager
Edit:

Can someone confirm or deny the following? I _think_ this is the current state
of affairs.

A) Apple will unlock PIN-locked devices by government request, but the best
they can do is brute-force. This is very slow, as it can only be done using
the phone's on-board crypto hardware (which has a unique burned-in crypto
key), and the PIN is stretched with PBKDF2. It has been this way for a while.
Apple has no "backdoor" on the PIN or any form of cryptographic advantage here
that we know of.

B) The _new_ thing mentioned in the OP's link is that things stored _on Apple
's servers_ are now encrypted as well, with your iCloud password.

Is this correct?

~~~
nathancahill
It's actually really easy to recover the passcode from iTunes backups (so
probably from iCloud backups too). I've had to do it before to rescue photos
off a friend's iPhone. Don't know about >iOS6 though.

~~~
threeseed
iTunes backups are encrypted by default. And I can't imagine a typical person
deliberately disabling it.

Your example is a little unique though because you have physical access to
both their computer and their phone. In theory you could just brute force
their iTunes backup password.

~~~
selectodude
No they aren't. You have to check a box in iTunes to have them encrypted.

~~~
dmishe
They are if you have a passcode lock

------
dubcanada
These threads should come with a tin foil hat requirement. There is so many
different views on this. But if you wear a thick enough tin foil hat, it
really doesn't matter what anyone says. You will think the gov is spying on
you regardless...

------
pikachu_is_cool
I don't need to read this. Everything on the iPhone is proprietary software.
As it has been proven countless times, there is an 100% probability that there
are backdoors everywhere on this device. This entire blog post is a lie.

~~~
wes-exp
And software built by volunteers, like OpenSSL, has proven to be so much more
secure. It's not like heartbleed left practically the entire internet
vulnerable to abuse. Oh wait, yes it did.

~~~
JetSpiegel
A bug is not a backdoor. Unless you are implying the OpenSSL devs left the bug
there on purpose.

------
BillFranklin
19250 people have their Apple accounts accessed by #NSA every year.

------
krisgenre
Doesn't Android phones also have 'Encrypt phone' feature?

------
baby
It almost seems like it's a feature of iOS8.

