
iOS 11 Security [pdf] - Artemis2
https://www.apple.com/business/docs/iOS_Security_Guide.pdf?
======
gervase
Apple seems to be investing heavily in security and privacy, but I'm curious
to see if they can actually convince the average consumer to care (and/or buy
into their security narrative, depending on your level of cynicism). So far,
the convenience and features offered by their competition (at the expense of
user privacy) seem to be a stronger draw.

I figure either (A) they're trying to carve out a niche of hardcore consumers
who do care, or (B) they're trying to play a long game, hoping that broad
sentiment shifts towards valuing electronic privacy. If it's the former case,
I think they're doing fine; these kinds of whitepapers will reach most of
those who care, and periodic news articles ("Terrorist iPhone unable to be
unlocked!") will reach the rest.

If it's the latter, I think it's a pretty big risk given the scale of their
re-education task (the pool of users willing to sacrifice personal privacy for
other benefits, i.e. Google and Facebook's bread and butter) and the potential
pushback they'll receive/have been receiving from governmental sources.

What does HN think? Is this a viable business differentiator for them, long
term? Or will they have to shift to the 'dark side' of personalized data and
services to remain competitive in the future?

~~~
lisper
The irony is that Apple's offering is not at all compelling to anyone who
actually understands anything about security and privacy. Yes, Apple's
security is strong with respect to outside threats, but at the cost of putting
absolute blind trust in Apple. So your actual privacy is only as good as
Apple's internal policies allow it to be, and those are not only completely
opaque, but Apple is under no obligation whatsoever to maintain those policies
in the future. Apple could be selling your data to the Chinese on the side,
and there would be no way for you to know. And even if they're not doing it
today, they could decide to do it tomorrow. At that point, even if you somehow
found out, you'd be very hard-pressed to do anything about it.

[UPDATE] This comment is getting a lot more attention that I expected it to.
I've watched the point count on it go as high as 20 and as low as 0, with
several cycles between 0 and 10 and back, so a _lot_ of people are voting on
it. So let me say a few additional things.

First, I concede that the way I phrased my position was inartful. I apologize
for that.

Second, Apple is probably the best solution on the market in terms of security
and privacy. My complaint is not about them per se, it's really about the
state of the market. My choices are either to hand my data over to Google or
Microsoft, or to hand over my control of what I can and cannot run on my
system to Apple. Neither of those is a satisfactory option IMHO.

~~~
tptacek
I don't understand where that is coming from. What parts of the iOS security
model involve blind trust in Apple, apart from the firmware update process?

The data you're supposing they might sell to the Chinese in the future is
mostly stuff they go out of their way _not to collect in the first place_.

~~~
fauigerzigerk
This is clearly coming from an "if it's not open source it can't be secure"
angle.

~~~
lisper
I'm not insisting on open source (though I do prefer that all else being
equal). But I would like my computing infrastructure to be independently
auditable.

------
tzahola
I hope Apple will begin to spin privacy and security as part of their "premium
lifestyle". Because if privacy and security will be associated with
premiumness, other companies will have an incentive to implement similar
measures in their products. People will actually care about their digital
privacy for the first time! (though not because of the benefits of privacy,
but to show off to others that they can afford a _premium_ product with
privacy)

Sort of like how companies suddenly started caring about their mobile phones'
package design after the iPhone was released with its sleek packaging.

~~~
saagarjha
Hopefully this doesn't backfire though: by associating privacy with a "premium
lifestyle", it by definition stops becoming something accessible to everyone
and instead something you must pay for.

------
amckinlay
Apple security is confusing. For example, Find My Mac does not require 2FA
even when 2FA is enabled. An attacker can remotely wipe your MacBook with just
your iCloud password.

Another example: apparently there is a distinction between "two-factor
authentication" and "two-step authentication", the later being a deprecated,
but active system. Reading the docs for the older system, you'll soon discover
differences in things such as account access and recovery that lead to an
entirely different set of consequences and caveats for security. You'll find
out that in certain scenarios you could permanently lose access to your iCloud
account and iTunes purchases under "two-step authentication*, but not the
newer "two-factor authentication". If a user confused the two while reading
the Apple online support pages, it could have grave consequences.

Security is something that needs to be documented and marketed in clear terms.
Why Apple would adopt names so similar for two distinct implementations of a
security mechanism that they could arbitrarily describe either is incoherent
with Apple's supposed model of user friendliness. It's what Microsoft does
with its products, not Apple. Additionally, all facets of a security feature
should be documented, and documented well. It is unacceptable that Apple does
not warn users that 2FA can be bypassed in certain scenarios. I hope Apple
does further focus on security, and documenting it well.

~~~
eridius
> _For example, Find My Mac does not require 2FA even when 2FA is enabled._

This is intentional. Otherwise people who only have one device would be unable
to wipe their device if it gets lost.

~~~
kartickv
But that reduces security for someone with multiple devices. Can I enable some
option to require 2FA for remote wipe?

------
5_minutes
I certainly appreciate this effort, whatever their long term intention or
strategy is with this in a commercial way (or not), it’s in line with what I
expect when it comes to my privacy and security.

Some of the google/Android “features” and what they do with your data, make
old school keyloggers look like a joke.

------
polygot
"The processor forwards the data to the Secure Enclave but can’t read it. It’s
encrypted and authenticated with a session key that is negotiated using a
shared key provisioned for each Touch ID sensor and its corresponding Secure
Enclave at the factory."

> "at the factory"

I suppose the secret key is erased at the factory, however, what if it isn't?
Or, is the secret key generated on-chip via a random number generator? If it
were stored at the factory somewhere then it would be possible to link it to
each iPhone. I'm not familiar with cryptography, so I think it's just a
misunderstanding on my part, and I'm not sure if this would be a weakness in
the Touch ID sensor.

~~~
axoltl
The long-lived secret for TouchID is generated in the SEP, using the UID of
the SoC (a secret known only to the SoC) and the serial number of the
respective sensor. That key is then keywrapped with a (symmetric) key known to
all TouchID sensors and sent to the fingerprint sensor. It unwraps and
verifies the key and burns it into fuses.

That key is then used for establishing a session key in the field. The session
key uses entropy from both the SEP TRNG and the TouchID TRNG.

Your threat means someone would 1. need to know the TouchID global key 2.
slurp off all the keywrapped blobs for later, at all iPhone manufacturing
sites. This then gives them to ability to either decrypt fingerprints, or feed
'fake' fingerprints to the SEP. Given that there are far easier ways of
stealing fingerprints, that leaves the feeding fake fingerprints to the SEP.
If you're in a position to do that you might as well just feed the phone a
fake finger.

~~~
Sephr
Those prerequisites aren't necessary if you can figure out how to compromise
the secure enclave remotely.

------
samat
I am wondering if someone could explain Chinese iCloud accounts transfer
implications.

I see that iCloud Keychain is still secure, but pretty much everything is
fucked up, right?

~~~
willstrafach
> but pretty much everything is fucked up, right?

Depends on your definition of "fucked up" in this case.

The change is that data for China-based users will need to reside on domestic
(Chinese) servers. Any associated implication derived from that would be
opinion-based.

~~~
samat
Apple made it clear that those company would have to iCloud data, too.

------
miles
Could you please add (PDF) to the title? Didn't HN used to do this
automatically? Or did the trailing "?" in the URL break that functionality?

~~~
edanm
I think it was done manually.

~~~
miles
Looks like you're right; turns out it's even in the guidelines:

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

> _If you submit a link to a video or pdf, please warn us by appending [video]
> or [pdf] to the title._

------
josho
iCloud Keychain may be surprising for some folks. For example, it can be
restored from an iCloud backup only to the same machine. Also, you have no
ability to recover your iCloud keychain from your own time machine backups.

The reasons, as the document outlines, are for added security. But, having
recently wiped my iCloud keychain by resetting Safari's privacy settings and
inadvertently loosing all my passwords, I was surprised to discover that I
couldn't restore my passwords from my own backups. The upside is a compromised
iCloud password doesn't also leak all the keychain passwords.

------
neom
Anyone know if other cell phone vendors publish a document like this?

~~~
josephh
Samsung: [http://www.samsung.com/my/business-images/resource/white-
pap...](http://www.samsung.com/my/business-images/resource/white-
paper/2013/11/Samsung_KNOX_whitepaper_An_Overview_of_Samsung_KNOX-0.pdf)

------
cocktailpeanuts
> "The processor forwards the data to the Secure Enclave but can’t read it.
> It’s encrypted and authenticated with a session key that is negotiated using
> a shared key provisioned for each Touch ID sensor and its corresponding
> Secure Enclave at the factory."

If I understand this correctly, IF they're using Diffie Hellman key exchange
to generate the shared session key for every chip, doesn't this mean Apple
also owns the session key for every single iDevice out there and can crack
into them if they wanted to?

Does this mean the "security" only protects users from men-in-the-middle, but
not from Apple (or NSA if they come after them)?

~~~
axoltl
The long-lived secret for TouchID is generated in the SEP, using the UID of
the SoC (a secret known only to the SoC) and the serial number of the
respective sensor. That key is then keywrapped with a (symmetric) key known to
all TouchID sensors and sent to the fingerprint sensor. It unwraps and
verifies the key and burns it into fuses.

That key is then used for establishing a session key in the field. The session
key uses entropy from both the SEP TRNG and the TouchID TRNG.

So an 'evil' Apple couldn't crack the session key unless it had been evil all
along and storing the generated keywrapped blobs. In which case you're sorta
screwed no matter which way you slice it.

------
ploggingdev
Regarding iCloud accounts, Apple seems to be forcing the usage of phone
numbers for 2FA and account recovery without an option to disable it. I
switched from an Android device to an iPhone recently and was asked to setup
an iCloud account. I went through the setup process and realized that my phone
number was setup as a 2nd factor with no option to disable it [0]. For all the
talk about Apple devices being the most secure, not many people seem to be
complaining about how Apple forces a phone number as a 2nd factor + account
recovery method. Most people backup very personal data to their iCloud
accounts and forcing users to use a phone number for 2FA and account recovery
is ridiculous. IMO Google gets 2FA right : I can setup a Yubikey +
Authenticator + backup codes and remove my phone number as a 2FA method. And I
also realized that there's no way to delete an iCloud account. I assumed all
the big companies will have an option to delete accounts. I hope there's a law
mandating all online accounts need to have a clearly defined lifecycle with an
option to delete accounts and personal data if users want to.

(First time using an Apple device, so I might be misunderstanding the 2FA
situation, correct me if I'm wrong.)

[0] [https://support.apple.com/en-us/HT204915](https://support.apple.com/en-
us/HT204915)

~~~
walterbell
You don’t need to use iCloud. Backups can be done locally.

~~~
thisacctforreal
libimobiledevice supports the native iOS backup protocol, including
encryption, with the idevicebackup2 command.

[http://www.libimobiledevice.org/](http://www.libimobiledevice.org/)

~~~
walterbell
Looks promising, thanks. No mention of iOS 11 support on the front page, is
that under development?

------
mrblues
Is it possible to extract data from a locked and turned off iphone 7 or newer
device?

------
zython
Please tag this as pdf

------
yorby
Was Steve Jobs in charge of over-viewing security?

------
ConcernedCoder
Is this chain-of-trust implementation the reason my backlit-keyboard on my
macbook pro won't light up whilst asking me for my password on coldboot? It's
a giant pain in the rear to get up and flip on a light when you're in bed
programming at night... ( sigh )

------
MikeGale
This looks like a great example of insecurity through security.

Given that Apple is not trustworthy and you need to be able to change and/or
inspect a device to have a chance at security, this is a solid strike for a
human-thought-free insecure world.

------
drewmcmillan
>The probability that a random person in the population could look at your
iPhone X and unlock it using Face ID is approximately 1 in 1,000,000 (versus 1
in 50,000 for Touch ID)

I would love to know the likelihood of this in reality. For example, What
about people who look like you? You don't tend to hang around with completely
random people, its often parents and siblings who, unlike fingerprints, may
bare facial resemblance enough to trick it

