
Apple Is Trying to Make iMessages More Private - bipr0
https://motherboard.vice.com/en_us/article/apple-is-trying-to-make-your-imessages-even-more-private
======
yalogin
I see a lot of complaints about closed code. That is the first thing that
people bring up with Apple and security. But how is open source changing
things here? No company open sources their server side components. Even if
Google released their server code we have no confirmation that they deploying
the same code on their servers. They are not vouching for that. We have a
company here that really seems to want to do good on security and privacy.
Immediately going to the closed source argument is just lazy and not helping.

Of course the good part about a crowd is all views come out and so th closed
source thing has its place but we should atleast give them their due and some
kudos. We know people will try to evaluate the implementation and see what
happens. In this case it's just a PR article. Let's wait for them to release
detail and see if it stands out. May be the protocol is enough to give us
confidence that their claim is true. We don't know yet.

~~~
zanny
Signal has open source clients and server. Matrix has crypto support now
(needs more auditing, but you can turn it on and nobody has cracked it yet to
my knowledge) and thats fully open source.

No, open source does not guarantee they are using the secure algorithms
advertised on their servers. But open source does is let you run your _own_
server, that you can put much more trust in. People spin up their own signal
and matrix servers all the time now for just that reason.

~~~
asadlionpk
But Apple isn't in that business, they are trying to make common user's comm
private. Open sourcing only helps the power user (audit, run their own), as
you described.

Signal and iMessage both don't guarantee true privacy as we can't see the
servers.

~~~
zanny
No service that uses a third party server can guarantee privacy _unless_ they
let you see the servers.

For common users the question is if you trust the server operator, and if not,
to consider your communications insecure. What they do about that is up to
them. And it is up to these providers to _earn_ trust.

~~~
muse900
Personally I can't trust apple for a specific reason... On iOS they've made it
quite difficult to turn location services off unless you jailbreak the device
itself. You have to go into settings -> privace -> locations -> on - off ->
are you sure you want to do it? -> yes ....

I mean when a company does that, it makes it pretty clear that they do want to
monitor your location, making it annoying for you to turn it on and off.

~~~
internet2000
How is that difficult to turn off?

~~~
muse900
The steps, I am a privacy junkie myself and I want to have location services
off most of the time, apart from when am gonna use uber or gmaps. Even for me
that I'd go the extra mile to turn it off, it just puts me off that you have
to go through so many steps to do it.

------
voidmain
We don't have to speculate how Apple could possibly handle account recovery
without entirely sacrificing security, because it's spelled out in their iOS
security whitepaper:
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

TL;DR: Keychain recovery relies on a cluster of hardware security modules to
enforce the recovery policy. After 10 tries to guess your PIN, the HSM will
destroy the keys. Apple support gates all but the first few of these tries.
The paper also implies that you can use a high entropy recovery secret as an
alternative, though I can't figure out how you would enable that.

This seems like a pretty reasonable point in design space to me. Of course,
you are relying on Apple's trustworthiness and competence to implement this
design. But that is true without recovery, since the client software is also
implemented by Apple.

~~~
dsacco
That's a good point, but the question is not just how to maintain security and
usability without account recovery, but how to do so without device
redundancy. There's no speculation about how to maintain true E2EE with a
network of trusted key pairs, but without multiple devices the user is very
vulnerable to permanently losing access. I think the recovery key is a clue.

As I speculated elsewhere in this thread, I think they're going to do it with
multiple recovery keys ostensibly written down by the user and never
transferred directly to Apple, which each then redundantly encrypt all user
data before transmitting respective copies to iCloud.

That would pull it off, and it basically just shifts the trusted device
redundancy problem to a trusted key redundancy problem. The only remaining
usability obstacle is to make sure the user has safely recorded all recovery
keys.

~~~
voidmain
The "HSM cluster" serves as a redundant "device" which is in Apple's
possession rather than yours, but which you must trust to withstand tampering,
even by Apple.

The option to record the keys yourself is also described in the whitepaper:

"If the user decided to accept a cryptographically random security code,
instead of specifying their own or using a four-digit value, no escrow record
is necessary. Instead, the iCloud Security Code is used to wrap the random key
directly. "

As I said, I can't actually find this option in my iOS settings. Maybe you
have to disable Keychain first?

~~~
learntofly
I think this relates to a time before 2 Factor Authentication, when Apple used
2 Step Verification. You could st that time (if you were using iOS 6 to iOS 8)
chose various recovery options.

I'm no expert but this is my recollection.

In the scenario where you have 2 devices, one is iOS 9/10 and have migrated
from 2SV to 2FA, the other is iOS 6/7/8, you can still access the recovery
menus on the iOS 8 device, but it does weird things to the keychain if you
mess about with it.

------
tptacek
iMessage is fine. Don't use it deliberately.

For secure, private messages, your sane current options are Signal, WhatsApp,
and Wire. Signal is the best option, but you're going to make some UX
sacrifices for security. WhatsApp and Wire are extremely comparable. If you
worry about implementation or operational security flaws, WhatsApp has the
Facebook security team behind it, and a long-term relationship with OWS; no
cryptographically secure messenger is better staffed. If you're worried about
Facebook seeing your metadata, which is a sane worry, Wire is approximately as
slick and usable as WhatsApp with mostly the same underpinnings.

Regardless of the underlying cryptography, in the absence of a well-reviewed
published crypto messaging protocol, iMessage is basically just an
optimization over SMS/MMS. It's great for that, but it shouldn't be anyone's
primary messenger.

~~~
michalu
What about Telegram? Is there a reason you haven't listed it? I find it better
that WhatsApp.

~~~
transitorykris
Telegram can be confusing for users. Encryption needs to be turned on, and
isn't available in group chats.

[https://www.ieee-security.org/TC/SP2017/papers/84.pdf](https://www.ieee-
security.org/TC/SP2017/papers/84.pdf) Is super interesting when thinking about
secure messaging from the perspective of ordinary folk.

------
ec109685
The question is whether Apple will allow recovery if you lost all your
devices.

If they don't, I don't think it is that hard for Apple to extend their current
security model to iCloud. They currently rely on senders encrypting messages
with each destination device's public key, so they can store the individually
encrypted messages separately in iCloud.

When a new device arrives, they could have an existing device perform re-
encryption of the messages for it (after the user authorizes that the device
should be added).

Even without the new iCloud functionality, Apple has always been in control
over the key exchange, which would allow a malicious employee / government to
write code that could add a new authorized device/key silently and thus allow
Apple to eavesdrop from that point on in future conversations.

~~~
scarlac
> allow a malicious employee / government to write code (...) from that point
> on in future conversations

This is exactly what Apple fought to protect "recently". Once a government
entity forces backdoors on citizen, privacy goes out the window. Apple knows
this and I think you can expect them to make a gigantic media mess if they
were forced to. It would ruin their business world-wide.

------
abalone
The headline is misleading. There are two features here, iMessage syncing and
iCloud device backups. All Apple has announced is better iMessage syncing with
no change in (already maximal) privacy. There's no indication that Apple is
going to stop backing things up the way they do now, which is not maximally
private but is capable of surviving a forgotten password, which is probably a
good default setting for consumer backups.

If Apple has changed _backups_ to function in a more private manner, then they
would announce that, not something exclusive to iMessage.

More detail: iMessage syncing has always been maximally private from day one.
However a drawback to the current implementation is that new devices cannot
sync message history. The reason is that each message is encrypted separately
by senders for each currently registered device for the receiver. And yes that
means if you have 3 devices on your iCloud account, whenever someone sends you
an iMessage, 3 separately encrypted copies get sent. Apple has gone to great
lengths to ensure that private keys are never shared by devices.

So what's new is apparently Apple's figured out a way to sync history via
iCloud. I'm interested to hear the implementation details, but there can be no
doubt that it still respects the design goal of never sharing private keys.

Now, the privacy goals for _backups_ are different. You obviously want them to
be as private as possible, but most people generally want to be able to
recover their life in the event of a simple forgotten password. There are
certainly scenarios where you want to encrypt your backups, but it always
should be an informed, opt-in choice. You should clearly be aware that if you
forget your password, you lose your backups. So generally it's desirable to
default to having a fallback recovery method.

Like I said earlier, if Apple has figured out a fallback recovery method that
somehow does not involve storing your data in a manner they can decrypt, that
would be something they announce as part of iCloud Backup... not just for
iMessage. But it seems almost a fundamental design constraint. You can either
have something impossible for anyone else to decrypt or conveniently
recoverable backups, not both.

~~~
dsacco
_> iMessage has always been maximally private from day one. However a drawback
to the current implementation is that new devices cannot sync message
history._

No it hasn't and yes they can. I've done it several times. The ability to
restore messages to a device is specifically what breaks the otherwise end-to-
end encrypted iMessage architecture, which is why Federighi talking about the
new iOS 11 capabilities is intriguing.

To your last point, my personal hypothesis is that Apple has designed a
cryptosystem that uses a PKI with redundant key pairs to extend the redundant
encryption. That shifts the recovery usability solution from redundant trusted
devices to redundant keys that are written down.

~~~
abalone
_> I've done it several times. The ability to restore messages to a device..._

Interesting. To be clear, you're not just talking about restoring messages to
the same device, like after resetting it?

Looking more closely at Apple's security whitepaper, perhaps restoring history
on a new device is possible if you enable iCloud Keychain. Looks like that
would in fact share the private decryption keys among devices.[1]

Ah, and that more clearly points at what this iMessage change may be:
Mandatory iCloud Keychain, at least as far as iMessage keys are concerned.
Which would suggest another, hidden improvement: no more need to redundantly
encrypt a copy of every message for every recipient device!

I want to add however that this still does not suggest anything about changing
the security of backups, which was the implication of the article. Nor would I
necessarily characterize iCloud keychain as "breaking" encrypted architecture.

[1]
[https://www.apple.com/business/docs/iOS_Security_Guide.pdf](https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

~~~
dsacco
Yes, I mean restoring to a new device. See here:
[https://blog.cryptographyengineering.com/2013/06/26/can-
appl...](https://blog.cryptographyengineering.com/2013/06/26/can-apple-read-
your-imessages/)

~~~
abalone
It sure sounds like you're talking about restoring from backups. There's
nothing in that article that suggests you'd normally be able to sync iMessages
to a new device apart from restoring from a backup. And that's expected
behavior.

I just noticed that you misquoted me.. I said iMessage _syncing_ has always
been maximally private, and drew a distinction between that and backups. The
article you cite mentions the tradeoff between strong cryptography (maximal
privacy) and user pain (losing your data forever). Apple has made the
intentional design choice of enabling the former for syncing but allowing
backups to survive an account password reset. I think it's pretty clear that's
a good choice for a consumer device. You can always turn off iCloud backups
and back up locally via iTunes if you want maximal _backup_ privacy.

------
tsunamifury
Except of course via Chinese access to the unencrypted cloud as required by
their laws. Who as we've seen this week, is willing to sell to anyone.

~~~
tbolt
Could you elaborate on this and/or provide some links? First I've heard of
this.

~~~
ahiknsr
[https://www.hongkongfp.com/2017/06/08/china-uncovers-
massive...](https://www.hongkongfp.com/2017/06/08/china-uncovers-massive-
underground-network-apple-employees-selling-customers-personal-data/)

~~~
coldtea
This looks totally unrelated to some "unencrypted cloud" access the parent
mentioned.

It says "The suspects allegedly used an internal company computer system to
gather users’ names, phone numbers, Apple IDs, and other data, which they sold
as part of a scam worth more than 50 million yuan (US$7.36 million)."

So this is about CRM style data being stolen.

------
amelius
Sounds great. But how do we check if what they say is true?

~~~
spiderfarmer
Make sure criminals use them. Check how the police reacts.

~~~
scarlac
As funny as your comment seem, it's actually true.

Look at the "recent" FBI case. The insecurities became very apparent because
Apple had to cooperate.

------
WA
Good. Only problem: iMessage is useless in Germany, where Android market share
is at least 70% or so and 95% of my friends use WhatsApp.

~~~
selectodude
Maybe the Germans should consider buying a device that cares about their
privacy a little bit. Why is that Apple's problem?

------
EGreg
The trade-off is that if you lose your keys, you're shut out.

I would recommend having an option to generate keys based on something you
have and something you know that you won't easily forget, such as a
passphrase. That way you can always recover them later!

------
notadoc
What if you could enter a special private iMessage chat with someone where to
decrypt and read/reply the participants had to verify each message with Touch
ID?

Good or bad idea?

~~~
goodplay
Touch ID is a username, not a password.

------
likelynew
Has there been any known exploit(by government or any other actor) that worked
by exploiting advanced cryptography. I feel using a zero day is more easier
way for exploiting anything. Also, there are limited ways in which one can
exploit cryptography, unlike zero days for which there is a free market and
continuous supply.

~~~
dsacco
To your first question: I'm interpreting you to mean a zero day of the form,
"The NSA is aware of a cryptanalytic weakness in this encryption algorithm";
as opposed to a backdoor, e.g. "Microsoft provided a way for the NSA to bypass
Skype's encryption without breaking it."

I don't recall any specific examples off the top of my head, but I believe
it's probably happened and does happen. But backdoors are much more common; so
much so in fact, that I'm led to believe the NSA doesn't have significantly
greater cryptanalytic capabilities than academia and industry these days,
given that their modus operandi is usually to demand a backdoor rather than
breaking it. Their advantages probably stem from access to superior computing
power or simply much more of it. I imagine a lot of the agency's research is
in fundamental paradigm shifts that can broadly attack many algorithms (like
quantum computing) - my edit at the bottom gives an example of this.

To your (implied) second question: it's probably not true that zero days are
easier. When a company like Apple develops a novel cryptosystem, the NSA is
not likely to break it for years (barring conspiracy-theoretic capabilities
that we have no way of verifying). Zero days incur massive amounts of research
and development time to go from identifying a useful cryptanalytic weakness
(i.e. get an algorithm from exponential, to sub-exponential to quadratic time)
to deploying an exploit. All the while, earnest cryptographers in industry and
academia are attempting the same thing, except they'll publish their results.
And if the NSA has a functional exploit, they will use it like you would a
classified weapon: sparingly.

 _EDIT:_ Actually, your question reminded me of differential cryptanalysis.
That's more of a paradigm of attacks against a variety of algorithms instead
of a zero day against any one particular encryption algorithm; still, the NSA
apparently developed differential cryptanalysis and maintained it as a
classified capability before the public community independently came up with
it. That probably qualifies for your question.

~~~
likelynew
I was referring to zero days as in the way to gain the ability to run the
malicious code in user's device, preferably in the root account which cannot
be stopped by updating the software, something like that was done in
jailbreaking using browser(a long time ago), or pwn2own. I was not thinking of
encryption related zero day specifically. If someone gets root access, they
get access to all the contents, no matter what transport security is used.

------
jakob223
What's to stop apple from registering another device on your account, which
will get the shared keys?

color me skeptical.

~~~
caiob
Apple can't simply register devices into your account, they would have to have
access to your iCloud account, which they can only get using one of those
keys.

~~~
jakob223
I guess if the devices are only willing to share keys with other devices which
can individually prove they have the iCloud password. I wonder if it's an
online procedure or if they just upload the keys to iCloud, encrypted with the
user's iCloud password.

~~~
nicky0
When registering a new device you have to authorise it using one of your
existing devices.

------
dsacco
A few thoughts I have after reading the article:

 _> "Our security and encryption team has been doing work over a number of
years now to be able to synchronize information across your, what we call your
circle of devices—all those devices that are associated with the common
account—in a way that they each generate and share keys with each other that
Apple does not have."_

 _> It's unclear exactly how Apple is able to pull this off, as there's no
explanation of how this works other than from those words by Federighi. The
company didn't respond to a request for comment asking for clarifications.
It's possible that we won't know the exact technical details until iOS 11
officially comes out later this year._

 _> Meanwhile, cryptographers are already scratching their heads and holding
their breath._

This might be uncharitable, but in my mind I think this writing and
presentation of facts (probably unintentionally) implies that this capability
is novel, when it's not. Sharing keys between multiple devices is a
straightforward issue if you're willing to make user experience trade offs.
Cryptographers are not scratching their heads wondering how Apple could
achieve E2EE with a network of devices, they're wondering how they did it
without sacrificing account recovery. It's not clear to me that readers would
automatically understand this, because the real head scratcher isn't addressed
until near the end of the article, which brings me to my next point:

 _> "The $6 million question: how do users recover from a forgotten iCloud
password? If the answer is they can't, that's a major [user experience]
tradeoff for security. If you can, maybe via email, then it's [end-to-end]
with Apple managed (derived) keys," Kenn White, a security and cryptography
researcher, told Motherboard in an online chat. "If recovery from a forgotten
iCloud password is possible _without access* to keys on a device's Secure
Enclave, it's not truly e2e. It's encrypted, but decryptable by parties other
than the two people communicating. In that sense, it's closer to the default
security model of Telegram than that of Signal."*

I'm hesitant on how much faith to put in Apple's scheme here. On the one hand
I generally trust Apple very highly when it comes to security and cryptography
in particular. On the other hand I don't see them making account recovery
impossible.

However, over the past few years they have been increasingly pushing two-
factor verification, and then full two-factor authentication based on a
network of trusted devices. The iCloud password used to be enough to manage
the account's security and trust, but now it frequently defaults to requiring
authenticated approval from a trusted device (instead of e.g. security
question responses).

I could see Apple abandoning conventional account recovery if they keep
proceeding down this path by providing a huge amount of access redundancy. For
example, they could keep redundant copies of all user data synced in iCloud
which are respectively end-to-end encrypted on the client with a user's backup
keys. Each authenticated user device might have 10 backup keys, with a typical
warning that they should be written down and will not be displayed again, etc.
The keys could be downloaded from the device and stored by the user but never
given to Apple, and would primarily be useful in circumstances where a user
only has one trusted device authenticated to iCloud. Then if a user loses
primary access to any given Apple device, the user has two ways to recover
data:

1) Authenticated approval from another of the user's trusted devices, or

2) Use the backup keys, which do not provide a method of changing the account
password, but which instead decrypt the redundant user data corresponding to
the key.

The basic idea is that removing conventional password-based account recovery
required inordinate redundancy to counter usability loss; you can do this with
redundant authenticated devices (each with their own keys), or you can
simulate it on one device with redundant keys that are ideally harder to lose.

------
repler
Why is Vice so excited to write this article, with this headline, and then
provide absolutely no details:

> It's unclear exactly how Apple is able to pull this off, as there's no
> explanation of how this works other than from those words by Federighi.

All Apple says is "end to end encryption". From your phone to the cloud is 2
ends, and then from the cloud to the FBI is 2 more. Yay!

------
mlosapio
If you encrypt your iCloud backups isn't the whole concern moot anyway?

------
mtgx
First priority to make iMessages more private: disable iMessages by default
when iCloud sync is enabled, or at least give users the option to have
iMessage backup disabled when iCloud sync is enabled.

------
520794
With true end-to-end encryption there is no need for a middleman.

Each user

1\. encrypts her data at the source i.e. _on her own computer_ and

2\. sends the encrypted blob over the untrusted network, or so-called "dumb
pipes".

Hardware company that makes the users computer tries to dictate whether and
how #1 can be done.

Not necessary.

The software for doing #1 does need to be open source.

On mobile, does such software even exist?

And even if it does, is a mobile phone really the users computer? It is an
effectively locked enclosure with several computers controlled by third
parties.

The way to do secure mobile messaging would be to encrypt the message on a
computer the user controls, then move the message to the "mobile phone" and
then send to the untrusted network.

Alternatively, _do not use a mobile phone_ for messaging if worried about
others have access to the messages. Wait for a pocket sized portable computer
that can be tinkered with. No baseband, etc.

~~~
theWatcher37
Everything you said is also true of modern desktop computers, see intel's ME
and AMDs equivalent.

~~~
520794
Not all computers are Intel or AMD and not all Intel or AMD computers have the
"features" to which you refer.

Even more, nothing requires these computers to be connected to the untrusted
network. ME requires an internet connection.

The message encrypted on the user-controlled computer can be moved to the
"mobile phone" via a wired local network, serial link or removable media. Is
such transfer to and from the device made difficult by the way these mobile
phones are constructed and configured? Yes, and probably this is intentional.
Companies want user data and the way they get it is by encouraging users to
store data in the "cloud".

Most importantly, these Intel and AMD based computers are not the only
computers capable of encrypting messages.

If ME scares you then do not use computers that have it.

If you have computers with ME then disconnect them from the internet and get a
computer without ME for your internet needs.

This message was typed on a computer that does not have ME, would not be
considered "desktop" and would probably not qualify as "modern". It makes no
difference. No problems encypting messages on it.

