
Insider Attack Resistance - el_duderino
https://android-developers.googleblog.com/2018/05/insider-attack-resistance.html
======
ISL
This is the sort of development that makes me want a Pixel 2.

I've been leaning toward an iPhone, for the first time, on security grounds,
so this is a welcome piece of news.

Thanks, Google. Please keep it up.

~~~
saagarjha
I'm pretty sure everything mentioned here has been a feature on iPhones for
quite some time.

~~~
ucaetano
No. If Apple develops a new, signed, version of the security module, it can
bypass the password by updating the firmware in the security module without
the user's password.

At least that was the case during the San Bernardino dispute. Apple never
claimed it couldn't do, only that it refused to do for the FBI.

In this case (Pixel 2), Google claims that it is impossible to do that.

~~~
jd007
The San Bernardino case involved an older device though, the 5C (which came
out in 2013), which did not have the dedicated security module (Secure
Enclave) at all.

As far as I know, there is no confirmation on whether Apple (or anyone) could
flash new firmware to the Secure Enclave, without the user passcode, without
wiping data on the phone. This info is strangely missing from the official iOS
Security Guide document. If anyone has more info on this please share.

There are some (unsubstantiated IMO) claims by people online (e.g.
[https://blog.trailofbits.com/2016/02/17/apple-can-comply-
wit...](https://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-
court-order/)), and a series of Tweets with an ex-Apple security engineer
([https://twitter.com/JohnHedge/status/699882614212075520](https://twitter.com/JohnHedge/status/699882614212075520)),
but nothing official. SEP firmware definitely can be upgraded without a key
wipe (as confirmed by the Tweets as well as regular usage of iOS), but it's
unsure if can be done without the user passcode. iOS does prompt the user for
passcode when performing OS updates (which is also the delivery mechanism for
Secure Enclave firmware upgrades). I don't know whether this is a UX-level
security check only or actually hardware level required step.

~~~
derefr
This isn't proof, but food for speculation: Given that you have to disable the
iCloud "Find My Device" feature on an iPhone as part of the steps Apple
requires to be willing to take a device for recycling, I would assume that
that setting being on prevents them from doing any automatic wiping/updating
of your phone without your passcode, even in DFU mode. (For, surely, if they
had the capability, they'd simply use it at the recycling centre, and thereby
streamline the recycling workflow.)

~~~
jdironman
I highly doubt they would put those methods in the hands of people working at
recycling centers. I feel like if, and that's a big if, they have that kind of
capabilities it would be reserved for special case uses. I mean really special
case uses.

~~~
derefr
Keep in mind that "recycling centre" here refers to an intake channel at their
own factories; and that the _firmware_ side of the recycling process isn't
done by a technician themselves, but by a specialized "sanitizer" unit that
the tech plugs the phone into. (Picture a disk degausser, but with a slot for
a phone rather than a hard disk. Something heavy enough that you can't simply
walk away with one!)

Is it hard to believe that, if iOS devices had a mode "deeper than DFU" that
enabled control over the SEP firmware, that such machines would be implemented
in terms of that mode?

And I mean, it's not like I'm making this idea up. This sort of "secret
hardware-level handshake between recycling/repair machines and production
devices, to put said devices back into a lowest-level firmware flashing mode
that bypasses all user protections" was discovered to exist on the Nintendo
3DS, and was turned into a permanent jailbreak method for those. It might be
an industry-wide practice. (It's hard to tell, because even on a rooted
device, you can't just "dump" the ASICs and scan them for a backdoor
handshake.)

~~~
mseebach
A device that can launder stolen phones regardless of security settings is
still something to keep in limited circulation, even if you can't pick it up
and walk away with it.

------
gustavmarwin
The beauty of this can be seen by how those securities are leverage and
enhanced on CopperheadOS.

I criticise Google a lot for how much information they store on us, but this
work they do both on Android (open-source) and the Pixel phones hardware
should receive more praise.

------
vuluvu
"To prevent attackers from replacing our firmware with a malicious version, we
apply digital signatures."

How about putting a read/write switch on the device that prevents writing to
the firmware if the switch is in the off position.

~~~
laggyluke
What problem would it solve?

------
jakobegger
> We recommend that all mobile device makers do the same.

Kind of insencere when the biggest competitor has been doing this since 2013
(the feature is marketed as “secure enclave” by Apple)

~~~
mandevil
The Apple Secure Enclave cannot defend against someone who has the signing
keys to the password software (or at least couldn't as of 2016)- that's why
the FBI wanted Apple's "help" over the San Bernardino shooter. Apple said no,
but could have done it- it was a policy choice of theirs to fight the FBI.
Google has created a situation with the Pixel2 where they can't do that sort
of thing even if they wanted to. And justified it without ever referencing
"Search Warrants" or "Nation-state threat actors" even though that is the
obvious driving force here.

~~~
saurik
> ...or at least couldn't as of 2016...

The iPhone 5c came out in 2013 (and was a budget device based on the hardware
design from 2012; the iPhone 5s--which I believe, though would happily be
proved wrong, had _already_ fixed this issue--actually came out at the same
time as the iPhone 5c). I have no love for Apple (clearly), and I think they
(and, sadly, the EFF) were being disincenuous at the time about the definition
of "back door", but you are just spreading misinformation here: "as of 2016"
would imply the iPhone 7 was vulnerable, which is a _much_ newer device than
the one from the infamous San Bernadino scenario :/.

~~~
kobayashi
>I have no love for Apple (clearly)

For those who are unaware, Saurik created Cydia, the package manager for
jailbroken iOS devices, as well as a whole bunch of jailbroken iOS-related
software.

------
delvinj
What if the attack is in the form of a court order?

~~~
ZeroCool2u
This constraint forces an attacker to focus on the user that actually has the
password. From a security perspective, this forces the attacker to shift focus
on forcing the user to reveal the secret as opposed to the company responsible
for creating the firmware, in this case Google.

My guess is in the U.S. the 4th and 5th amendment would prevent the government
from forcing you to reveal the secret, so long as you do not rely on biometric
security, which has been in some cases ruled as exempt from the same rights as
say a password. IANAL though, so I really can't elaborate on an explanation of
why. I think if anything you're likely to be held on obstruction charges or
have your assets frozen in an attempt to apply pressure on someone unwilling
to cooperate. In other, perhaps less forgiving locales like North Korea,
China, or Russia, I imagine one may end up being the subject of persuasion of
a more physical nature.

~~~
xur17
I've noticed that my Pixel asks me for my PIN for "additional security" every
few days. Apparently it asks you for your PIN if you try to unlock your device
without having entered your PIN for 3 days [0]. I never realized this was the
reason, but it seems like a fairly decent deterrent to law enforcement - I
wonder if there's a way to reduce this frequency to a day or so.

[0]
[https://www.reddit.com/r/nexus5x/comments/3us0f6/pin_require...](https://www.reddit.com/r/nexus5x/comments/3us0f6/pin_required_for_additional_security_why/)

~~~
closeparen
A competent law enforcement agency attaches a digital forensics device all the
phone’s content as soon as they get their hands on it. They’re probably not
going to wait three days.

~~~
xur17
But that requires them to unlock the device. Using your fingerprint to unlock
it likely requires a court order, which takes some time.

------
smattiso
Have there been notable cases of a malicious actor installing a compromised OS
on a target's phone for spying purposes?

------
nmstoker
Impressive, but given the prevalence of apps that demand full access to all
USB contents and then arming the user with only the ability to accept or
decline all-or-nothing, this seems like an electronic Maginot Line.

But to be fair, they've got to start somewhere and there is always hope
they'll extend the permissions options to be more powerful.

------
ec109685
One attack this wouldn’t guard against is a malicious actor pushing a buggy
version of the Secure Enclave code that couldn’t be updated without destroying
all data on the phone.

------
dredmorbius
Does the firmware signature preclude flasshing the devices with an alternate
OS? (Considered independenttly of data stored.)

~~~
bitmapbrother
The locked bootloader would prevent flashing anything that was not signed by
Google.

~~~
dredmorbius
:-(

------
gouggoug
> There is a way to "force" an upgrade, for example when a returned device is
> refurbished for resale, but forcing it wipes the secrets used to decrypt the
> user's data, effectively destroying it.

It's interesting.

Why "[wipe] the secrets used to decrypt the user's data, effectively
destroying it" instead of wiping the data itself too?

Is this to potentially allow a third-party with enough power (i.e. a
government entity) to eventually decrypt the data?

~~~
mdellavo
this is the standard "remote wipe" technique and a standard cryptographic
application

~~~
gouggoug
I know nothing about the "standard remote wipe technique", but neophyte me
believes that wiping security keys isn't exactly protecting you against your
data being eventually retrieved and decrypted.

~~~
mcpherrinm
The data could be stored on an SD card, or emmc flash, or the cloud. Erasing
is not reliable there. The more data you have to protect with secure hardware,
the easier a time the attacker has.

Destroying encryption keys stored in a small, secure processor is the most
reliable and secure method.

The encrypted data can and should be wiped too, but it's a lot more reliable
to have your security model not rely on that.

We have no reason to believe AES-256 will ever be broken within the lifetime
of the universe. Maybe in 100 years I will be proven wrong, but I am willing
to bet the security of my phone on it.

I suspect we will have human beings living in another solar system before
AES-128 is broken.

~~~
gouggoug
My point is about doing both, if possible, instead of just getting rid of the
keys.

~~~
ethbro
Because, as parent noted, making it possible makes your secure enclave more
complex (which is the same as less secure).

Furthermore, as parent also noted, if your encryption is solid, wiping the
keys is identical to wiping the data.

~~~
gouggoug
Indeed, my bad, I totally read his answer too fast and missed "The encrypted
data can and should be wiped too, but it's a lot more reliable to have your
security model not rely on that."

