
What Is the Secure Enclave? - ingve
https://www.mikeash.com/pyblog/friday-qa-2016-02-19-what-is-the-secure-enclave.html
======
stcredzero
In a more ideal world, all devices would have something like the Secure
Enclave, but with the hardware and software open sourced. There would be a
public process for vetting and verifying the design, as well as for verifying
embodied instances. Ideally, it would be implemented in such a way that the
security could be mathematically provable. This would let the public have the
benefits of trusted execution, which they could then use to protect their
information in the hands of corporations and governments.

This is the exact same asymmetry embodied in openness/privacy/surveillance.
When governments and corporations have unfettered access to people's private
information, this is very bad for human rights and an open democratic society.
On the other hand, when individuals have open access to information from
government organizations and corporations, this is generally good for human
rights and an open democratic society.

Organizations using trusted execution technologies against individuals has
been a disaster for individual rights. However, empowering individuals to use
such technologies to protect them against corporations would have tremendous
benefits for individuals.

~~~
vectorjohn
It isn't possible to open source the hardware. If the secure enclave is really
secure, it has parts that by definition cannot be examined or verified. So
hardware specs can be open, but you'll never have any guarantee that's what
you're getting because hardware is always a compiled blob.

~~~
nitrogen
Can you list some of these parts that, by your definition, cannot be examined
or verified? If you're talking about physical security features to make
decapping difficult, surely a high-res xray would be good enough to verify
that the hardware is as specified without being able to read the contents of
nonvolatile memory, and without compromising physical security.

~~~
deegles
Question: could an x-ray device be made sensitive/high resolution enough to
"read" whether a bit is flipped in silicon? (and extract the private key)

~~~
tonyarkles
It's a tricky thing, and I fully admit that I have a limited understanding of
the specific effects of x-rays on semiconductor devices. My understanding
though is that you get higher resolution with higher-energy x-rays, and that
as you increase the energy, you're more and more likely to damage the
semiconductor (e.g. by inducing bit flips).

Here's some material from a flash memory manufacturer with more details:
[https://www.spansion.com/Support/Application%20Notes/X-ray_i...](https://www.spansion.com/Support/Application%20Notes/X-ray_inspection_on_flash_AN.pdf)

------
equalarrow
I think one thing that needs to be said is Mike Ash is just one of those super
humans where Apple & code are concerned.

I've been reading Friday Q&A for years and the posts never cease to amaze. He
doesn't politicize or whine about anything. It's always thoughts from a
teacher, a master.

I wish there was more in the tech world like him. Thanks Mike!

~~~
mikeash
That's really nice of you to say. I appreciate it.

I do actually whine a lot, I just try real hard to keep it off my blog.
Complaining is fun, but it's not helpful.

------
gh02t
So one thing this article kind of takes granted/doesn't describe is physical
security of the Secure Enclave. What exactly is done (physically) to make it
tamper proof? How is the UUID stored in such a way that the Secure Enclave can
still read it, but somebody with access to unlimited resources can't dissect
the processor die to read it out? I understand that there have to be some sort
of countermeasures, but I haven't ever really seen anybody describe what they
are.

~~~
a-priori
I'm more familiar with HSMs (Hardware Security Modules), which are larger
devices used to securely store and manipulate cryptographic data by
certificate authorities and so on. But the security requirements are similar,
and HSMs are designed to destroy their secured data (e.g. key material) if
they're physically tampered with. If you physically breach their security
envelope, then it protects the data by immediately wiping internal storage of
all key material.

I suspect the iPhone's Secure Enclave is designed to self-destruct in a
similar way.

~~~
lawpoop
> it protects the data by immediately wiping internal storage of all key
> material.

Yeah, so how, exactly, do they wipe their data? Is it a firmware process? What
if they are unpowered as they are tampered?

Or is the media attached in such a way that physically removing it would
damage it physically?

~~~
pjc50
A common HSM approach is to keep the key material in battery-backed SRAM so it
evaporates when unpowered or tampered. The single-chip solution used in
smartphones probably has no budget for extra parts just for key security, so
the key will be fixed and stored in processor antifuses. You theoretically
could get at them with a scanning electron microscope, but only with extreme
difficulty and no guarantee of success on a single device. And it's a
destructive process.

[http://www.microsemi.com/document-
portal/doc_view/132857-ove...](http://www.microsemi.com/document-
portal/doc_view/132857-overview-of-microsemi-antifuse-device-security) : see
page 5. That's Microsemi but the general approach of Apple/TSMC/Samsung is
likely to be the same.

~~~
AdamJacobMuller
do you have any idea of what success rate you would be looking at there? 99%
50% 10%?

~~~
pjc50
I don't know, but evidently the manufacturers think it's "low enough". This is
definitely the kind of security which is about increasing the resource spend
per attack rather than guaranteeing impossibility.

------
CGamesPlay
> Given the goal of protecting the user's data, it would make a lot of sense
> for the Secure Enclave to refuse to apply any software update unless the
> device has already been unlocked with the user's passcode.

This is also speculation, but perhaps this is why you have to enter the
passcode on device reboot. This may be simply a software protection (see talk
about being compelled to provide a fingerprint), but it may actually be a
necessary step for the secure enclave to boot as well.

I also suspect that Tim Cook's announcement doesn't mean to imply that such a
theoretical attack currently exists, but rather than one may exist in the
future that Apple could be compelled to comply with.

------
thrownaway2424
I take it these suspects didn't have any backups of their phone? It clearly
cannot be the case that the backup can only be decrypted by the original
device, since the entire point of the backup is to be able to restore it to a
different device.

~~~
mcherm
The suspects DID have backups of the phone on Apple's iCloud platform, and
Apple already provided that to the FBI.

But that doesn't meet the FBI's actual needs. The FBI's ACTUAL needs are to
have a case with a lot of public sympathy in which they can force a major tech
company to very publicly comply with their order to add a backdoor to a phone
(without calling it that) in order to influence the legal and legislative
systems (and perhaps public opinion, if the FBI even cares about that).

~~~
Crito
Has the FBI (or Apple) been able to unencrypt the backups from iCloud?

~~~
mcherm
Apple has provided the FBI with decrypted copies of the iCloud backups. But
the phone only backed up "intermittently" so recent activity would not have
been included in those backups. (Well, it COULD have been, except that the FBI
told San Bernadino County to change the password, which messed that up.)

------
pcora
Couldn't we find more about this by searching for patents around the Secure
Enclave submitted by Apple Inc?

------
awqrre
If "secure enclaves" are actually secure, companies manufacturing those
"secure enclaves" could just log the encryption key(s) at fabrication time to
render them useless?

~~~
Pharaoh2
The chips do not have an unique identifier on them. It will be impossible for
the manufacturer to look at a chip and then query their db for the encryption
key. At best they will be able to provide all the encryption key that they
have produced possibly reducing the search space form 2^128 to ~10B. Also,
secure enclave manufacturing process is done is such a way that even the
manufacturer does not know what the key is. They don't generate a key for each
chip, a natural phenomenon which is truly random is used to burn in the
encryption key.

~~~
awqrre
If the enclave chips don't have a unique identifier on them after being
installed in an Apple device does not mean that they didn't have a temporary
identifier...

Also, are you saying that even the machines manufacturing them do not know
what they are doing ("... secure enclave manufacturing process is done is such
a way that even the manufacturer does not know what the key is.")? Sounds a
lot like a PR campaign... I would be curious to know how this manufacturing
process really works.

~~~
lucaspiller
It sounds like the encryption key is generated when the device is running as a
combination of the UID and passcode/password, so it's not quite as simple as
being able to decrypt everything straight away if you are the manufacturer.

Even so, if they do have the UID that greatly reduces the security of the
encryption - especially if you are using a short passcode.

------
cromwellian
Trusted computing modules have been broken before. Someone presented a working
attack a few years ago at BlackHat using an electron Microscope to figure out
how to crack the TPM in the Xbox.

~~~
Ao7bei3s
Yes, but the process is destructive and quite time-consuming, requires
expensive equipment and cannot really be automated.

That's not a problem if your want to, say, extract secret keys _once_ from
_one_ device that you own in order to break DRM.

But it makes it completely impractical if your aim is to extract crypto keys
from smartphones to decrypt peoples data.

~~~
breischl
If you can backup the data ahead of time then it doesn't matter if the process
is destructive. So couldn't they (in theory) dump the flash memory, then
extract the key from the processor and use it to decrypt the memory?

Obviously not practical for mass surveillance, but it would work to read one
particular person's phone, which is the issue at hand.

~~~
msbarnett
> Obviously not practical for mass surveillance, but it would work to read one
> particular person's phone, which is the issue at hand.

Eh. I mean, yes, if the issue at hand were ACTUALLY read this one particular
person's phone, that would probably be a valid avenue of attack.

But the actual issue at hand here is "establish precedent that you can use the
All Writ's Act to get a judge to hand you an ex parte order that lets you walk
in to a tech company and order them to build you (and, crucially,
cryptographically sign) the tools you want so that you can get whatever data
your heart desires".

Once that precedent is in place for this "just this one phone we swear" order,
nothing's stopping them from walking into Apple or Google or whoever with an
order to build and sign a custom OS version that, say, copies all data to an
FBI server and push it as an OTA update to a target.

Once All Writs has been expanded to mean "you have to build us signed, custom
versions of your software to get us data we want", all bets are off.

------
numlocked
Is it possible that the enclave is in fact an ASIC with the crypto logic and
udid burned into the silicon? That would ensure that it couldn't be updated or
compromised, by Apple or anyone else.

This is pretty far outside my area of expertise, so that may be a very dumb
question.

~~~
awalton
It's likely implemented as a part of the System-on-Chip that Apple is using in
their phones, yes, either reusing or is very similar to ARM's TrustZone
extensions.

~~~
nicolas314
Since the Apple chip is derived from an ARM design it would make sense to have
the secure enclave implemented with TrustZone rather than being provided as a
separate piece of hardware. Most probably a TEE (Trusted Execution
Environment). Lots of TEEs are based on L4.

~~~
tptacek
Nope, it's a separate core, according to Apple.

~~~
struct
Educated guess: it may be the application processor also has a trusted
execution environment containing stubs that communicate with the Secure
Enclave. This would prevent kernel level exploits from writing to the shared
memory and mailboxes.

------
rwmj
On my Android phone, I have to unlock the screensaver and approve updates. Is
it the same on the iPhone? If so, it seems as if the FBI are wasting their
time asking Apple to update the phone, since they'd have to unlock it first.

~~~
jayd16
According to the article, that isn't known.

>The first possibility is that the Secure Enclave uses the same sort of
software update mechanism as the rest of the device. That is, updates must be
signed by Apple, but can be freely applied. This would make the Secure Enclave
useless against an attack by Apple itself, since Apple could just create new
Secure Enclave software that removes the limitations. The Secure Enclave would
still be a useful feature, helping to protect the user if the main OS is
exploited by a third party, but it would be irrelevant to the question whether
Apple can break into its own devices.

If we assume this is the case, it might explain what McAfee meant when he
mentioned social engineering.

~~~
planetjones
Don't forget that the phone the fbi want to decrypt doesn't have a secure
enclave. If apple agreed it would be trivial to write an OS without the
exponential back off when entering incorrect passcodes

------
wrong_variable
You have to hand it to apple - they are awesome at naming.

~~~
awalton
...awesome at naming what, the industry standard term "Secure Enclave"?

