
Self-encrypting deception: weaknesses in the encryption of solid state drives [pdf] - exceptione
https://www.zdnet.com/article/flaws-in-self-encrypting-ssds-let-attackers-bypass-disk-encryption/
======
dang
The paper is here: [https://www.ru.nl/publish/pages/909275/draft-
paper_1.pdf](https://www.ru.nl/publish/pages/909275/draft-paper_1.pdf).

(That was the submitted URL but download links don't work so well for HN
stories.)

------
tptacek
Litany of failures:

* Firmware protection in drives is almost uniformly broken, so that they can get code execution (through JTAG or through hacked firmware images) routinely. This is bad, but shouldn't be the end of the world, since in the drive encryption threat model you don't want to have to depend on the firmware anyways. But:

* Two Crucial SSDs encrypt the drive with a key _unrelated to the password_ ; the password check is enforced only with an "if" statement in the firmware code, which can be overridden.

* Another Crucial SSD uses PBKDF2 to derive keys, but then has a master override key, _which is blank by default_. It also has a multi-volume encryption interface (Opal) with slots for volume keys, all of which are populated whether they're in use or not, and if they're not in use, they're protected with an all-zeroes key that recovers the master key for the device.

* Two Samsung drives implement PBKDF2, but not in the default mode, which is "password is checked in an if statement, like the Crucial drive". Also, the wear-leveling logic in one of the drives doesn't zero out old copies of the master key, so that when you change your disk password (or set it for the first time), unprotected copies of the data encryption key are left in blocks on the device.

* The Samsung T3 portable drive uses the drive password in an "if" statement and is trivially unlocked through JTAG. Its successor, the T5, is no more cryptographically sound, but is simply harder to obtain code execution on.

People have strange ideas about what disk encryption is good for (in reality,
full-disk encryption really only protects you from the situation where your
powered-down, locked device is physically stolen from you and never recovered
[if you get the drive back, you have to assume, at least from a cryptographic
standpoint, that it's now malicious.])

But the net result of this work is that Samsung and Crucial _couldn 't even
get that right_. This paper is full of attacks where someone simply steals
your drive and then _unlocks it on their own_. It's bananas.

~~~
walterbell
Are there regulations in any industry which require the name of software
authors on binaries/products, or at least the name of someone who approved the
security claims made by the device?

~~~
pmorici
Buy drives with FIPS 140-2 certification. At least then you know a third party
lab has reviewed the implementation.

~~~
tptacek
It is almost definitely not the case that FIPS certification of any kind
reliably informs you that a capable crypto engineer verified the
implementation.

~~~
kop316
Respectfully, I'll need a citation on that.

~~~
tptacek
Sure. Find any device crypto vulnerability, and check to see if the device was
FIPS 140-2 certified. For instance, Infineon with their batshit RNG (the ROCA
vulnerability).

~~~
pmorici
Does your argument boil down to any review process that doesn’t catch 100% of
problems is worthless? I don’t think anyone claims fips certification is
always going to garuntee perfection but it’s better than nothing. It also
stipulates things like tamper evident seals which would mitigate attacks that
need jtag access to muck with firmware.

~~~
pvg
If someone lifted your drive - the thing FDE is supposed to protect against -
tamper evident seals are not going to offer much in the way of mitigation.

~~~
pmorici
that isn’t what the seals are for.

~~~
pvg
They keep the data USDA Grade A fresh.

------
nly
Why would anyone be surprised about this?

If you can't read a whitepaper, and inspect the algorithm to a degree where
you can create your own provably-compatible implementation, or easily inspect
the vendors source code, then you should just assume it's implemented
incompetently and is completely and utterly compromised.

The same goes for all cellular/mobile phone encryption standards, proprietary
VPN solutions, proprietary DRM, crypto used in banking (Chip n Pin etc etc).
All compromised. Period. Don't trust it. All crypto implementations should be
guilty until proven innocent under serious peer review.

The same attitude should be taken with anything not encrypted. Yes your ISP is
spying on your browsing habits, logging everything, and probably will one day
sell that data. Yes your bank is analyzing your spending habits. Just assume
it's happening. There's enough evidence out there now that this is the new
reality.

~~~
tptacek
Really, not so much.

The problem here isn't so much with transparency as with delegating
cryptography to non-cryptographers (or, for that matter, trusting that other
vendors have had cryptographers review their designs). Based on the track
record not only of open source cryptography but also of internal review of
closed-source cryptography (that is: closed source software reviewed _by the
vendor that wrote it_ ), having access to designs and code is of marginal
importance. Crypto bugs of comparable severity have lived for many, many years
in these systems.

What you actually want is an assurance that no link in the security chain of a
system protected by cryptography depends on a cryptographic capability had
hasn't been formally assessed. Ideally, you want the specific name of the
person who assessed it. For instance: Intel's SGX setup has been audited by
third parties and was also designed and in some sense overseen by Shay Gueron.
You will never see any of that underlying code, but you would probably be
crazy to trust an comparably ambitious un-vouched cryptographic enclave more
than SGX. Similar story with Amazon and KMS (again: Shay Gueron, but also in-
house crypto experts at AWS).

If anything, this story illustrates how weak an obstacle closed source really
is. This is an academic project, and they tore up something like 8 different
drives and game-overed most of them.

Of course, I agree with the broader point that Microsoft shouldn't have
trusted hardware cryptographic capabilities they clearly knew nothing about
(Microsoft also has sharp crypto people, all of whom would have barfed all
over any of these designs).

~~~
nly
Sorry no, i don't trust Shay Gueron either and, even if I did, I wouldn't
trust what Intel, AWS or anybody else did after he left the building. Same
goes for Whatsapp. It might be based on Moxies protocols and excellent work. I
might trust Moxie, but I still don't trust WhatsApp to be eavesdrop free
because I can't easily create compatible clients, inspect their protocol, or
look at their code.

That's not to say you shouldn't _use_ TPMs, HSMs, or CPU enclaves. It's just
that, if at all possible, you shouldn't elevate them to being the only thing
protecting you. In fact, most of these devices can be coupled with software
encryption by splitting secrets between them and the users brain. My android
phone for instance is encrypted, but only protected by a 6 digit PIN. I'm
aware that I'm totally dependent on the SoC vendor here to rate-limit and
otherwise fend off bruteforce attacks and protect the key ultimately
protecting my data. But you know what? The encryption itself is still done in
software and can be verified, and I have the _choice_ to use a ridiculously
long passphrase instead.

What Microsoft did is choose a stupid default, and put their users at unknown
risk.

~~~
tptacek
You can have whatever criteria you'd like for trust. I'm familiar enough with
the "open source security" argument on Hacker News to know where this
discussion converges.

I'm just offering a counterpoint:

1\. Closed source wasn't too much of an obstacle for a pair of interested
researchers to independently validate, without assistance from vendors, the
firmware and hardware crypto interfaces on a bunch of drives.

2\. The determinant of whether something is likely to be cryptographically
sound is less likely to be "open source" (open source is full of
cryptographically unsound things, many of which won't be ascertained as such
for years to come) than whether you know the identities of the people who
verified the design.

~~~
nly
1\. There's a difference between kicking the tyres and looking under the
bonnet of a car, and taking the schematics and building your own. It's easier
to show something is crap than convince people that it is good when you're
outright hiding things. If the things you aren't hiding (well) are crap, then
god only knows how terrible the things you are _really_ hiding are.

2\. Making claims based on the calibre of the companies and individuals
involved doesn't work. People don't trust DJBs algorithms simply because of
his personal track record. He also publishes whitepapers about their design,
and demonstrates their advantages over the state of the art. That's hard
science, not "it must be good because DJB is a rock star". The fact that
people can then take his designs and implement them themselves, in the open,
is also important.

~~~
tptacek
I don't know many people who actually work in cryptography who believe that
verification is cheaper and easier than building systems.

People absolutely do, for the most part, trust DJB because of his track
record. Find 10 people who have adopted Poly1305 and ask them what Poly1305
does better than GCM. That's not to say their decision is uninformed; rather,
they've delegated a very specialized part of the decision that very few people
are actually qualified to make out to an expert they trust, and trust for good
reason.

In practical cryptography, the opposite approach --- people who trust nothing
until they verify it from first principles --- is extremely problematic.
That's how you end up with oddball libraries that only look like they can do a
secure scalar multiplication or reliably add two numbers together. As Bruce
Schneier is fond of pointing out, everyone can design a system they themselves
can't break.

~~~
CiPHPerCoder
> Find 10 people who have adopted Poly1305 and ask them what Poly1305 does
> better than GCM.

Two things, one of which cannot be credited to Poly1305 directly.

1\. Poly1305 is easier to implement without introducing side-channels (modulo
integer multiplication on some CPUs; in which case, I defer to Thomas Pornin
on how to avoid them).

2\. The Chapoly constructions that use Poly1305 do so in a way that leaking
the one-time Poly1305 key doesn't buy you much. (This is the one that cannot
be attributed to Poly1305 directly.)

Others may have better or different answers. I implemented it in PHP [1], so
anyone reading this thread should keep that in mind. :)

[1]
[https://github.com/paragonie/sodium_compat/blob/f3b2d775a50d...](https://github.com/paragonie/sodium_compat/blob/f3b2d775a50d6f316b311554f4954a1668439551/src/Core/Poly1305/State.php)

------
DyslexicAtheist
_" We have analyzed the hardware full-disk encryption of several SSDs by
reverse engineering their firmware. In theory, the security guarantees offered
by hardware encryption are similar to or better than software implementations.
In reality, we found that many hardware implementations have critical security
weaknesses, for many models allowing for complete recovery of the data without
knowledge of any secret. BitLocker, the encryption software built into
Microsoft Windows will rely exclusively on hardware full-disk encryption if
the SSD advertises supported for it. Thus, for these drives, data protected by
BitLocker is also compromised.This challenges the view that hardware
encryption is prefer-able over software encryption. We conclude that one
should not rely solely on hardware encryption offered by SSDs."_

we know how bad general code quality is in commercial vendor firmware, it's
not surprising to see this

~~~
exceptione
I still think it's shocking, and I imagine a lot of companies and
professionals rely on hardware encryption based on the presumption that it
would be safer than software based encryption. I myself trusted the Opal-
certification too.

There was really no option for the vendors to stunt this feature.

For me, this paper is rather alarming and should be on the front page.

~~~
xoa
It's genuinely interesting to hear this is the case. I mean, I don't disagree
with you that it's important to have out in the open regardless. But I
honestly just assumed from the start that Opal would just generally be a badly
implemented feature checkbox not something dependable. For this case of crypto
code it didn't seem trustworthy if it wasn't open, standard and trivially
auditable. Unlike Apple for example these products don't have the kind of
marketshare, clear public profile and constant open as well as covert attacks
that would be necessary to grant a higher degree of confidence despite being
closed, the barrier needs to be lower given the lower profile. So I've never
used the hardware crypto in any SSD, it's always been through software on top,
although in fairness that has become easier to do as AES-NI has become more
universal in CPUs. That has given a lot of the assurance and speed of a
dedicated hardware crypto chip while still using higher level software.

I guess in retrospect it's quite reasonable most people would trust it, or for
that matter not think about it at all but rather trust Microsoft or whomever
to be getting that right for them and that checking the FDE Box in the GUI
would mean what it said. A good example of having personal blinders on for me,
I'm too used to just ignoring Opal entirely since the day it was introduced.

------
raesene9
In case you're wondering if this affects you (I know I was wondering) the
relevant command to run is (in an elevated command prompt)

`manage-bde -status <drive_letter>`

Then look at the output for "Encryption Method". If it says something like
"XTS-AES 128" I think that means you're using software encryption. If it
mentions hardware encryption, then it's using it :) (more info.
[https://helgeklein.com/blog/2015/01/how-to-enable-
bitlocker-...](https://helgeklein.com/blog/2015/01/how-to-enable-bitlocker-
hardware-encryption-with-ssd/))

FWIW on my Win 10 install with a Samsung PM871 it was set to software
encryption.

~~~
dogma1138
I never used HW disk encryption (other than TPM) because I always seen it as
unnecessary as in it doesn’t really improve on SW+TPM in terms of performance
or compatibility and could only cause potential data recovery and security
issues.

There is no performance benefit in fact with modern CPUs that have crypto
extensions it’s often slower and I never trust commercial solutions ever since
you could dump the key from the SanDisk/McAfee “secure” flash drives, and all
previous HDD password protection schemes like the ATA passcode were so shit I
didn’t even understood why people bothered with them in the first place.

~~~
pmorici
The hw drives absolutely perform better if you have a raid.

~~~
dogma1138
Is there a source for that? XTS-AES was actually slower with some of the
drives, block chaining for raid is done on the raid logical blocks I don’t see
how HW would be faster, in fact until fairly recently HW encryption didn’t
really work with raid setups at least those available to consumers.

~~~
pmorici
That was my point. Software aes-Xts gets slow when you use it on a raid.

If you do use it on raid put it under the raid so make one decrypt device for
each drive then raid those.

------
amarshall
As far as I know, no other popular full-disk encryption (LUKS, geli,
FileVault) delegates to the SSD’s hardware encryption (by default, anyway) as
BitLocker apparently does [0]. Anyone know otherwise?

[0]: [https://docs.microsoft.com/en-us/previous-
versions/windows/i...](https://docs.microsoft.com/en-us/previous-
versions/windows/it-pro/windows-
server-2012-R2-and-2012/jj679890\(v=ws.11\)#configure-use-of-hardware-based-
encryption-for-fixed-data-drives)

~~~
wmf
Lots of enterprise SAN/NAS equipment uses SEDs, although one would hope those
are actually secure considering how much they charge for it.

~~~
yellowapple
"although one would hope those are actually secure considering how much they
charge for it"

Considering BitLocker requires a Windows purchase (unless you pirate it) and
LVM costs exactly $0, I think it's safe to say that the amount of money thrown
is not a good indicator of actual security.

------
trulyrandom
I did not know that BitLocker relies on hardware encryption if the SSD has
support for it. That seems like an extremely dangerous default to have,
especially as the implementation is closed source in most (all?) cases.

~~~
Skunkleton
But bitlocker itself is closed source. Why do you trust microsoft more than
the disk vendor? Further, if I were going to choose an opaque blob to trust I
would choose the one that has the smaller attack surface.

~~~
trulyrandom
Whether or not Microsoft can be trusted is not relevant. It's just very
surprising to me that Microsoft itself blindly trusts disk vendors, instead of
using their own encryption layer on top.

~~~
Skunkleton
Ahh. I would guess that they aren't blindly trusting them. Given Microsoft's
historical relationships with hardware vendors I would bet they have at least
partially audited the firmware.

~~~
trulyrandom
Yeah, I can imagine they'd conduct audits for firmware on the hardware they
ship with their own products. I doubt they look at much beyond that though.

------
CaliforniaKarl
Here's what I love the most about this: If you have a full-disk encrypted
Windows laptop, which is fully powered down (or hybernated), and the laptop
contains PHI, _and_ you lose the laptop, then you probably do _not_ have to
report it as a data breach.

([https://www.hitechanswers.net/hipaa-doesnt-require-data-
encr...](https://www.hitechanswers.net/hipaa-doesnt-require-data-encryption-
but-you-should/))

But with this revelation, if you have an affected SSD, and you are running
Windows, then losing such a laptop may now be a reportable event.

------
walrus01
There's a very strong argument that encryption should only be implemented in
software, in something GPL, LGPL, BSD or Apache licensed so that its full
source code can be examined by cryptographers. I'm extremely wary of vendor
supplied hardware-based "crypto" that is totally opaque with what's going on
under the hood.

~~~
sandov
IMO, it should be common sense at this point

------
Fnoord
BitLocker, ehh? Truecrypt.org, before its final demise, claimed it was
obsolete due to FDE being native in Windows nowadays (ie. _BitLocker_ ).

~~~
nyolfen
i remember this being widely interpreted as sarcasm or a straussian double
meaning at the time

------
paavoova
I wonder what this implies for SSDs that claim to support instantaneous ATA
Secure Erase. Many SEDs, when issued this command, will simply discard their
internal key and generate a new one, in theory rendering data irretrievable.
I've encountered this method recommend over traditional wiping for reasons of
security (you can't be sure all blocks are wiped when overwriting due to
internal reservation), speed, and flash-longevity.

~~~
mjevans
My own personal policy is to allow SSD reuse internally, but to just
physically destroy the drives at the end of useful life.

------
rhtcmu
I am the original inventor of self-encrypting drives. My 3-4 page comment is
too long so you can read it here www.privust.com/sedlies Which describes the
work at RU, its partial truths, and the lying tweets that followed. Robert
Thibadeau, Ph.D.

~~~
fencepost
You'll probably get more credibility if that's linked in some way with your
page at CMU or your Linkedin profile.

------
pmorici
I spent a lot of time this year looking at ssds with encryption support. I’ve
come to the conclusion that what is really needed is to separate the crypto
hardware from the storage. Make a 2.5” drive sled for example that takes any
m.2 and adds cryto to it. Then you can choose the best supplier for both
crypto and flash independently.

------
sandov
Who would have thought that proprietary firmware encryption implemented by
hardware vendors would be crappy?

~~~
Arkight
Who would have thought water is dry?

------
tyingq
Not related to encryption, but I enjoyed this article where they hack into the
microcontroller on a typical HDD:
[http://spritesmods.com/?art=hddhack](http://spritesmods.com/?art=hddhack)

------
monocasa
I'm not surprised. My understanding was that encryption was originally added
as a whitening step to the data, a product guy somewhere heard about it and
added it as a checkbox.

------
nootropicat
Based on the presented analysis, TCG Opal encryption for the Samsung 850+
series appears to be secure.

I don't see a reason to still use software encryption. I would sooner expect a
backdoor in cpus rather than SSD. Almost nobody even has the capability to
even analyze the microcode for modern cpus. It could contain a backdoor that
stores N aes passwords in the cpu itself, sorted by the amount of data
encrypted. Using AES-NI makes it relatively trivial. At worst, both ssd
controllers and amd/intel cpus would have backdoors like that, but if that's
the case there's nothing to be done.

~~~
m45t3r
> I don't see a reason to still use software encryption.

Why not? Software encryption can still be hardware accelerated (thanks to
encryption instructions in the CPU), and it is fast enough to not be a
bottleneck unless you have a very fast IO device (and very fast in this case
mean either Optane or modern SSD's in RAID-0). Also, the impact in CPU
usage/power consumption is low.

------
chmike
It would have been interesting to see the result with Intel SSD

------
conbandit
Warning - this file will automatically download instead of opening up in the
browser.

~~~
jobigoud
It's a direct link to a PDF, what it does depends on how your browser is
configured. For me on FF it shows a dialog asking if I want to open or save.

~~~
sandov
Doesn't the website set a header or something like that for the preferred way
of consuming it?: in-site viewing or downñoading to hdd?

