

Full-disk encryption for all computer drives, coming soon - nreece
http://www.computerworld.com/action/article.do?command=viewArticleBasic&taxonomyName=storage&articleId=9126869&taxonomyId=19&intsrc=kc_top

======
TomOfTTB
My problem with encryption technology like this is that it falls into that
gray area between theoretically useful and actually useful.

Equivalent software encryption has existed for a while (it's built into Vista
for example). But when you read about these Government or Financial laptops
getting stolen with people's private information on them they're never
encrypted. Why?

Because any info that's valuable is too valuable to risk losing just because
someone lost the password. I mean, he says it himself...

"it can't be brought back up and read without first giving a
cryptographically-strong password. If you don't have that, it's a brick. You
can't even sell it on eBay."

So when you deploy this kind of thing you have to ask yourself if there's any
circumstance under which that password could get lost. I'm not against the
technology I just don't know a lot of people who will be brave enough to use
it.

~~~
tptacek
I don't want to comment much on a story from a trade rag like Computerworld,
especially when it ignores the fact that we already have better options than
this press release product. But, I'll take a moment to correct you:

In many F-500 companies, disk encryption is already a corporate standard on
all laptops; the organizations that are worried about losing passwords keep
escrow keys. The problem people are trying to solve is, if you leave your car
unlocked and lose your laptop, you don't have to cut a press release about how
many SSNs you just lost.

~~~
TomOfTTB
I don't deny that there are ways in which encryption can be very useful and
that a very organized company can guard against ever losing their data because
of a password mishap. I'm just saying I've never met one of these companies
and I've done my fair share of consulting work for companies large and small,

People are used to security that can be circumvented if need be (through an
administrators password or at worst a disk recovery service). When you're
talking about hardware based encryption that option is no longer on the table
and I think that rightfully makes a lot of people uncomfortable. In that way
it's a huge risk.

Which is why things like bitblocker (again, built into Vista) don't get used.

~~~
zain
Big organizations don't trust individuals enough to make them the single point
of failure for encryption. Most often, these organizations run key servers
that have the actual key, and provide the key based on some form of auth
(usually Active Directory). If the user forgets their AD password, its an easy
reset; and if they leave the company or something, their data is easily
recoverable.

This sort of "encryption in the sky" approach is available to consumers too.
There are products by most of the big encryption vendors (like
<http://voltage.com/vsn/>) that make it easy for a user to encrypt their data
without putting themselves at risk for data loss.

------
jwr
That kind of encryption is useless, because I can't audit it. How do I know my
data really IS encrypted and the key isn't just stored on the drive itself?

~~~
tptacek
Proprietary hardware encryption schemes are assessed by third parties _all the
time_. If you just aren't comfortable with anything but open source, that's
fine. But you already rely on plenty of other security systems you can't
audit.

~~~
jwr
I don't. And I really do believe that I should be able to audit any security
solution I use.

I don't understand what you mean by "Proprietary hardware encryption schemes
are assessed by third parties all the time". Sounds like corporate speak to
me. First, any "proprietary encryption scheme" is rubbish, and second, I
should be the one to assess it.

My point with the drive encryption solution was that you are told that your
data is encrypted, but you have no way of checking it yourself. What if
encryption is disabled in your drive? Can you tell?

The final output of any encryption solution needs to be independently
accessible, so that you can verify if it resembles white noise (which it
should, ideally).

~~~
tptacek
So go ahead and audit them. The vendors will pay you to do it. Go thumb
through a couple years of Black Hat talks for examples of people finding
vulnerabilities in firmware, microcode, and closed-source cryptosystems in
their spare time.

By "proprietary encryption scheme", I'm just referring to crytosystems for
which you don't have the source code. You will have trouble finding any
mainstream full-disk encryption vendor that isn't using something like AES,
LRW-AES, or XTS-AES.

Nobody you care about is ever going to ship a product based on Super-
Mega-40960-Bit-Matrix encryption. It's not 1994 anymore. There is absolutely
no business case to be made for using nonstandard encryption: if you do it,
you can't sell to the government, you can't sell it to any Fortune 500
company, and you get made fun of in magazine reviews.

I don't understand your "what if encryption is disabled in your drive"
comment. What makes you think that's hard to check? Also, what makes you think
that "secretly not encrypting an encrypted drive" would be a sane business
decision for any vendor? They'd be open to spectacular liability. The first
credit card processor that lost a secretly unencrypted disk drive would end up
owning Seagate.

~~~
jwr
So, how do you check if your data is encrypted? Open the hard drive and look
at the platters?

I'm sorry, I am not convinced. I would much rather run software than I can a)
audit and b) check the output of.

~~~
tptacek
(1) I think all your questions are answered at the Opal spec site:
<http://tinyurl.com/aulrpj>

(2) If you think that hardware-level analysis is out of scope for assessing a
corporate full disk encryption system, you're out of step with the security
industry; hobbyists do hardware analysis in their spare time. There are
obviously likely to be much simpler way to verify encrypted storage than
"looking at the platters".

(3) Like I said originally, if you're religious about open source, more power
to you. It's obvious that there's little I can say to make you happy with the
TCG. But, and I mean no offense, from what little I know of you it seems like
they know much more about this topic than you do.

~~~
jwr
As to (2), my point was that I have no way of doing it.

As to (3), I am not "religious about open source". My point was that at the
very least I would like to be able to verify what the output is, which I
can't.

Your arguments about companies' reputations being on the line are something I
don't buy. This is hacker news, I buy technical arguments. And if you believe
"reputable companies" don't do strange things with your keys or our data, may
I kindly remind you of <http://en.wikipedia.org/wiki/NSAKEY>

Regarding (1), thanks for the reference, I will certainly learn more about the
technology involved.

~~~
tptacek
I don't think it's really reasonable to rule out hardware security simply
because jwr from Hacker News isn't capable of assessing it, but I don't blame
_you_ for not using it.

~~~
jwr
It isn't just me, Bruce Schneier doesn't trust the vendors either, and for
good reasons:

[http://www.schneier.com/blog/archives/2009/02/the_doghouse_r...](http://www.schneier.com/blog/archives/2009/02/the_doghouse_ra.html)

(yes, I know this is about a hardware enclosure, not about a drive, that's
actually lucky because someone could _check_ if the bytes are actually
encrypted)

Now let's hear you mock Bruce :-)

------
rw
Will we still own our data?

~~~
ramchip
I was kind of worried when I saw "Trusted Computing", but having read the
article I don't think there's any DRM-related use for it. It's really just HD
encryption.

~~~
tptacek
"Trusted Computing" is a bit of a bugbear; the technology behind it is totally
inevitable and fundamentally innocuous. All it's saying is, the OS should be
able to extract the promise of a secure channel to the chipset and the
hardware. Without that promise, you lose your machine just once to a piece of
malware and you can never trust it again.

~~~
mindslight
I wouldn't call the idea of physical access not being the highest level (to
hardware state) particularly innocuous. From the article - "You can't even
sell it [a stolen drive] on eBay". Theft deterrence is nice, but the piles of
bricked drives left over from xbox upgrades are disgusting.

The same eviction of malware can be achieved through a documented hardware
interface to the lowest levels of microcode (say, JTAG). The current problem
is not due to openness, but closed CPU microcode.

~~~
eru
> The same eviction of malware can be achieved through a documented hardware
> interface to the lowest levels of microcode (say, JTAG). The current problem
> is not due to openness, but closed CPU microcode.

Would you please elaborate?

~~~
mindslight
kragen's comment hits the nail right on the head. Any software update system
suffers from the ability to install a "rootkit" that can emulate the update
mechanism itself to ensure its own survival. I was mistaken in thinking that
CPU microcode was stored in non-volatile memory on the CPU itself, but the
same idea goes for any area in which such rootkits can be installed.

The simple solution is to allow easy updating of that base firmware through a
dedicated hardware interface. A socketed DIP isn't hip anymore (eww through-
hole), but a USB device header on the motherboard would certainly work.

I'd feel better about trusted computing technology if one of their design
goals wasn't preventing physical "tampering" but instead they provided
unfettered access through a debug port. However, it seems like they're aiming
for the exact opposite in order to facilitate naive software, theft
deterrence, and business model preservation.

~~~
tptacek
Can you explain more about how the TPM facilitates naive software and business
model preservation? That's not my experience with it, but maybe you've had a
different experience.

~~~
mindslight
Certain restrictions are impossible to encode in protocols where one's
computer acts as their agent. For example, it's impossible to restrict the
duplication or longevity of a document. Remote attestation restricts what code
can be used to run a protocol, and thus enables implementation of such naive
rules.

I don't currently use remote attestation for anything, but it's a matter of
time until some bozo decides that remote attestation is the way to solve
online banking security ("if only we could be sure they weren't running
malware!"), and brings the technology stack mainstream.

When that happens, the non-techies (seeing no distinction) will enforce their
business rules on end user's computers (they occasionally try to do this now,
but run up against reality). Client-side-only verification of form fields may
be laughable now, but won't be so funny with no incentive to fix it because
common people cannot exploit it.

------
ks
How will this affect data recovery?

~~~
mzexfswlkr
If you know the passwd then tools going through the drive will not change. If
you have to remove the PCB and replace it with one from another drive or you
have lost the passwd then you are stuffed. Unless of course the drive maker
has built in a backdoor - from past experience of security on PCs the
unchangeable security passwd will probably be 'seagate'

