Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: TrueCrypt audit status?
185 points by mukmuk on Feb 18, 2015 | hide | past | favorite | 157 comments
After raising over $70,000 from the community in October 2013, progress on the TrueCrypt audit has been fitful at best. The last update on http://istruecryptauditedyet.com was April 14, 2014.

Two key contributors, Matthew Green (https://twitter.com/matthew_d_green) and Kenn White (https://twitter.com/kennwhite) remain active on Twitter.

I am aware of VeraCrypt and CipherShed but these appear orthogonal to the original audit.

Does anyone know what the heck is going on?

It's in a weird logistical state. I'll take a healthy chunk of the blame.

The TC audit project commissioned iSEC to do a formal code audit. That audit was completed professionally and efficiently. No smoking gun problems were found (several nits were, but nothing that would make it any easier to decide whether to trust the package).

That iSEC audit was the headline achievement for the project, so the fact that it was finished should reassure people worrying about whether the project did anything.

After the code audit, the project was supposed to move on to review the cryptography in TC. Which is where I come in.

Because the project was considering commissioning services from professional appsec firms, I recused myself from the project (at the time, I worked for a very large appsec firm). My feeling is that a better use of the TC project resources would be to set up some kind of crowdsourced audit slash bug bounty. When the code audit was completed, and after I had left Matasano, I volunteered to coordinate a crowdsourced crypto audit.

Unfortunately, I was also in the midst of starting a new company and recruiting cofounders and then the holidays hit and long story short things went off the rails.

There are two big paths forward for the TC project that I am aware of:

1. They can rekindle the crowdsourced crypto audit (I'd be happy to remain involved, or to talk to any other subject matter expert that wanted to do that job --- n.b., I was going to do the work gratis). If any kind of formal review of TC's cryptography is to be done, this is the way to do it; the project can't afford what it costs to retain professional cryptography engineers to review the code (real crypto security consulting costs a multiple of what appsec consulting does).

2. They can devote all the remaining funds to a public bug bounty for Truecrypt.

There may be options 3 or 4 that I'm not aware of. I have a decent relationship with Kenn and Matthew, but I have not been trying to keep myself in the loop on the project.

There you go: more than you wanted to know about the TC audit project!

None of it has much of anything to do with that weird announcement from last year.

Thanks Thomas.

This is Matt Green, one of the two people running the audit. As tptacek explained, we went down a few blind alleys following the public collapse of the Truecrypt (development) project.

In the last few weeks we've signed a contract to begin a commercial evaluation of the cryptography in Truecrypt. We've also been doing some internal evaluation of portions of the source code. For reasons related to getting the best price, the start date of the audit was allowed to drift forward a bit. This was necessary to make sure that the donated money stretches as far as possible.

Rather than giving all the details in an HN post, I'm going to write an update on my blog (blog.cryptographyengineering.com) but it won't be up until we've notified everyone involved that we're making the details public, probably around 4:30-5pm ET today.

Please do go ahead and post a link to HN when you do.

Also: speaking in no "official" capacity whatsoever, I'd advise you to stay away from the forks of Truecrypt.

Unless something new has come to light since last I looked, the licensing situation on the TC code is weird:


... which means there is a pretty strong disincentive for people with serious crypto and systems expertise to invest their time and energy building on it. You don't want to trust crypto platforms with built-in adverse selection problems.

Wow those are serious concerns, thank you for linking that.

Maybe they have an email list of the original donors and can propose some multiple choice options:

1 - bug bounty

2 - attempt to hire someone at a steeply reduced rate for the audit

3 - use the money to seed a complete replacement or a clean room rewrite if possible (this is a can of worms but given the license issues seems like the only realistic way forward... might need the help of FSF or ASF or the like)

There is pretty much zero chance that the TC audit project is going to sponsor a rewrite of Truecrypt. That's a project that would cost much more than the TC project has to spend.

It just seems futile to spend the money on TC. I get the number of users of it is huge and the money was donated for this specific purpose. Users of TC should already be looking to move away from it whether there's a vulnerability or not. Even if a vulnerability is found, how is it legally patched? Is it almost better not to know?

With the money left, maybe a smart fund-raiser could get some conversations started with bigger donors who could match grants and go in a large round. The money might buy you the time of a Bruce Schneier or Richard Stallman to advise and promote the project in its infancy. And non-profit software orgs are underpaid and work on shoestring budgets already that this is real money to them.

The money was contributed for a specific purpose, and Matthew and Kenn are scrupulous about ensuring that it gets used for that purpose. They actually can't take the money and use it as a seed for a different project. It isn't a slush fund.

If only the design/architecture would be contracted to experts and the actual implementation be written by the community how expensive would that be?

The experts shouldn't write any line of code, use the community as code monkeys, only accepting pull requests and merge them in the project(basically what Linus does this days). Would that not be feasible?

I don't think this would work. The details of how block-level crypto work are intimately connected to a bunch of fiddley systems programming details like bootup, power saving, and memory management. It's not nearly enough just to propose a workable design for how to make a virtual hardware-encrypted disk with XTS; you need to evaluate a lot of raw code, too.


I don't see this as the roadblock. They (the experts) could bill by the hour. The most intensive period is the initial specification/design/architecture. After the burst period they just have to review the commits for security pitfalls and merge them if OK. The community could have some volunteer reviewers for triage.

I have no idea if this actually works and I also didn't heard anything like this done before, so take it with a grain of salt. That's why I asked more knowledgeable people how feasible this could be.

So you're proposing keeping elite crypto expert(s) on the payroll for--what, years?--as they patiently wait for the community to build something that meets their standards? I'm not going to say such a person or people don't exist, but Linus is a rare sort; considering the market value of crypto experts' talents I think chances of finding someone willing to serve in such a role are rather slim.

Can we still safely use TC 7.1a?

The EFF seems to recommend using DiskCryptor for Windows [1]. What are HN's opinions on this?

Also what does HN think about still using TrueCrypt? Any reason not to?


Been mulling this over myself. Doubt anyone with a reputation would like to stake theirs to another 'solution' that may well turn out to be flawed. Plus the FDE approach seems broken in that machines have to be powered down etc.

I take the view that for my own threat model (i.e. someone nabbing my machine) DC, even TC would be perfectly adequate for that first layer of protection. Advice given by tptacek on using PGP individually for sensitive information (if I have understood correctly) could then be coupled to that FDE where required.

For my own purposes this would protect what needs protecting in terms of at-rest data. A vast improvement over the no-encryption situation.

SSDs with hardware encryption seem to be the new frontline defense for mainstream users such as myself. Same issues as with any FDE, I suppose, but coupled with filesystem PGP encryption ought to again offer adequate protection from opportunistic thieves.

So we all know Truecrypt is dead. What should we use instead?

I don't know of any other software that provides cross-platform, encrypted, and mountable disk images. We have Bitlocker, Filevault, and Luks, but none of these work (easily, at least) on different platforms.

The idea that you'd want cross-platform full disk encryption suggests you may be investing too much faith in the powers of full disk encryption. FDE does basically one thing for you: it reassures you if your laptop is stolen from the back seat of your car or left in a cab.

Cross platform encryption is very important, and you should look for good solutions (most of us who rely on encryption for real operational reasons just hold our noses are use PGP). But full disk encryption has very limited utility. The idea of a USB drive you can plug into any computer that is locked by default is attractive, but you can get better security out of a USB drive that holds nothing but files encrypted at the application layer.



> You can’t authenticate the data. Authenticating every sector is too expensive. Try to (painfully) authenticate the whole disk and any disk error trashes the disk. Maybe your users enjoy exciting games of chance? Authenticate arbitrary groups of sectors and you can play Russian Roulette with your files. (http://sockpuppet.org/blog/2014/04/30/you-dont-want-xts/)

Out of interest: Doesn't the use of error detecting filesystems like ZFS and Btrfs solve the authentication problem? I don't have anything resembling a formal argument, but intuitively having each block checked in a Merkle tree like fashion should inhibit attacks where attackers can only change blocks in a random manner or restore old backups of the blocks. Of course time traveling - i.e. replaying - the file system as whole is still possible, but selectively manipulating the data should not.

It would probably make the exploit harder (unless the exploit targets the filesystem code itself), but there are almost always subtle gaps between "breaking the exploit" and "fixing the vulnerability". This is a fun question to dig into, though.

This is also very similar to the tactic of using an encrypted message digest in place of a MAC. The whole point of a MAC is that it can't be calculated without the secret that provides its security, and so schemes that use digests instead are (a) harder to attack than systems that do nothing and (b) still broken.

The ZFS crypto implementation (which isn't really available outside Oracle's Solaris) uses AES-GCM, which solves the authentication problem.

If you have filesystem encryption, block-level encryption is probably unnecessary. Filesystem encryption is in most ways superior to block-level crypto.

Do you know of any software (maybe it's possible with Truecrypt?) to have full disk encryption with a hidden volume for Linux? I remember Truecrypt only worked on Windows with this feature. As Windows can auto-update your computer even when you disable it in the settings, I wouldn't want to use it.

The comparisons between filesystem-level vs. block-level encryption that I've encountered usually make a common distinction; namely that file metadata is still present when only applying fs-level encryption. What are some attributes of fs-level encryption that would make it a superior choice over block-level?

There are three huge advantages filesystems have over block devices when it comes to encryption:

1. They have storage flexibility, so they can allocate metadata to authenticators and nonces. The fact that sector-level crypto can't do this means that the "state of the art" in efficient sector crypto is essentially unauthenticated ECB mode.

2. They're message-aware, so they can apply authentication at meaningful boundaries; a block crypto device is essentially a simulated hard disk, and so it doesn't know where files begin and end.

3. Being message-aware, they can protect files at a better level of granularity than "all or nothing", which for instance is the security failure that made it so easy for the FBI to convict Ross Ulbricht for Silk Road.

A lot of concerns about filesystem crypto stem from the fact that filesystem crypto precedes sector-level crypto, and most of it was designed (or has designs tracing to) the 1990s. What people who don't spend a lot of time studying crypto should remember is that nobody knew how to encrypt anything in the 1990s. It was a unique and weird time, where there was a lot of demand and interest in crypto, but not enough knowledge to supply crypto effectively.

So we should be careful about judging filesystem crypto by the standards of the 1990s.

I've read that page before. It does a good job of explaining why you don't (just) want full-disk encryption, and explaining why developers might want to use application-level crypto in their own application. However, what it doesn't explain is "what should users (developer or otherwise) do to encrypt all the random files on their system?".

Full disk encryption has the advantage of being transparent and not application-specific, so you don't have to teach every random application to do application-level crypto.

Sure, if you have a few specific files you want to encrypt, you could run gpg. You could even teach specific tools to understand gpg, such as text editors that can decrypt to memory, edit, and re-encrypt before writing to disk. But what about a source tree, stored in a git repository, regularly manipulated with git and various command-line utilities, and edited with a variety of editors? How would you store that, securely, other than on a block device encrypted with full-disk encryption?

Would you suggest a file-level encrypting filesystem instead, similar to eCryptFS? Would you suggest integrating encryption into ext4 (currently being worked on) and other filesystems?

Application-level crypto work great when the target surface is small and known, but fails badly otherwise.

Take a email you have received on a unix mail server, and lets assume it was sent encrypted. Is the search term database encrypted, the one that was created while the mail was decrypted? Is the reply you sent encrypted while resting in the sent directory? Are there logs, metadata, offline caches and similar leakage of data?

One should start with full-disk encryption, then add application-level encryption for defense in depth.

I can imagine all the ways an os can leak information if it isn't fully encrypted at a sector level.

Just having the filnames is an serious risk.

I spent all last night searching for an answer to this question. I couldn't find one.

A few times in the past I've taken a hard drive out of a failed computer and retrieved data from it on another computer. Sometimes this was cross-platform, reading Windows data on a Linux machine.

I haven't had to do this often, but the times I have had to do it, it's been a lifesaver.

Any advice, apart from keeping better backups so I don't have to do any cross-platform data recovery?

Pass the disk through to a VMWare/KVM/Xen container (if you can find another machine of the same processor architecture) and then boot the OS in the VM.

Usually it will piss Windows right off to be booted on completely different hardware, but usually it works enough to grab any files you may want from within the FDE.

It's utterly useless if the FDE was using any TPM hardware though, in that case then I'm afraid you're stuck using the original hardware.

GP didn't ask for full disk encryption, only "mountable disk images" which are far more flexible to deal with in heterogeneous environments.

Yes, and I'm saying that "flexibility" isn't worth its cost in security. Use app-layer crypto. Among other things, app-layer crypto gives you fine-grained data integrity, foreclosing on a vector for remote code execution attacks.

Do you have any specific examples of implementations on what you mean? I'd love to store word documents, pdfs, etc. gpg encrypted on an USB stick, without having the decrypted bytes be written to the device (Which I've previously done with TrueCrypt volumes). The concern being that I've decrypted a file, am working on it and someone pulls out the USB and leaves with it.

If you're serious about protecting PDFs and stuff, encrypt them with PGP. It's much more annoying than using FDE systems like Truecrypt, but much more secure.

If you're not so serious that you're willing to manually encrypt and decrypt files (or ZIP archives of files), just use your OS's full disk encryption scheme. On a Mac, for instance, you can create virtual disks with AES-XTS and keys derived from passwords; that's built into the OS.

What people really want is some kind of transparent encrypted filesystem. That's a reasonable thing to want, and it would be more secure than Truecrypt. I don't know of a good one.

Exactly. While I could use PGP to encrypt individual files, that's really inconvenient for more than a handful. Putting them into ZIP-style archives introduces potential problems with data being written to temp files, etc. A container-based file system solution would be preferred.

It'd also be nice if it were cross-platform, since lots of people do use different operating systems during their day. I know I do. :)

Over the weekend I was looking at some of the commercial products that are positioning themselves as TC replacements. And most of them are a little too close to the military/govt for my comfort. I'd rather have something open source just for the ability to inspect the code, if nothing else.

> What people really want is some kind of transparent encrypted filesystem.

Isn't that the promise of FileVault on a Mac? Is that not under discussion here because it's not good, or because it's not cross-platform? (In other words, should I not trust FileVault?)

Asking if you should trust software X is like asking the internet if it'd be healthy for you to start running a marathon.

For starters, how would we know? The software in question is closed-source and has spotty docs as best - and more importantly: your trust in a software is something that only you can establish for yourself, irregardless if whatever number of people on the internet claim the product trustworthy.

Appelbaum on FileVault(1) in 2006: http://events.ccc.de/congress/2006/Fahrplan/attachments/1244...

These are all very good points, but I guess I was think purely in architecture terms, like "does this software have a known-bad crypto design?" Tptacek answered that, but you make a good point that ultimately no one but Apple knows whether FileVault is doing exactly (and only) what Apple says it does.

To be fair they asked if they should not trust it. Which I think is a fair question. It could be answered with "I don't have a reason not to", or "Yes, because ...". Neither answer implies that it should be trusted.

Filevault is AES-XTS sector-level crypto. It's what I use, and I like it fine, but I encrypt important stuff with PGP.

Truecrypt also offers a container filesystem, which is what I used to use. It's cross platform and it always served me well. It would create a virtual disk drive with the contents being stored in the encrypted container file.

You want a full disc encryption which adds plausible deniability in the case the KGB sees you have an encrypted file named "Nuclear Launchcodes_2016_final2.docx" on your USB drive. Truecrypt has that - who else has it?

No you don't. The scenario you have concocted is ridiculous.

The specific scenario is unlikely, but hiding file and folder names is definitely a great feature. A lot can be inferred from file and folder names in industrial espionage scenarios.

If you're worried about someone inferring data from seeing files with leading names like "potential microsoft buyout.ppt", perhaps you should give your files less conspicuous names? Pretty much every organization concerned with security has figured this out long ago.

Frankly I find the suggestion that I manually give every single file on my computer an inconspicuous name even more ridiculous than a document of nuclear launch codes.

Metadata is data too. It should be protected.

Filename metadata absolutely does deserve protection! I'm not sure why it's become a ground truth that filesystem-level crypto must leak filenames.

For me, the much-increased convenience of mountable encrypted disks vastly outweighs the slightly-increased security of application-level encryption.

Who said anything about full disk encryption? Certainly not the comment you replied to.

Truecrypt is full disk encryption.

It is not only full disk encryption. I have used a file container encrypted with financial information on linux and windows over the years. I can mount that file on either OS.

Like Charles Dimino, you're not following the point. The security limitations of FDE come from doing crypto at the block layer. If your cryptosystem is giving something that (a) transparently encrypts and (b) mounts as a filesystem, it's block crypto, and shares the same problems as whatever cryptosystem unlocks your boot drive.

The problem is block-level crypto. It has nothing to do with whether it's layered on top of a hardware disk drive.

You're obsessing over the crypto, but we're talking about the user experience. It's block-level crypto. We get that. No one cares, in the context of this conversation.

What you're not getting is that TrueCrypt offered a particular interface experience and cross-platform compatibility that doesn't exist elsewhere.

You write as if the whole thread isn't there for people to see, and as if I had somehow responded to something you said rather than it being the other way around. You literally started this unproductive subthread by responding to the comment where I addressed the need for cross-platform things that work like Truecrypt does, and you've tried to built an argument by stipulating that security doesn't matter. Sorry, security is all that matters here.


"The term "full-disk/on-disk encryption" is often used to signify that everything on a disk is encrypted, including the programs that can encrypt bootable operating system partitions."

Are you going to tell Markus Gattol he's wrong? No? Good, let's move on.

What matters here is the security, and the adoption rate of TrueCrypt is/was through the roof, because of how it allowed folks to move encrypted volumes across various platforms without much hassle.

What you wrote seems to intimate there's no actual need or value in moving encrypted volumes across platforms, and that if folks actually want to do that they should just encrypt individually and at a FS level and do so using PGP, which has existed for years, and whose adoption rate and ease-of-use are both, compared to TrueCrypt, through the floor.

The fact is, people want to move encrypted volumes across platforms. It's not more secure than anything else, but it presents a workflow that might actually be more secure, due simply to it's ease of implementation.

You're right, security is all that matters here, and folks aren't going to be secure if it remains impossibly difficult to be secure.

The reference you just provided, with your "are you going to tell Markus Gattol he's wrong" quip, starts out as follows:

Block-layer encryption, also known as "whole disk encryption", "on-disk encryption" or "full-disk encryption"

I think you're just trolling. Sorry, I didn't even finish reading your comment.

Amusingly (edit: given how you cited it), there is in fact cryptographic stuff wrong in that article.

> Amusingly, there is in fact cryptographic stuff wrong in that article.

Oh yeah? That it wasn't personally endorsed by you?

(IMO) In that what it says about what specific things do does not correspond with objective reality. (As a product of the enlightenment I believe that such a concept exists and that correspondence with such is what makes something "wrong" or "right." ;-) If you want to know why, just look at what the article says, then observe objective reality, and see how they differ. The "Salt, Stretching" section is one place to look, it's the only one I really bothered reading.

Minimum viable non-snarky answer is: read what Rogaway says about the block-level crypto modes here:


You'll know the part I'm referring to because it reads practically as a response to a chunk of Gattol's page; it's a problem shared by the Wikipedia coverage on full-disk encryption.

Being cagey about it (i) motivates you to actually read the paper and (ii) avoids what would inevitably be an extremely unproductive debate.

(This is a fantastic survey, by the way; if you're interested in crypto, bookmark it forever.)

My amusement about Gattol's page has nothing to do with Gattol; it's just the way his page got used in this thread by someone else, as a sort of rhetorical "fatality" move. I'm confident Gattol is much smarter than I am. I say that because Dimino also tried to take this thread to Gattol on Twitter, too. :)

I figured if someone were besmirching my good work, I'd want to know about it, is all. I don't think one should be able to call someone wrong without giving them a chance to disagree.

Furthermore, this is devolving into a schoolyard, "I know but I'm not telling" situation. To put it another way, I don't think there is any significant error in the Gattol's page, nor is there any significant error on the wikipedia page on full-disk encryption.

I'm very glad to be shown to be incorrect on this point, but I doubt I would be, at least by you.

I'm interested in this area, both professionally and as a genuine curiosity, but every time I run into you it's a negative experience. I'd like that to stop happening.

That's a mighty fancy of saying, "it's wrong" without pointing out examples.

But hey, I'm not ultimately the guy who wrote the article. I've poked both the author and Thomas on twitter -- maybe Thomas can help the author correct any inaccuracies. You, too, could help I'd imagine, if you would be more specific about the issues. I'm guessing the author wouldn't want his article to remain inaccurate if folks could point out the specific issues.

It's like I'm talking to a child when I talk to you. I wouldn't bother if you didn't constantly pop up to derail conversations about crypto with your "crypto should stay hard" attitude. I suspect you want crypto to remain impossible to get right just so you can continue to stay relevant.

The rest of my comment frankly wasn't for you, it's for the folks reading what you write and blindly accepting it. At least now they can see how petulant you can be when facing differing points of view.

Up until this comment, I thought you might have a point against tptacek, but now it's clear you are trolling. Or at least, not operating under the "give your opponent a charitable interpretation" rule of HN.

I admit I tilted. It's just frustrating every time I talk to him or see him talk to others about anything that's even remotely contradictory to what he's said, it always devolves into something as inane as what happened here.

I shouldn't have tilted. :(

Is it still considered 'full disk' when its only used with a container file? I've never used TCs full-disk mode, but I've used it to quickly and easily create mountable disk images (even without encryption this would be handy). To my ear, 'full disk encryption' is something a hard drives firmware should be involved in.

It's "block level" encryption.

Some folks call that "full disk encryption", but since there's a separate feature in TrueCrypt that calls itself "full disk encryption" and is actually encrypting the entire disk, to the point where TrueCrypt has to supply a boot loader to decrypt, it's probably reasonable to want to differentiate the two.

Thomas doesn't see the difference because it's all "block level" encryption, and apparently the only thing in the world that matters is crypto (rather than the presentation and adoption of crypto), but the difference is mainly in the boot loader aspect.

What are you talking about? It needs a boot loader if it encrypts the OS partition, which is orthogonal to whether it encrypts entire physical discs or not. That feature is not called "full disk" anywhere I can see. "System Encryption" or something.

> That feature is not called "full disk" anywhere I can see. "System Encryption" or something.

See above citation.

No it's not, I use TrueCrypt all the time but not to encrypt my disk. Are you talking about the volume-level encryption TrueCrypt offers?

Are you really trying to suggest the world shouldn't have a tool like TrueCrypt out there?

I have no idea why you think it's productive to litigate the difference between "block-level encryption" and "full-disk encryption", but if it makes you feel better we can just pretend we switched the terms, because my point applies equally to them --- they're synonyms.

I also have no idea where the "I'm telling the world there shouldn't be a tool like Truecrypt" came from. I think you've misread me.

I never said you were telling the world anything.

I'm asking you a question to clarify your stance.

And yes, there is value in noting the difference between block-level and full-disk encryption, mostly because they're different.

Interesting. How?

Size, software used. The crypto might be the same, but this isn't just about the crypto itself.

If you're talking about a security product -- which TrueCrypt is -- the first metric you have to concern yourself with is: does it keep you secure? The user experience and the adoption and the performance and all that other fun stuff is irrelevant if the product doesn't do the one thing that every user unequivocally requires of it.

So yes, it's not just about the crypto...when the crypto works. But when the crypto is insecure, which is what tptacek is saying, then yes, it is ONLY about the crypto.

NB: I'm plenty qualified on UX and general technical matters, but on whether crypto is secure, I defer to the experts.

No one knows about they cryptographic integrity of TrueCrypt, as the person/persons actually doing the work only got their act together today.


My only point has been that Thomas, et. al. have been telling us we don't want something like TrueCrypt, despite the fact that we very clearly do. His suggestion of "just use PGP and FS level encryption" is absurd, but NOT from a crypto standpoint (I, like you, defer to Thomas and the other experts on the integrity of the crypto itself). It is, however, absurd from a UX/workflow standpoint.

Horseshit. Round 1 and Round 2 of the audit share technical members. The guy leading the actual crypto review work has been looking at Truecrypt for more than a year. And Matthew Green, who coordinates the whole audit project, just wrote that he and his students have been reviewing Truecrypt's crypto for months.

They did not "only get their act together today". They've thought about Truecrypt far more rigorously than you have, and for far, far longer.

You've been almost completely unable to explain in technical terms what "UX" you want from sector-level crypto that you couldn't get from filesystem crypto. When pressed, you in effect say "yeah, well, name a tool that does that".

The fact that your only options today are [insecure, easy] and [secure, difficult] does not mean that there is no [secure, easy] option possible. But militating in favor of insecure crypto goes a long way towards hiding that possibility from everyone.

This isn't a pedantic point. Ross Ulbricht just got reamed in federal court because a simple physical arrest compromised virtually every secret he had. Why? Because he was relying on sector-level all-or-nothing crypto. By encouraging people to rely on tools like Truecrypt, you are, in a very small but real way, endangering them.

Today was the day Matthew Green released an update on his blog.

I was just reading it, and that's exclusively I was referring to. I look forward to the results and am grateful of the time they're spending. I hope they find nothing.

Not sure why you made this about me personally.

> Not sure why you made this about me personally.

Your comments in this thread come off as ridiculously aggressive. I'm not sure if you're aware of that.

There is admittedly a level of aggression I feel when talking to Thomas, as I find his conversational tone off-putting and generally elitist.

I thought I did a better job of dealing with that for the most part, however. Maybe not.

This is what you actually wrote:


I stand by what I just wrote.

That's still up there in my comment you replied to, you realize that right?

This isn't a pedantic point. Ross Ulbricht just got reamed in federal court because a simple physical arrest compromised virtually every secret he had. Why? Because he was relying on sector-level all-or-nothing crypto.

That is not accurate.

... because...

There was a bit more to it than just that. He could have used block-level encryption relatively safely if he'd made a series or hierarchy of Truecrypt containers and mounted them only when needed, rather than putting everything on just the one block device.

More importantly, his physical security was lacking, as he hadn't properly considered the threat model. If he'd been working in a secured area (like a locked room) where open laptop snatching was infeasible, that would have given him enough warning to close the lid, and maybe pop the battery out. Albeit still vulnerable to a cold boot attack, if law enforcement have such capacity.

You're rambling.

TrueCrypt lets you create fixed sized encrypted volumes, and allows you to decrypt those volumes on any of the three major OS platforms.

There's nothing special about TrueCrypt in how it performs the encryption/decryption (or so we're told), but no tool besides TrueCrypt allows such a flexible approach.

And it's you who refuses to accept that [secure,easy] can exist, because it'd make you irrelevant. It's a completely silly stance to take, but it's yours.

But hey, at least I've wrung your opinion on TrueCrypt out of you:

> By encouraging people to rely on tools like Truecrypt, you are, in a very small but real way, endangering them.

For posterity, in case you edit it away.

Which leads me to the question: Why are you even involved in the TrueCrypt audit, if you think it's a bad idea to use such tools?

P.S. Ulbricht was caught because the FBI owned TOR, and that's about it. Maybe your indignation towards TrueCrypt should consider Snowden's use of TrueCrypt to evade the combined allied world's intelligence community.

Would you like to put money on whether my opinion about Truecrypt is identical to Matthew Green's and Kenn White's, or would you like to include them in your critique?

It's amusing that you feel you've "wrung out" of me something one of the few things I've recently blogged at length about.

Then why are any of you three working on it if you all think it's dangerous to promote its use?

You've blogged, "Don't use TrueCrypt"?

I've already answered that question, directly, on this thread.

And no, I blogged "don't use sector-level crypto". In a post literally titled "You Don't Want XTS". Under the subhed "Disks Are The Last Thing You Want To Encrypt". As in, "the last thing in the world".

The first sentence of your own article says:

> This piece is written for software designers, not end-users. If you’re an end-user looking for crypto advice: use Truecrypt, use Filevault, use dm-crypt

This is apparently where you stopped reading.

It's a great write-up, I read the whole thing. You clearly understand the domain well.

I really just don't get why you'd, in one breath, decry XTS, and then in that same breath, recommend people use TrueCrypt, which is, as you call it, "the best-known implementation of XTS".

Maybe just lead me to the water on this one. It's really the only thing left unresolved in our conversation.

Block-level encryption is a terrible, terrible approach for many reasons (which 'tptacek has referenced a million times). However, Truecrypt is the best such implementation, and it's a required approach in certain cases. You should be doing crypto at the application/filesystem level; if you can't, use Truecrypt. This isn't contradictory advice.

This is like, 89% of what I think (I don't think TC is the best, but it's not the worst).

What's weird to me is why we have a gigantic thread dedicated to the precise nuances of what I think about Truecrypt. Isn't this incredibly boring?

Mostly, except for the part where the guy who conducted phase 1 of the TrueCrypt audit said that encouraging TrueCrypt's use is dangerous and harmful.

That's not just what he said, he also said, "By encouraging people to rely on tools like Truecrypt, you are, in a very small but real way, endangering them."

No you haven't.

You also completely changed the comment I originally replied to. I much prefer your new comment, though my fundamental issue with the fact that you're working on the audit of software you think is dangerous to promote remains.

No, I did no such thing.

I don't know what you think you're accomplishing by saying "nuh uh" like this. You've done it a few times, and I don't understand, in any of these cases, why you think anyone would think you'd say otherwise.

If you'd care to elaborate beyond, "nuh uh", I'm sure we'd all be glad to hear it.

The "No you haven't" was in regard to the fact that you haven't answered why you're involved in TrueCrypt at all, if you don't think it should be used.

Yes, that's what I thought you meant. Perhaps reread the thread.

Well, you have edited a lot of your comments, so perhaps you did include this information in a later edit?


Having re-read the thread, you haven't explained why you're involved in the TrueCrypt audit, or why you recommend folks use TrueCrypt if you think XTS is bad.

Please stop calling me names.

I changed my comment somewhat, because you're being very squirmy, as per usual.

Do you think something like TrueCrypt shouldn't exist?

I'm not being "squirmy". You're playing a semantic game with the word "disk". The technical issue with FDE is that it works at the level of blocks, and so lacks information about message boundaries or the storage flexibility needed to (a) randomize the encryption and (b) store authenticators. Encrypt a physical disk, encrypt a file that pretends to be a mountable volume, same issues.

I get that not everyone understands the technical issues in designing storage encryption, but don't take that out on me.

Yours is a hilariously catty response to a fairly benign question.

Says the person who wrote "Are you really trying to suggest the world shouldn't have a tool like TrueCrypt out there?"

Yes, that is literally the sentence I wrote, and a sentence you never responded to.

Full-disk encryption is block-level encryption. If you're using TrueCrypt to encrypt anything, you're using block-level encryption. There is no functional difference between them. If you are not encrypting your entire disk, then block-level encryption is a bad idea because 1) it doesn't provide authentication, and 2) block-level encryption (using strategies like XTS) is not as strong as regular authenticated encryption using CBC and a MAC or whatever.

If you're not using TrueCrypt for full-disk or full-volume encryption, you'd be better off using basically anything else. There are plenty of cross-platform tools for doing that kind of thing.

Pedantic, but hopefully in a fun way:

Authentication is the biggest problem with sector-level crypto, but the other technical problem with encrypting sectors is that you don't get a place to store the metadata you'd need to randomize the encryption, and so you lose semantic security as well. If you squint at it the right way, XTS is the ECB mode of sector-level (wide-block) crypto schemes.

Can you name some of those cross-platform tools?

On which OS? Unless your adversaries are state actors BitLocker on Windows works just fine for both the internal and any removable drives.

Yes it's not perfect and if you use it and let your devices to go to sleep rather than power off you are vulnerable to various memory access and coldboot attacks.

However with TPM/USB Storage for the keys, with secure boot enabled Bitlocker offers one of the better data encryption capabilities out there.

Yes MSFT might have backdoored it, TPM is an oxymoron and all that might be true. However if the only thing you are worried about is what happens to your data if your device gets lost or stolen you are probably more than fine with using this setup.

If you are worried about the NSA having your files well then probably there isn't much you can do about it if they want them they'll get them. Whether its by backdooring your OS, internet connection, or by sending intelligence support activity assets to your home to tap into it while you sleep :)

There is TrueCrypt,the binary application and TrueCrypt,the on-disk format.

TrueCrypt,the binary application is the one that is dead but its on-disk format will probably live on as a cross platform on-disk format as there are other projects out there that are supporting it and more projects will probably follow.

tcplay[1] is one of the projects that have full support for TrueCrypt,the on-disk format.

If you already have a TrueCrypt volume and you dont want to use TrueCrypt,the binary application,then you can search for these alternatives and start using them.

[1] https://github.com/bwalex/tc-play

VeraCrypt seems a successful fork, cross platform and supporting truecrypt containers.

Didn't TrueCrypt throw in the towel?


The way it was closed was suspicious enough for me to wonder if they were driven to close by the government because it was one of the methods they couldn't crack efficiently.

This is the general assumption I've been under. When someone goes from "This is a secure product being actively developed" to "USE THIS PRODUCT FROM MICROSOFT INSTEAD OF THIS. THIS PRODUCT IS BAD. BAAAD", then well, yeah. That's sort of the canary.

No, it's not.

So do you know more about the sudden "TrueCrypt is not secure" thing (http://www.theregister.co.uk/2014/05/28/truecrypt_hack/) and can say definitively why the creators did that?

You can see elsewhere on this thread where I'm coming from and what I think about the project.

I appreciate the time you spent on the project (and on this thread!). However, I don't see this issue specifically addressed: The following was posted on TrueCrypt's SourceForge page [1]; I don't see how it's not a 'canary' (well, technically it's not because it's a direct message) and how users can trust TrueCrypt. Until this is resolved, every other discussion of TrueCrypt's future seems moot.

WARNING: Using TrueCrypt is not secure as it may contain unfixed security issues.

The development of TrueCrypt was ended in 5/2014 after Microsoft terminated support of Windows XP. Windows 8/7/Vista and later offer integrated support for encrypted disks and virtual disk images. Such integrated support is also available on other platforms (click here for more information). You should migrate any data encrypted by TrueCrypt to encrypted disks or virtual disk images supported on your platform.

[1] Discussed here:

* https://news.ycombinator.com/item?id=7828107

* https://news.ycombinator.com/item?id=7812133

* https://news.ycombinator.com/item?id=7814725

Seconded. I was quite puzzled by this whole issue when it happened and surprised to see that there hasn't been much clarification since. The developers who posted this message are real people whom others have been in contact with since, correct? (such as https://www.grc.com/misc/truecrypt/truecrypt.htm ). There was a very unambiguous claim of _existing_ security vulnerabilities in the EOL announcement. Have the developers explicitly refused to elaborate on this? Is there no reference to these concerns in the dev mailing list or elsewhere? Have they refused to take ownership for the statement?

People involved in the TC audit project have talked to the developers. They developers do not have an ideological investment in Truecrypt. They're just developers. They built Truecrypt to scratch an itch, found that supporting it was a largely thankless task, and then watched their software get at least 80% obsoleted by modern operating systems.

They don't owe anyone on HN or anywhere else any kind of "ownership of statements" or explanation. The published some source code. They got sick of it. They've moved on. It's over.

I understand the urge people have to synthesize a soap opera narrative out of things on message boards --- that's fun, after all, and the alternative is boring. But that's all the conversation about the TC project abandonment really is: a synthesized soap opera.

As I re-read your response it is seeming stranger and stranger. It seems you, too, are refusing to answer the straightforward question put forward. If no-one has asked the devs to explain this claim of potential vulnerabilities, so be it. But if everyone related to the project is recusing themselves from this question I think it's quite reasonable for observers to be interpreting that as a red flag.

(Also to clarify, when I said 'take ownership of this statement' I was referring to the earlier conjecture that the message was written by people other than the original developers.)

I appreciate your response, but I think also you are misrepresenting the events here. If the developers simply moved on that would be one thing, but what actually happened is they also made this claim:

   Using TrueCrypt is not secure as it may contain unfixed security issues.
I don't think the developers owe anyone anything, in terms of supporting this project or even justifying their decision not to support it. But they did make this claim, which they didn't need to, which casts the entire project in doubt. You're saying they are refusing to elaborate on this claim? Or has no-one asked them to? Because I think this thread is evidence enough that plenty of people want to know.

I don't see anything wrong with that quote. They're simply saying it's no longer being developed, and therefore security issues could arise in the future and go unfixed.

However, I still use TrueCrypt because I'm familiar with it, don't believe it has been compromised, and I trust a random pickpocket will be unable to break it.

You mean to say, "Move one everyone nothing to see here"?

The circumstances were fishy - in a way that says don't trust the software to anyone that has anything to hide.

More or less yes, that is what I think. If you don't want to trust Truecrypt, I certainly wouldn't argue that point. I don't use it either.

I read your other comments on the project. And most of what you are conveying seems to be about the audit project specifically, which you're involved in.

My particular curiosity is about that announcement, and whether or not it was a government attempt to discredit a likely very effective product.

There was some talk about how BitLocker in newer versions of windows removed an "elephant diffuser" component or something, was that ever explained properly?

The "diffuser" was an attempt to provide last-ditch data integrity for a system that is fundamentally incapable of providing real data integrity. XTS doesn't provide integrity either.

Long story short: that change does not matter much.

Oh, the diffuser matters a lot actually. Your former colleagues (I think?) proved this by blindly popping calc on a bitlocker-protected Windows 8.1!


With the diffuser, we have ~9 years of conjecture and speculation, with no one overly certain that attacks are possible. Without it, we have calc.exe fairly quickly after someone got the idea to try. You can't say these are roughly the same in practical terms.

If you're not going to trust the developers when they tell you that's not the case, why would you trust them enough to use TrueCrypt?

Perhaps this is a naive question:

What is the connection between truecrypt and windows xp?

Just that it's the last mainstream operating system for which there is no credible alternative to Truecrypt. What Truecrypt does has become a basic feature of modern operating systems.

As far as FDE goes, yes. But TrueCrypt was rather unique in that it provided (then) trusted cross-platform encrypted containers, not necessarily encrypted disks.

I spent several hours last night trying to find a trustworthy, free tool to encrypt some files that would make them accessible across all my OSes (I use Windows, Linux, _and_ OS X). After looking at Ciphershed and Veracrypt, I didn't find confidence in them, so the closest thing I could come up with was one-off style encryption -- eg. GPG. And I just don't see a good workflow wherein every time I want to access a 3 Mb file I have to unencrypt, untarball, access, retarball, reencrypt 100 GB of files. Or write a script that's going to sort through a couple hundred files to ensure they're all encrypted without tarballing.

So yeah, now I jump through hoops using one OS's encryption mechanism, and then kinda-sorta sharing the porn to the other OSes in a way that I hope is semi-secure and I try to make sure I clean up after myself.

Edit: In the time I wrote this, I see a huge discussion has blown up around TrueCrypt's containers/block encryption. If anyone has a tool that can encrypt a collection of files and make them transparently accessible to other apps in a cross-platform way, I'm all ears.

Not as an open-source feature though. I am happy with dm-crypt on Linux, but surely the Windows and MacOS built-in encryption methods are not to be trusted by people who might be targeted by the state?

Considering all the proprietary firmware on a standard laptop/desktop/router/modem you're pretty much screwed if targeted by a state. States build SCIF rooms with armed guards to protect against other states. We have FDE and PGP to protect against thieves.

Was looking forward to a cryptanalyst's ripping apart of the "cascading ciphers" TC shilled to see if it was snake oil or not.

Some people think it's a warrant canary of sorts, trying to convey a message that the project had been compromised and was subsequently shut down. But that's all speculation without any evidence to back it up so we may never know.

Like tptacek said, versions of Windows after Windows XP shipped with their own disk encryption functionality. XP needed TC for that.

Next to that: the announcement to discontinue the product was very close to the MS end-of-life announcement for XP.

Remotely related is the difficulty researchers had compiling TC from source and matching the official binaries. The original devs used Visual C++ 1.52 (1993) leading some to believe that the build system for TC still ran on Windows XP and was not up-to-date enough anymore.

I think windows vista introduced encryption on NTFS volumes

I think the key was full disk encryption. I _think_ XP supported individual file encryption.

Newer versions of Windows include BitLocker which provides full disk encryption. http://en.wikipedia.org/wiki/BitLocker

The first sentence says it all: it is _not_ available in all of the versions after Windows XP. Combining that with the fact that it is not open source and it is not cross-platform, it's basically useless to me.

For now on, I'm sticking with TrueCrypt 7.1a version which I downloaded from https://truecrypt.ch/ on all of my machines.

So, an open-source product which the developers have publicly declared as insecure, and on which a major code-audit has stalled inconclusively, is superior to closed-source _how_, exactly?

Not saying you're wrong, per se, just that it seems like all the choices are wrong, here.

The target platform for TrueCrypt was probably windows because of lack of alternatives? This is what transpires from his last email.

As far as I know the audit project was abandoned when the TrueCrypt developer did his dramatic exit. Cannot remember a source for that though.

No, the project was not "abandoned" like that. What source told you that?

The wording on the top of http://truecrypt.sourceforge.net/ leads me to believe that TrueCrypt is no longer being developed, and bugs (including vulnerabilities) are not being fixed.

If that's the case, what's the purpose of of the audit?

Thank you, tptacek, for the huge amount of time you're spending answering questions in these threads.

Lots of people still depend on Truecrypt, and have depended on it the past, and those people need to know whether it was ever safe to do that, and how urgent it is for them to migrate to something better.

That makes a lot of sense. Thanks!

Apparently none! Apparently that idea made it into my head somehow, I must've just jumped to conclusions and assumed I read it somewhere. Oops!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact