Hacker News new | past | comments | ask | show | jobs | submit login
Truecrypt report (cryptographyengineering.com)
479 points by java-man on Apr 2, 2015 | hide | past | web | favorite | 174 comments



So there is now a complete truecrypt audit.

What is the checksum of the source tree that I can use to verify that I have exactly the copy they audited ?

I looked at the full report PDF and saw no mention of downloads, binaries, source trees or checksums.


There's a paragraph in the Phase I Audit Report (published a year ago) which includes a checksum:

> The iSEC team reviewed the TrueCrypt 7.1a source code, which is publicly available as a zip archive (“truecrypt 7.1a source.zip”) at http://www.truecrypt.org/downloads2. The SHA1 hash of the reviewed zip archive is 4baa4660bf9369d6eeaeb63426768b74f77afdf2.

The Phase II report (today;s release) claims to be auditing 7.1a, so I assume it's exactly the same version and ZIP file.

Last June, they published "a verified TrueCrypt v. 7.1 source and binary mirror", including file hashes, on GitHub: https://github.com/AuditProject/truecrypt-verified-mirror

I just cloned that repo and inspected the source ZIP; the SHA1 sum matches what they quote in the report.


The TL;DR is that based on this audit, Truecrypt appears to be a relatively well-designed piece of crypto software. The NCC audit found no evidence of deliberate backdoors, or any severe design flaws that will make the software insecure in most instances.

That doesn't mean Truecrypt is perfect. The auditors did find a few glitches and some incautious programming -- leading to a couple of issues that could, in the right circumstances, cause Truecrypt to give less assurance than we'd like it to.


Yeah, that's right on top of the post but you comment it like you came up with it yourself...


Plenty of people check comments before the article. The summary was helpful.


Yes, the summary was helpful. I think GP is merely saying "please make it clear that you are cutting-and-pasting from the post itself"


Yes and plenty of people make it clear when they're quoting the article.


You are right. I should have quoted the excerpt. Sorry.


Just edit the post, and use italic markup :).


Can you still edit the post?


No, I can't. I think the ability to edit disappears after a while.


It's the third paragraph, not right on top.


Are you javaman, as in, #cdc javaman ?


No.


For what it's worth, regarding the recommendation in page 14 of the report, there is a portable implementation (well, direct port of the SSSE3 code) of AES-CTR in NaCl: https://github.com/jedisct1/libsodium/tree/master/src/libsod.... Don't expect it to be fast or anything, but it exists.


The problem with not complaining if the Microsoft CryptoAPI can't be initialized doesn't appear to be really a thing to worry. I'd really like to hear about any known Windows configuration on which the calls can fail, and even if the calls would magically fail, the RNG is uses other entropy sources, including user's own mouse movements, specially requested from the user before the key is to be generated.


> I'd really like to hear about any known Windows configuration on which the calls can fail

Mandatory Profiles


Is that likely to intersect with TrueCrypt use in a troublesome way? If you're making the profile, don't do that; if somebody you don't like is making the profile, you're already boned.


> Is that likely to intersect with TrueCrypt use in a troublesome way

/shrug While it's certain that if you're operating inside a mandatory profile you are at the mercy of whoever created it... but that doesn't mean that people don't try to circumvent restrictions placed on them by their Administrators.

It's also not clear to me just how common Mandatory Profiles are actually.


Very good to know. Does anyone know the status of the Truecrypt forks? Although it has proven reliable and doesn't need a lot more functionality, it will eventually break without further development.



https://truecrypt.ch/ isn't actually a fork, they've done no development, the guys who made the website aren't developers and have no plans to work on it at all. I don't know why everyone keeps linking there. They're also using the truecrypt name which was the ONE thing the original developers said they didn't want forks to do.


>They're also using the truecrypt name which was the ONE thing the original developers said they didn't want forks to do.

Well right, they're not a fork so it's not the same kind of problem if they use the name. They're a fansite.


It'll come in handy for the NSA or some phishers some day


Has the licensing situation been clarified? The original license was not open source, but it seems unlikely that it would be enforced, as that would involve de-anonymising the copyright holders.


I don't think anyone has a problem with forks just changing the name of TrueCrypt (see VeraCrypt) and start from there.

There is a lot of discussion on the license on the VeraCrypt forums.

It's completely reasonable to ask people to stay away from the TrueCrypt name in my opinion, so that his software's reputation would not be tarnished if amateurs or problem-causing people took over the TrueCrypt name.

I wish the TrueCrypt end was handled better, but it's unique situation has some educational value.


There's nothing to clarify. You've summarized it correctly.


I think perhaps by 'clarification' he was hoping for resolution


It is resolved. Just not in the way you want.



There are independent implementations of TrueCrypt on-disk format like tcplay[1],cryptsetup[2] and zuluCrypt[3].

TrueCrypt,the binary application may break without further development but its on-disk format will continue to thrive as more applications pick it up as a mean to support cross platform encryption format.

[1] https://github.com/bwalex/tc-play

[2] https://gitlab.com/cryptsetup/cryptsetup/blob/master/README....

[3] http://mhogomchungu.github.io/zuluCrypt/


Explained for a layperson: should i be using TrueCrypt, or is LUKS okay? Anyone here use TrueCrypt for whatever reason?


Report coauthor here.

No significant issues were found in either phase of the TrueCrypt audit. If you're using it today (or have used it in the past), I don't think you have anything to worry about.

But it is an unmaintained piece of software, and for that reason I would migrate away from it. If I were setting up a new laptop today, I wouldn't consider installing it. If I had an existing laptop using it, I would think about transitioning when I had some spare time.

Not a hair-on-fire problem, though.


All else being equal it would make sense to move. But all else isn't equal and having an audit and knowing the programmer knew their stuff enough to pass the audit (especially since this is in the crypto field) seems to be a huge benefit that other projects can't match right now.


BitLocker and Filevault were almost certainly reviewed more carefully than Truecrypt was (I didn't participate in either audit, but can confirm that both of those companies allocate significant resources to third-party audits).


Certainly reviewed more and certainly infected by the NSA more. Until the taint of the NSA has been removed those are outright no gos.


I feel like a lot of people probably think the way you do: that software for which source code isn't available is trivially backdoored by NSA. In fact, huge chunks of the billion-dollar software security industry are premised on the idea that this isn't true at all: that you can, for instance, recover the control flow graph from a binary compiled program, and then from that recover an intelligible IR, and then examine that for security flaws.

Where, exactly, do you suggest that NSA could have tainted Bitlocker in a manner that wouldn't also implicate Truecrypt?


Possibly contrived example: An airgapped Windows machine. We've eliminated the possibility of any data exfiltration (all the software vulnerabilities in the world can't transmit on a downed interface), but that does not mean that the encryption itself is not defeated in some way by the existence of a "master key" or some other flaw that allows an attacker to access the data offline.

Truecrypt has now been proven to have no such flaws.

It's unlikely that Bitlocker will ever be proven in the same way.

Anyone that has a legitimate fear of that kind of attack, the bad guys with guns coming in and seizing your stuff, would use Truecrypt over Bitlocker out of an abundance of caution.


Why exactly is it hard for reverse engineers to discover how Bitlocker is keyed? If it's because it colludes with compromised hardware, why isn't that just as problematic for Truecrypt?


It isn't necessarily - but there's a lot of places to hide in a large set of binary blobs. Can any effort spent reverse engineering any program (let alone an operating system!) compete with a line-by-line audit of the source of a single program?

Compromised hardware is another matter entirely. The old rule is, "if an attacker has physical access to your machine, it's not your machine anymore", and it's no less true here.


It's funny you should mention it, because the Windows kernel is among the more comprehensively reverse-engineered codebases in the industry.


It goes like this:

NSA: "Hi, can you please let us see your source code."

(The conversation below happens in your head): <thought-bubble>

"Do I want to spend the next 15 years and all my money and life trying to stay out of jail?"

"Do I want the IRS to know about trips to the Cayman Islands?"

"Do I want my wife to know about my mistress?"

</thought-bubble>

Me/You: "Sure no problem!"

I know its nice to think about that we would the moral high ground.

Try doing it alone with just a nice government official.


This is in no way responsive to my comment, which wasn't about whether the USG would coerce Microsoft.


Yeah, better trust US companies who are subject to NSA, which has been proven to require no backdoors in software /s


If those companies wanted to subvert your TrueCrypt installation, they would have an easy time of it.


Could you sketch how, so we can think about countermeasures?


MITM the http connection you used to download truecrypt?


I assume MITM your https connection by manipulating CAs.

Or use one of their 0days to breaking into your machine and get your data while it is decrypted in memory or just steal your password.


I would assume both Windows and Mac are compromised to the level of C&C by the NSA. In a threat model that includes them I would not use either. But I would still think Windows + TrueCrypt is better than Windows + BitLocker.


Do you mean "NSA has stockpiled vulnerabilities they've discovered in Win8 that would enable them to quickly enroll a networked Win8 box into a C&C"? If so: sure, I agree.

Do you mean "NSA has implanted backdoors into Win8 that would enable them to directly enroll a Win8 box into a C&C"? If so: virtually nobody who does professional vulnerability research, myself included, agrees.

The distinction matters here, because if all you're saying is the former thing, that impacts Truecrypt just as much as Bitlocker, because it's equally true of every operating system, including Linux, FreeBSD and OpenBSD.


Does it have an implanted back door? I don't know. Can it give one to Microsoft and say push it out as an update for these individuals and if you say anything you'll end up in prison? That is the problem with secret courts.


This seems silly, but Microsoft appears, on the surface, to have purchased Skype with the intention of selling surveillance to the NSA. They immediately replaced the whole network architecture and encryption setup with something that had weaker security and was subject to compromise of Microsoft.

That's the sort of player we're dealing with, and that's why Bitlocker is not just improvably secure, it is best to simply assume that there are deliberate backdoors, whether any particular employee knows of them or not.


Skype's peer-to-peer networking was entirely unsuitable for running on mobile devices, the new default market. I was never able to leave it logged in on my phone before, or else I'd only have half the usual battery life.


And that's a justification for changing the way non-mobile devices handled skype?


Easy, but not as easy. I'm not saying that something is 100% secure, I'm saying that option A is better than option B.


Of course. I mean once your threat model includes secret shadowy intelligencia overlords that can bend corporations to their will, there's just no way they could tamper an audit.


They could even force the CPU makers to include backdoors directly in the silicon ;)


That one layer of security may be compromised with no possible recourse does not mean that one should abandon considerations of compromisation of other layers that do have recourse.


See: Trusted Platform Module (TPM)

Between China and the USA, there's almost certainly hardware backdoors in place for some systems. IIRC there has been evidence of router tampering for ISP hardware shipping across borders.


truecrypt had a software AES implementation you could enable.. if seriously paranoid.

performance impacting though.


I would just use the built-in disk encryption solutions your platform offers. This is also what the TrueCrypt developer recommends.


What's your opinion of XTS in general for disk encryption?

Are there any known exploitations of weakness in XTS that might work their way back to TrueCrypt?

Most folks use TrueCrypt's disk encryption as a "file container", when they effectively want to encrypt files. Do you know of any tools (cross-platform or otherwise) that would provide the same UX as TrueCrypt while not relying on FDE?


I think it's fine. For the limited threat model FDE is equipped to handle, XTS does the job.

I'm not sure what the best answer is to your second question. I wouldn't use TC for that. I would typically use PGP, but I don't think it has a good UX for general consumption. There may be room for someone to build a better product.


Coincidentally (?), You Don't Want XTS (2014) https://news.ycombinator.com/item?id=9312599 was re-posted today, opining on this question.


> This is also what the TrueCrypt developer recommends.

You can't be seriously deferring to that? The end-of-the-line announcement was so completely out of sync with all previous communications from the TC devs, so un-TrueCrypt like, that everything it said should be taken with a huge grain of salt. _Especially_ any advice pertaining to better managing one's security needs.


I'm not sure I know of a professional cryptography engineer who believes that what happened with Truecrypt is anything more than what that developer said. Can you cite one?


But maybe what they did today makes that impossible. They set the whole thing on fire, and now maybe nobody is going to trust it because they’ll think there’s some big evil vulnerability in the code. But now this decision makes me feel like they’re kind of unreliable. Also, I’m a little worried that the fact that we were doing an audit of the crypto might have made them decide to call it quits. - Matthew Green

TrueCrypt's developers chose not to graciously turn their beloved creation over to a wider Internet development community, but rather, ... to attempt to kill it off by creating a dramatically neutered 7.2 version ... - Steve Gibson

I have no idea what's going on with TrueCrypt, Speculations include a massive hack of the TrueCrypt developers, some Lavabit-like forced shutdown, and an internal power struggle within TrueCrypt. I suppose we'll have to wait and see what develops. - Bruce Schneier.


Matthew Green does not believe there is a Truecrypt conspiracy. Source: I've discussed this with him at length as an advisor to the TC audit project.

Steve Gibson isn't a cryptographer, and your quote doesn't respond to my question.

Bruce Schneier I'll have to accept as a totally valid rebuttal to my implied point. This is something that really irritates me about Schneier, for what it's worth. He's also behind a meme that elliptic curve cryptography is less safe than RSA, when the reality is almost the opposite.


I think those quotes were a bit snotty, especially considering your stature and knowledge on this. I don't think I was trying to rebut you, as much as I was simply trying my luck at advocating that user's view. Sorry for that.


What convinces me that there is more to the story is what they haven't said.


They believe that, huh? Interesting wording.


I wouldn't say I'm "deferring" to it. It does align with what I think is the best option.


> This is also what the TrueCrypt developer recommends.

Given the context of the thread, how is this not using developer's recommendation as an endorsement? What TC dev said bears absolutely zero weight regardless of whether it aligns with your opinion or not.


The context of the thread is a report with my name on it.

My professional opinion is my own. The developer's opinion is also relevant (to people who don't buy into nonsensical conspiracy theories), so I mentioned it.


As far as I can gather the Dutch authorities are not able to retrieve TrueCrypt keys. I can not say the same about BitLocker and FileVault. If the US knows how to get into TrueCrypt volumes then at least they are not sharing that knowledge with foreign allied agencies.


Any input on the cryptic message the TrueCrypt developer left on their website, essentially saying we should all assume TrueCrypt to be insecure?


The developers themselves have clarified. Software that is not actively maintained should be assumed to be insecure.


link/source?


The Security Now show from the week after the event had a pretty thorough recounting of what happened. I'd recommend starting there.


Security Now! episode #458 - 10 Jun 2014

Skip to 2 minutes in if you don't want to hear the drawn out intro

https://media.grc.com/sn/sn-459.mp3

https://www.grc.com/sn/sn-459.htm


I don't have any inside knowledge, but I don't think there's any cloak-and-dagger stuff here. I think he just didn't want to maintain it anymore (for whatever reason), and he wanted to warn people away from using unmaintained software.

Again, this is just my perception.


out of curiosity then, what would you use? I've been trying to figure out an alternative.


I'm primarily a Mac user, and I use FileVault. If I were on Windows, I would use BitLocker. I'm not sure about Linux.


For Linux, dm-crypt/LUKS for full-disk encryption (this is already the default on many distros) and eCryptfs for less complete or individual file/directory encryption.


The benefit of TrueCrypt over LUKS is that it's portable to Windows and Mac.


And it seems that LUKS supports full disk encryption only, while TrueCrypt allows to use encrypted containers (i.e. single files).


LUKS supports encryption of block devices. Linux has robust support for using regular files as block devices via the loopback driver, so you can have a file as an encrypted container just fine.


Surely you can use LUKS on a loopback device backed by a single file?


I did so before moving to TrueCrypt. Mounting and dismounting was a bit convoluted compared to TrueCrypt (or encfs), I remember I wrote a Python script to make it a little more convenient.

But it did work fine about five years ago, and I do not think the situation has become worse.


From http://manpages.ubuntu.com/manpages/trusty/man5/crypttab.5.h...

    The second field, source device, describes either the block special
    device or file that contains the encrypted data. Instead of giving the
    source device explicitly, the UUID is supported as well, using
    UUID=<luks_uuid>.
That covers the init scripts, and it looks like recent versions of cryptsetup will "do the right thing" if pointed at a file rather than a block device.


Tomb is better for this purpose than LUKS on Linux machines https://www.dyne.org/software/tomb/

More of a direct competitor to Truecrypt.


Interesting reading, for those trying to reproduce the official binaries:

"How I compiled TrueCrypt 7.1a for Win32 and matched the official binaries" ―

https://madiba.encs.concordia.ca/~x_decarn/truecrypt-binarie...


Any news, insight or updates on why it is unmaintained anymore?

That always seems a bit shady and suspicious


This all happened when the Snowden leaks were coming out. I always assumed that they were served with some type of National Security court orders, papers, threats whatever the Govt sends out.

Similar to what happened with lavabit. I doubt the devs did anything, more likely they were asked to do something they felt was shady. They could've seen the storm that was brewing on the horizon.


> I doubt the devs did anything

The best course of action was to do nothing, given they couldn't talk about it if they were served. Walking away was the only option.

This is pure speculation. Logical, but still speculation.


Alternatively, since accusations of being NSA sympathizer were getting tossed around pretty freely, maybe they didn't want to find themselves on the wrong end of a witch hunt.


To the contrary. It was nearly a miracle that it was maintained as long as it was, considering how underappreciated the developers were.


Frankly, I have to dismiss this report. As Truecrypt is cross-platform and this report evaluates Truecrypt only under windows, this is at best 33% of a report. A fault in the linux/mac implementations could result failings in the windows version. For example, if the mac version has issues with RNGs, volumes created on mac could be week even when mounted and used on windows machines years later.

"Linux" only appears once in the report, and then only in a footnote.



This is good to know. Thank you for the great work. I've been looking for alternatives, but I think I'll stick with TC for now.

1. Download it for your OS : https://www.grc.com/misc/truecrypt/truecrypt.htm

2. Verify the hash : https://defuse.ca/truecrypt-7.1a-hashes.htm



It's interesting that XTS mode in general is seen as a weakness – this has implications for LUKS et. al. as well, not just TrueCrypt.


Report coauthor here.

The weaknesses of XTS mode are mostly limitations imposed by full-disk encryption. You should think of XTS as fine for FDE, but you also shouldn't put too many expectations on FDE. FDE is great if you leave your powered-off laptop in the back of a cab. It's not great if federal agents distract you and steal your powered-on-and-logged-in laptop at the library.

You should definitely use FDE, but you should also separately encrypt the things that are really important to you.


This is a known issue with (almost) any full-disk encryption - see e.g. our own tptacek's http://sockpuppet.org/blog/2014/04/30/you-dont-want-xts/.


"Someday you’ll leave a laptop in the passenger seat of your parked car and lose it when someone cinderblocks the window. When that happens, you’ll be glad for the failsafe of locked-at-wakeup."

Or, in my case, leave the laptop in a rather nice leather messenger bag on the seat in the bus. Very reassuring that the average opportunistic thief (or as I like to imagine, the skint teenager) basically has to wipe the hard drive and reinstall an operating system.

Now, on Linux, has anyone here been able to get luksSuspend and luksResume working with suspend to RAM on a Debian/Ubuntu system? That would be golden.

http://waaaaargh.github.io/gnu&linux/2013/08/06/lukssuspend-...

http://askubuntu.com/questions/348196/how-do-i-enable-ubuntu...


Is there a consensus project that's carrying on development?


I think the interesting lesson from this is less about crypto, more about free-software projects and how to grow them.

The Truecrypt developers supposedly left because it wasn't interesting/fun for them anymore. I believe that. Funding this audit required ~$65k in donations, probably more than the Truecrypt project ever saw. If you were the developer of a project that you knew was solid, and you knew had no backdoors, how would you feel about people essentially maligning you being able to generate more cash than you've ever seen for your side project? That'd make me want to quit too.

At the end of the day, which is more preferable -- a TrueCrypt that was never audited professionally, or a TrueCrypt with active developers?

How can we ensure security in open-source software without driving the developers away in the future?

I think one way would be to match every single audit donation with a donation to the upstream developers. If it's worth spending a dollar to audit software, it's worth spending a dollar to keep that project alive and show the developer you care. It would have taken twice as long to get the audit funded, but maybe then the developer wouldn't have been hounded away.

Personally, though, I think this audit was a colossal waste of time and resources. All it told us was something every truecrypt user was assuming already, and it cost us all Truecrypt. What guarantee is there any of the new developers are going to be as trustworthy as the original developers, or as skilled?


The Truecrypt Audit Project did not "cost you Truecrypt". The primary reason the project was abandoned was that it's a 3rd party package that implements something every operating system now implements for itself, better than Truecrypt could. There is a mythology among Linux Truecrypt users that the whole project was an effort to create a cross-platform encrypted disk scheme to free users from the tyrannical yoke of I don't know LUKS, but I've seen zero evidence that such a goal was particularly important to actual TC developers.


Alas, TrueCrypt was/is the only free solution that allows you to encrypt a USB flash drive (or create an encrytped container file on it) and thus exchange data between, a Mac, a Windows machine, and a Linux machine[1]. Maybe the number of people who need/want that is small compared to the number of people who want some form of whole disk encryption, but if you are one of those people, you are pretty much screwed without TrueCrypt.

It would be relly nice if there was some common container format you could mount natively on these operating systems without the need for 3rd party software. I think I remember reading about LUKS supporting TrueCrypt's container format - but that, of course, only works on Linux. And frankly, I don't see Apple and Microsoft either supporting the TrueCrypt container format natively or getting together to define a new one.

Plus, BitLocker is not available on Vista Business and Windows 7 Pro. So if you don't want to use Windows 8, again, you are kind of screwed.

[1] At least it was/is the only software that I know of which allows you to do that. If there are alternatives, I would very much like to hear about them!


> I think I remember reading about LUKS supporting TrueCrypt's container format - but that, of course, only works on Linux.

"LUKS" stands for "linux unified key setup" and it is an "on-disk format" and its reference implementation is found in cryptsetup[1]

LUKS on-disk format is supported in windows through doxbox[2]

cryptsetup also supports TrueCrypt,the on-disk format.

people usually say "LUKS" when they mean cryptsetup as it makes no difference to them but i think its important to know the difference.

[1] https://gitlab.com/cryptsetup/cryptsetup/wikis/Specification

[2] https://github.com/t-d-k/doxbox


Thank you very much for clearing that up!


...exchange data between, a Mac, a Windows machine, and a Linux machine.

Just use PGP, probably the gpg [0] software. You can set it up so that each machine knows the others' public keys, or you can take a shortcut and copy the same private key to all your machines. Encrypt and decrypt as you like. This is "hacker" advice, however; gpg is not a consumer-oriented tool. You'll have to read some documentation and keep your wits about you. But that was true of TC as well, wasn't it?

[0] https://www.gnupg.org/


I am aware of PGP, and yes, one could do that, but... a) While I am not afraid of reading docuzmentation, TrueCrypt has always struck me as being really, really simple to use. b) The whole point of an encrypted filesystem is that cleartext data never hits the disk/flash drive/whatever. Of course you can prepare your data, encrypt it with gpg, transmit it in whatever way you choose, then decrypt it. But then you have on-disk plain text data, which in some cases is just not desirable.

Just to be clear, I appreciate your suggestion, and in many cases it is entirely sufficient.

But for me, right now, TrueCrypt hits kind of a sweet spot in terms of portablity, convenience and security.


Agreed. My flash drives have a (portable) copy of TC on them and a container. I don't consider it secure (how could I, considering that I use it for shuttling things to/from machines I don't own), but it's a whole lot better than nothing.

If there's a replacement for this use case I'd really like to know about it.

Ditto:

If you're using FDE for your system disk, it is desirable to know that in the event of the OS becoming unbootable that you can open it using any old OS. Especially in the case of laptops, where you won't have any guarantees about what other machines are available.


What is "secure" depends on the scenario you have in mind. I keep some work-related on my flash drive encrypted, in case I loose the flash drive on the train or something like that. There was a case a couple of years back where somebody in the UK, I think, lost a flash drive with very sensitive data, like, medical records of prison inmates or something like that, on a train, unencrypted - the data I carry is far less sensitive, but this is precisely the scenario I want to be protected from.


Yep.

As I said - it's a whole lot better than nothing. The "insecure" thing is mainly to remind myself that it isn't the be-all and end-all.


Based on the other comments you've made in this thread, it seems like you wouldn't have recommended anyone to use Truecrypt even when talk of auditing it started.

Why fund an audit then? Why beat this particular dead horse?

Can you imagine being that open source developer, putting hours of your (incredibly valuable, if you're a talented crypto dev) time into a project that gets relentlessly shit on, later raising more money to hunt through backdoors than you've ever seen for it?

How would you that make you feel? How would you feel when you thought about working on the code? When you thought about fixing bugs reported by people who were paid to look for them by people who openly distrust you?

That isn't a rhetorical question. I'd like you to imagine the situation the Truecrypt devs were in, try to empathize with them, and tell us about it.

I'm not trying to say Truecrypt should never have been audited, or that open-source projects in general should never be audited. But I do think this particular case was badly handled, and I don't think that's such a horrible thing to say. Things are mishandled all the time. We can always learn from our mistakes. Let's learn from this experience and make the next cross-platform crypto tool better for everyone.


Badly handled... how? Truecrypt got audited, like so many open source projects say they want to get audited, by top-caliber vuln researchers, and they didn't have to pay a dime. Instead, people who cared about the audit footed the bill.

I'm really having a hard time seeing the problem here.


Could you answer my question about how you would feel (what emotions you would experience) in a similar circumstance?


I would feel like my project got audited by top-caliber vulnerability researchers and I didn't have to pay a dime to make that happen. "Happy" is how I would feel, I guess.

Now can you answer my question?


* The massive fundraising drive for an audit did not contribute back to the project in any way. I think half, or at least 25%, of a fundraiser tangential to a project should go back to the project itself. If something is valuable enough to audit, it should be valuable enough to pay for. It seems like a massive slap in the face to developers to directly compete with them in the donations pool in such a way.

* As far as I can tell, the audit project began without the consent of the Truecrypt developers. You can argue that this is necessary for some security reason, but it is also a rude gesture to an open-source project.

* The crowdfunding drive contained no provisions for handling the results of the audit such as developer resources, etc.. The existing developers were assumed to be both capable and motivated to drop whatever they were spending their free time working on to fix things that auditors found.

* The atmosphere around the audit push from the beginning seemed very negative. There was a distinct "Truecrypt is super sketchy" vibe in the HN comments, etc., surrounding it initially. I think that this was a good thing for the audit, which meant helping the audit directly hurt the Truecrypt brand. This dovetails with...

* Singling out Truecrypt as the target of an extensive, expensive audit seems somewhat absurd. There are several projects that could have benefited much more from an audit that weren't audited, such as OpenSSL, Linux's RNG, OpenVPN, any other TLS implementation, Tor, maybe TextSecure (though I don't know if TextSecure had enough of a userbase at the time to justify it, it certainly does now). Why were these projects not audited? I think it's because the Truecrypt devs tried to be anonymous, and low-profile, whereas these other projects are all backed by people with faces who are much harder to throw something at than Truecrypt was.

istruecryptauditedyet.com is a real page now, but what was it before? A giant NO on the screen? Is there an 'istextsecureauditedyet.com' even though TS uses novel protocols and crypto? Why not?

So, I think:

* The fundraiser should have supported Truecrypt while raising money for an audit

* The fundraiser should have incorporated costs to fix anything found as a result of an audit, with money going back to Truecrypt if nothing substantial was found

* The audit push should have been conducted with the consent, or better yet, solicitation, from Truecrypt developers

* The audit should have been conducted in a more respectful way, emphasizing the positive gain from auditing specifically Truecrypt rather than the negative aspects of Truecrypt's existing development style

In general, open source is a massive donation of time and effort and should be treated like the gift that it is instead of being taken for granted. I think that people took Truecrypt for granted and that's probably a part of the reason why its developer quit, because that is a very common reason for open-source developers to quit and I find it impossible to believe that it was not at all a factor in their quitting.

If you would truly feel happy if your project was shit on, donations effectively stolen from you, and a ton of work dumped on you that you did not ask for, but now must immediately address lest you be attacked for "ignoring the results of the audit," you are a special type of human to say the least.


TrueCrypt has portable volume images. It's nice to make a container on Mac OS and give it to a Windows user and know it will mount as a filesystem or drive.


This is an abuse of FDE. If you're moving archives from one OS to another, use userland file-level encryption. There are a bunch of reasons for this: you'll get better crypto primitives, you get cryptographic integrity protection (XTS can't offer that), and the crypto code you'll be using will be simpler and thus easier to audit.


Can you give an example of a file encrypter / decryptor that has a portable version and is cross-platform?

Also, file encryption / decryption fails massively for some things. What I want is something that implements a filesystem. As otherwise things with many files get... annoying. Not to mention less secure.

If there's a better way to do this then let me know.


You can use 7zip to archive, compress, and AES-256 encrypt.

You can also use GnuPG or openssl to encrypt a file or a disk image.

Lately I took a look at actual filesystems for cross-platform use. I didn't find a great one, but depending on which OS you want to burden with an extra driver, pretty good ones include exFAT, NTFS, and UDF specifically version 2.01. Like you, I am curious whether better answers are available.


Again, those don't work for filesystem purposes. One of the nice things about TC is that it not only is cross-platform, but that it functions as an mountable directory across all of them.


When the brouhaha about TrueCrypt started, I wrote a zsh wrapper for "mounting" GPG-encrypted tarballs as directories: https://gist.github.com/gcv/b7d72023a851f2edf951

It works well as a TrueCrypt replacement for me personally, but has a couple of warts. Secure rm is slow on cleanup, and the user needs to be careful about the hidden files it uses to keep track of volume metadata.


> and is cross-platform

That's not Windows-compatible, as far as I know.

Still interesting though.


I consider the zsh scripts a proof-of-concept more than production code. It should be a relatively simple matter to rewrite the gpgd scripts in a portable manner, provided gpg and tar are available. I'm sure Windows has some kind of secure rm equivalent, too.


PGP/GPG.


Does GPG actually support mounting directories without installation?


Most Windows 7 versions (Starter, Home Basic, Home Premium and Professional) don't support full disk encryption.


Upgrade, or switch. Doesn't every version of Win8 support it?


No.


You're referring to Bitlocker, right? Don't they do something different, like iOS Data Protection, for non-Pro?


Yes, Bitlocker. There is something called "Device Encryption" in every edition, but it has some rather unusual hardware requirements (TPM and connected standby).


This fact must have been known to Matt Green et al.! Why did they waste 65k on an evidently obsolete project?

Also, (as far as I know) all public evidence for your assertion is a purported email from one of the devs to Matt Green. He might just have been polite in that mail...


Both Matt Green and I donated to the project. The reason: thousands of people actually use Truecrypt, and it's important to know how safe their data actually was.

I'm amused by the idea that the actual words of the TC developers are less credible than whatever random thoughts are bubbling around a message board. But, HN, what are you gonna do.


That's not what the donation page says:

https://www.fundfill.com/fund/TrueCryptAudited

There's no hint that the audit is for historic purposes.

As for your second paragraph, I already said that the TC developer might just have been polite in that mail. Technical people are often not able to stand up for themselves, especially if it comes to money.


I'm pretty sure I can safely speak for the TC audit project on this point.


I think this audit was a colossal waste of time and resources

Said every entity that has ever been audited in the history of time.

Of course you assume your partners' accounting books are legit, and you assume your vendors' IT controls are tight, and you assume your encryption software isn't backdoored. There is no guaranteed relation between your assumptions and reality until someone qualified checks it out.


Be honest here.

Would you have bet money that Truecrypt was backdoored?

I don't think many people would have. Because they didn't really believe it. They thought an audit would be a "good assurance," but I really doubt many of them considered donating the money to Truecrypt, and the relative value of each.


Or they're people like me, who don't want to encourage people to use Truecrypt, but do want to help ensure that the people who did use it know how safe they are.


But why? What does it get you?

An audit doesn't prove the absence of bugs. It doesn't change the fact that you should assume vulnerability and use defense in depth. It doesn't actually tell anyone that they're more or less safe.

And why Truecrypt? Why not any other crypto program? Why not one that was maintained and had a future? Wouldn't it be more worth it to donate that money to ensure that the people who use GPG know how safe they are? Or OpenSSL?

I think the real reason why Truecrypt got so much attention is that it A.) ran on windows + had visible source and thus was possible to be audited, and B.) had anonymous devs shrouded in interesting secrecy. If Truecrypt was linux-only and was developed by Jorge Schmorge in his off-time from Startuply, it wouldn't have gotten any burning desires to audit it.

Can you explain more what convinced you to spend time and money on this? What makes this more valuable than 70 Watsi patients?

Also, could you respond to my comment here: https://news.ycombinator.com/item?id=9312559


By this logic, all my money should go to Watsi patients. I don't think the Truecrypt audit was the most important thing to allocate my resources towards, but it was worthy of the resources I allocated to it.


That's not necessarily true -- I think you could very convincingly argue that a secure Truecrypt is more valuable than some number of Watsi patients. What if Truecrypt allows a journalist to expose some corruption that saves or benefits some equivalent number of lives?

By allocating resources you are implicitly deciding that the target of your allocation is the most important thing to allocate your resources towards. I don't see how this could not be the case.


The developers could have easily gotten this money for themselves. All they had to do was say "pay us to hire someone for code audit and we will do it" or even "pay us and we'll open source it".

For whatever reasons the TrueCrypt developers chose not to do that, those are their own reasons. But when it comes to issues of security, you don't make assumptions and you don't trust without verifying. If, like you are implying, the developers took offense to that and quit out of spite, that's no one's fault but their own.

So no, you can't blame the people who wanted or performed the audit. The audit did not "cost us" TrueCrypt in any way.


Truecrypt was always open-source, the source was at least always visible if it wasn't an OSS-approved license.

Truecrypt's primary installation base was Windows. Where's the audit for that?

I don't think the developer quit out of spite. I think they quit because they felt underappreciated, overwhelmed with un-fun requests related to the audit, and were very rudely awakened to the fact that it's easier for a project that implies Truecrypt is backdoored to raise money than for Truecrypt to raise money.


You can audit Windows. Microsoft will give you the source code.

http://www.microsoft.com/en-us/sharedsource/default.aspx


Still, one valid point: there seems to be no third option regarding the development of software: either it's a commercial model which favors closed source and tight IP controls, or it's an open source and generally unpaid development model.

Is a third option even possible? Micropayments? Reputation?


A few companies have used dual-licensing with at least a certain degree of success - you either use their software under the GPL, which means any software you create using it has to be GPL, too, or you can buy a license that does not force you to use any particular license.

Sleepycat did it with BerkeleyDB (they were bought by Oracle, though, and I don't know if Oracle changed this arrangement), the same goes for MySQL (which, of course, is now owned by Oracle, too). There are a few more, I guess, that I do not know about.

OPSI is backed by a company and uses a model where OPSI itself is free software (in both ways), but new features are made available to paying customers only initially, until development of these features has been paid for.

Also, there are cases of companies making software open source, but making money by doing one-off customization or (obviously) support contracts.


There is a third option that meets the two in the middle. Red Hat was one of the first to successfully employ it: develop an open source product but sell support, and for some of their software, require a license if you want to use all of the features.


I would argue this combination is not sustainable. Eventually a project will be gobbled up by a commercial entity (FoundationDB) or become stagnant once the original developers leave.

I am not trying to poo poo open source development model, or decry evils of commercialism. I think we can all benefit from finding an alternative.


Red Hat has been going strong for 22 years and they had an IPO in 1999, then managed to survive the dot com crash. Their stock is doing pretty well. Canonical is doing okay with Ubuntu, despite the product being given away. Sourcefire is doing well, even though Snort is free and open source. Mozilla is still trucking along last I heard.

So you can argue that it's unsustainable, in the way that you can argue that the moon is made of cheese. It's just not true.

Besides, FoundationDB wasn't open source, was it? It was just free of charge, which isn't a business model at all.


RedHat people did not write Linux. They did contribute to it, provided support, but they did not write most of the code.

The people who _did_ write the code were not directly compensated for it. Indirectly - perhaps.

This is my point. It is not [currently, in its present form] sustaintable, due to the fact that people and corporations who use open source software do not [generally] pay for it.


people and corporations who use open source software do not [generally] pay for it.

Except, of course, for all of the companies that do, which is every company that uses Red Hat and many that use Ubuntu, Snort, and numerous other, smaller projects. Red Hat and Canonical may not have created Linux, but what we have today would not have been possible without their (highly paid) contributions. If Linus and the other original developers were gone tomorrow, Red Hat could easily keep Linux going for the foreseeable future.

The point you're trying to make is that people don't pay for open source. And you're right. But companies do pay for support for open source, and they do pay for valuable cloud features that tie into open source software for a small fee.


You are right. _Some_ companies do pay, that's why I included qualifier [generally].

We don't really know the percentage of the companies who do pay, but judging by the recent openSSL debacle, it's not great.

I know this is probably not the right forum, but I was hoping to engage collective wisdom to come up with a third option, which would guarantee payment of some kind for an open source development.


Red Hat is not the only free software company, only the most widely known. Most core Linux developers are paid these days, check out the workplace statistics LWN publishes with almost every kernel release.


We're all listening if you want to propose one. Just... I at least have the caveat in mind that insisting that an alternative could be better is not the same as proposing an alternative. Or even proof that an alternative exists.

Micropayments have really only seen success for things with very broad userbases (like musicians on Patreon). Right now what we've got is essentially patronage.


"The Truecrypt developers supposedly left because it wasn't interesting/fun for them anymore."

I thought the Truecrypt developers quit because the US Government forced them to surrender their private keys.


The Truecrypt developers left for an unspecified reason. Anyone claiming otherwise is speculating.


You're thinking of Lavabit, not TrueCrypt.


No, there has been speculations/allegations/rumours regarding TrueCrypt and developer's keys. GPG keys used for signing releases, if I remember correctly. But as another comment here mentions, that is also (afaik, too) just hear-say, speculation and could very well be made up (well intentioned or otherwise).


To me, it appears that your argument presumes something that you believe to be false. Do you believe the devs left because of losing interest, or because of the funding of the audit, or both? Do you have a reference for the devs leaving because of anger over the funding?


I had a source for the devs leaving because it was no longer fun anymore that I can't find because of buggy HN features.

Are you implying that "the devs lost interest" is mutually exclusive from "the devs were facing burnout from dealing with a hostile community"? To me, the latter is causal to the former, but you might disagree.


I think, in this context, it's not important _why_ developers leave - it's inevitable.

In a commercial setting, there is financial motivation for a company to hire and train new developers, but in the case of an open source project this may not be the case.

How do we motivate people to fix or continue development of open source projects? Is it possible at all?


>How do we motivate people to fix or continue development of open source projects? Is it possible at all?

Hm. Maybe it is impossible to motivate people to continue development of open-source projects. There are very few long-lived open-source projects, and very little open-source code. You're probably right that motivating people by anything other than cold, hard cash is impossible -- after all, what do people want in life besides money?


Sex. You know what you have to do.


Can someone tell me why so much attention is paid to Truecrypt? Is it that popular? I thought openssl or mozilla's crypto engine are the most used.


TLDR: Its a relatively easy way to create encrypted digital virtual drives.

---------------------------------------------

Details:

Truecrypt creates a .txt file - via a GUI - which is complete gibberish... until you unlock it. It is then a mounted drive.

At time of creation of the Truecrypt volume you:

* set the drive size - it can be very small (ex: 1mb) to very large (many GB iirc)

* set the access credentials. This can be a password, or it can be a password and key-file - that is a file you must pass in addition to the password. The file can be any kind of file. A jpg, an MS Word doc file, a OS iso, a .pem or .ppk... it doesn't matter. I think the hash generated by the file is essentially a second passphrase. Change the contents of the file and you change the hash... it doesn't work. Pick your favorite picture among the tens of thousands you have backed up several places online... and it's much less obvious than the one or two .ppk files you have tucked away in you lastpass account.

* you can have a "hidden volume." Say for example somebody drugs you and hits you over the head with a wrench to get your Truecrypt volume access credentials. You give them the credentials, they unlock and mount the drive... and inside is your grandma's banana bread recipe... not your bitcoin wallet details. You gave them the passphrase to a "dummy volume." If you follow the instructions correctly regarding this you can have the actually valuable information hidden within the Truecrypt volume. There's no indication that any particular volume has a hidden volume in it or not. So there's a kind "plausible deniability" about what's there.


Any electric anthill operator worth his salt knows that banana bread recipes aren't real secrets. A convincing dummy volume has a high-traffic mostly-empty bitcoin wallet, with enough coins remaining that the agents can claim it was recovered empty, and everybody goes home happy.


I'm imagining this being finished yesterday and them waiting a day to release it.


We had it finished earlier, but it was undergoing review. Matt and Kenn _did_ want to release it yesterday, I suggested stalling just a day =P


Thank you for your work!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: