What is the checksum of the source tree that I can use to verify that I have exactly the copy they audited ?
I looked at the full report PDF and saw no mention of downloads, binaries, source trees or checksums.
> The iSEC team reviewed the TrueCrypt 7.1a source code, which is publicly available as a zip archive (“truecrypt 7.1a source.zip”) at http://www.truecrypt.org/downloads2. The SHA1 hash of the reviewed zip archive is 4baa4660bf9369d6eeaeb63426768b74f77afdf2.
The Phase II report (today;s release) claims to be auditing 7.1a, so I assume it's exactly the same version and ZIP file.
Last June, they published "a verified TrueCrypt v. 7.1 source and binary mirror", including file hashes, on GitHub: https://github.com/AuditProject/truecrypt-verified-mirror
I just cloned that repo and inspected the source ZIP; the SHA1 sum matches what they quote in the report.
That doesn't mean Truecrypt is perfect. The auditors did find a few glitches and some incautious programming -- leading to a couple of issues that could, in the right circumstances, cause Truecrypt to give less assurance than we'd like it to.
/shrug While it's certain that if you're operating inside a mandatory profile you are at the mercy of whoever created it... but that doesn't mean that people don't try to circumvent restrictions placed on them by their Administrators.
It's also not clear to me just how common Mandatory Profiles are actually.
Also, please refer to this stackexchange thread:
Well right, they're not a fork so it's not the same kind of problem if they use the name. They're a fansite.
There is a lot of discussion on the license on the VeraCrypt forums.
It's completely reasonable to ask people to stay away from the TrueCrypt name in my opinion, so that his software's reputation would not be tarnished if amateurs or problem-causing people took over the TrueCrypt name.
I wish the TrueCrypt end was handled better, but it's unique situation has some educational value.
TrueCrypt,the binary application may break without further development but its on-disk format will continue to thrive as more applications pick it up as a mean to support cross platform encryption format.
No significant issues were found in either phase of the TrueCrypt audit. If you're using it today (or have used it in the past), I don't think you have anything to worry about.
But it is an unmaintained piece of software, and for that reason I would migrate away from it. If I were setting up a new laptop today, I wouldn't consider installing it. If I had an existing laptop using it, I would think about transitioning when I had some spare time.
Not a hair-on-fire problem, though.
Where, exactly, do you suggest that NSA could have tainted Bitlocker in a manner that wouldn't also implicate Truecrypt?
Truecrypt has now been proven to have no such flaws.
It's unlikely that Bitlocker will ever be proven in the same way.
Anyone that has a legitimate fear of that kind of attack, the bad guys with guns coming in and seizing your stuff, would use Truecrypt over Bitlocker out of an abundance of caution.
Compromised hardware is another matter entirely. The old rule is, "if an attacker has physical access to your machine, it's not your machine anymore", and it's no less true here.
NSA: "Hi, can you please let us see your source code."
(The conversation below happens in your head):
"Do I want to spend the next 15 years and all my money and life trying to stay out of jail?"
"Do I want the IRS to know about trips to the Cayman Islands?"
"Do I want my wife to know about my mistress?"
Me/You: "Sure no problem!"
I know its nice to think about that we would the moral high ground.
Try doing it alone with just a nice government official.
Or use one of their 0days to breaking into your machine and get your data while it is decrypted in memory or just steal your password.
Do you mean "NSA has implanted backdoors into Win8 that would enable them to directly enroll a Win8 box into a C&C"? If so: virtually nobody who does professional vulnerability research, myself included, agrees.
The distinction matters here, because if all you're saying is the former thing, that impacts Truecrypt just as much as Bitlocker, because it's equally true of every operating system, including Linux, FreeBSD and OpenBSD.
That's the sort of player we're dealing with, and that's why Bitlocker is not just improvably secure, it is best to simply assume that there are deliberate backdoors, whether any particular employee knows of them or not.
Between China and the USA, there's almost certainly hardware backdoors in place for some systems. IIRC there has been evidence of router tampering for ISP hardware shipping across borders.
performance impacting though.
Are there any known exploitations of weakness in XTS that might work their way back to TrueCrypt?
Most folks use TrueCrypt's disk encryption as a "file container", when they effectively want to encrypt files. Do you know of any tools (cross-platform or otherwise) that would provide the same UX as TrueCrypt while not relying on FDE?
I'm not sure what the best answer is to your second question. I wouldn't use TC for that. I would typically use PGP, but I don't think it has a good UX for general consumption. There may be room for someone to build a better product.
You can't be seriously deferring to that? The end-of-the-line announcement was so completely out of sync with all previous communications from the TC devs, so un-TrueCrypt like, that everything it said should be taken with a huge grain of salt. _Especially_ any advice pertaining to better managing one's security needs.
TrueCrypt's developers chose not to graciously turn their beloved creation over to a wider Internet development community, but rather, ... to attempt to kill it off by creating a dramatically neutered 7.2 version ... - Steve Gibson
I have no idea what's going on with TrueCrypt, Speculations include a massive hack of the TrueCrypt developers, some Lavabit-like forced shutdown, and an internal power struggle within TrueCrypt. I suppose we'll have to wait and see what develops. - Bruce Schneier.
Steve Gibson isn't a cryptographer, and your quote doesn't respond to my question.
Bruce Schneier I'll have to accept as a totally valid rebuttal to my implied point. This is something that really irritates me about Schneier, for what it's worth. He's also behind a meme that elliptic curve cryptography is less safe than RSA, when the reality is almost the opposite.
Given the context of the thread, how is this not using developer's recommendation as an endorsement? What TC dev said bears absolutely zero weight regardless of whether it aligns with your opinion or not.
My professional opinion is my own. The developer's opinion is also relevant (to people who don't buy into nonsensical conspiracy theories), so I mentioned it.
Skip to 2 minutes in if you don't want to hear the drawn out intro
Again, this is just my perception.
But it did work fine about five years ago, and I do not think the situation has become worse.
The second field, source device, describes either the block special
device or file that contains the encrypted data. Instead of giving the
source device explicitly, the UUID is supported as well, using
More of a direct competitor to Truecrypt.
"How I compiled TrueCrypt 7.1a for Win32 and matched the official binaries" ―
That always seems a bit shady and suspicious
Similar to what happened with lavabit. I doubt the devs did anything, more likely they were asked to do something they felt was shady. They could've seen the storm that was brewing on the horizon.
The best course of action was to do nothing, given they couldn't talk about it if they were served. Walking away was the only option.
This is pure speculation. Logical, but still speculation.
"Linux" only appears once in the report, and then only in a footnote.
1. Download it for your OS : https://www.grc.com/misc/truecrypt/truecrypt.htm
2. Verify the hash : https://defuse.ca/truecrypt-7.1a-hashes.htm
The weaknesses of XTS mode are mostly limitations imposed by full-disk encryption. You should think of XTS as fine for FDE, but you also shouldn't put too many expectations on FDE. FDE is great if you leave your powered-off laptop in the back of a cab. It's not great if federal agents distract you and steal your powered-on-and-logged-in laptop at the library.
You should definitely use FDE, but you should also separately encrypt the things that are really important to you.
Or, in my case, leave the laptop in a rather nice leather messenger bag on the seat in the bus. Very reassuring that the average opportunistic thief (or as I like to imagine, the skint teenager) basically has to wipe the hard drive and reinstall an operating system.
Now, on Linux, has anyone here been able to get luksSuspend and luksResume working with suspend to RAM on a Debian/Ubuntu system? That would be golden.
The Truecrypt developers supposedly left because it wasn't interesting/fun for them anymore. I believe that. Funding this audit required ~$65k in donations, probably more than the Truecrypt project ever saw. If you were the developer of a project that you knew was solid, and you knew had no backdoors, how would you feel about people essentially maligning you being able to generate more cash than you've ever seen for your side project? That'd make me want to quit too.
At the end of the day, which is more preferable -- a TrueCrypt that was never audited professionally, or a TrueCrypt with active developers?
How can we ensure security in open-source software without driving the developers away in the future?
I think one way would be to match every single audit donation with a donation to the upstream developers. If it's worth spending a dollar to audit software, it's worth spending a dollar to keep that project alive and show the developer you care. It would have taken twice as long to get the audit funded, but maybe then the developer wouldn't have been hounded away.
Personally, though, I think this audit was a colossal waste of time and resources. All it told us was something every truecrypt user was assuming already, and it cost us all Truecrypt. What guarantee is there any of the new developers are going to be as trustworthy as the original developers, or as skilled?
It would be relly nice if there was some common container format you could mount natively on these operating systems without the need for 3rd party software. I think I remember reading about LUKS supporting TrueCrypt's container format - but that, of course, only works on Linux. And frankly, I don't see Apple and Microsoft either supporting the TrueCrypt container format natively or getting together to define a new one.
Plus, BitLocker is not available on Vista Business and Windows 7 Pro. So if you don't want to use Windows 8, again, you are kind of screwed.
 At least it was/is the only software that I know of which allows you to do that. If there are alternatives, I would very much like to hear about them!
"LUKS" stands for "linux unified key setup" and it is an "on-disk format" and its reference implementation is found in cryptsetup
LUKS on-disk format is supported in windows through doxbox
cryptsetup also supports TrueCrypt,the on-disk format.
people usually say "LUKS" when they mean cryptsetup as it makes no difference to them but i think its important to know the difference.
Just use PGP, probably the gpg  software. You can set it up so that each machine knows the others' public keys, or you can take a shortcut and copy the same private key to all your machines. Encrypt and decrypt as you like. This is "hacker" advice, however; gpg is not a consumer-oriented tool. You'll have to read some documentation and keep your wits about you. But that was true of TC as well, wasn't it?
Just to be clear, I appreciate your suggestion, and in many cases it is entirely sufficient.
But for me, right now, TrueCrypt hits kind of a sweet spot in terms of portablity, convenience and security.
If there's a replacement for this use case I'd really like to know about it.
If you're using FDE for your system disk, it is desirable to know that in the event of the OS becoming unbootable that you can open it using any old OS. Especially in the case of laptops, where you won't have any guarantees about what other machines are available.
As I said - it's a whole lot better than nothing. The "insecure" thing is mainly to remind myself that it isn't the be-all and end-all.
Why fund an audit then? Why beat this particular dead horse?
Can you imagine being that open source developer, putting hours of your (incredibly valuable, if you're a talented crypto dev) time into a project that gets relentlessly shit on, later raising more money to hunt through backdoors than you've ever seen for it?
How would you that make you feel? How would you feel when you thought about working on the code? When you thought about fixing bugs reported by people who were paid to look for them by people who openly distrust you?
That isn't a rhetorical question. I'd like you to imagine the situation the Truecrypt devs were in, try to empathize with them, and tell us about it.
I'm not trying to say Truecrypt should never have been audited, or that open-source projects in general should never be audited. But I do think this particular case was badly handled, and I don't think that's such a horrible thing to say. Things are mishandled all the time. We can always learn from our mistakes. Let's learn from this experience and make the next cross-platform crypto tool better for everyone.
I'm really having a hard time seeing the problem here.
Now can you answer my question?
* As far as I can tell, the audit project began without the consent of the Truecrypt developers. You can argue that this is necessary for some security reason, but it is also a rude gesture to an open-source project.
* The crowdfunding drive contained no provisions for handling the results of the audit such as developer resources, etc.. The existing developers were assumed to be both capable and motivated to drop whatever they were spending their free time working on to fix things that auditors found.
* The atmosphere around the audit push from the beginning seemed very negative. There was a distinct "Truecrypt is super sketchy" vibe in the HN comments, etc., surrounding it initially. I think that this was a good thing for the audit, which meant helping the audit directly hurt the Truecrypt brand. This dovetails with...
* Singling out Truecrypt as the target of an extensive, expensive audit seems somewhat absurd. There are several projects that could have benefited much more from an audit that weren't audited, such as OpenSSL, Linux's RNG, OpenVPN, any other TLS implementation, Tor, maybe TextSecure (though I don't know if TextSecure had enough of a userbase at the time to justify it, it certainly does now). Why were these projects not audited? I think it's because the Truecrypt devs tried to be anonymous, and low-profile, whereas these other projects are all backed by people with faces who are much harder to throw something at than Truecrypt was.
istruecryptauditedyet.com is a real page now, but what was it before? A giant NO on the screen? Is there an 'istextsecureauditedyet.com' even though TS uses novel protocols and crypto? Why not?
So, I think:
* The fundraiser should have supported Truecrypt while raising money for an audit
* The fundraiser should have incorporated costs to fix anything found as a result of an audit, with money going back to Truecrypt if nothing substantial was found
* The audit push should have been conducted with the consent, or better yet, solicitation, from Truecrypt developers
* The audit should have been conducted in a more respectful way, emphasizing the positive gain from auditing specifically Truecrypt rather than the negative aspects of Truecrypt's existing development style
In general, open source is a massive donation of time and effort and should be treated like the gift that it is instead of being taken for granted. I think that people took Truecrypt for granted and that's probably a part of the reason why its developer quit, because that is a very common reason for open-source developers to quit and I find it impossible to believe that it was not at all a factor in their quitting.
If you would truly feel happy if your project was shit on, donations effectively stolen from you, and a ton of work dumped on you that you did not ask for, but now must immediately address lest you be attacked for "ignoring the results of the audit," you are a special type of human to say the least.
Also, file encryption / decryption fails massively for some things. What I want is something that implements a filesystem. As otherwise things with many files get... annoying. Not to mention less secure.
If there's a better way to do this then let me know.
You can also use GnuPG or openssl to encrypt a file or a disk image.
Lately I took a look at actual filesystems for cross-platform use. I didn't find a great one, but depending on which OS you want to burden with an extra driver, pretty good ones include exFAT, NTFS, and UDF specifically version 2.01. Like you, I am curious whether better answers are available.
It works well as a TrueCrypt replacement for me personally, but has a couple of warts. Secure rm is slow on cleanup, and the user needs to be careful about the hidden files it uses to keep track of volume metadata.
That's not Windows-compatible, as far as I know.
Still interesting though.
Also, (as far as I know) all public evidence for your assertion is a purported email from one of the devs to Matt Green. He might just have been polite in that mail...
I'm amused by the idea that the actual words of the TC developers are less credible than whatever random thoughts are bubbling around a message board. But, HN, what are you gonna do.
There's no hint that the audit is for historic purposes.
As for your second paragraph, I already said that the TC developer might just have been polite in that mail. Technical people are often not able to stand up for themselves, especially if it comes to money.
Said every entity that has ever been audited in the history of time.
Of course you assume your partners' accounting books are legit, and you assume your vendors' IT controls are tight, and you assume your encryption software isn't backdoored. There is no guaranteed relation between your assumptions and reality until someone qualified checks it out.
Would you have bet money that Truecrypt was backdoored?
I don't think many people would have. Because they didn't really believe it. They thought an audit would be a "good assurance," but I really doubt many of them considered donating the money to Truecrypt, and the relative value of each.
An audit doesn't prove the absence of bugs. It doesn't change the fact that you should assume vulnerability and use defense in depth. It doesn't actually tell anyone that they're more or less safe.
And why Truecrypt? Why not any other crypto program? Why not one that was maintained and had a future? Wouldn't it be more worth it to donate that money to ensure that the people who use GPG know how safe they are? Or OpenSSL?
I think the real reason why Truecrypt got so much attention is that it A.) ran on windows + had visible source and thus was possible to be audited, and B.) had anonymous devs shrouded in interesting secrecy. If Truecrypt was linux-only and was developed by Jorge Schmorge in his off-time from Startuply, it wouldn't have gotten any burning desires to audit it.
Can you explain more what convinced you to spend time and money on this? What makes this more valuable than 70 Watsi patients?
Also, could you respond to my comment here: https://news.ycombinator.com/item?id=9312559
By allocating resources you are implicitly deciding that the target of your allocation is the most important thing to allocate your resources towards. I don't see how this could not be the case.
For whatever reasons the TrueCrypt developers chose not to do that, those are their own reasons. But when it comes to issues of security, you don't make assumptions and you don't trust without verifying. If, like you are implying, the developers took offense to that and quit out of spite, that's no one's fault but their own.
So no, you can't blame the people who wanted or performed the audit. The audit did not "cost us" TrueCrypt in any way.
Truecrypt's primary installation base was Windows. Where's the audit for that?
I don't think the developer quit out of spite. I think they quit because they felt underappreciated, overwhelmed with un-fun requests related to the audit, and were very rudely awakened to the fact that it's easier for a project that implies Truecrypt is backdoored to raise money than for Truecrypt to raise money.
Is a third option even possible? Micropayments? Reputation?
Sleepycat did it with BerkeleyDB (they were bought by Oracle, though, and I don't know if Oracle changed this arrangement), the same goes for MySQL (which, of course, is now owned by Oracle, too). There are a few more, I guess, that I do not know about.
OPSI is backed by a company and uses a model where OPSI itself is free software (in both ways), but new features are made available to paying customers only initially, until development of these features has been paid for.
Also, there are cases of companies making software open source, but making money by doing one-off customization or (obviously) support contracts.
I am not trying to poo poo open source development model, or decry evils of commercialism. I think we can all benefit from finding an alternative.
So you can argue that it's unsustainable, in the way that you can argue that the moon is made of cheese. It's just not true.
Besides, FoundationDB wasn't open source, was it? It was just free of charge, which isn't a business model at all.
The people who _did_ write the code were not directly compensated for it. Indirectly - perhaps.
This is my point. It is not [currently, in its present form] sustaintable, due to the fact that people and corporations who use open source software do not [generally] pay for it.
Except, of course, for all of the companies that do, which is every company that uses Red Hat and many that use Ubuntu, Snort, and numerous other, smaller projects. Red Hat and Canonical may not have created Linux, but what we have today would not have been possible without their (highly paid) contributions. If Linus and the other original developers were gone tomorrow, Red Hat could easily keep Linux going for the foreseeable future.
The point you're trying to make is that people don't pay for open source. And you're right. But companies do pay for support for open source, and they do pay for valuable cloud features that tie into open source software for a small fee.
We don't really know the percentage of the companies who do pay, but judging by the recent openSSL debacle, it's not great.
I know this is probably not the right forum, but I was hoping to engage collective wisdom to come up with a third option, which would guarantee payment of some kind for an open source development.
Micropayments have really only seen success for things with very broad userbases (like musicians on Patreon). Right now what we've got is essentially patronage.
I thought the Truecrypt developers quit because the US Government forced them to surrender their private keys.
Are you implying that "the devs lost interest" is mutually exclusive from "the devs were facing burnout from dealing with a hostile community"? To me, the latter is causal to the former, but you might disagree.
In a commercial setting, there is financial motivation for a company to hire and train new developers, but in the case of an open source project this may not be the case.
How do we motivate people to fix or continue development of open source projects? Is it possible at all?
Hm. Maybe it is impossible to motivate people to continue development of open-source projects. There are very few long-lived open-source projects, and very little open-source code. You're probably right that motivating people by anything other than cold, hard cash is impossible -- after all, what do people want in life besides money?
Truecrypt creates a .txt file - via a GUI - which is complete gibberish... until you unlock it. It is then a mounted drive.
At time of creation of the Truecrypt volume you:
* set the drive size - it can be very small (ex: 1mb) to very large (many GB iirc)
* set the access credentials. This can be a password, or it can be a password and key-file - that is a file you must pass in addition to the password. The file can be any kind of file. A jpg, an MS Word doc file, a OS iso, a .pem or .ppk... it doesn't matter. I think the hash generated by the file is essentially a second passphrase. Change the contents of the file and you change the hash... it doesn't work. Pick your favorite picture among the tens of thousands you have backed up several places online... and it's much less obvious than the one or two .ppk files you have tucked away in you lastpass account.
* you can have a "hidden volume." Say for example somebody drugs you and hits you over the head with a wrench to get your Truecrypt volume access credentials. You give them the credentials, they unlock and mount the drive... and inside is your grandma's banana bread recipe... not your bitcoin wallet details. You gave them the passphrase to a "dummy volume." If you follow the instructions correctly regarding this you can have the actually valuable information hidden within the Truecrypt volume. There's no indication that any particular volume has a hidden volume in it or not. So there's a kind "plausible deniability" about what's there.