Hacker News new | past | comments | ask | show | jobs | submit login
True Goodbye: ‘Using TrueCrypt Is Not Secure’ (krebsonsecurity.com)
360 points by panarky on May 29, 2014 | hide | past | favorite | 237 comments

That's LavaBit 2.

I've been a long time TC user and if there's the trait it has it's the quality and a high degree of polish. And now looking at the diff and the screenshot of that in-app "Not secure" message, the polish is just not there. It feels like it was something that was slapped together in a rush or by someone who's not an original developer. The SF page alone is a big red flag. If you compare its nearly hysterical tone and ridiculous BitLocker advice to the tone and content of the actual app, they don't add up at all.

This leaves us with a handful of discrepancies between the last good state of the project and what's out there now. So it's either someone else's hackjob or it is original and the discrepancies are intentional. Then, factor in the .exe sig match, and it pretty much leaves only the latter option - the original devs made an absurdly non-TC-like release. The question is "why?"

The whole message on the site makes no sense and I think that's on purpose. What likely happened is the US gov found the TC authors, then used their weight to try and get them to back door the binaries. Authors didn't want to, but couldn't publicize the letters without going to jail, so they made up the most ridiculous story for why they were giving up on the project, the best possible outcome so that they wouldn't go to jail and wouldn't subject users to the required back door.

This my first thought when I read it and how it ended so abruptly. Conspiracy theories not withstanding, it's clear someone or some agency got to the developers and they just pulled the ejection seat for their own legal protection.

Can't say I blame them. I've had the feds show up where I lived once and I nearly shit a brick when I realized what was going on. You never want the weight the government beating down your back.

This is the most plausible explanation I have seen so far.

This is a pretty confusing case, hard to make much of it, LavaBit 2 is of course a possibility. But while we're making these theories, I wanna sound my wild theory: Considering that: (1) TrueCrypt authors go to great to keep their identities hidden, and (2) it turns out TrueCrypt is not free/open software -- TrueCrypt is actually a project by some spooky 3-letter agency.

But anyway, thoughts on alternatives? CiskCryptor (http://en.wikipedia.org/wiki/DiskCryptor) sounds like a nice one, it's FLOSS. Thoughts on it?

I really don't think we need to be running to TrueCrypt alternatives quite yet. If Phase II of the audit comes back showing TrueCrypt as insecure, then it's time to start worrying about that, but given that everyone was happy to keep using TrueCrypt up until 1 day ago even though it hasn't been updated in over 2 years, I don't think there's any big rush to switch to something else even if ongoing development stops.

"given that everyone was happy to keep using TrueCrypt up until 1 day ago"

The same was true for OpenSSL a few weeks ago.

One of the most plausible theories is that the TrueCrypt developers found a gaping security hole (ala OpenSSL) and realised that releasing a fix for it would reveal the bug and compromise every TrueCrypt partition in existence, so they chose to kill the project rather than risk the safety of all of that currently encrypted data.

If that scenario is true, there's a big rush to switch to something else before the bug is discovered by other researchers.

Their warning about it being insecure is clearly a joke, as their alternate suggestions (particularly for linux and Mac) are ones that no one should ever take seriously. TrueCrypt passed Phase I of the audit, and Phase II of the audit is coming down the pipe soon enough. Also keep in mind that TrueCrypt is one of the most widely-used cross-platform drive encryption tools and has been for some time. It seems far more likely to me that TrueCrypt is secure than that a more obscure competitor, no matter what the developers say in a weird, cryptic message that everyone assumed was their site being hacked.

That would make sense, especially with all the truecrypted drives in evidence lockers and whatnot - in other words, things that can't be properly upgraded.

If that was the case, why not fix the bug and then tell everybody to upgrade to the new fixed version ASAP?

by fixing the bug you tell everybody what/where the bug is and if anybody has a copy of someone else encrypted disk (think external backup, amazon etc..) they can decrypt it.

That's effectively the same thing as releasing details of the bug. It would take time to take the patch and figure out the bug from it, but it would be fairly easily done for a determined attacker.

TrueCrypt authors go to great to keep their identities hidden

I donated at least two times to them via PayPal. How anonymous could they be if they got funds via PayPal? Not very, in this day and age. I would image it's trivial for the US government to find their true names based on this fact alone.

Since this thread is all about paranoia, how did you verify that anyone ultimately received the funds?

Obviously, I could not. truecrypt.org had a donate button which redirected to a paypal page, and I donated. I'm assuming they wouldn't set up a donation method that merely gave the funds to paypal.

What's their PayPal identity?

If TruCrypt got taken over, it should tell us something that they recommended Bitlocker (about Bitlocker).

There's another possibility that I haven't seen mentioned yet.

It could be that the developer(s) want to get out anyway. The fact that they've been doing it for so long indicates that they've got some strong feelings about privacy.

Perhaps they're staging the whole thing as their way out, to make a political statement, pushing us further to do something about the government's stand on privacy.

I just don't quite understand the panic about microsoft not supporting XP anymore. It's not like that was a surprise announcement or even that the deadline was just met. It was April 8th....and TrueCrypt just now shut down in panic? ...Because XP support stopped??? WTF is going on?

It's not even like support means anything, other than that they will no longer improve or fix it, i.e., there's still time to migrate away as XP degrades. It says nothing about whether XP is secure in and of itself.

This whole incident is about as weird as weird gets.

I saw a post that suggested it might be a canary, i.e., an event that must be interpreted as a certain action having taken place...a negative message....an absence of an indication that everything is ok. But that also seems odd since I am not quite sure that if TrueCrypt people were who we all want to believe they were, would suggest using bitLocker. bitLocker??? Alone that suggestion smells like rotten fish just by its association with MS and the US government.

You should put money on the fact that it really was someone associated with certain ever increasingly fascist governments of, likely the USA or Israel, that compromised TrueCrypt in a way that set off an "auto-destruct" sequence. It's probably a reaction to the Snowden compromise, with increased funding and efforts to regain domination that he exposed by becoming even more totalitarian through an "Operation Kristallnacht" sprint against civilian institutions.

Another theory is that some component of the development environment to compile TrueCrypt requires XP. Remember the guy that tried to compile the TC source to match the binary?


He needed to get some older version of Visual Studio and a very specific combination of service packs and updates in order to get to matching (nearly) the entire binary.

Could it be that the devs really wanted to keep developing in XP because some part of their dev chain required it, but with support being discontinued they of course couldn't run XP and consider that computer "secure" to develop on?

Obviously, if this theory has some grain of truth to it, it can only be part of the explanation for the TrueCrypt weirdness that's been going on today.

If I had to bet on it, I would say it's most likely they got Lavabitten. But then still, the XP remark seems like a very odd choice if it's intended to function as a canary of sorts.

I don't think that the EOL of XP has much do with it because they could simple air gap their Windows XP installation to continue development in a secure fashion.

Unless, of course, something more shadowy is going on.

> Another theory is that some component of the development environment to compile TrueCrypt requires XP

This seems unlikely: the screenshots in your link clearly show the source could be built on Windows 7. The trouble Xavier mentions is with the updates to VS2008 SP1, not Windows service packs.

>But that also seems odd since I am not quite sure that if TrueCrypt people were who we all want to believe they were, would suggest using bitLocker. bitLocker??? Alone that suggestion smells like rotten fish just by its association with MS and the US government.

That's the canary.

Not only is bitlocker backdoored, and most TrueCrypt users would know that, but they also say it is available on all windows versions, which is eminently not the case.

Isn't a canary something that is 'removed', not 'added'? As in, the canary warns by being absent.

Not in this case [1], or rather, not always. And if you consider where the term arose (canaries in mines), it's really an issue of state more than it is absence. If the state of the canary changes it's cause for alarm.

In this situation, warrant canaries (or maybe an NSL canary as the case might be) is a subtle indication that everything is no longer in a known-good state and something has been compromised.

Although given some of the comments by danielweber, I'm increasingly less inclined to believe this was a canary. Developers throwing in the towel might be a more plausible situation, no matter how disappointing that may be.

[1] http://en.wikipedia.org/wiki/Warrant_canary

Just another theory but consider that the TrueCrypt team was always anonymous and considering that their licence is somewhat prohibitive for a open-source project it's not totally bonkers to assume that some agency or big coorporation developed TrueCrypt in secret and made it public to gain widespread assurance that there are no bugs in the code.

Maybe they migrated away from XP and have assurance that Bitlocker is safe for them or switched to something entirely different and have no interest or funding for development it anymore...

* bitLocker??? Alone that suggestion smells like rotten fish *

The Bitlocker recommendation does seem strange. But when you look around the Windows ecosystem, there isn't much else that could be recommended. What would you recommend Windows people use, other than Bitlocker?

I find it interesting that it is only Microsoft Windows users really affected by this. What does that tell you about it as an ecosystem/platform?

It tells me all kinds of things that we've all known since the 90s.

Having tried windows 8 recently i find it appalling how little it has changed. Even aesthetically it is diabolically a failure. There is no consistency and nothing makes sense. The removal of progress bars from system update functions is UX stupidity and having to reboot after every minor change is just embarrassing.

TrueCrypt 7.1a is still safe to use. So is GPG.

how do you conclude that any TrueCrypt version is secure in light of this disclosure?

Because this happened to 7.2. Chances are, this is a warrant canary, and people will take the last known good 7.1 (i.e. the one that is being audited) and build their forks from there.

1. 7.2 just disabled things. It didn't introduce a vulnerability.

2. Warrant canary makes no sense. There is nothing for a warrant to grab.

3. Forking is legally troublesome. Just because you can see the source doesn't mean you can distribute the source.

>1. 7.2 just disabled things. It didn't introduce a vulnerability.

I had a look at the diff; I have neither the time nor crypto-specific skill to do an audit, but there are plenty of code changes that aren't warnings in there.

>2. Warrant canary makes no sense. There is nothing for a warrant to grab.

No, but they could have theoretically been asked to introduce a backdoor from a Lavabit-style order.

>3. Forking is legally troublesome. Just because you can see the source doesn't mean you can distribute the source.

Do you think that will stop people? Even if there is some legal issue, the Streisand Effect always overcomes it. So what if a TrueCrypt fork can't use Github because of that issue? Is the world suddenly lacking good hosting providers, perhaps in Switzerland or similar? Has everyone forgotten how to set up a public git repo themselves? Somehow, I think not. So what if many devs will probably have to contribute anonymously? With a product such as TrueCrypt, they probably should anyway.

been asked to introduce a backdoor from a Lavabit-style order

This makes no sense. Lavabit was compelled to turn over evidence it told the government it had, which is straightforward law. There is nothing "Lavabit-style" about "distribute a back door or else." You would need explicit legislation to allow that. If the developer's cat was kidnapped to force him to put in a backdoor, there is nothing "Lavabit-style" about that.

You do hedge with the word "theoretically," but "theoretically" this could be a message from the aliens.

Do you think that will stop people?

It will stop the smart people, and you need smart people to work on it. Otherwise you would only be allowed to distribute it by illicit means, and you could never trust it. It would be as trustworthy as warez sites. Why even bother with that nonsense?

Much saner to just build something new, having learned from TrueCrypt's experiences. For example, something that starts from the command-line and then has a GUI in top, instead of the reverse.

> "There is nothing "Lavabit-style" about "distribute a back door or else." You would need explicit legislation to allow that."


No legislastion authorizes that program, among many others. I think what the parent was referring to as "Lavabit-style" was the "compromise your users or face legal action" move. Turning over customer data or introducing a backdoor are both means to the same end, as far as the NSA is concerned.

Is there something at that Bullrun link that shows the USG uses legal methods (or even illegal methods) to force product makers to insert backdoors into their equipment against the product makers' will?[1]

The NSA inserting backdoors into the product without the cooperation or maybe even knowledge of the vendor -- while troubling for any number of reasons -- is vastly different. Especially when giving advice to developers about how to stay in bounds with the law. If the metagame becomes "the government can legally force you to insert backdoors into your product" any developer faced with this threat might believe it, when he should know it's bunk.

[1] RSA allegedly got paid $10 million to make a change the NSA wanted. RSA customers should demand an answer, but that's not forcing the RSA. People get paid to do things all the time.

Not there, but in the historical record for sure.

>Hushmail stated that the Java version is also vulnerable, in that they may be compelled to deliver a compromised java applet to a user.


>Hushmail turned over cleartext copies of private email messages associated with several addresses at the request of law enforcement agencies under a Mutual Legal Assistance Treaty with the United States.; e.g. in the case of U.S. v. Tyler Stumbo.

>if a court order has been issued by the Supreme Court of British Columbia compelling us to reveal the content of your encrypted email, the "attacker" could be Hush Communications, the actual service provider.

Wow, Huhsmail was pretty silly with their claims:

the company that provides Hushmail, states that it will not release any user data without a court order from the Supreme Court of British Columbia,

This is malarkey. Someone in the US could say "I'll fight all the way to the Supreme Court!!!" but you would be a fool to trust your business to their determination. Especially if they say it about a subpoena, which means they haven't even retained a lawyer to ask about this. (If your business plan depends on being able to wage a legal battle, you really shouldn't be scrambling through the yellow pages for a lawyer when you get your first subpoena.)

Back to the topic, I'll have to point out that this still isn't evidence of a company being required to backdoor a product. Hushmail, the same company that thinks it can fight a subpoena for third-party data all the way to the Supreme Court, said "well, we might be compelled to backdoor our product." This is just more repeating of the meme without evidence. It's unfortunate because some developer who remembers Hushmail might take their ill-informed legal opinion as reality.

Of course, Hushmail had access to cleartext copies of the messages. That's the killer. The government has the right to evidence about third parties in your possession. (Canada derives from British law tradition like the US. The government's right to all evidence is a concept that goes back centuries. If you can show that Canada broke from this tradition I would be most interested.)

> No legislastion authorizes that program, among many others.

No legislation forbids it either. If the NSA is legally able to search/seize a given piece of information why would you think they should not be allowed to forensically analyze it?

It's not a panic about XP. The presented reasoning is:

- XP is the only widely used version of Windows with no built-in FDE

- XP is finally, truly, EOL

So TrueCrypt no longer needs to exist.

(I'm not arguing that- that is what the message is saying)

My current favorite insane/facetious conspiracy theory on this:

    "WARNING: Using TrueCrypt is not secure as ..."
     WARNING: Using TrueCrypt is (n)ot (s)ecure (a)s ...
                    TrueCrypt is (n)ot (s)ecure (a)s ...
                                 (n)ot (s)ecure (a)s

                    TrueCrypt is (n)   (s)      (a)

I love this game!

    "WARNING: Using TrueCrypt is not secure as ..."
    "WARNING: Using True(C)rypt (i)s not secure (a)s ..."

                        (C)     (i)             (a)

    "WARNING: Using TrueCrypt is not secure as ..."
    "WARNING: Using TrueCrypt i(s) n(o)t (s)ecure as ..."

                               (s)  (o)  (s)

Not that I think it was intentional, but putting something in that is not grammatically correct like the op version and with no spacing is much stronger than picking random letters from the words to prove a point.

It's grammatically correct, read the whole sentence. It just looks odd when you take half the sentence out of context.

Another interesting thing is that in the diff for 7.1a and 7.2 the author changed U.S. to United States several times. Some people think this is to draw attention to it being related to the US government

But other people point out that Visual Studio changed its default name for America from the former to the latter.

>>> s="""If you have files encrypted by TrueCrypt on Linux:

Use any integrated support for encryption. Search available installation packages for words encryption and crypt, install any of the packages found and follow its documentation"""

>>> ''.join(a[0] for a in s.split())


That short string contains "beast of the apocalypse" [1] plus some other letters ('iuififwaifafi')

[1] https://en.wikipedia.org/wiki/The_Beast_(Revelation) - "The second beast is described in Revelation 13:11-18 and is also referred to as the false prophet."

When you harbor some pet assumption, the entire world conspires to verify your assumption.

More random tidbits of folk-wisdom. This neither helps us further along on any idea(for/against/neither). Merely a blank truism.

Well the very next sentence continues "This page exists only to help migrate existing data encrypted by TrueCrypt" which could easily be read as a prepared response to the person brandishing the NSL who is now annoyed that the authors have suddenly decided that right NOW is an 'entirely natural' time to end the project.

Obligatory Half-Life 3 is confirmed, 9/11 was an inside job, etc. etc.

Yeah seriously, take of your tin-foil hat, why would the NSA be interested in an encryption product?

Oh, wait, it's entirely feasible.

This seems highly suspicious, especially the recommendation of BitLocker, a product we have little to no evidence does what it says and after PRISM, have no reason to trust[2]; not to mention it being limited to a (very small subset of) Windows platforms vs. TrueCrypt's cross-platform functionality. If this was legit[1], it'd probably be directing people to one of the other TrueCrypt-like programs.

[1]The new version posted is almost certainly compromised; don't download it, or at the very least, run it in a VM on non-networked hardware you can reimage after finishing using.

[2]Edit: Forgot this before, but BitLocker is definitely completely broken as it sends your recovery key to MS anyway ( https://twitter.com/TheBlogPirate/status/471759810644283392/... ).

There's alot of FUD in your statement there.

BitLocker in it's "click click next" incarnation stores keys in the cloud, but it is fairly trivial to install in a manner that uses the TPM or external media for key storage.

For example, NIST publishes guidelines for FIPS compliant BitLocker configuration that gives some guidlines re: the different operating modes: http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140sp/1...

If you have the technical chops / willingness to download TrueCrypt, you should have the ability to spend 10 minutes googling for instructions on a customized configuration of BitLocker.

My point is that as it's closed source, we still don't know whether it sends the key to MS anyway (even if the user asks not to link it to their hotmail account). Given MS' complicity in PRISM, it's not a leap of trust I'm willing to make.

Truecrypt has been around for a decade, and only now is someone getting around to doing a real audit. The people behind Truecrypt are completely unknown, and may well be the NSA for all we know. So do you trust them?

I personally trust open source and audited system much more than closed-source system shipped with Windows (apparently).

It isn't audited yet though.

That's a fairly trivial check to make. I would be fairly shocked to find out that people have discovered multiple sidechannel attacks against Bitlocker to leak keys out (if you have physical access to the machine while the disk is decrypted), and didn't manage to catch the fact that the key was getting sent over the network.

The more plausible potential for Bitlocker being broken is that there is some subtle flaw in their crypto implementation.

Bitlocker is also closed source. Truecrypt is (was?) open source.

A screenshot that says "We have the recovery key", but zero indication of how they got it? The previous slide could not possibly be something related to dumping RAM, could it?

Or perhaps an optional Microsoft-account feature to back up your encryption keys. Something that most normal users would want, just like they want it on Apple devices? Because a lot of common users aren't going to want FDE if it means "oh and lose your key and say goodbye to your data".

As far as I know, there's only been speculation on what PRISM is. Nothing that suggests it couldn't be a frontend to CALEA or warrant-based systems.

As far as I know, there's only been speculation on what PRISM is. Nothing that suggests it couldn't be a frontend to CALEA or warrant-based systems.

Subsequent Snowden releases made it clear that thdatee NSA has many sources of information that are only "legal" because they said so, including intra-datacenter and international fiber taps, zero day exploits, and physically modified equipment (see photos of network gear being intercepted and bugged).

In other words, this paragraph might have been reasonable a year ago, but is now grossly out of date

None of that was part of PRISM.

Let's assume that when people say "PRISM" they mean "digital espionage", and when people say "NSA", they also mean CIA, NRO, FBI, DEA, etc.

No, bitlocker explicitly send the key to MS for non-domain systems - as such, I would guess it potentially still does for those on a domain too, it'd just be kept quieter.


Bitlocker is not trustworthy as an overall method of FDE.

That's an optional feature -- it asks you if you want to backup your key to your online account.

So you know that if you click 'no', it definitely isn't sent anyway (perhaps with some "MS/NSA use only" bit set to distinguish it from user-accessible ones)?

Thought not.

You previously said it "definitely sends your recovery key to MS." Sounds like you don't actually mean that.

It's fine and perfectly reasonable not to trust closed-source code, but no reason to spread half-truths about it.

The very ability to send it to MS is worrying; doing it automatically is more so. If they were honest about the key, it'd say "put this on a flash drive/hardcopy in a safe deposit box".

Lift with your knees, not your back. Those goalposts are heavy.

You'd make an amazing PM.

"Hey, how should we deal with resetting people's passwords and keys when they forget them?"

"Tell them to get a safe deposit box"

"And when they're traveling or really need a report and the bank's closed?

"They shouldn't have lost their keys. Stupid lusers."

"We could just make it upload all the secret goodies to us for safe keeping."

"Hey, who are you and how did you get in this meeting?"

"I'm the new intern. From the CIA."

"Oh, okay, yeah, let's do that."

Your link says "There are several locations in which your BitLocker recovery key might have been saved." and then mentions "Your Microsoft account online.". Do you know under which conditions this happens?

At least when it is enabled in Windows 8 with a computer linked to a hotmail account, but I would never rule out other conditions too.

> after PRISM

People seem to keep forgetting this (I'm sure it's simply unintentional), but PRISM was and still is nothing much more than an automated warrant/NSL compliance system.

You're basically saying that Microsoft is complicit in divulging information in response to specific requests made under specific legislative authorities, which was standard hat since even before Smith v. Maryland.

well, I can see where you're coming from, but automation changes the nature.

license plates on cars wasn't a big deal, it was primarily used to identify stolen cars and track drivers breaking the law. Then automation entered the picture and it became feasible to track the movements of everyone, aggregate it in a huge database, and claim "they might be criminals later".

PRISM is more of the same, they could of compelled Microsoft to do this long before, of course, but PRISM is one of those compromise everything initiatives. Meaning that even if the possibility existed before, it definitely exists now.

so it's not unjustified bringing it up.

But here automation is only automating the hard part (doing the collection correctly and in accordance with company policies).

Remember, with PRISM each and every request has to be approved by the company in question before it proceeds, which is still a manual step. So the license plate reader example doesn't apply directly; Rather it might be like a license plate reader that only works when activated by a remote magistrate, only for the one car permitted by that activation, but can continue scanning that one car's license plate from then on wherever it's seen in the city until the permission expires.

huh. I thought they had to justify the inquiry to prisms database, not justify collection itself.

Are you sure it's the later, and not the former?

Would this be a Lavabit-like situation? The governement asking for a backdoor and the developers are refusing it.

Suddenly (while there is an audit), they quit everything, change the assemblies and the website, so users can get to another product... It seems weird that after 10 years of hard-work, they suddenly quit without further explanation.

No, there a big differences with Lavabit.

Lavabit was a service, TrueCrypt is a product.

Lavabit had access to all their customers' data, and told investigators that they had it. It's completely straightforward law that, given a subpoena, Lavabit must turn over evidence to the government.

TrueCrypt is a product. They do not have access to customer data. There is no requirement for TrueCrypt to "help out the government" in this case.

If you want to hang a conspiracy theory on this news[1], find some hook besides Lavabit.

[1] And I can't fault the conspiracy theorists for trying to find some explanation over this, because the damn thing is so weird and unusual.

You're correct to point out the useful distinction that TrueCrypt is a product.

But what makes you think U.S. law treats them any differently, assuming TrueCrypt's creators and maintainers can be identified?

Here's my article from 8 years ago talking about how the FBI was demanding that makers of certain products include backdoors for FedGov surveillance:

http://news.cnet.com/FBI-plans-new-Net-tapping-push/2100-102... The FBI has drafted sweeping legislation that would...force makers of networking gear to build in backdoors for eavesdropping... FBI Agent Barry Smith distributed the proposal at a private meeting last Friday with industry representatives and indicated it would be introduced by Sen. Mike DeWine, an Ohio Republican, according to two sources familiar with the meeting...

Your use of "demand" is misleading. Your own words at the time say "drafted sweeping legislation." Did that legislation pass?

Anyone can "draft legislation." I can draft legislation right now. That doesn't make it U.S. law. Getting it passed is the hard part.

Phone companies are required to enable wiretaps. But that happened through the public legislative process, and the legislation even lets the phone company bill the government for costs to comply. (Your linked article explicitly points out CALEA.)

We are talking about a government that has, in the recent past, sent nastygrams to people telling them that not only did they have to comply with the orders in the letter, but that it would be a crime to consult a lawyer about the letter.

So you, a non-lawyer developer, get one of these letters. You are pretty damn sure it is a bluff (didn't that clause in NSLs get shot down? Pretty sure I heard something about that... Something about Nicholas Merrill?). What if you are wrong though? What if this is a different kind of letter that you and the rest of the general public are currently unfamiliar with? What if the government has found a new way to create such a clause? Is "pretty damn sure" a high enough standard of sureness for you to call their bluff and talk to a lawyer anyway? How much do you value your freedom, and how much do you value your work?

Not being willing to call their bluff and contact a lawyer means that you are not able to question or interpret anything else in that letter as well. The best you can do is ask the government to interpret the letter for you, and tell you exactly what you need to do in order to comply.

The next best option is likely to burn what they want to the ground.

This is pretty much why I said "If you want to hang a conspiracy theory on this news[1], find some hook besides Lavabit."

Linking an abuse like you describe to Lavabit only harms developers, who if they were to receive such an illegal demand might remember "wait, Lavabit was required to install back doors, right? I guess I have to, as well!"

I'm not even talking about Lavabit. They have done this to others (it was unconstitutional at the time, but was not yet declared as such). They could do it again. Only the most selfless person would be able to bring it to the publics attention.

Until the current regime is dismantled, we cannot rule out the possibility that these abuses are ongoing. To label it as a conspiracy theory is just shameless apologetics.

They have done this to others.

What's "this"? Is it "the USG compels vendors to install back doors into their software products they ship to others, under threat of jail time and/or fine and/or vacation at Gitmo"? To whom was this done?

NSLs are nasty in many ways. That doesn't mean they are nasty in any way you can imagine.

I'll repeat my question, which you ignored in favor of quibbling with a tangential point: What makes you think U.S. law treats makers of products any differently, assuming TrueCrypt's creators and maintainers can be identified?

If you want examples of FBI surveillance untethered to the law, we can provide those. Look at the video of the public forum I hosted with Ladar (of Lavabit) in SF last fall. Look at warrantless cell tracking, which I was the first to disclose circa 2006, and which is now the subject of significant litigation. Look at the warrantless use -- not just by the bureau but other police agencies as well -- of physical GPS tracking devices. How about surreptitious black bag jobs to install key loggers to extract PGP passphrases before this was authorized by the 2001 Patriot Act?

Here's another from last summer, which I was the first to disclose:

http://www.cnet.com/news/fbi-pressures-internet-providers-to... "The U.S. government is quietly pressuring telecommunications providers to install eavesdropping technology deep inside companies' internal networks to facilitate surveillance efforts..."

Huh! Where does the FBI get the legal authority to do that? Shouldn't, you know, Congress set the rules here after openly debating them in a public hearing?

Again, all these points are tangential to the question of FedGov product backdoors. (Note I'm expressing no opinion here about what's going on with TrueCrypt.) This survey I did in 2007 is probably worth repeating: http://news.cnet.com/Will+security+firms+detect+police+spywa...

I'm no longer doing this kind of reporting (and left to found the SF-area startup http://recent.io instead) but I hope someone tries to replicate it today with a broader set of companies.

I'll repeat my question, which you ignored in favor of quibbling with a tangential point: What makes you think U.S. law treats makers of products any differently, assuming TrueCrypt's creators and maintainers can be identified?

Something must be wrong because this is 100% the question I believe I responded to. I will attempt so again now:

* Statute gives the government the right to compel certain service providers to actively assist in wiretapping. Example law: CALEA

* There is no U.S. law that gives the government the right to compel arbitrary third-parties to modify their products to make wiretapping easier.

You give a long list of bad things the USG has done, but none of them involve vendors being compelled to modify products.

(In another domain, banks have to report transactions over 10K, but that's completely the result of statute, the Bank Secrecy Act.)

> There is no U.S. law that gives the government the right to compel arbitrary third-parties to modify their products to make wiretapping easier

This is an interesting claim. It would be more interesting if the U.S. government publicly said its interpretation of the law is the same as yours. It has not. :)

> * There is no requirement for TrueCrypt to "help out the government" in this case.*

That's what the publicly available laws say, but America has secret interpretations of laws now. We know, for example, that every Internet service is, in theory, free to provide tools that would put user data out of reach of anyone with, or without a warrant. And yet, nobody has.

Nobody except Silent Circle, who have decided to domicile their company in Switzerland, is a new entrant based on the premise of providing truly secure communication. So, what to make of all the CEO-level complaining but no end-to-end encryption tools and no web-of-trust?

If a major Internet portal provided end-to-end secure mail, real-time communications, and secure storage we would know that, yes, there is no legal or extralegal obligation to keep us all naked in the panopticon. But so far all the indicators are in the wrong direction.

> publicly available law

If you have a business in America, you signed the Patriot Act. That's probably the law they are using for coöperation :)

This is something to be precise about. You won't find language in the USA PATRIOT Act that tells businesses they may not provide privacy and security tools for their customers. Similarly, CALEA requires access to carriers' networks and equipment providers must support "lawful intercept" of a certain percentage of traffic, but there is nothing that would prevent networks from providing easy tools for end to end encryption and web-of-trust.

The fact that nobody is doing that is a kind of probe. Do we really live in a free country, or is pervasive monitoring a condition being imposed on us, with no choice of services that would prevent it?

Yes and recommending Bitlocker is how they are trying to tip everyone off that this message is compromised.

Reminiscent of Jeremiah Denton, the American prisoner of war in Vietnam, who was forced to appear before the cameras to say how well he was being treated - but used the opportunity to blink out "TORTURE" in morse code, to place those words in context.


See also, POWs 'flipping the bird' in propaganda photographs: http://www.usspueblo.org/Prisoners/The_Digit_Affair.html

If so, I would at least like to think they would have likely pointed people towards something which gives at least comparable security, rather than a backdoored product.

Suppose someone has a girlfriend who is a vegetarian chef. She was kidnapped. The kidnappers forced her to make a phone call saying that everything is fine. She says "don't forget to buy some spam". At least she could have recommended something healthy?

that's the point: by recommending BitLocker, they are communicating a message to us that they are under duress.

Truecrypt is dead, long live ChipCrypt: a Truecrypt fork with TRESOR and scrypt built in.

TRESOR is a technique that keeps the volume key strictly in the CPU registers and not in RAM. This completely prevents RAM freezing and related attacks. A running computer that is locked cannot be trivially decrypted anymore by dumping it's RAM.

Scrypt is an advanced password derivation function that makes even trivial passwords very hard to bruteforce. A scrypt derived key is 20000 times harder to crack than the equivalent PBKDF2 derived key of the same password.

The TrueCrypt license is not GPL compatible but it allows redistribution in source form as long as the software is not called "TrueCrypt".

Who's up for it ?

"TRESOR is a technique that keeps the volume key strictly in the CPU registers and not in RAM. This completely prevents RAM freezing and related attacks."

Until an interrupt happens, and the handler saves the registers on the stack. (sad trombone)

Actually, they use the internal CPU debug registers which aren't automatically saved - in fact they whole purpose is to trigger a context save when the hardware break-point is hit and invoke the debugger.

If breakpoints are disabled, using 8 1 bit flags contained in the registers, you have a whooping 6x32-8=184 bits of storage on x86 and 376 bits of storage on x64.

These two (TRESOR and scrypt) are incompatible, unless you allow for some time period where keys can be extracted. The point of scrypt is to be sequentially memory-hard: it mixes password and salt in a huge amount of RAM and makes it impossible to not use RAM for this operation.

I would say "complementary" not "incompatible". Once you have derived the key you can clear the RAM and keep the key on the CPU chip only. So there's a vulnerability window of a few seconds during which the key (or key related material) is stored in RAM, after which it can no longer be recovered.

In sharp contrast with regular TrueCrypt where the master volume key is stored in RAM all the time. It's already standard operating procedure for law enforcement to time raids when the computers are in use and make a dump of the machine RAM.

Agreed. Only doing operations involving transferring the key to RAM when the user is present should mitigate it, perhaps with a prominent WARNING dialog showing when memory is vulnerable, and overwriting the memory locations used for the key when done.

Truth be told, if RAM is a security liability for the few seconds it takes to enter the password and derive the key, then you can't give any security guarantees whatsoever no matter what algorithm you use, scrypt, PBKDF2 or any other.

It's not like you can punch the password into the CPU directly, you need to use some sort of input device which has drivers which keep state in RAM, use DMA and IRQs etc. Any attacker capable of reading RAM during this phase is also capable of sniffing the password characters as they are typed on the keyboard.

If we keep thinking on tine slices of time, even a shoulder surfer attack (with cameras in case of a pass-phrase :)) can be a problem... We are not talking about how to prevent a break during the time when the user is typing the password (torture attack is still a valid one and no technology will ever prevent it). But I like good ideas: once mounted, the password cannot be retrieved as it's stored in the chipset itself... I don't know if the test pins on those processors will make this vulnerable but it worth a try-catch. And if Windows go into sleep/hybernate, you loose the password and have to mount again to restore operation. This is possible as we mount the driver as a removeable media.

I guess a stupid way to do that would be to prompt "This next step requires you to disconnect from the internet" and loop as long as is pingable or something, then prompt when it's safe to reconnect.

It would be stupid, yes :)

If someone has remote control of your computer, they'd install a sniffer that picks up the password and sends it back next time there's an internet connection.

"Who's up for it ?" Neither Google (no hit) nor Wikipedia [0] You might want to provide a link to the softwares homepage as well as additional information?

[0] https://en.wikipedia.org/wiki/Comparison_of_disk_encryption_...

"BitlLocker, the proprietary disk encryption program that ships with every Windows version since Vista."

This is misleading - Windows 7 product line has Bitlocker only for Ultimate and Enterprise. Even Windows 7 Professional users cannot use Bitlocker without upgrading to Ultimate. Very unfortunate.

There has been a suggestion that the sourceforge post is a canary triggered in a "self-destruct" sequence. I am wondering if the suggestion to use bitLocker, which as you point out is not available to everyone, is also a signal.

I want to suggest reading into it something that doesn't exist, but why would TrueCrypt people, if they are what we want them to be, suggest using bitLocker of all things. No better alternative and better than nothing??? I don't know how I feel about that.

Their "suggestions" for Linux and Mac OS seem just as fishy, almost like done on purpose to raise suspicion and hint that something bad is going on:



> Very unfortunate.

I wouldn't actually say that, but I see what you mean.

TrueCrypt encrypted volume format is well known and there are tools out there that can create TrueCrypt volumes and open them.

There is tcplay[1]. This project can create and open TrueCrypt volumes.

There is cryptsetup[2].This is a linux native solution for block device encryption and supports opening of TrueCrypt volumes.

The above two projects and command line based and there is a third project called zuluCrypt[3] that gives a GUI front end to the two projects.

I am not aware of any alternative solutions in windows or OSX that does support TrueCrypt encrypted format but adding support for it should not be that hard.

This maybe the end of line for TrueCrypt as a project,but its encrypted volume format may still be used as a "universal cross platform encrypted volume format".

Somebody should file a bug report in projects that deal with block device encryption in windows and OSX and ask them to support this format as i think the format should live on as its the only one that is widely used and supported.

[1] https://github.com/bwalex/tc-play

[2] https://code.google.com/p/cryptsetup/

[3] https://code.google.com/p/zulucrypt/

Well, if we are going to speculate, I'll offer a guess: the crowd funded security audit made the developers lose their enthusiasm.

I believe I read in another thread that TrueCrypt did not get many donations. I'd be a bit depressed if I worked long and hard on a project that people seemed to appreciate, but not enough to crack open their wallets and toss a few bucks my way, and then some third party comes along and quickly raises $70k to audit my code.

Yep. And reading the pdf for phase 1 of the audit, worth about $40k, the findings didn't seem very impressive. Specifically the readability portion where they give a critique of naming conventions in the code. I could see the developers figuring for that money they could've done a lot more good with it.

I believe I read in another thread that TrueCrypt did not get many donations

Given the anonymity of the developer(s), how would anyone know this?

However, the downside of being anonymous is that it makes it hard to ask for donations (in places other than your website).

If, and it's a big if, the reason for them throwing in the towel like this, was compensation, I think it would have been handled a lot of other ways.

My money is on an NSL and this was their way of telling us about it.

That also seems very possible.

Somebody who has been following TrueCrypt closely seem to think the project lost momentum and they just decided to call it quit.Their comment is on slashdot and the link is: http://it.slashdot.org/comments.pl?sid=5212985&cid=47115785

Makes the most sense yet. The question that was asked in response to that Slashdot post was why anyone would choose to quit the way they did (unprofessional and so on), and this doesn't sway me. If I lost my star dev and couldn't follow the code myself I might (speculation, as ever) well be petulant enough for this mess. There may be any number of factors behind it, ranging from animosity within the team to fear or self-doubt.

I don't see this as particularly unprofessional or petulant.

The owner/manager of the project didn't want to keep maintaining it and is redirecting users to an alternative that will work for almost all use cases. As of last month, there were reported flaws in TrueCrypt and there's nothing that forces a maintainer of a free project to keep going.

I'm left almost a little annoyed that the conversation isn't "RIP TrueCrypt project, thank you goes out to the maintainers for helping people feel more secure for years."

You can't just post an announcement like that in the current climate of suspicion and expect everyone to just ignore the possible implications.

Alright, I'll play. How long of an explanation are the users of a free service entitled to receive before the maintainer can happily go his/her way without follow-ups? Development of Truecrypt stopped, and as it's open-source, someone else could keep hacking away on it.

Entitled to receive? None. Read my post again. I'm saying that they should have expected that the announcement would be received with suspicion.

Not saying it is (unprofessional or petulant). I'm saying these are but two possibilities. Of many.


"Sorry. This URL has been excluded from the Wayback Machine." :-)

That's funny. The normal technique for telling the Wayback Machine not to archive would result in the message "Page cannot be crawled or displayed due to robots.txt.". How do you get "excluded" this way?

I don't know about the message, but you can ask for your site to be removed from the Wayback Machine.

"[Matthew] Green last year helped spearhead dual crowdfunding efforts to raise money for a full-scale, professional security audit of the software."

"'I think the TrueCrypt team did this,' Green said in a phone interview. 'They decided to quit and this is their signature way of doing it.'"

"I’m a little worried that the fact we were doing an audit of the crypto might have made them decide to call it quits.”

That is nonsense. The TrueCrypt developers turned over code and assisted in the initial audit. iSEC found no serious issues.

Granted they only evaluated the bootloader under the first contract, but if you were going to slip in a backdoor or if a serious crypto bypass would be possible it would have likely been there.

Which TrueCrypt developers? AFAIK we don't know them yet … and why was it necessary to turn over code for an alleged open source project?

iSEC has not found serious issues but that was only phase 1 of the audit.

It wasn't "Open Source™." The source, however, was available, and you could compile it yourself if you wished.

(It was somewhat unusual to compile on Linux, since it started as a Windows GUI project and was then ported to Linux.)

The project was in contact with TrueCrypt developers (someone who could sign messages using the release keys, and provide images of the build machine that is nearly impossible to reproduce).

They reviewed the bootloader which is one of the most complex parts and touched all the crypto and found no serious issues. That suggests that the rest of the mundane code is probably in pretty good shape.

Of course it is not a guarantee that things are perfect, but it suggests that the developers a) knew what they were doing b) had no issues with an external audit.

If the audit made them call it quits, then having them cough up the source and abandon the project for someone else to pick up is the perfect outcome.

We already have the source.

We didn't before, though, did we? Or am I mistaken about that?

The source tarball was always available for download on truecrypt.org.

I think that's why they are quitting. They didn't want the audit to find something. But it's just a speculation like any other.

It's more likely that they were angry that the audit got a lot of funds and they didn't.

In OSS often the people who do the original work get nothing and all the money goes to pundits, packagers, and consultants.

If that were true, and they were so sure of the quality of their code then they'd keep going, wait for the all clear and say: "Look, we've been doing this for 10 years, our system is now independently audited, will you please support us..."

I suspect that would have brought in a few dollars in the current climate.

That might make sense in a world of perfectly rational unemotional robots.

In the real world, if you worked for years trying to make people safe, and you felt (correctly or not) that you were being disrespected while others were being respected for picking at your nits, you might say "fine, fuck you all, have fun," too.

To be clear, I don't know what's going on. A "rage quit" is the most likely scenario IMHO, but this is all very weird.

EDIT: On reflection, a rage quit fits my priors, which say "there's no such thing as free-as-in-beer[1] software." So you and I should both be a little skeptical when I find it the likely explanation.

[1] It does exists, but it's the exception, and each project where it works has its own particular quirks that make it work. The successful ones typically rely on someone having a particular and unusual mental setup that doesn't mind free loaders. Stallman and de Raadt develop software for their own use, and the rest of us can use it, too.

Off-topic, but considering the sheer amount of stuff on GitHub, I'm not sure it's about free loaders - whether or not it's useful, the fact that so much code is published demonstrates that many people are okay with it being used by others.

They are humans, not all of which are objectivists. In fact Matthew Green himself has mentioned the possibility of insulting the developers:


  Aren't you worried you'll insult the Truecrypt developers?

  I sure hope not, since we're all after the same thing.
  Remember, our goal isn't to find some mythical back door
  in Truecrypt, but rather, to wipe away any doubt people
  have about the security of this tool.

  But perhaps this will tick people off. And if you're one
  of the developers and you find that you're ticked, I'll
  tell you exactly how to get back at us. Up your game.
  Beat us to the punch and make us all look like fools.
  We'll thank you for it.

The sad state of OSS is that it's much more profitable to find security holes, fork existing good code bases, write articles, write blogs, teach, go on stage in conferences than to do some original work.

I wonder what would happen if all original workers united and did the same as the Truecrypt people...

I object to the implicit criticism of the value of the audit here: independent audits drastically increase the value of security systems, because having someone without an existing mental model scrutinize code is a very good way to catch bugs, so users correctly consider an audited system more trustworthy. The audit is not more valuable than the original code, but it is much more valuable than the types of derivative work you mentioned.

the "current climate" is one where security defects lead to highly visible public witch hunts and shaming and then the forking of code (see: OpenSSL). OpenSSL only just today got funding and an additional two developers. Despite pretty much the entire world using and depending on it for years now. Despite the fact that it's open source and anyone anywhere could have taken the time to conduct an audit.

If the system works fine, and the auditors are paid and they have certified that the system works, where is the need for any more financial support? I agree that the work should be recognized, but there is no need to pay to support work that has already been completed, as anyone who's ever applied for a research grant already knows.

Not just OSS, though, if you read the history of new products or discoveries the same pattern regularly occurs.

Just that there are no known 'people who do the original work' in the case of TrueCrypt.

I doubt that's the reason - Greene says that the audit is going ahead. The other fact that makes this unlikely is that Green's team has the source, so his team can poke around all it likes anyway. Ergo quitting is no guarantee that something won't be found.

But quitting and hence most likely ending the project will substantially increase the chance that Green decides spending the rest of the money auditing it is a waste of cash.

Are there any decent alternatives to TrueCrypt for Windows that aren't Bitlocker?


I just want to point out that TrueCrypt is open(ish) source and the license was modified in the latest update in such a way that could be interpreted as allowing forks (and it's unlikely that the TrueCrypt devs would ever actually deanonymize to enforce any license violations anyway). Considering that we've all been happy to keep using TrueCrypt for these last 2 years without even minor updates and the Phase II audit is going on as planned, I think it's a bit premature to be looking into alternatives at this point. I would give it a better than average chance of being forked, and it's already stable, well-tested, cross-platform software that can be used in the meantime. There are mirrors of version 7.1a available to anyone who didn't already have TrueCrypt installed.

DiskCryptor (GPLv3) - https://diskcryptor.net/wiki/Main_Page

The author seems to be an anonymous Russian-speaking person.

Another option is FreeOTFE - http://sourceforge.net/projects/freeotfe.mirror/

It is compatible with Linux's LUKS and dm-crypt, and it doesn't support system partition encryption.

FreeOTFE (which was developed by a woman named Sarah Dean... who by the way also had a large TrueCrypt archive) is no longer developed. She stopped responding to email about a year ago and her domains went away too.


The Way Back Machine has an old copy of her FreeOTFE website (freeotfe.org): https://web.archive.org/web/20130531062457/http://freeotfe.o...

How about AxCrypt for file encryption?


(I phrase this as a question because it'd be great if we could have some HN skepticism on this thing. Personally, I think everything basically checks out: open source, free, there's a name, phone number, address, picture etc.)

I worked on a small software package for a financial firm that used AxCrypt for encryption. It wasn't bad from a program integration perspective. I can't personally verify the cryptographic security of it. Like codeulike said, it's a per-file based encryption. No virtual disk services.

Looks like its just a 'right-click ... encrypt this file' sort of thing. Doesn't appear to do whole disk encryption or encrypted virtual drives.

Yep, that's why I mentioned file encryption specifically. :)

My use case is wanting to have an extra layer of paranoia before I upload anything important to the cloud.

AxCrypt looks solid for your purpose. Obviously if you are whoever it is who replaced Bin Laden you can't even trust your keyboard -- it might be reporting what you type to the NSA -- but for ordinary users it is a good choice. Plus, it is mature and unlikely to destroy your data through a bug.

For non-full disk I just make AES encrypted files using 7zip. Considering just about everyone has 7zip installed its actually less of a pain in the ass that you'd think.

The only downside I see is that 7zip seems to be almost abandonware at this point. The installer linked at the top of their page is almost 4 years old and there's a recent beta but they haven't moved a beta to stable in a very long time.

Is there a good way to mount an encrypted 7z archive as a filesystem, for interactive usage?

That was the advantage of TrueCrypt, IMO. Not for FDE (I'm not a huge fan of FDE anyway), but because it created an interactive partition rather than forcing you to decrypt an entire archive to storage, work on it, and then re-encrypt the whole thing and cleanse the storage device you decrypted to.

It seems like putting something together that does that (which is really decrypt-to-RAM) wouldn't be that hard, but there are like a million file encryption tools but very few ones that allow for interactivity.

It looks like WinArchiver can mount 7z and other archive formats. There's not much documentation about it. It does not say if it supports mounting encrypted archives. It does not say if it supports write-access either. The 7z format is normally solid, so unless you're accessing files at the beginning of the archive, it could be pretty slow anyway.

I wrote a blog posting with similar products to TC:


Nothing really turn-key but you can mix and match a couple different products and get the same results.

https://www.deslock.com/ Proprietary but I used their free stuff for years

Well this is good for me. I currently use a TrueCyrypt encrypted exFAT volume for backups. My motivation is now to move this to an open source system (probably dm-crypt). This and RDP is the only reason I'm hanging onto windows and that's purely out of apathy. The suggestion of using BitLocker is a bit insulting (this might just be comedy value from TC though). Every other bit of software I use is portable or in a Linux VM already.

So my weekend project is now to move all my stuff to Debian.

RDP: Is there anything freerdp cannot do that you need from day to day? I'm working with Windows, but run Linux without any issues so far.

We have an RDP gateway which is an awful pile of shit to deal with as it uses HTTPS initiated MSRPC as the transport layer. FreeRDP doesn't work properly with that yet as there are all sorts of odd configuration and encryption things that are almost impossible to line up properly when your ops team don't actually know what they're doing or how to find out stuff for you.

Ugh; I remember having that problem a couple of years ago. In the end I had to split a room of thin clients in half, and have them each point at 1 of 2 hosts. Defeating the point of load balancing entirely :/

There's an RDP client under Linux, and you can use nx as a server, it's nearly as efficient as RDP.

I'm connecting to a Windows machine from my Linux host rather than the other way round. I can connect to the Linux machine quite happily with SSH as everything I do on them is on the terminal anyway (apart from a few things which I don't want to do remotely anyway).

Here's my theory (step-by-step):

1. Truecrypt is a gigantic pain-in-the-side for US intelligence agencies.

2. Intelligence agencies brainstorm about the best way to deal with the situation.

3. Taking over and tampering with the current code is deemed unrealistic. The user base of Truecrypt is very sophisticated and even minor changes to the source code would be scrutinized.

4. "How can be get people to stop using Truecrypt?" "We can discredit the project - get people to voluntarily stop using it because they don't trust it".

Your (1) partly fails because they'd just toss you in jail until you hand over the key. If they think you're a terrorist that jail might be overseas with no access to lawyers. If they think you're a paedophile they'll just leak that info (and this your life is destroyed).

Also, "Truecrypt properly used is a gigantic pain" and although I have nothing to support it I reckon many people use it incorrectly. Has anyone done any research?

See "deanonymizing alt.anonymous.messages" for examples of people doing crypto wrong.

Or they could simply use rubber-hose cryptanalysis.


Actually I disagree. The NSA is all about spying. If they can't decrypt what you do without going to you and asking you for the keys (or throwing you in jail) then I would say it -is- a major pain for them. Remember we're talking about an agency who routinely targets one person in the hope to find dirt on others.

On the other hand, the NSA is all about spying. If they can't encrypt their data and keep it secret, they can never get the upper hand.

(The most valuable information is info your enemy doesn't know you know. Information your enemy knows you know is not as powerful)

Previous discussion from earlier today: https://news.ycombinator.com/item?id=7812133

Out of curiosity, wouldn't the open-source TrueCrypt be better than the closed BitLocker? (assuming, of course, that TrueCrypt was not already compromised)

I don't believe BitLocker can be considered secure. Cross-post (mine) from the previous thread:

> Not "backdoors" but it seems Microsoft stores the BitLocker decryption key, based on this leaked slide (just found on twitter): https://twitter.com/TheBlogPirate/status/471759810644283392/... > edit: Confirmed Microsoft stores your recovery key on their servers if you're not connected to a domain: http://windows.microsoft.com/en-AU/windows-8/bitlocker-recov...

> Not "backdoors" but it seems Microsoft stores the BitLocker decryption key, based on this leaked slide (just found on twitter): https://twitter.com/TheBlogPirate/status/471759810644283392/....

That looks dubious to me - if you were creating a slide, would you use 90-degree-rotated text?

> edit: Confirmed Microsoft stores your recovery key on their servers if you're not connected to a domain: http://windows.microsoft.com/en-AU/windows-8/bitlocker-recov....

"There are several locations in which your BitLocker recovery key might have been saved" - emphasis mine - doesn't this mean they'll store the key if you log into "Your Microsoft account"? I have to admit I don't use Windows 8 so I don't know if this is mandatory or not? What about Windows 7?

Previous poster's confirmation is misleading -- it does not appear to be mandatory. I just went through the process on my Windows 8 machine, and you can choose to save to the recovery key to a file or print it out in lieu of storing it with your Microsoft Account (http://imgur.com/J0zk6I5).

Two caveats:

(1) It's possible Microsoft could choose to store the keys online regardless of what the user picks, but it's certainly not their official stance.

(2) Microsoft does automatically store keys online if you're using "Device Encryption" in Windows 8.1 (http://arstechnica.com/information-technology/2013/10/window...). This uses Bitlocker code but is distinct from using Bitlocker itself though -- i.e. if you do a vanilla Bitlocker encryption, your system should not send the keys to MS without a user (or admin) explicitly telling it to.

Using a Microsoft account for login is not mandatory on Windows 8.

It's not mandatory to use a Microsoft account for login, but they've been hiding the option not to in increasingly obscure locations in the installer. At one point, the easiest way to do it was apparently to just unplug your internet connection!

Maybe Microsoft asks or somehow prompts the user to back up their encryption keys? I mean, a user-focused OS certainly wouldn't include such a feature, right?

The question was not whether or not they might do it but whether or not they always did it. If a user is required to submit their encryption key to a third party for storage then the encryption solution is limited by their trust in the third party.

In the Greenwald book (No Place to Hide) he describes how Microsoft made a real effort with Skype and their Dropbox clone to open them up to the NSA.

Now I have read these revelations, I feel that not trusting Microsoft is a sensible default.

Now I have read these revelations, I feel that not trusting Microsoft is a sensible default.

You make it sound as if trusting Microsoft was ever an option to begin with...

There are pros and cons to both closed source and open source. Open source is nice because the community can audit the code and see for themselves, but closed-source is nice because a company generally has the resources to maintain and build software correctly.

Both of these are hypothetical, however. We've seen tons of vulnerabilities from both. IMHO Open Source works a lot better on paper but once projects get very large auditing them is really hard...which definitely cuts down on the amount of eyes looking at them.

>>closed-source is nice because a company generally has the resources to maintain and build software correctly.

These two things are orthogonal, IMHO.

My post is "on paper". In reality, projects are squeezed for nickels and dimes.

My point is that while open-source should be better, right now it seems that everything is equally not good enough.

Surely it would also take very little effort to implement an alternative to truecrypt? What's the big deal

If you are being sarcastic, you should be aware of Poe's law, which means we can't tell.

Security is hard. Encryption is hard. And don't call me Shirley!

Well implement one. While getting the encryption right is possible. There comes the pesky problems with presenting stuff to windows as a volume that works as well.

Well peer reviewed encryption libraries would take care of a huge proportion of that.

http://dokan-dev.net/en/ might work on windows for implementing something you might otherwise do via FUSE. Can't speak for windows as ive not used it for a decade or more.

I do believe that open-source solutions are better when concerned about privacy and security. But why would a user of a mostly closed-source OS bother about the open-source nature of one component of that OS?

That would be my opinion, but then I'd probably bet everything I own on BitLocker containing a backdoor of some sort.

If the TrueCrypt devs are done with the project, is there anything legally preventing others from restarting development on it? I know the software had an oddball license that wasn't always well-received.

My point here being, with the source being available, why do we need to assume that TrueCrypt is history, other than perhaps a lack of people willing to work on it (which I assume could change now that there is an immediate need for some)?

Sounds like what happened to LavaBit (some sort of gov't pressure).

When you know literally nothing, it doesn't sound like anything.

E4M - Encryption For the Masses is free software that TrueCrypt was based on. It's free to fork as well.


Lets just fork the sucker and call it a day. They developer is anonymous and if he wanted to stop people he would have to reveal his identity and that is not going to happen.

Is this a warrant canary?

A warrant is how the government compels you to turn over evidence you have. TrueCrypt does not have any access to its users' keys or data.

If we really really stretch, TrueCrypt had server logs that would show who downloaded it that they might be compelled to turn over. This would be an overreaction to such a request.


Reading through the posted diff, a couple of things stood out to me.

1. The release date string went from "February 7, 2012" in 7.1a to "5/2014" in 7.2 . Might be nothing, but it made me wonder if someone other than the orignal author changed it due to the date style change - I'd've expected it to be changed to "February 2014".

I also wonder why no specific day was given - makes me wonder if the release was automated and the author didn't know exactly when it would happen (possibly a dead man's switch triggered it?). Again, could be nothing.

2. Pretty much every reference to truecrypt.org has been removed - even the licence now states "Your Product [...] must not present any Internet address containing the domain name truecrypt" (instead of truecrypt.org), and there is no requirement to link to it anymore. It might just be a change in licencing stance to encourage forks, or, if the release was made under duress (NSL/threats/blackmail etc.), it might be a way to try and signal that truecrypt.org can no longer be trusted.

Edit: Something is bugging me about this line on the site: "The development of TrueCrypt was ended in 5/2014 after Microsoft terminated support of Windows XP." IT might just how the author writes, but my reaction on reading "was ended" was that something external forced it to stop rather than it being a choice.

Also, why mention XP's EOL? The message doesn't say support for TC stopped _because_ of EOL, just after, and I can't think why the end of XP support would effect TC greatly.

Maybe it is because Bruce Schneier uses and recomends it: https://www.schneier.com/cgi-bin/mt/mt-search.cgi?tag=TrueCr...

Any opinion on "Tomb"? http://www.dyne.org/software/tomb/ It tries to be a nice LUKS wrapper with container and key files.

I'm not particularly looking forward to a slew of poorly coded alternatives to TC, or to endless discussions about whether or not something actually is FDE.

Maybe the developers were Americans and they decided to bail before they get caught for exporting cryptographic software.

> Maybe the developers were Americans and they decided to bail before they get caught for exporting cryptographic software.

...or shutting down TrueCrypt was part of the plea-bargain. Either that or gitmo one way from the land of the formerly free.


Wasn't that settled in like 1996?

From Wikipedia [0]:

As of 2009, non-military cryptography exports from the U.S. are controlled by the Department of Commerce's Bureau of Industry and Security.[9] Some restrictions still exist, even for mass market products, particularly with regard to export to "rogue states" and terrorist organizations. Militarized encryption equipment, TEMPEST-approved electronics, custom cryptographic software, and even cryptographic consulting services still require an export license[9](pp. 6–7). Furthermore, encryption registration with the BIS is required for the export of "mass market encryption commodities, software and components with encryption exceeding 64 bits" (75 F.R. 36494). In addition, other items require a one-time review by or notification to BIS prior to export to most countries.[9] For instance, the BIS must be notified before open-source cryptographic software is made publicly available on the Internet, though no review is required.[10] Export regulations have been relaxed from pre-1996 standards, but are still complex.[9] Other countries, notably those participating in the Wassenaar Arrangement,[11] have similar restrictions.[12]

[0] http://en.wikipedia.org/wiki/Export_of_cryptography_from_the...

[9] http://www.access.gpo.gov/bis/ear/pdf/ccl5-pt2.pdf

[10] http://www.bis.doc.gov/encryption/pubavailencsourcecodenofif...

[11] http://www.wassenaar.org/participants/index.html

[12] http://www.wassenaar.org/guidelines/docs/Initial%20Elements%...

I can't think of a reason why they'd remove sources for earlier versions from SF. I mean, in this day and age somebody is going to upload them again to the internet.

Considering the licensing, its likely that the developers wanted to keep development to themselves, and never turn it over to someone else. If the ragequit theory is correct, its fairly reasonable for them to remove the previous versions to make a fork slightly more difficult, especially since it would be an illegitimate fork.

It doesn't add up. You can't (easily) enforce licensing while remaining anonymous, so why bother with removing the earlier versions?

If ragequitting, why bother with making a new release instead of just removing _everything_ and changing the webpage? (Presumably, you already have a copy of the SW if you have encrypted volumes.)

Also, note "you _should_ migrate data". Could this imply that cold storage is not secure?

The kind of person I can see maintaining Truecrypt for a decade I can also see making this decision. They're upset and are completely done with the project. They take down all the old versions, knowing that the licensing means they're largely useless for purposes of forking (unless you want to violate the licensing, which causes questions to the validity of the fork). Despite their frustration with the project, they still deeply care about crypto and keeping their users secure, so they publish some brief recommendations on alternatives, and release a read-only version of their software to support it.

There are any number of possible vulnerabilities that could exist. Its definitely a plausible possibility. I could fairly easily believe that they were contacted by a researcher who was about to publish a major AES flaw, or one of the other algorithms in use.

There's a number of relatively plausible theories. I wouldn't be surprised if we don't find out for 20 years what the actual reason for this was, when the developer is on their deathbed.

Was not one of the selling points of truecrypt its ability to provide 'plausable deniability'? That is not so common in other crypto-products.

Would it be feasible to use encrypted docker containers as a cross-platform encrypted container solution?

how would that work on windows? run linux in a vm and share over samba?

So what's the (Windows compatible and open source) software to transition systems to?

AS far as I am aware - there isn't one offering the same functionality.

Which is what makes it scary, because people all over the world are now left with much less user-friendly choices for encryption.

Even if there isn't one, I guess you can run Linux in a VM, mount encrypted volumes from it and export the unit.

Now that this party is over, does anyone know any wrappers for using gpg-zip with some of the degree of convenience of TrueCrypt — at least, for the limited case of keeping directories conveniently encrypted and useable?

Is there any reason this cannot just be forked?

The license.

Perhaps TrueCrypt was an NSA scam all along, 'retiring' before they're found out? Worse still, a Russian/Chinese scam!

Why is that worse?

America has rule of law and the NSA had to go rogue to do what it did; once their program was outed it's on all news, everyone is discussing; Americans enjoy real rights.

If China or Russia had the same capabilities there would not be any theoretical backlash against this being discovered, since par for the course there is far less freedom. It's not the abstraction that's promised even theoretically.

This is the same reason why it's better when Google (with its nominal "do no evil" strategy) turned out to collect wifi data with its street cars, which is not as bad as if Microsoft (with no such policy or mission) had done the same thing.

It might be limited in its actions against Americans but that doesn't make me feel any better.

Also the program may have been outed but, it looks like some people are keen to keep it running and possibly make it actually legal

(Oh, I was just answering your literal question, very narrowly.)

Fair enough. You weren't wrong.

Why isn't BitLocker open source? If the new CEO wants to show he's serious about user privacy, I think opening up BitLocker and letting everyone look inside would be a great start.

One of the reasons I like iPhone is the idea that the security system and drive encryption is not hopelessly broken. It would be great to have the same level of confidence in BitLocker.

> One of the reasons I like iPhone is the idea that the security system and drive encryption is not hopelessly broken.

Where can you get the source of that?

There is no source for that. Data on iOS devices is not even fully encrypted and accessible to law enforcement agencies etc.

iPhone security system is open-source?

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact