I've been a long time TC user and if there's the trait it has it's the quality and a high degree of polish. And now looking at the diff and the screenshot of that in-app "Not secure" message, the polish is just not there. It feels like it was something that was slapped together in a rush or by someone who's not an original developer. The SF page alone is a big red flag. If you compare its nearly hysterical tone and ridiculous BitLocker advice to the tone and content of the actual app, they don't add up at all.
This leaves us with a handful of discrepancies between the last good state of the project and what's out there now. So it's either someone else's hackjob or it is original and the discrepancies are intentional. Then, factor in the .exe sig match, and it pretty much leaves only the latter option - the original devs made an absurdly non-TC-like release. The question is "why?"
Can't say I blame them. I've had the feds show up where I lived once and I nearly shit a brick when I realized what was going on. You never want the weight the government beating down your back.
But anyway, thoughts on alternatives? CiskCryptor (http://en.wikipedia.org/wiki/DiskCryptor) sounds like a nice one, it's FLOSS. Thoughts on it?
The same was true for OpenSSL a few weeks ago.
One of the most plausible theories is that the TrueCrypt developers found a gaping security hole (ala OpenSSL) and realised that releasing a fix for it would reveal the bug and compromise every TrueCrypt partition in existence, so they chose to kill the project rather than risk the safety of all of that currently encrypted data.
If that scenario is true, there's a big rush to switch to something else before the bug is discovered by other researchers.
I donated at least two times to them via PayPal. How anonymous could they be if they got funds via PayPal? Not very, in this day and age. I would image it's trivial for the US government to find their true names based on this fact alone.
It could be that the developer(s) want to get out anyway. The fact that they've been doing it for so long indicates that they've got some strong feelings about privacy.
Perhaps they're staging the whole thing as their way out, to make a political statement, pushing us further to do something about the government's stand on privacy.
It's not even like support means anything, other than that they will no longer improve or fix it, i.e., there's still time to migrate away as XP degrades. It says nothing about whether XP is secure in and of itself.
This whole incident is about as weird as weird gets.
I saw a post that suggested it might be a canary, i.e., an event that must be interpreted as a certain action having taken place...a negative message....an absence of an indication that everything is ok. But that also seems odd since I am not quite sure that if TrueCrypt people were who we all want to believe they were, would suggest using bitLocker. bitLocker??? Alone that suggestion smells like rotten fish just by its association with MS and the US government.
You should put money on the fact that it really was someone associated with certain ever increasingly fascist governments of, likely the USA or Israel, that compromised TrueCrypt in a way that set off an "auto-destruct" sequence. It's probably a reaction to the Snowden compromise, with increased funding and efforts to regain domination that he exposed by becoming even more totalitarian through an "Operation Kristallnacht" sprint against civilian institutions.
He needed to get some older version of Visual Studio and a very specific combination of service packs and updates in order to get to matching (nearly) the entire binary.
Could it be that the devs really wanted to keep developing in XP because some part of their dev chain required it, but with support being discontinued they of course couldn't run XP and consider that computer "secure" to develop on?
Obviously, if this theory has some grain of truth to it, it can only be part of the explanation for the TrueCrypt weirdness that's been going on today.
If I had to bet on it, I would say it's most likely they got Lavabitten. But then still, the XP remark seems like a very odd choice if it's intended to function as a canary of sorts.
Unless, of course, something more shadowy is going on.
This seems unlikely: the screenshots in your link clearly show the source could be built on Windows 7. The trouble Xavier mentions is with the updates to VS2008 SP1, not Windows service packs.
That's the canary.
Not only is bitlocker backdoored, and most TrueCrypt users would know that, but they also say it is available on all windows versions, which is eminently not the case.
In this situation, warrant canaries (or maybe an NSL canary as the case might be) is a subtle indication that everything is no longer in a known-good state and something has been compromised.
Although given some of the comments by danielweber, I'm increasingly less inclined to believe this was a canary. Developers throwing in the towel might be a more plausible situation, no matter how disappointing that may be.
Maybe they migrated away from XP and have assurance that Bitlocker is safe for them or switched to something entirely different and have no interest or funding for development it anymore...
The Bitlocker recommendation does seem strange. But when you look around the Windows ecosystem, there isn't much else that could be recommended. What would you recommend Windows people use, other than Bitlocker?
2. Warrant canary makes no sense. There is nothing for a warrant to grab.
3. Forking is legally troublesome. Just because you can see the source doesn't mean you can distribute the source.
I had a look at the diff; I have neither the time nor crypto-specific skill to do an audit, but there are plenty of code changes that aren't warnings in there.
>2. Warrant canary makes no sense. There is nothing for a warrant to grab.
No, but they could have theoretically been asked to introduce a backdoor from a Lavabit-style order.
>3. Forking is legally troublesome. Just because you can see the source doesn't mean you can distribute the source.
Do you think that will stop people? Even if there is some legal issue, the Streisand Effect always overcomes it. So what if a TrueCrypt fork can't use Github because of that issue? Is the world suddenly lacking good hosting providers, perhaps in Switzerland or similar? Has everyone forgotten how to set up a public git repo themselves? Somehow, I think not. So what if many devs will probably have to contribute anonymously? With a product such as TrueCrypt, they probably should anyway.
This makes no sense. Lavabit was compelled to turn over evidence it told the government it had, which is straightforward law. There is nothing "Lavabit-style" about "distribute a back door or else." You would need explicit legislation to allow that. If the developer's cat was kidnapped to force him to put in a backdoor, there is nothing "Lavabit-style" about that.
You do hedge with the word "theoretically," but "theoretically" this could be a message from the aliens.
Do you think that will stop people?
It will stop the smart people, and you need smart people to work on it. Otherwise you would only be allowed to distribute it by illicit means, and you could never trust it. It would be as trustworthy as warez sites. Why even bother with that nonsense?
Much saner to just build something new, having learned from TrueCrypt's experiences. For example, something that starts from the command-line and then has a GUI in top, instead of the reverse.
No legislastion authorizes that program, among many others. I think what the parent was referring to as "Lavabit-style" was the "compromise your users or face legal action" move. Turning over customer data or introducing a backdoor are both means to the same end, as far as the NSA is concerned.
The NSA inserting backdoors into the product without the cooperation or maybe even knowledge of the vendor -- while troubling for any number of reasons -- is vastly different. Especially when giving advice to developers about how to stay in bounds with the law. If the metagame becomes "the government can legally force you to insert backdoors into your product" any developer faced with this threat might believe it, when he should know it's bunk.
 RSA allegedly got paid $10 million to make a change the NSA wanted. RSA customers should demand an answer, but that's not forcing the RSA. People get paid to do things all the time.
>Hushmail stated that the Java version is also vulnerable, in that they may be compelled to deliver a compromised java applet to a user.
>Hushmail turned over cleartext copies of private email messages associated with several addresses at the request of law enforcement agencies under a Mutual Legal Assistance Treaty with the United States.; e.g. in the case of U.S. v. Tyler Stumbo.
>if a court order has been issued by the Supreme Court of British Columbia compelling us to reveal the content of your encrypted email, the "attacker" could be Hush Communications, the actual service provider.
the company that provides Hushmail, states that it will not release any user data without a court order from the Supreme Court of British Columbia,
This is malarkey. Someone in the US could say "I'll fight all the way to the Supreme Court!!!" but you would be a fool to trust your business to their determination. Especially if they say it about a subpoena, which means they haven't even retained a lawyer to ask about this. (If your business plan depends on being able to wage a legal battle, you really shouldn't be scrambling through the yellow pages for a lawyer when you get your first subpoena.)
Back to the topic, I'll have to point out that this still isn't evidence of a company being required to backdoor a product. Hushmail, the same company that thinks it can fight a subpoena for third-party data all the way to the Supreme Court, said "well, we might be compelled to backdoor our product." This is just more repeating of the meme without evidence. It's unfortunate because some developer who remembers Hushmail might take their ill-informed legal opinion as reality.
Of course, Hushmail had access to cleartext copies of the messages. That's the killer. The government has the right to evidence about third parties in your possession. (Canada derives from British law tradition like the US. The government's right to all evidence is a concept that goes back centuries. If you can show that Canada broke from this tradition I would be most interested.)
No legislation forbids it either. If the NSA is legally able to search/seize a given piece of information why would you think they should not be allowed to forensically analyze it?
- XP is the only widely used version of Windows with no built-in FDE
- XP is finally, truly, EOL
So TrueCrypt no longer needs to exist.
(I'm not arguing that- that is what the message is saying)
"WARNING: Using TrueCrypt is not secure as ..."
WARNING: Using TrueCrypt is (n)ot (s)ecure (a)s ...
TrueCrypt is (n)ot (s)ecure (a)s ...
(n)ot (s)ecure (a)s
TrueCrypt is (n) (s) (a)
"WARNING: Using TrueCrypt is not secure as ..."
"WARNING: Using True(C)rypt (i)s not secure (a)s ..."
(C) (i) (a)
"WARNING: Using TrueCrypt is not secure as ..."
"WARNING: Using TrueCrypt i(s) n(o)t (s)ecure as ..."
(s) (o) (s)
Use any integrated support for encryption. Search available installation packages for words encryption and crypt, install any of the packages found and follow its documentation"""
>>> ''.join(a for a in s.split())
That short string contains "beast of the apocalypse"  plus some other letters ('iuififwaifafi')
 https://en.wikipedia.org/wiki/The_Beast_(Revelation) - "The second beast is described in Revelation 13:11-18 and is also referred to as the false prophet."
Oh, wait, it's entirely feasible.
The new version posted is almost certainly compromised; don't download it, or at the very least, run it in a VM on non-networked hardware you can reimage after finishing using.
Edit: Forgot this before, but BitLocker is definitely completely broken as it sends your recovery key to MS anyway ( https://twitter.com/TheBlogPirate/status/471759810644283392/... ).
BitLocker in it's "click click next" incarnation stores keys in the cloud, but it is fairly trivial to install in a manner that uses the TPM or external media for key storage.
For example, NIST publishes guidelines for FIPS compliant BitLocker configuration that gives some guidlines re: the different operating modes:
If you have the technical chops / willingness to download TrueCrypt, you should have the ability to spend 10 minutes googling for instructions on a customized configuration of BitLocker.
The more plausible potential for Bitlocker being broken is that there is some subtle flaw in their crypto implementation.
Or perhaps an optional Microsoft-account feature to back up your encryption keys. Something that most normal users would want, just like they want it on Apple devices? Because a lot of common users aren't going to want FDE if it means "oh and lose your key and say goodbye to your data".
As far as I know, there's only been speculation on what PRISM is. Nothing that suggests it couldn't be a frontend to CALEA or warrant-based systems.
Subsequent Snowden releases made it clear that thdatee NSA has many sources of information that are only "legal" because they said so, including intra-datacenter and international fiber taps, zero day exploits, and physically modified equipment (see photos of network gear being intercepted and bugged).
In other words, this paragraph might have been reasonable a year ago, but is now grossly out of date
Bitlocker is not trustworthy as an overall method of FDE.
It's fine and perfectly reasonable not to trust closed-source code, but no reason to spread half-truths about it.
"Hey, how should we deal with resetting people's passwords and keys when they forget them?"
"Tell them to get a safe deposit box"
"And when they're traveling or really need a report and the bank's closed?
"They shouldn't have lost their keys. Stupid lusers."
"Hey, who are you and how did you get in this meeting?"
"I'm the new intern. From the CIA."
"Oh, okay, yeah, let's do that."
People seem to keep forgetting this (I'm sure it's simply unintentional), but PRISM was and still is nothing much more than an automated warrant/NSL compliance system.
You're basically saying that Microsoft is complicit in divulging information in response to specific requests made under specific legislative authorities, which was standard hat since even before Smith v. Maryland.
license plates on cars wasn't a big deal, it was primarily used to identify stolen cars and track drivers breaking the law. Then automation entered the picture and it became feasible to track the movements of everyone, aggregate it in a huge database, and claim "they might be criminals later".
PRISM is more of the same, they could of compelled Microsoft to do this long before, of course, but PRISM is one of those compromise everything initiatives. Meaning that even if the possibility existed before, it definitely exists now.
so it's not unjustified bringing it up.
Remember, with PRISM each and every request has to be approved by the company in question before it proceeds, which is still a manual step. So the license plate reader example doesn't apply directly; Rather it might be like a license plate reader that only works when activated by a remote magistrate, only for the one car permitted by that activation, but can continue scanning that one car's license plate from then on wherever it's seen in the city until the permission expires.
Are you sure it's the later, and not the former?
Suddenly (while there is an audit), they quit everything, change the assemblies and the website, so users can get to another product... It seems weird that after 10 years of hard-work, they suddenly quit without further explanation.
Lavabit was a service, TrueCrypt is a product.
Lavabit had access to all their customers' data, and told investigators that they had it. It's completely straightforward law that, given a subpoena, Lavabit must turn over evidence to the government.
TrueCrypt is a product. They do not have access to customer data. There is no requirement for TrueCrypt to "help out the government" in this case.
If you want to hang a conspiracy theory on this news, find some hook besides Lavabit.
 And I can't fault the conspiracy theorists for trying to find some explanation over this, because the damn thing is so weird and unusual.
But what makes you think U.S. law treats them any differently, assuming TrueCrypt's creators and maintainers can be identified?
Here's my article from 8 years ago talking about how the FBI was demanding that makers of certain products include backdoors for FedGov surveillance:
The FBI has drafted sweeping legislation that would...force makers of networking gear to build in backdoors for eavesdropping... FBI Agent Barry Smith distributed the proposal at a private meeting last Friday with industry representatives and indicated it would be introduced by Sen. Mike DeWine, an Ohio Republican, according to two sources familiar with the meeting...
Anyone can "draft legislation." I can draft legislation right now. That doesn't make it U.S. law. Getting it passed is the hard part.
Phone companies are required to enable wiretaps. But that happened through the public legislative process, and the legislation even lets the phone company bill the government for costs to comply. (Your linked article explicitly points out CALEA.)
So you, a non-lawyer developer, get one of these letters. You are pretty damn sure it is a bluff (didn't that clause in NSLs get shot down? Pretty sure I heard something about that... Something about Nicholas Merrill?). What if you are wrong though? What if this is a different kind of letter that you and the rest of the general public are currently unfamiliar with? What if the government has found a new way to create such a clause? Is "pretty damn sure" a high enough standard of sureness for you to call their bluff and talk to a lawyer anyway? How much do you value your freedom, and how much do you value your work?
Not being willing to call their bluff and contact a lawyer means that you are not able to question or interpret anything else in that letter as well. The best you can do is ask the government to interpret the letter for you, and tell you exactly what you need to do in order to comply.
The next best option is likely to burn what they want to the ground.
Linking an abuse like you describe to Lavabit only harms developers, who if they were to receive such an illegal demand might remember "wait, Lavabit was required to install back doors, right? I guess I have to, as well!"
Until the current regime is dismantled, we cannot rule out the possibility that these abuses are ongoing. To label it as a conspiracy theory is just shameless apologetics.
What's "this"? Is it "the USG compels vendors to install back doors into their software products they ship to others, under threat of jail time and/or fine and/or vacation at Gitmo"? To whom was this done?
NSLs are nasty in many ways. That doesn't mean they are nasty in any way you can imagine.
If you want examples of FBI surveillance untethered to the law, we can provide those. Look at the video of the public forum I hosted with Ladar (of Lavabit) in SF last fall. Look at warrantless cell tracking, which I was the first to disclose circa 2006, and which is now the subject of significant litigation. Look at the warrantless use -- not just by the bureau but other police agencies as well -- of physical GPS tracking devices. How about surreptitious black bag jobs to install key loggers to extract PGP passphrases before this was authorized by the 2001 Patriot Act?
Here's another from last summer, which I was the first to disclose:
"The U.S. government is quietly pressuring telecommunications providers to install eavesdropping technology deep inside companies' internal networks to facilitate surveillance efforts..."
Huh! Where does the FBI get the legal authority to do that? Shouldn't, you know, Congress set the rules here after openly debating them in a public hearing?
Again, all these points are tangential to the question of FedGov product backdoors. (Note I'm expressing no opinion here about what's going on with TrueCrypt.) This survey I did in 2007 is probably worth repeating:
I'm no longer doing this kind of reporting (and left to found the SF-area startup http://recent.io instead) but I hope someone tries to replicate it today with a broader set of companies.
Something must be wrong because this is 100% the question I believe I responded to. I will attempt so again now:
* Statute gives the government the right to compel certain service providers to actively assist in wiretapping. Example law: CALEA
* There is no U.S. law that gives the government the right to compel arbitrary third-parties to modify their products to make wiretapping easier.
You give a long list of bad things the USG has done, but none of them involve vendors being compelled to modify products.
(In another domain, banks have to report transactions over 10K, but that's completely the result of statute, the Bank Secrecy Act.)
This is an interesting claim. It would be more interesting if the U.S. government publicly said its interpretation of the law is the same as yours. It has not. :)
That's what the publicly available laws say, but America has secret interpretations of laws now. We know, for example, that every Internet service is, in theory, free to provide tools that would put user data out of reach of anyone with, or without a warrant. And yet, nobody has.
Nobody except Silent Circle, who have decided to domicile their company in Switzerland, is a new entrant based on the premise of providing truly secure communication. So, what to make of all the CEO-level complaining but no end-to-end encryption tools and no web-of-trust?
If a major Internet portal provided end-to-end secure mail, real-time communications, and secure storage we would know that, yes, there is no legal or extralegal obligation to keep us all naked in the panopticon. But so far all the indicators are in the wrong direction.
If you have a business in America, you signed the Patriot Act. That's probably the law they are using for coöperation :)
The fact that nobody is doing that is a kind of probe. Do we really live in a free country, or is pervasive monitoring a condition being imposed on us, with no choice of services that would prevent it?
TRESOR is a technique that keeps the volume key strictly in the CPU registers and not in RAM. This completely prevents RAM freezing and related attacks. A running computer that is locked cannot be trivially decrypted anymore by dumping it's RAM.
Scrypt is an advanced password derivation function that makes even trivial passwords very hard to bruteforce. A scrypt derived key is 20000 times harder to crack than the equivalent PBKDF2 derived key of the same password.
The TrueCrypt license is not GPL compatible but it allows redistribution in source form as long as the software is not called "TrueCrypt".
Who's up for it ?
Until an interrupt happens, and the handler saves the registers on the stack. (sad trombone)
If breakpoints are disabled, using 8 1 bit flags contained in the registers, you have a whooping 6x32-8=184 bits of storage on x86 and 376 bits of storage on x64.
In sharp contrast with regular TrueCrypt where the master volume key is stored in RAM all the time. It's already standard operating procedure for law enforcement to time raids when the computers are in use and make a dump of the machine RAM.
It's not like you can punch the password into the CPU directly, you need to use some sort of input device which has drivers which keep state in RAM, use DMA and IRQs etc. Any attacker capable of reading RAM during this phase is also capable of sniffing the password characters as they are typed on the keyboard.
If someone has remote control of your computer, they'd install a sniffer that picks up the password and sends it back next time there's an internet connection.
This is misleading - Windows 7 product line has Bitlocker only for Ultimate and Enterprise. Even Windows 7 Professional users cannot use Bitlocker without upgrading to Ultimate. Very unfortunate.
I want to suggest reading into it something that doesn't exist, but why would TrueCrypt people, if they are what we want them to be, suggest using bitLocker of all things. No better alternative and better than nothing??? I don't know how I feel about that.
I wouldn't actually say that, but I see what you mean.
There is tcplay. This project can create and open TrueCrypt volumes.
There is cryptsetup.This is a linux native solution for block device encryption and supports opening of TrueCrypt volumes.
The above two projects and command line based and there is a third project called zuluCrypt that gives a GUI front end to the two projects.
I am not aware of any alternative solutions in windows or OSX that does support TrueCrypt encrypted format but adding support for it should not be that hard.
This maybe the end of line for TrueCrypt as a project,but its encrypted volume format may still be used as a "universal cross platform encrypted volume format".
Somebody should file a bug report in projects that deal with block device encryption in windows and OSX and ask them to support this format as i think the format should live on as its the only one that is widely used and supported.
I believe I read in another thread that TrueCrypt did not get many donations. I'd be a bit depressed if I worked long and hard on a project that people seemed to appreciate, but not enough to crack open their wallets and toss a few bucks my way, and then some third party comes along and quickly raises $70k to audit my code.
Given the anonymity of the developer(s), how would anyone know this?
However, the downside of being anonymous is that it makes it hard to ask for donations (in places other than your website).
If, and it's a big if, the reason for them throwing in the towel like this, was compensation, I think it would have been handled a lot of other ways.
My money is on an NSL and this was their way of telling us about it.
The owner/manager of the project didn't want to keep maintaining it and is redirecting users to an alternative that will work for almost all use cases. As of last month, there were reported flaws in TrueCrypt and there's nothing that forces a maintainer of a free project to keep going.
I'm left almost a little annoyed that the conversation isn't "RIP TrueCrypt project, thank you goes out to the maintainers for helping people feel more secure for years."
"Sorry. This URL has been excluded from the Wayback Machine." :-)
"'I think the TrueCrypt team did this,' Green said in a phone interview. 'They decided to quit and this is their signature way of doing it.'"
"I’m a little worried that the fact we were doing an audit of the crypto might have made them decide to call it quits.”
Granted they only evaluated the bootloader under the first contract, but if you were going to slip in a backdoor or if a serious crypto bypass would be possible it would have likely been there.
iSEC has not found serious issues but that was only phase 1 of the audit.
(It was somewhat unusual to compile on Linux, since it started as a Windows GUI project and was then ported to Linux.)
They reviewed the bootloader which is one of the most complex parts and touched all the crypto and found no serious issues. That suggests that the rest of the mundane code is probably in pretty good shape.
Of course it is not a guarantee that things are perfect, but it suggests that the developers a) knew what they were doing b) had no issues with an external audit.
In OSS often the people who do the original work get nothing and all the money goes to pundits, packagers, and consultants.
I suspect that would have brought in a few dollars in the current climate.
In the real world, if you worked for years trying to make people safe, and you felt (correctly or not) that you were being disrespected while others were being respected for picking at your nits, you might say "fine, fuck you all, have fun," too.
To be clear, I don't know what's going on. A "rage quit" is the most likely scenario IMHO, but this is all very weird.
EDIT: On reflection, a rage quit fits my priors, which say "there's no such thing as free-as-in-beer software." So you and I should both be a little skeptical when I find it the likely explanation.
 It does exists, but it's the exception, and each project where it works has its own particular quirks that make it work. The successful ones typically rely on someone having a particular and unusual mental setup that doesn't mind free loaders. Stallman and de Raadt develop software for their own use, and the rest of us can use it, too.
Aren't you worried you'll insult the Truecrypt developers?
I sure hope not, since we're all after the same thing.
Remember, our goal isn't to find some mythical back door
in Truecrypt, but rather, to wipe away any doubt people
have about the security of this tool.
But perhaps this will tick people off. And if you're one
of the developers and you find that you're ticked, I'll
tell you exactly how to get back at us. Up your game.
Beat us to the punch and make us all look like fools.
We'll thank you for it.
I wonder what would happen if all original workers united and did the same as the Truecrypt people...
The author seems to be an anonymous Russian-speaking person.
Another option is FreeOTFE - http://sourceforge.net/projects/freeotfe.mirror/
It is compatible with Linux's LUKS and dm-crypt, and it doesn't support system partition encryption.
The Way Back Machine has an old copy of her FreeOTFE website (freeotfe.org): https://web.archive.org/web/20130531062457/http://freeotfe.o...
(I phrase this as a question because it'd be great if we could have some HN skepticism on this thing. Personally, I think everything basically checks out: open source, free, there's a name, phone number, address, picture etc.)
My use case is wanting to have an extra layer of paranoia before I upload anything important to the cloud.
The only downside I see is that 7zip seems to be almost abandonware at this point. The installer linked at the top of their page is almost 4 years old and there's a recent beta but they haven't moved a beta to stable in a very long time.
That was the advantage of TrueCrypt, IMO. Not for FDE (I'm not a huge fan of FDE anyway), but because it created an interactive partition rather than forcing you to decrypt an entire archive to storage, work on it, and then re-encrypt the whole thing and cleanse the storage device you decrypted to.
It seems like putting something together that does that (which is really decrypt-to-RAM) wouldn't be that hard, but there are like a million file encryption tools but very few ones that allow for interactivity.
Nothing really turn-key but you can mix and match a couple different products and get the same results.
So my weekend project is now to move all my stuff to Debian.
1. Truecrypt is a gigantic pain-in-the-side for US intelligence agencies.
2. Intelligence agencies brainstorm about the best way to deal with the situation.
3. Taking over and tampering with the current code is deemed unrealistic. The user base of Truecrypt is very sophisticated and even minor changes to the source code would be scrutinized.
4. "How can be get people to stop using Truecrypt?" "We can discredit the project - get people to voluntarily stop using it because they don't trust it".
Also, "Truecrypt properly used is a gigantic pain" and although I have nothing to support it I reckon many people use it incorrectly. Has anyone done any research?
See "deanonymizing alt.anonymous.messages" for examples of people doing crypto wrong.
(The most valuable information is info your enemy doesn't know you know. Information your enemy knows you know is not as powerful)
> Not "backdoors" but it seems Microsoft stores the BitLocker decryption key, based on this leaked slide (just found on twitter): https://twitter.com/TheBlogPirate/status/471759810644283392/...
> edit: Confirmed Microsoft stores your recovery key on their servers if you're not connected to a domain: http://windows.microsoft.com/en-AU/windows-8/bitlocker-recov...
That looks dubious to me - if you were creating a slide, would you use 90-degree-rotated text?
> edit: Confirmed Microsoft stores your recovery key on their servers if you're not connected to a domain: http://windows.microsoft.com/en-AU/windows-8/bitlocker-recov....
"There are several locations in which your BitLocker recovery key might have been saved" - emphasis mine - doesn't this mean they'll store the key if you log into "Your Microsoft account"? I have to admit I don't use Windows 8 so I don't know if this is mandatory or not? What about Windows 7?
(1) It's possible Microsoft could choose to store the keys online regardless of what the user picks, but it's certainly not their official stance.
(2) Microsoft does automatically store keys online if you're using "Device Encryption" in Windows 8.1 (http://arstechnica.com/information-technology/2013/10/window...). This uses Bitlocker code but is distinct from using Bitlocker itself though -- i.e. if you do a vanilla Bitlocker encryption, your system should not send the keys to MS without a user (or admin) explicitly telling it to.
Now I have read these revelations, I feel that not trusting Microsoft is a sensible default.
You make it sound as if trusting Microsoft was ever an option to begin with...
Both of these are hypothetical, however. We've seen tons of vulnerabilities from both. IMHO Open Source works a lot better on paper but once projects get very large auditing them is really hard...which definitely cuts down on the amount of eyes looking at them.
These two things are orthogonal, IMHO.
My point is that while open-source should be better, right now it seems that everything is equally not good enough.
http://dokan-dev.net/en/ might work on windows for implementing something you might otherwise do via FUSE. Can't speak for windows as ive not used it for a decade or more.
My point here being, with the source being available, why do we need to assume that TrueCrypt is history, other than perhaps a lack of people willing to work on it (which I assume could change now that there is an immediate need for some)?
If we really really stretch, TrueCrypt had server logs that would show who downloaded it that they might be compelled to turn over. This would be an overreaction to such a request.
1. The release date string went from "February 7, 2012" in 7.1a to "5/2014" in 7.2 . Might be nothing, but it made me wonder if someone other than the orignal author changed it due to the date style change - I'd've expected it to be changed to "February 2014".
I also wonder why no specific day was given - makes me wonder if the release was automated and the author didn't know exactly when it would happen (possibly a dead man's switch triggered it?). Again, could be nothing.
2. Pretty much every reference to truecrypt.org has been removed - even the licence now states "Your Product [...] must not present any Internet address containing the domain name truecrypt" (instead of truecrypt.org), and there is no requirement to link to it anymore. It might just be a change in licencing stance to encourage forks, or, if the release was made under duress (NSL/threats/blackmail etc.), it might be a way to try and signal that truecrypt.org can no longer be trusted.
Edit: Something is bugging me about this line on the site: "The development of TrueCrypt was ended in 5/2014 after Microsoft terminated support of Windows XP." IT might just how the author writes, but my reaction on reading "was ended" was that something external forced it to stop rather than it being a choice.
Also, why mention XP's EOL? The message doesn't say support for TC stopped _because_ of EOL, just after, and I can't think why the end of XP support would effect TC greatly.
...or shutting down TrueCrypt was part of the plea-bargain. Either that or gitmo one way from the land of the formerly free.
As of 2009, non-military cryptography exports from the U.S. are controlled by the Department of Commerce's Bureau of Industry and Security. Some restrictions still exist, even for mass market products, particularly with regard to export to "rogue states" and terrorist organizations. Militarized encryption equipment, TEMPEST-approved electronics, custom cryptographic software, and even cryptographic consulting services still require an export license(pp. 6–7). Furthermore, encryption registration with the BIS is required for the export of "mass market encryption commodities, software and components with encryption exceeding 64 bits" (75 F.R. 36494). In addition, other items require a one-time review by or notification to BIS prior to export to most countries. For instance, the BIS must be notified before open-source cryptographic software is made publicly available on the Internet, though no review is required. Export regulations have been relaxed from pre-1996 standards, but are still complex. Other countries, notably those participating in the Wassenaar Arrangement, have similar restrictions.
If ragequitting, why bother with making a new release instead of just removing _everything_ and changing the webpage? (Presumably, you already have a copy of the SW if you have encrypted volumes.)
Also, note "you _should_ migrate data". Could this imply that cold storage is not secure?
There are any number of possible vulnerabilities that could exist. Its definitely a plausible possibility. I could fairly easily believe that they were contacted by a researcher who was about to publish a major AES flaw, or one of the other algorithms in use.
There's a number of relatively plausible theories. I wouldn't be surprised if we don't find out for 20 years what the actual reason for this was, when the developer is on their deathbed.
Which is what makes it scary, because people all over the world are now left with much less user-friendly choices for encryption.
If China or Russia had the same capabilities there would not be any theoretical backlash against this being discovered, since par for the course there is far less freedom. It's not the abstraction that's promised even theoretically.
This is the same reason why it's better when Google (with its nominal "do no evil" strategy) turned out to collect wifi data with its street cars, which is not as bad as if Microsoft (with no such policy or mission) had done the same thing.
Also the program may have been outed but, it looks like some people are keen to keep it running and possibly make it actually legal
One of the reasons I like iPhone is the idea that the security system and drive encryption is not hopelessly broken. It would be great to have the same level of confidence in BitLocker.
Where can you get the source of that?