TrueCrypt Audit Back On Track After Silence and Uncertainty 112
itwbennett writes: In October 2013 Cryptography professor Matthew Green and security researcher Kenneth White launched a project to perform a professional security audit of TrueCrypt, partly prompted by the leaks from Edward Snowden that suggested the NSA was engaged in efforts to undermine encryption. Their report, published in April 2014, covered the first phase of the audit. Phase two was supposed to involve a formal review of the program's encryption functions, with the goal of uncovering any potential errors in the cryptographic implementations—but then the unexpected happened. In May 2014, the developers of TrueCrypt, who had remained anonymous over the years for privacy reasons, abruptly announced that they were discontinuing the project and advised users to switch to alternatives. Now, almost a year later, the project is back on track.
Um, (Score:2)
Re:Um, (Score:5, Insightful)
Re: (Score:3)
I don't think the devs were helping anyway. They remain anonymous to this moment, at least to all of us.
Re:Um, (Score:5, Interesting)
What did the TrueCrypt developers have to do with the audit of TrueCrypt?
Is there a point to continue auditing a platform whose entire developer team has abandoned whilst urging all users to seek other encryption tools? At this point the audit is probably going to be interesting (related to the aforementioned dev abandonment), but not exactly useful... If you are still using Truecrypt, you have already been warned.
Re:Um, (Score:5, Informative)
As such, a warrant would let people continue to use it, secure in the fact that it actually works as required.
It also lets people fork it.
Frankly, I have been severely disappointed with BestCrypt, which I had hoped would end up as the replacement for TrueCrypt. (multiple problems with getting the regular operating system to recognize the 'mounted' drives)
Re: (Score:2)
Yes. Because some people think that TrueCrypt was killed BECAUSE it was actually secure and the NSA wanted them to de-secure it.
As such, a warrant would let people continue to use it, secure in the fact that it actually works as required.
It also lets people fork it.
Frankly, I have been severely disappointed with BestCrypt, which I had hoped would end up as the replacement for TrueCrypt. (multiple problems with getting the regular operating system to recognize the 'mounted' drives)
Given that the authors were anonymous is it postulated that the NSA hunted them down and was ready to doxx them for not complying? What leverage could they possibly have had?
Leverage? (Score:2)
Re: (Score:3)
If they lived in the U.S., it would be comply or go to prison. If they lived outside the U.S., work for us or GITMO baby! Even if the NSA couldn't actually enforce it, the current nebulous state of U.S legal enforcement powers would make anyone with a bulls-eye on their head nervous.
Re: (Score:2)
It's likely that people working on a project like TrueCrypt that the earlier arguments are not going to work. They know bad people are going to use their software. It's just comes with the territory. It's impossible to build a tool that will help a free speech activist in China (or the U.S. for that matter) and not be able to help an Al Qaeda group or child pornographers. It starts to get a whole lot harder to keep to your principles when your freedom and personal safety start being threatened. That's
Re: (Score:2)
It is far easily to hide a well engineered backdoor than it is to find it. No matter how good the auditors are and even if you 100% trust them there is no way they can be certain to uncover a backdoor if one exists. At this point with the exiting statement of the developers only a fool would trust Truecrypt with anything important.
Re:Um, (Score:5, Interesting)
[Backdoors are hard to find.] At this point with the exiting statement of the developers only a fool would trust Truecrypt with anything important.
Let's see: only a fool trusts things that actively lose data. (ie, bitrot, or email systems used by important people. If it's important, have 2+ independent copies)
So let's posit that TC is "sane", that it doesn't actively corrupt your data (Actual disk bitrot is another matter.)
Is it secure? (Ignoring keyloggers, CPU tampering, OS-file I/O interception, not to mention on-bus DMA controllers that have direct access to physical memory, and other out of band things? You could argue they need to detect this but they aren't an A/V vendor and you do halfway have to trust your hardware. Oh, visit CC PIN hacking via a IR camera [technologyreview.com] to see your hardware "betray" you.)
Well, given a correct encryption key, things work correctly; given seemingly any incorrect key, things don't -- a very good start. So they need to protect the working in-memory key (because it's game-over if not.) They erase it if enough idle time has passed and try to keep it from being swapped out to disk. Process memory isolation is great, but in both cases the OS itself can do whatever it wants. So you have to trust the OS, at least a bit.
So, what everybody actually means: is the encryption secure? Can someone who doesn't know my password read my data due to stupid password handling, bad encryption routine choices (ROT-26), or leaky code of good routines? (Say perfect AES file encryption, but the unencrypted source file moved to the recycle bin, never mind about any corruptible buffer or stack overflows. [That's an example; TC doesn't encrypt single files.] ) Are there password collisions, ie password are actually case-insenstive? or silently truncated after 2 characters?
I suspect that you're (humans) the weakest link because of the XKCD wrench [xkcd.com], an easily guessed password [networkworld.com], or your likes/habits that could lead to your password. If you can't type your password it's not going to work, and you have to remember how to type it.
It seems to boil down to do you trust the vendor to act in good faith every step of the way? Let's see: -anonymous vendor, +access to source code that compiles to the released binary, +routine usage that makes sense, +updates over time, -weird final message. Personally, i trust them more than MS's native BitLocker, which is sane but has a (understandable) business-released AD key recovery function. (It's not your data but the companies, and they have keys to continue read it.) But is BL actually secure? Dunno, can't tell; we have to trust MS completely on that.
If it (TC v7.1) was good to use the day before sunset, it was good to the use day after too, until known problems arise or non-OS support kills it. But YMMV -- trust whom you see fit. So being curious: what are you using, if not TC?
Re: (Score:1)
Given the developers last message we actually DON'T know if 7.1 was safe to use or not. That is the issue, they could have been compromised well before then, they could have discovered a systemic weakness or they could have just had enough of developing it Without that information you don't have a fucking clue whether it was safe at sunset the day before or completely fucked over by the NSA already.
Re: (Score:2)
Re: (Score:3, Interesting)
we DON'T know if 7.1 was safe to use or not.
Isn't that kinda the point of a security audit?
Really, my personal tin-foil take (and I know actually know, I'm just guessing from the reported results and my internal biases) is that the TC authors were "given an offer they couldn't refuse" and forced to hand over the control of the website and code signing keys to someone else.
THAT they did -- but they were not told NOT trash the brand beforehand. So in my happy little fantasy world they put that weird final notice and gleefully handed over the control keys to the code, knowing that no one would ev
Re: (Score:2)
No matter how good the auditors are a security audit doesn't guarantee no vulnerabilities or back doors. If it did software security would be a lot easier.
Re: (Score:2)
Re: (Score:3)
Maybe the first clue should be the "other encryption tools" they urged people to use? "Don't use this open source tool, use a closed source tool from Microsoft located in Redmond, Washington, US - home country of the NSA." You can not take that message seriously, it's so absurd that the only purpose of it would be to utterly destroy their credibility. So far we're in agreement. There are three cases where they might do that:
1. There is already a backdoor and they've been under a gag order for years, but dec
Re: (Score:2)
And hope the auditors haven't gotten a compelling visit themselves.
Hmm? (Score:3, Insightful)
Are these auditors trustworthy? At least if it's crowdsourced it's an open process.
Re:Hmm? (Score:5, Funny)
They are the most trustworthy auditors the NSA, CIA, FBI, and the PTA could find.
Re: (Score:1)
I know I've been holding out for crypto endorsed by the PTA.
Re: Hmm? (Score:1)
Finally, someone is thinking of the children.
Re:Hmm? (Score:5, Interesting)
The suddenness of the TC team's departure (having throughout TC's history promised never ever to have any backdoor) coupled with the U.S. gov (FBI)'s inability to crack a South American's business computer after a full year of trying, suggests that their departure was a consequence of U.S. government pressure.
Recent disclosures by Mr. Snowden (Feb, 2015) make it clear that more than mere analysis of the TC code is necessary: the NSA's newly discovered ability to implant code-compromising elements in devices' firmware suggest just how difficult it might be for any analysis to confirm that TC is secure. TC could be perfect, but .. but their analysis must address topics beyond the TC code itself.
if HD firmware is able to read and share passwords then clearly much more work has to be done. I'm proud to have helped the crowd-source effort and wish this new team well
Re:Hmm? (Score:5, Interesting)
... TC could be perfect, but if HD firmware is able to read and share passwords then clearly much more work has to be done ... their analysis must address topics beyond the TC code itself.
I disagree. Taking your point to its logical conclusion, the TC auditors should audit every computer on Earth, and all the software running on those computers.
.
That is very clearly beyond the scope of auditing TC.
I do think the TC auditors should publish a caution of some sort about ~the computer that runs TC~ but beyond that, it would be out of scope.
Re: (Score:2)
Why bother? The NSA has already done most of them. It would just be redundant.
Re: (Score:2)
Re: (Score:2)
The suddenness of the TC team's departure (having throughout TC's history promised never ever to have any backdoor) coupled with the U.S. gov (FBI)'s inability to crack a South American's business computer after a full year of trying, suggests that their departure was a consequence of U.S. government pressure.
I'm not saying it didn't happen, because nobody knows. But the connection to the US Government here is only as strong as the connection to the Chinese Government, the Russian Government, or the Martian Government. You're just waving your hands while being anti-American.
Re: (Score:2)
Passwords never touch the HDD so there is no way the firmware could read them. They are only ever stored in ram, along with the decrypted keys.
The real danger is that the firmware sends a rootkit instead of the real MBR at boot time, and compromises the whole OS.
Re: (Score:3)
That isn't an either-or thing, more like belt and suspenders. Having crypto-experts review it reduces the risk of subtle compromises going unnoticed, having the general public review it reduces the risk of the reviewers being compromised. To be honest though, I feel the value of a crowdsourced review would be really low. I expect an NSA backdoor to be subtle and highly unlikely to be found by a casual review by developers not particularly specializing in security and code audits. On the other hand it can't
Re: (Score:1)
I'd rather trust the guys who get paid to do this every day rather than the armchair psychoanalysts who think it's easy to roll your own crypto. Maybe if among the crowd were Rijmen, Schneier, Shamir, Merkle, Diffie, Elgamal, Biham, et al I'd trust the crowd.
And Theo de Raadt.
Re: (Score:3)
It's already opensource?
TrueCrypt is not open source software. (Score:5, Interesting)
TrueCrypt isn't open source software, in spite of the author incorrectly claiming it is. More detail is here, which the author could have learned in 2 minutes of Googling: http://en.wikipedia.org/wiki/T... [wikipedia.org] ... for your amusement, I have quoted it below:
TrueCrypt was released under the "TrueCrypt License" which is unique to the TrueCrypt software. It is not part of the pantheon of widely used open source licenses and is not a free software license according to the Free Software Foundation (FSF) license list, as it contains distribution and copyright-liability restrictions. As of version 7.1a (the last full version of the software, released Feb 2012), the TrueCrypt License was Version 3.0.
Discussion of the licensing terms on the Open Source Initiative (OSI)'s license-discuss mailing list in October 2013 suggests that the TrueCrypt License has made progress towards compliance with the Open Source Definition but would not yet pass if proposed for certification as Open Source software.
According to current OSI president Simon Phipps:
As a result of its questionable status with regard to copyright restrictions and other potential legal issues, the TrueCrypt License is not considered "free" by several major Linux distributions and is therefore not included in Debian, Ubuntu, Fedora, openSUSE, or Gentoo.
The wording of the license raises doubts whether those who use it have the right to modify it and use it within other projects. Cryptographer Matthew Green noted that "There are a lot of things [the developers] could have done to make it easier for people to take over this code, including fixing the licensing situation", and speculates that since they didn't do those things (including making the license more friendly), their intent was to prevent anyone from building on their code in the future.
End of life and license version 3.1
The 28 May 2014 announcement of discontinuation of TrueCrypt also came with a new version 7.2 of the software. Among the many changes to the source code from the previous release were changes to the TrueCrypt License — including removal of specific language that required attribution of TrueCrypt as well as a link to the official website to be included on any derivative products — forming a license version 3.1.
On 16 June 2014, the only alleged TrueCrypt developer still answering emails, replied to an email by Matthew Green about the licensing situation. He is not willing to change the license to an open source one, believes that Truecrypt should not be forked, and that if someone wants to create a new version they should start from scratch.
Re: (Score:1)
On 16 June 2014, the only alleged TrueCrypt developer still answering emails, replied to an email by Matthew Green about the licensing situation. He is not willing to change the license to an open source one, believes that Truecrypt should not be forked, and that if someone wants to create a new version they should start from scratch.
While in principle I usually agree that the authors copyright should be respected, if the author is going to remain anonymous then how is he/she going to sue for an infringement? I suppose they could sell/pass the rights to TrueCrypt to a non-anonymous entity that could turn up in court and sue. Otherwise, I respectfully suggest that the anonymous developer fuck off.
Re: TrueCrypt is not open source software. (Score:1)
How can they prove that I am not a member?
Pissed off developers needed money (Score:2, Interesting)
I suspect Truecrypts real fate was the fundraising for it. Truecrypt promoted donation for it on their website to continue development. I was tempted to donate a big wad of cash, but only after audit.
The fundraiser for the AUDIT of Truecrypt got a lot more money than the fundraising for Truecrypt, I suspect, and so the developers said f*** it and pulled the plug in disgust.
Fair enough, their work deserved money and they weren't getting it.
Re: (Score:1)
This use of the term "open source" to describe something under a license that's not only unapproved by OSI but known to be subject to issues is unacceptable.
I find it rather unacceptable that some douches want a monopoly on the definition of "open source." In common language it means absolutely nothing more than "the source code is open," in other words, you can download it and look at it. So TrueCrypt absolutely is open source software.
I'm also rather annoyed that they claim it isn't possible to continue development simply because it isn't possible to strip TrueCrypt of its license and apply the GNU GPL instead. Fuck the GPL, just use BSD code or code under
Re: (Score:2)
Seriously, talk about pretentious - sorry, OSI you don't get to decide who gets to use the term, you don't have a monopoly on it.
Words have meanings (Score:2)
The vast majority of people who use the term "open source software" use it with roughly the same meaning as OSI does, which is all that matters. You can confirm this with a quick Google search. Also, note that many organizations that require something to be be "open source software" will point to the OSI definition.
By the commonly-used definition of "open source software", you MUST be able to fork the project and maintain your own version. You cannot legally do that with TrueCrypt, therefore, by defin
Re: (Score:3)
Did you even read the link you posted? It merely backs up exactly what I said - they don't have a monopoly on it, but claim to anyway.
The fact that it is a global non-profit doesn't give them exclusive rights to the term.
The fact that they support and promote the open source movement doesn't give them exclusive rights to the term.
The fact that they maintain a definition which they created doesn't give them doesn't give them exclusive rights to the term.
The fact that they maintain a list of licenses which c
Re: TrueCrypt is not open source software. (Score:1)
You're confusing open source and free. It's open source. You can see the source, use it, change it, and fork it, subject to certain restrictions, such as not calling it Truecrypt. Seems fair to me. Whether this or that distro has decided to include it or not tells us very little.
Re: (Score:2)
As a result of its questionable status with regard to copyright restrictions and other potential legal issues, the TrueCrypt License is not considered "free" by several major Linux distributions and is therefore not included in Debian, Ubuntu, Fedora, openSUSE, or Gentoo.
While this is true of the others, it is not true of Gentoo [gentoo.org]. Gentoo's policy [gentoo.org] seems to be that while the base system should not depend on non-FOSS components, having them present in the main tree is fine. (This might be partly because it's pretty easy to filter which licenses you want on your system using ACCEPT_LICENSE.)
Uh, ALL those companies are NCC (Score:3, Informative)
"Instead, phase two of the audit will be handled by Cryptography Services, a team of consultants from iSEC Partners, Matasano, Intrepidus Group, and NCC Group."
Uh, all those companies *are* NCC Group. They've got some fantastic talent, but it's a bit of an odd way of putting it. NCC owns iSEC Partners, Matasano and Intrepidus.
Really Glad to see this (Score:5, Interesting)
I really would like to see Truecrypt live and usable again. Just in terms of having a great and useful interface/featureset Truecrypt was and hopefully will again be the best crypto out there. Assuming it audits well of course.
Truecrypt inside BTsync would be amazingly powerful.
Pug
Re: (Score:2, Informative)
I assume you've seen VeraCrypt and CipherShed? I know VeraCrypt fixed some of the issues highlighted by the first part of the TrueCrypt Audit.
Re: (Score:2, Funny)
Draw a Venn diagram and find out who's in the intersection of "feds" and "meth heads", then save yourself the trouble by handing over your data to them.
Riiiiight. (Score:5, Insightful)
So an audit performed by a closed group of corporates who have, no doubt, been thoroughly vetted and has never, ever, ever gotten a phone call from anyone in a suit offering them the choice of a bag of cash to play ball, or an increased probability of "accidents" and "unfortunate data leaks."
Given the farewell address we got from the TC devs, which I'm sure most of us remember, and the laughable suggestions of "alternatives," there are two strong possibilities for why the project was shuttered:
1. The developers all suffered a massive psychotic break at the same time.
2. A canary so big and obvious that it's more of a "warrant roc."
They may have ended the "silence", but the "uncertainty" is still alive and well, AFAIC.
What it really reveals (Score:3)
This is good, or bad, depending on the tightness of your tin foil, but I think it reveals something far more important about encryption: we, the average users, are powerless to verify or truly trust any encryption solution offered. To realize that an audit of the code for a single-purpose program can only be done by a very small set of people shows that even with open source we're still just trusting others to safeguard our data. The need for encryption and the mathematical and coding complexity required to understand what we are using to safeguard our data is simply beyond our ability to check that it even makes sense at a basic level.
I'm not so sure I welcome our mathematical overloads.
Re:What it really reveals (Score:4, Insightful)
This is good, or bad, depending on the tightness of your tin foil, but I think it reveals something far more important about encryption: we, the average users, are powerless to verify or truly trust any encryption solution offered. To realize that an audit of the code for a single-purpose program can only be done by a very small set of people shows that even with open source we're still just trusting others to safeguard our data. The need for encryption and the mathematical and coding complexity required to understand what we are using to safeguard our data is simply beyond our ability to check that it even makes sense at a basic level.
We - even IT power users and programmers - are mostly powerless to verify not only encryption programs, but the underlying OS as well. As Shutterworth said, if you use our OS, you have to trust us, because we have root [stackexchange.com].
Re: (Score:2)
Re: (Score:1)
You're naive at best to disagree and use that as a supporting argument. Even you are aware you are trusting source you aren't verifying ( and very few are, or capable)
Re: (Score:2)
I had a high-security scenario ... [and] was happy enough that everything was traced back the sources enough to make me feel secure.
So you've compiled "everything" from source code? Then you're all good to go -- the code will be exactly what the compiler produced, but NOT necessarily what the source code actually says.
Huh? See Reflections on Trusting Trust [bell-labs.com], from back in the pre-NSA days where one special guy could easily log into any Unix system: "I could log into that system as any user."
He's not BSing or joking, either.
Re: (Score:2)
Luckily for us, we were inside your ISP's server, and were able to provide you with corrected hashes from the official URLs, and various original sources. It wasn't that hard, we just dumped your traffic through special transparent proxies that scanned the response data for the original hashes, and replaced them.
True, you didn't built everything from source, but you were happy enough that everything traced back to "the" sources to make you feel secure. That's a lot more protection than anything from a com
Re: (Score:2)
Re: (Score:2)
Nonsense, it is absolutely do-able to have a realistic understanding of your actual security. The impossibility of secrecy does not refuse the usefulness of true information.
And I agree, there are few things more secure than the best available open offerings. But well financed law enforcement and security agencies are outside of that security. That the attack vectors are not revealed as such in the media is meaningless when the necessary capabilities are know to be possessed by them, and where their tactics
Re: (Score:2)
Ah, yes. We know that know nothing, so have learned everything. Wait, we know everything, so we learned nothing. Nonono. We learned everything, so we know nothing. Okay, okay, we know nothing, and... and... I keep getting stuck right there. We can't really know anything. Encryption relies on trust, and nothing can be trusted. There is no method of verification of anything.
The obvious implication is that encryption cannot protect us from over-arching conspiracy. You don't need to adjust your tin foil for tha
Re: What it really reveals (Score:1)
No different to trusting Microsoft to not send your data to them when they do virus checks. You're already trusting the food chain, doctors etc.
Re: Truecrypt's dead; what do you recommend then? (Score:1)
DiskCryptor - open source
https://diskcryptor.net/wiki/Main_Page
Unlike TrueCrypt which was designed for empty volumes, diskcriptor was designed from the ground up for data volumes in use and is therefore faster. Vera is based on TC and uses the same lame, imho, emtpy volume theme.
vera crypt (Score:1)
I installed VeraCrypt on a new laptop. It took a few minutes to mount a volume (160GB on SSD). I uninstalled VC and installed TrueCrypt. Maybe NSA can decrypt it - I don't care - I'm not their target. I just need to protect projects that I'm working on from laptop thieves.
Re: (Score:2, Insightful)
Maybe NSA can decrypt it - I don't care - I'm not their target.
Don't be silly. You are their target. Everyone that fits into one of these groups is a target:
1) Not an American citizen
2) Is an American citizen
Re: (Score:2)
Re: (Score:1)
I suspect you're being sarcastic, but the reason for the delay VeraCrypt has when opening TrueCrypt volumes is explained on the VeraCrypt website. VeraCrypt uses different encryption algorithms and parameters by default than TrueCrypt, and there's no way to "detect" what algorithm was used. Therefore, VeraCrypt first tries to use its default algorithms to open the encrypted volume. If that fails, then it iteratively tries other combinations of encryption algorithms until it finds one that works. It takes it
Re: (Score:2)
Another rumor ... (Score:1)
... was that TC was actually developed by the NSA. There's a webpage somewhere arguing for this, partly based around the insistence on anonymity by the (allegedly 3 only) developers, on the dubious code provenance and suspicious registration of the "Truecrypt Foundation", and also on the shear amount of work it would take to put out TC releases across three platforms and keep these tested and maintained. This would normally take a small company of developers to produce, certainly more than three.
When the
Re: (Score:2)
Once trust is lost, you can't get it back. There is no way to trust the people who are telling you to trust the audit. NSA *could* be anywhere. That doesn't mean that they ARE anywhere, but I can't see any way to trust any software or audit process. (unless you are one of the extremely rare people who can personally audit the code).
If you had a piece of code that *you* knew was completely secure, how could you convince me of that?
Re: (Score:3)
Once trust is granted, all is lost.
One credulous enough to grant trust today, might renew that trust tomorrow.
If you think trust was lost because "NSA," then you might just be credulous enough to be convinced. Maybe not by me, but by a person commanding enough resources and enough parallel constructions that relate to your own life.
The only way not to be deceived by trust is not to trust. Trust lost is trust longing to be re-found.
Wrong order (Score:2)
First clean room reverse engineering / rewriting
Then an audit of the resulting tool. What does a audited truecrypt help, if you cannot continue developing it, because the license is nonfree? So first rewrite it with a free license.
Re: (Score:2)
except no serious software developer will start an project with a new licence based on abandonware.
Repost. (Score:1)
I think /. needs to audit their own posts ...
http://it.slashdot.org/story/1... [slashdot.org]