Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption

First Phase of TrueCrypt Audit Turns Up No Backdoors 171

msm1267 (2804139) writes "A initial audit of the popular open source encryption software TrueCrypt turned up fewer than a dozen vulnerabilities, none of which so far point toward a backdoor surreptitiously inserted into the codebase. A report on the first phase of the audit was released today (PDF) by iSEC Partners, which was contracted by the Open Crypto Audit Project (OCAP), a grassroots effort that not only conducted a successful fundraising effort to initiate the audit, but raised important questions about the integrity of the software.

The first phase of the audit focused on the TrueCrypt bootloader and Windows kernel driver; architecture and code reviews were performed, as well as penetration tests including fuzzing interfaces, said Kenneth White, senior security engineer at Social & Scientific Systems. The second phase of the audit will look at whether the various encryption cipher suites, random number generators and critical key algorithms have been implemented correctly."
This discussion has been archived. No new comments can be posted.

First Phase of TrueCrypt Audit Turns Up No Backdoors

Comments Filter:
  • Wow (Score:4, Informative)

    by cold fjord ( 826450 ) on Monday April 14, 2014 @07:00PM (#46751747)

    Wow, a code audit. What a great idea for a FOSS project. [openbsd.org]

    • by Anonymous Coward

      Difference between an Internal and External audit.

  • by Anonymous Coward

    just important this audit is...

    • Re: (Score:3, Informative)

      Hard to understate

      It's not really important at all.

      There, that was easy.

      Or, assuming the AC meant "overstate":

      Without this audit the lives of every person on this planet are doomed to end in fiery death when the Earth plummets into the Sun in 2017!

      Also easy.

  • Technically, if an NSA backdoor existed in the codebase, you would be prevented from reporting it by an NSA letter, subject to immeadiate imprisonment and confiscation.

    So, what we can say is that it's clean, insofar as they are permitted to report.

    Verify, then trust.

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      Technically, if an NSA backdoor existed in the codebase, you would be prevented from reporting it by an NSA letter, subject to immeadiate imprisonment and confiscation.

      So, what we can say is that it's clean, insofar as they are permitted to report.

      Verify, then trust.

      "Finally, iSEC found no evidence of backdoors or otherwise intentionally malicious code in the
      assessed areas" - so I guess they are permitted to lie.

      • by 2fuf ( 993808 )

        > I guess they are permitted to lie

        one doesn't need permission to do it anyway

    • by Anonymous Coward

      ITT: People who (a) don't know how US law actually works and (b) assume that everyone in the world is bound by US law.

    • by masonc ( 125950 ) on Monday April 14, 2014 @07:31PM (#46751955) Homepage

      The code is being audited in America. That's pretty funny.
      How about an audit in a country where the NSA cannot tell the auditors to shutup?

      • The code is being audited in America.

        Is there something preventing an audit elsewhere? Is it illegal to send the source code overseas? And how are these audits done? There aren't a lot of details in TFA. Is it like a big Wiki where anybody can look at the code and report what they find, or are the auditors vetted with specific sections assigned them?

        I'm asking seriously. I'm not a developer, so I don't know. But I worry about security and snooping.

        • by Threni ( 635302 )

          The source code is available here:

          http://www.truecrypt.org/downl... [truecrypt.org]

          Nothing to stop anyone anywhere from looking. And I don't see how a " NSA letter " , even to someone in the USA, would stop them from exercising their first amendment rights and writing whatever they wanted, or from adding comments to the code and posting them somewhere, etc.

          • Thanks for the info. That's what I'd assumed, and hoped.

            So I'm not sure where this idea that these audits are "American only" or that there is something preventing someone from pointing out a vulnerability comes from.

            Generally, I trust stuff that has lots of eyes on it.

    • by techno-vampire ( 666512 ) on Monday April 14, 2014 @07:34PM (#46751983) Homepage
      Tell me this: if the NSA did put a backdoor in the package and if this audit found it, how would the NSA know about it in time to prevent it being reported? Sending a security letter to the auditors would just be considered proof that there was a backdoor to be hidden. The auditors may have been forced not to reveal anything about it to the general public, but you can bet that the people over at TrueCrypt would have found out about it and eliminated it as soon as possible, although they'd probably have had to pretend that they found the flaw themselves to protect both themselves and the auditors.
      • Um put an agent inside iSEC, although we know the NSA would above that. Spying is not their job.

      • The NSA was _able_ put in back doors. According to the report, the build environments were not safe enough and well enough controlled, or verified, to _prevent_ back doors. Given the NSA's strong interest in having one, and their level of skill, I'm afraid I'd have to assume that they did, indeed, create one. Whether a system that is at risk of such a back door is good enough for personal or even business is something you'd have to decide on a personal basis.

        It does seem a good step in the right direction f

      • NSA letters, if my occassional skimming on the topic is correct, are gag orders about themselves as well. There are apparently ways to legally respond in the public to these without revealing one has been received but it involves not talking rather than talking.
    • by vux984 ( 928602 ) on Monday April 14, 2014 @07:34PM (#46751989)

      Technically, if an NSA backdoor existed in the codebase, you would be prevented from reporting it by an NSA letter, subject to immeadiate imprisonment and confiscation.

      Two responses.

      First, I suspect if they were confronted with an NSL they could go the lavabit route and simply suspend the audit project with no explanation. IANAL but I don't think the NSA can compel them to falsify the audit results.

      Second, if they are smart, they can have it audited multi-nationally with independent auditors to make it harder for any government gag orders to stick.

      • by Charliemopps ( 1157495 ) on Monday April 14, 2014 @08:14PM (#46752243)

        The problem with the NSA is we have no idea what their capabilities are, technologically or legally. They are clearly violating the constitution already and there seems to be no one willing or capable of stopping them. So if they did come to you with a NSL, no matter how ridiculous or unconstitutional it was, what choice would you have? You could go to the media, but how embedded in the media are they? Do they have standing NSLs with all the media organizations out there? You could go outside the country, but those newspapers are government by their own countries version of the NSA who's working in close relationship with ours. This really is a Global totalitarian secret police state. They haven't started herding people into camps or anything, but really... what's to stop them?

        • by vux984 ( 928602 ) on Monday April 14, 2014 @08:26PM (#46752325)

          Do they have standing NSLs with all the media organizations out there?

          I think there'd be less Snowden leak coverage if there were. :)

          You could go outside the country, but those newspapers are government by their own countries version of the NSA who's working in close relationship with ours

          Like China & Russia? Governements want their own security as much as their own intelligence agencies want to break it... there's too many pieces moving in opposite directions for there to be a credible global coverup of a transparent audit of open source software.

        • Oh just post it on Slashdot. We'll do the rest.

        • The problem with the NSA

          The problem with the NSA is the same as all other problems: They Exist.

          Government agencies have long since proven they can't be trusted with secrecy. [wikipedia.org] A secret oversight committee just moves the problem around.

        • by Anonymous Coward

          ...what's to stop them?

          Fear of American Citizens who have not yet disarmed.

        • . They are clearly violating the constitution already and there seems to be no one willing or capable of stopping them.

          They are only "violating" the cartoon version of the constitution. The real Constitution is doing ok, at least for this issue.

          This really is a Global totalitarian secret police state. They haven't started herding people into camps or anything, but really... what's to stop them?

          Do you have any links to info about those "FEMA death camps" you care to share?

        • The problem with the NSA is we have no idea what their capabilities are, technologically or legally.

          Well, if you read all NSA-related legislation, you should have a good idea of their LEGAL capabilities are.

          Which, unfortunately means reading basically ALL legislation passed since NSA was founded, since a rider could have been inserted into unrelated legislation quite easily.

          They are clearly violating the constitution already and there seems to be no one willing or capable of stopping them.

          There are sever

        • "They haven't started herding people into camps or anything" Uhmmm... never heard of Gitmo?
    • If you find a back door, you publish it IMMEDIATELY, and let the NSA found out that you know about it by reading it on Google News.

      Their security letter doesn't do much at that point.

      What do you do, find the bug and then go ask them if you can publish it?

      • No, you think you've found a possible security hole and you email your friend Mike and ask him to look over it and see what he thinks. The NSA intercepts the email, and immediatly sends you the security letter.

    • [Citation Needed]

      Seriously, where do you get this? You aren't allowed to disclose that you got certain specific requests. Where do you get from there to a ban on free speech in general?

  • Since Snowden's revelation about the NSA's clandestine $10 million contract with RSA,
      I hope that as well as checking that the code implements some known encryption algorithm properly, that they also confirm that the algorithm itself is mathematically unadulterated (by the NSA or whoever).

    • Re:also (Score:5, Insightful)

      by Shakrai ( 717556 ) * on Monday April 14, 2014 @07:32PM (#46751971) Journal

      Since Snowden's revelation about the NSA's clandestine $10 million contract with RSA,

      If you're on NSA's radar you've got bigger problems than TrueCrypt's trustworthiness or lack thereof. The NSA doesn't have to have a back door into AES (or the other algorithms) when they have an arsenal of zero day exploits, side channel attacks, social engineering, and TEMPEST techniques at their disposal. The average user should be far more concerned about these attack vectors (from any source, not just NSA) than the security of the underlying encryption algorithm.

      The Diceware FAQ [std.com] sums up the problem rather succinctly: "Of course, if you are worried about an organization that can break a seven word passphrase in order to read your e-mail, there are a number of other issues you should be concerned with -- such as how well you pay the team of armed guards that are protecting your computer 24 hours a day."

      • Re:also (Score:5, Interesting)

        by rahvin112 ( 446269 ) on Monday April 14, 2014 @08:01PM (#46752169)

        Oh hell, they'll just sneak into your home in the middle of the night and plant a hardware bug or key logger into your computer.

        One of their favorite tactics used by law enforcement is to install cameras in your residence facing where you normally use your computer. They got a child pornographer like this, his use of true crypt didn't help because they had video of him entering the password and simply entered the password once they seized the computer.

        True Crypt cannot reasonably protect you from law enforcement nor state sponsored spying like the NSA. It might protect you from some non-tech police agency in some shit hole country being able to access it but then they just use the standard non-tech password extraction method.

        Obligatory XKCD. http://xkcd.com/538/ [xkcd.com]

        • by phorm ( 591458 )

          One of their favorite tactics used by law enforcement is to install cameras in your residence facing where you normally use your computer

          At that point I'm pretty sure there should be a warrant involved...

          • There was, that's the entire point. You can't win against the state. The state can take action by force, the warrant is a check on that system but regardless no matter what you do and the technical precautions you take the state, if patient and cautious, can easily acquire the information to breech those protections. It can range from the camera put in you house to the $5 wrench. Those advocating for true crypt to protect you from the state are simply wrong that it can protect you.

            • by phorm ( 591458 )

              Well, here's the thing. They had enough on the guy to get the warrant to plant the camera. No encryption (or in the case of heartbleed, broken encryption), and they can likely find ways to snarf all that information without a warrant, in which case it could (more easily) become a case of "find people fitting profiles we don't like, then sift through all this information and look for something that sticks"

      • Re:also (Score:4, Insightful)

        by Kjella ( 173770 ) on Monday April 14, 2014 @10:21PM (#46752881) Homepage

        If you're on NSA's radar you've got bigger problems than TrueCrypt's trustworthiness or lack thereof.

        In case you've been sleeping under a rock for the last year, the target of the NSA is everyone. Not that they put you on the same level as the Chinese military of course, but nobody's under their radar and if they can grab your data or metadata easily they will because you could be a terrorist or at least the friend of a friend of a friend of a terrorist. It's not that the average joe would stand a chance if they threw everything in their arsenal at us, but those "zero day exploits, side channel attacks, social engineering, and TEMPEST techniques" don't come free and using them highly increases the chances of exposing them. The question is more like "Does NSA grab all the TrueCrypt containers used as backup on Dropbox/GDrive/whatever and rifle through everyone's data?" than "If the NSA really wants the contents of my laptop, would this really stop them?"

        • The NSA doesn't target anymore than a fisherman targets every tuna.

          They are doing a dragnet, if you become a person of interest ... THEN they have this big collection of data on you to use, but before that, you're just another random datapoint that they aren't expending resources on ... or wasting their precious exploits on.

        • by Shakrai ( 717556 ) *

          The metadata argument wears thin on me. If my phone number is two or three levels removed from a terrorist I really don't see why it's objectionable that the Government take a precursory look at my call logs. They'll quickly find that I'm a rather boring sort, whose connection with the terrorist was likely limited to ordering the same take out, and my privacy isn't significantly impacted by having someone review my call logs after obtaining a court order.

          Traditional police investigative techniques would

          • And when somebody at NSA examines or leaks your metadata, and your wife finds out about the emails to your mistress, or your employer finds out about the emails to a competitor about possibly taking a job there, or somebody finds out about your emails to a $DISEASE support group, or your fondness for Albanian furry porn, no matter how legal, you may have problems, or at least embarrassment.

            If your call logs are actually secure barring a court order, you're fine. If they leak (something like LOVEINT or a

            • by Shakrai ( 717556 ) *

              That argument is bogus, insofar as an employee at Verizon could just as easily leak my call logs, yet few people take exception to Verizon storing such data.

      • Snowden basically walked out of the NSA with all their secrets; who's to say a few dozen or hundred other contractors didn't do the same thing before him? Everything the NSA knew or had access to before 2013 was most likely available in blackhat circles through clandestine leaks.

        Any backdoors in TrueCrypt would be a security disaster, and the NSA has already proven itself willing and able to put backdoors in highly trusted security software. It's also proven itself incapable of keeping secrets.

        Worrying abou

  • by FuzzNugget ( 2840687 ) on Monday April 14, 2014 @07:27PM (#46751929)
    This is why open source is so important.
    • by jones_supa ( 887896 ) on Tuesday April 15, 2014 @01:42AM (#46753837)
      No. This is why thorough code audits are important.
      • by AmiMoJo ( 196126 ) *

        Yes, but who will audit the audit? Because it is open source we can meta-audit, much like how Slashdot meta-moderates. Otherwise the audit would be useless to us, much like a corporation paying for an audit of itself and presenting that to the public as proof of its good work.

        • Yes ... you can meta-audit ... how's OpenSSL working for you?

          Open source is only useful if someone looks AND has the skills to understand it.

          Just being open source doesn't mean dick and you fanboys really should get that through your head. You all stand around waxing on about how 'many eyes' see it ... assuming SOMEONE ELSE is looking ... and no one actually is because ... because ... 'its open source! anyone can look!!@$!@%!@%&'

          When are you guys going to actually come back to reality. OSS is great fo

      • Yeah, because it's so easy for the public to audit closed-source software.

    • This is why open source is so important.

      How so? TrueCrypt is neither Open Source or Free Software. It's freeware (ie. proprietary).

      • T r u e C r y p t Free open-source disk encryption software for Windows 7/Vista/XP, Mac OS X, and Linux

        How so? TrueCrypt is neither Open Source or Free Software. It's freeware (ie. proprietary).

        Right, TrueCrypt is not "Open Source", it's "open-source".

  • by Anonymous Coward on Monday April 14, 2014 @07:30PM (#46751945)

    The first phase of the audit focused on the TrueCrypt bootloader and Windows kernel driver. Not really surprising that they didn't find any critical security issues in those parts. The high value bugs should be in the crypto parts and how they are implemented.

    • by epyT-R ( 613989 ) on Monday April 14, 2014 @10:52PM (#46753091)

      The crypto is implemented in the driver, as well as the bootloader. The application known as truecrypt just flips their configuration bits around, loads keys into ram, and tells the driver when to mount/dismount containers etc. The bootloader needs to know enough to mount the system partition and hook into BIOS so that the regular OS bootloader can take over using it's normal calls. Once it loads the kernel and related drivers, truecrypt.sys takes over handling container IO.

        The separate formatting utility probably contains some too since it's used to create containers..

  • by Anonymous Coward

    isn't it possible to just have your backdoor be inserted by the compiler ?

  • memset() is bad? (Score:5, Interesting)

    by Anonymous Coward on Monday April 14, 2014 @07:45PM (#46752055)

    I've been coding in C a long time and one of the medium security faults makes no sense to me:
    "Windows kernel driver uses memset() to clear sensitive data"
    The reasoning they give is:
    "...However, in a handful of places, memset() is used to clear potentially sensitive data. Calls to memset() run the risk of being optimized out by the compiler."

    WTF?!?
    I suppose a smart compiler can optimize out a memset() if it's directly preceeded by a calloc() or something, but I have never had any compiler ever just ignore my request to memset().
    What am I missing here?

    • Re:memset() is bad? (Score:4, Informative)

      by Anonymous Coward on Monday April 14, 2014 @08:17PM (#46752271)
      • Great article. Including the openssl bug(s) he pointed out...was expecting something esoteric but turned out to be really straightforward i.e. the type of error you make at 2am, taking the size of the pointer instead of the actual size of the buffer.

        • was expecting something esoteric but turned out to be really straightforward

          I think you failed to notice that the page talks about two separate bugs. In the first one, the memset() really is completely removed by optimization.

          the type of error you make at 2am, taking the size of the pointer instead of the actual size of the buffer

          I'd argue that's an error one might make any time of the day. The sizeof() operator is ambiguous. Consider the following example:

          #include <stdio.h>
          void main() {
          char a[100];
          char *b = a;
          printf("address of a is %p\n", a);
          printf("address of b is %p\n", b);
          printf("size of a is %lu\n

    • Re: (Score:2, Informative)

      by Anonymous Coward

      https://www.securecoding.cert.org/confluence/display/cplusplus/MSC06-CPP.+Be+aware+of+compiler+optimization+when+dealing+with+sensitive+data

    • Re:memset() is bad? (Score:5, Informative)

      by canajin56 ( 660655 ) on Monday April 14, 2014 @08:20PM (#46752295)

      As a special case, MSVC++ removes memset(array,value,sizeof(array)) if array isn't read again before the end of its scope.

      For example

      void Foo()
      {
      char password[MAX_PASSWORD_LEN];
      InputPassword(password);
      ProcessPassword(password);
      memset(password, 0, sizeof(password));
      }

      The MS compiler will delete the memset. In Windows you should use RtlSecureZeroMemory to zero out memory you want to keep secure.

    • Re:memset() is bad? (Score:4, Interesting)

      by philcolbourn ( 1150439 ) on Monday April 14, 2014 @08:22PM (#46752307)
      Say you store a password in a memory buffer. Use it. Then overwrite it with a call to memset. If this buffer is never used again, a compiler may think this is a wasted write and optimise-out this call to memset.
    • If you call memset on some allocated memory and then free that memory, what (apart from clearing sensitive data from physical RAM) functional difference does removing the call to memset make? None?

      • by rdnetto ( 955205 )

        If you call memset on some allocated memory and then free that memory, what (apart from clearing sensitive data from physical RAM) functional difference does removing the call to memset make? None?

        The longer the data remains in memory, the wider the window to read it via some other exploit. (Also, anything running as root could potentially access it.) This is precisely what happened with Heartbleed.

        • But the program performs functionally the same.
          That's the rule followed when doing compiler optimisations.

          memset has nothing to do with Heartbleed by the way, nor does any compiler optimisation.

          You also don't guarantee the original data is overwritten. If your application is paged out of RAM before the call to memset, when it gets loaded back in to RAM it can be pointing to a different physical memory location. You're now overwriting.... something completely different.

          • by rdnetto ( 955205 )

            But the program performs functionally the same.
            That's the rule followed when doing compiler optimisations.

            memset has nothing to do with Heartbleed by the way, nor does any compiler optimisation.

            The program will generate the same output yes, but the security implications are not the same.
            This is actually tangentially related to heartbleed - if the memory had been zeroed when freed, the scope of the exploit would have been greatly reduced, as only currently allocated blocks would have been vulnerable. Furthermore, the most common reason for using custom mallocs in security-critical applications is to do exactly that - to zero all memory immediately upon freeing.

            Zeroing memory like this is a common p [viva64.com]

            • This is actually tangentially related to heartbleed - if the memory had been zeroed when freed, the scope of the exploit would have been greatly reduced, as only currently allocated blocks would have been vulnerable

              The blocks holding the certificate private key are always allocated, so always vulnerable.

              This is completely incorrect. Until it is freed (or realloc'ed), the address returned by malloc will point to the same data, regardless of whether it is in the L1 cache, RAM, or paged to disk. Were this not the case, each program would need to implement its own MMU.

              So virtual memory is completely useless, because paging to disk doesn't free up the physical RAM or other processes?

              Perhaps you should have read the article linked in the article you linked. http://www.viva64.com/en/k/004... [viva64.com]

              There is SecureZeroMemory() function in the depths of Win32 API. Its description is rather concise and reads that this function overwrites a memory region with zeroes and is designed in such way that the compiler never eliminates a call of this function during code optimization.

              So don't use memset to zero memory.

              There is still the risk that another process reads data from RAM that another process was using, unless the OS zeros out the memory before allocating it.
              That's so

    • by fisted ( 2295862 )
      No idea why the paper talks about the compiler optimizing it out, that's obviously wrong. However, in the next paragraph, it reveals that swapspace is the reason. You might, after the page fault and swap-in, initialize the buffer via memset -- however this doesn't erase the previous data from swap space. Apparently, some "secure" memset-like routine does that.
    • WTF?!?

      WTF indeed.

      There seems to be a major trend towards making compilers create code that is as different as possible from what the programmer wrote without being so different that the programmer actually notices. One might assume it's a secret NSA plot to defeat security measures in all software everywhere. You know, if one was incredibly paranoid, that is.

      It's hard to say whether this is justified behavior. As an example, consider this code from a link an AC posted [viva64.com]:

      int
      crypto_pk_private_sign_digest(....)
      {

    • That and memset in windows doesn't zero by default, as an optimization, until the page is hit (or some such pattern that I don't fully recall)

      Theres a specific kernel API for zeroing memory because memset, even if called, may choose not to do anything. ZeroMemory is the generic way, SecureZeroMemory removes the 'option' to actually do the zeroing from the kernel and always does it.

      Using memset to scrub memory on Windows, then not doing anything with it that requires the memory to actually be in active use

    • What you're missing is that some compilers get very aggressive about removing code when optimizing. I don't have the C standard here, but the C++ standard says the compiler can do anything as long as it keeps volatile variable access and calls of I/O library routines the same, in the same order. This means that, if you have a chunk of memory and memset() it and nothing of that chunk is referenced for an I/O operation or volatile variable access, it can go.

      Whether this is a good thing is debatable. I'm

  • The next question to answer is: Can Heartbleed compromise True Crypt?

  • by kbg ( 241421 ) on Monday April 14, 2014 @08:52PM (#46752467)

    The backdoor is not in the source it is in the MVC++ compiler. NSA is not stupid, putting the backdoor in the source itself would be risky, it would be much wiser to put the backdoor in the MVC++ compiler itself.

    • One way to detect a backdoored compiler to a fairly high certainty is diverse double-compiling [dwheeler.com], a method described by David A. Wheeler that bootstraps a compiler's source code through several other compilers. For example, GCC compiled with (GCC compiled with Visual Studio) should be bit for bit identical to GCC compiled with (GCC compiled with Clang) and to GCC compiled with (GCC compiled with Intel's compiler). But this works only if the compiler's source code is available. So to thwart allegations of a backdoor in Visual Studio, perhaps a better choice is to improve MinGW (GCC for Windows) or Clang for Windows to where it can compile a working copy of TrueCrypt.
      • We know there's a difference between Windows containers and Linux containers, that being the ~64KB of random data at the end of the header for a Windows container instead of ~64KB of 0's in a Linux container.

        This difference is not a result of some difference in the source code of Truecrypt when compiled under Windows. Where could the backdoor be?

  • Crede quod habes, et habes.
    --
    I do not speak for the truth of foreigners.

For God's sake, stop researching for a while and begin to think!

Working...