Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security Encryption

TrueCrypt Cryptanalysis To Include Crowdsourcing Aspect 131

msm1267 (2804139) writes "A cryptanalysis of TrueCrypt will proceed as planned, said organizers of the Open Crypto Audit Project who announced the technical leads of the second phase of the audit and that there will be a crowdsourcing aspect to phase two. The next phase of the audit, which will include an examination of everything including the random number generators, cipher suites, crypto protocols and more, could be wrapped up by the end of the summer."
This discussion has been archived. No new comments can be posted.

TrueCrypt Cryptanalysis To Include Crowdsourcing Aspect

Comments Filter:
  • While we're on the topic of crowdsourcing and truecrypt, how about we get someone to rebuild it open sourced?
    • Re:Crowdsourcing (Score:5, Insightful)

      by cheater512 ( 783349 ) <nick@nickstallman.net> on Monday June 02, 2014 @05:04PM (#47149653) Homepage

      Why? It is already open sourced.

      • I think eedwardsjr meant "make it free software" even though she/he typed "open source"

        • Re:Crowdsourcing (Score:5, Interesting)

          by NReitzel ( 77941 ) on Monday June 02, 2014 @05:29PM (#47149857) Homepage

          Well,

          Since Truecrypt has decided to drop their project, and the project has been opensourced from day one, I'm going to suggest this is a good time for a fork.

          It would (will) be educational to see who goes to court to stop it.

          • Re:Crowdsourcing (Score:5, Interesting)

            by rahvin112 ( 446269 ) on Monday June 02, 2014 @06:49PM (#47150507)

            It open source but not FOSS.

            You can't fork it. The license is actually highly restrictive. The only options are a total reimplementation using the GPL or BSD license or to keep using the last version in perpetuity.

            • by Anonymous Coward

              Where do you get this? When I read the license it reads largely to be less restrictive than GPL 3.0. Section III of the license discusses exactly what is required to create derivative products. Basically, you have to make sure that no one will confuse it with TrueCrypt, you have to make the source available, and you can't change the license.

              The only problem I can see with it from the perspective of the people around here is that it wasn't spawned by Stallman.

            • Re:Crowdsourcing (Score:4, Insightful)

              by vux984 ( 928602 ) on Monday June 02, 2014 @07:41PM (#47150853)

              You can't fork it.

              Are you sure.

              The license is actually highly restrictive.

              Insofar as lawyers don't like the wording as its a bit ambiguous on some fine details; but its not as restrictive you seem to think.

              Moreover, for the license to actually be a problem someone would have to come forward, establish they actually have copyright standing, and then sue you over making a fork.

              So what realistically what are the risks? That some anonymous devs who shutdown the project and have advocated everyone switch to alternative systems is going to come out of the woodwork to sue you for copyright infringment and 'damages' despite your best efforts to follow their license which DOES actually allow for forking, and for which you wouldn't be charging for copies. So there are no profits to sue for then there is the acute impossibility of you 'damaging' their interests given they discontinued the original project and burned it to the ground.

              I honestly don't understand the fear. I mean sure there is a risk there, but if you incorporate a nonprofit, continue to give it away for free, and retain the terms of the license; the risk small.

              Even if the authors did come out of the woodwork, and sue you, so what? So your non-profit shuts down - worst case. More likely by far to just walk away with little more than a cease and desist and/or a small fine, and that's assuming the court even finds against you (which given the ambiguity of the license, and your attempt to adhere to it as best as possible) isn't all that likely in the first place.

              Yet, the lawyers say its 'highly restrictive' and 'dangerous' to anyone who goes near -- same lawyers who approved the non-compete clauses that now have silicon valley under a class action? Where was their sage advice about risk then?

              • Re:Crowdsourcing (Score:5, Informative)

                by xeoron ( 639412 ) on Monday June 02, 2014 @10:25PM (#47151833) Homepage
                As of last weekend, it is in the process of being forked. New community site here [truecrypt.ch]
            • by epyT-R ( 613989 )

              The law doesn't define reality. It's unlikely they will come forward to sue, so the license is just a letter telling us how angry they will be.

            • Re:Crowdsourcing (Score:4, Insightful)

              by Pieroxy ( 222434 ) on Tuesday June 03, 2014 @01:28AM (#47152531) Homepage

              Who is going to stop you? The authors are anonymous so who could claim to be the copyright holder to come and stop you?

            • by Anonymous Coward

              It was already forked at least twice. For example Realcrypt.

        • AFAIK, it is neither free software nor open source. If you cannot fork it, it's not open source, even if the source code is published.

          Open source is that which is open to read, modify, redistribute,... Not only open to read.

      • Why? It is already open sourced.

        The TrueCrypt source is also - by most accounts - a huge ungodly mess that hasn't seen a significant update in at least the past two years.

        • A lot of GNU tools haven't been updated in around two decades yet no one feels like they need to be rewritten.

          I was shocked to find out the other day that the cron most Linux distributions use was last updated in 1993.

          • by mpe ( 36238 )
            A lot of GNU tools haven't been updated in around two decades yet no one feels like they need to be rewritten.

            If it ain't broke don't try to "fix" it.

            I was shocked to find out the other day that the cron most Linux distributions use was last updated in 1993.

            How have the requirments of cron changed in lthe last 20, even 40, years?
            • Not much is broken with Truecrypt from the audit's initial results.

            • A lot of GNU tools haven't been updated in around two decades yet no one feels like they need to be rewritten.

              If it ain't broke don't try to "fix" it.

                I was shocked to find out the other day that the cron most Linux distributions use was last updated in 1993.

              How have the requirments of cron changed in lthe last 20, even 40, years?

              Where is my microsecond scheduling, you insensitive clod?

          • by Anonymous Coward

            A lot of GNU tools haven't been updated in around two decades yet no one feels like they need to be rewritten.

            I was shocked to find out the other day that the cron most Linux distributions use was last updated in 1993.

            And I am shocked that people have to reinvent the wheel over and over. Not to mention to skip regression checks and bring a new and 'better' version which lacks in features. There is a time when 'simple' tools are done and just do their job. IIRC tcpwrapper is on the same boat and being droppe

        • Re:Crowdsourcing (Score:5, Informative)

          by Kjella ( 173770 ) on Monday June 02, 2014 @07:19PM (#47150715) Homepage

          The TrueCrypt source is also - by most accounts - a huge ungodly mess that hasn't seen a significant update in at least the past two years.

          Not seen a significant update in at least two years, check. But huge, ungodly mess? Nah, 4.45 MB uncompressed, subtract 491 kB bitmaps and icons, 902 kB user guide, 117 kB license and readme texts in several versions, 250 kb string localization, 150 kB resource, project and solution files and you're talking approximated 2.5 MB code, divided into several logical directories. I skimmed the main files and they look decently formatted and commented, on the longish side but with plenty whitespace. I think probably under 100 kLOC total, a lot of it standard cryptographic primitives, installer, GUI and so on. Once you've made sure they don't contain any funny business the actual logical core seems to be more like 20-30 kLOC, quite manageable for one man to grasp.

          • by epyT-R ( 613989 )

            Yeah. it seems reasonably well done, compared with todays 50MB 'utilities' and their huge runtimes and crazy dependencies.

          • Re:Crowdsourcing (Score:5, Informative)

            by WaywardGeek ( 1480513 ) on Monday June 02, 2014 @10:23PM (#47151819) Journal

            It's actually just a bit over 110 kLOC, but you were close. The crypto code is mostly very good. The GUI code must have been written by someone else, because it totally sucks, IMO. I was just porting it to wxgtk3.0 today from wxgtk2.8, and of course all the crypto compiled without even a warning, other than some AES code I need to look into. The GUI was a freaking nightmare. They implemented their own string class. How stupid is that? Well, they didn't just implement a string class, but they implemented a directory string class, a filename string class, a "volume" string class, a "volume info" string class, and about a dozen other string classes, most of which don't actually have any useful functionality, and just require all kinds of casting operators. Stupid stupid stupid...

            I haven't looked at the firewall between the GUI and crypto code yet. Obviously there's a fuse driver in Linux and I would not expect it to link with the GUI code at all, but I need to check. Given that the crypto code rocks, and the GUI code sucks, it's critical that they be in separate processes. That would be needed in any case, since you can't trust all that GUI library code living in the same process as the crypto core.

            • by mpe ( 36238 )
              The GUI was a freaking nightmare. They implemented their own string class. How stupid is that? Well, they didn't just implement a string class, but they implemented a directory string class, a filename string class, a "volume" string class, a "volume info" string class, and about a dozen other string classes, most of which don't actually have any useful functionality, and just require all kinds of casting operators.

              Sounds like the GUI came from a completly different project. Possibly even on a different p
            • by g00ey ( 1494205 )
              What would you say about those who claim that the deniable encryption doesn't work because the parts of an encrypted volume that hold actual data has lower entropy than the parts that hold the random data? I cannot understand that claim since, as far as I understand it, encryption algorithms such as the AES uses probabilistic encryption [wikipedia.org] and should have as high entropy as random data. Usually high entropy data is associated with data that is hard to compress (especially when discussing lossy compression of v
              • Re:Crowdsourcing (Score:4, Interesting)

                by WaywardGeek ( 1480513 ) on Tuesday June 03, 2014 @06:31AM (#47153355) Journal

                From this security analysis [privacy-cd.org] there is a 64K-ish block in the header that is filled with random data in Windows, but encrypted 0's in Linux. There's no simple way to insure the Windows header is indistinguishable from true random data, but the Linux version should be OK. As for the rest of the unused portion of the volume, I haven't checked the code. If it's using a pseudo-random number generator that isn't cryptographically strong, then it may be distinguishable. However, the entropy argument seems wrong to me. If the unused portion has measurably lower entropy than true random data, then the random number generator in question must have been compromised.

        • by epyT-R ( 613989 )

          Just because something hasn't been updated doesn't automatically mean it's broken. Everyone's hopped on to this nonsensical upgrade treadmill. Software doesn't 'wear out.' If it's not buggy, it will stay buggy. If it's working, it will stay working.

          As far as supported vs unsupported software goes, you should be assuming your system can be compromised and planning accordingly anyway, whether you get updates or not.

          • by epyT-R ( 613989 )

            errr... "If it's not buggy, it will stay not buggy." sorry.. Obviously, if software around it changes, bugs can crop up, but technically that's not a failure of the existing software.

          • by Desler ( 1608317 )

            Software doesn't 'wear out.' If it's not buggy, it will stay buggy. If it's working, it will stay working.

            Only true if you never upgrade any part of the system it runs on. Any upgrade to the OS or its dependencies (the dependencies of those dependencies, ad infinitum) and you risk introducing bugs.

          • Just because something hasn't been updated doesn't automatically mean it's broken. Everyone's hopped on to this nonsensical upgrade treadmill. Software doesn't 'wear out.' If it's not buggy, it will stay buggy. If it's working, it will stay working.

            As far as supported vs unsupported software goes, you should be assuming your system can be compromised and planning accordingly anyway, whether you get updates or not.

            That's true for something like an ASCII text editor where the requirements are dead simple. However when encryption, and in particular fancy-tricks encryption like deniability are part of the requirements, you bet your ass that problems will appear out of nowhere. Humans make mistakes, and humans make software, so humans make software with mistakes. Just because it passed every practical review and test the first time around, doesn't make it future-proof. With the source code and enough time, someone wi

      • I take it to mean crowdsourcing the attempts to verify the integrity of TrueCrypt.

        White said the next phase of the cryptanalysis, which will include an examination of everything including the random number generators, cipher suites, crypto protocols and more could be wrapped up by the end of the summer. Some of the work, White said, could be crowdsourced following a model used by Matasano, known as the Matasano Crypto Challenges. The now-defunct challenges were a set of more than 40 exercises demonstrating

      • Sorry. It is hard to convey it written words but the emphasis for my sentence is on the word 'rebuild'. I would rather they rebuild it open vs closed source. Since I am implying a rewrite, it would be their prerogative.
  • If TrueCrypt devs really gave up because they think it is pointless, then they should open source the code (BSD, Apache2, GPL, MIT). There is no reason not to, unless they had contributers who passed away.

    So finally, was the duress canary activated or not? If it is "still there" as according to that tweet, that should mean it was not activated.

    Btw, tc-play is not a solution, because it is Linux/BSD only.

    • by Threni ( 635302 )

      > So finally, was the duress canary activated or not? If it is "still there" as according
      > to that tweet, that should mean it was not activated.

      If it was clear that it had been activated, then it would breach the NSL and the authors would be at risk of legal action. Therefore, you will not see a clear canary warrant.

      There was no info in that tweet, and even Mathew Green doesn't know what they were talking about. It was just clickbait to take you to a site with old news.

      • by gweihir ( 88907 )

        I think by now things are clear enough: The alternatives immediately after it happened were defacement or canary. As a defacement would have been cleaned up by now, it has to be canary. And yes, the developers would go to prison if they made that really clear, so a minimum of independent intelligence is required to see it.

        • by muridae ( 966931 )

          If it is a NSA/NSL canary, then the devs are restricted in what they can say about why they are abandoning the project. The logical choice, and the easiest lie to remember, is that "we are just tired of developing it."

          Which, unfortunately, is also the same exact thing they would say if they were just giving up on developing it. So the only real clues are the content of the current web page, and the changes made to the new 7.2 TrueCrypt. That they suggest using BitLocker without a TPM chip (I never thought I

          • by gweihir ( 88907 )

            That they suggest using BitLocker without a TPM chip (I never thought I'd be suggesting the use of a pre-made TPM chip; honest) and that the solution involves upgrading to the pro version of windows . . . it doesn't pass the smell test. Serious crypto guys wouldn't suggest those tools when drunk, much less just because they are quitting.

            Indeed. Or the fact that for OS X, they give "encryption: none" as selection. Or the slap-dash look. (My guess is the look is specifically to suggest a defacement in order to get maximum press exposure and make the message even clearer when a few days later it becomes clear it is not a defacement.)

            All quite clear, but the reasoning required seems to exceed what effective intelligence some people have.

    • by gweihir ( 88907 )

      It looks more and more like a not-too pretty negative canary. Like a website self-defacement automatically triggered is a number of people fail to do some things regularly. Really open-sourcing things needs work. The ridiculous travesty of the original website can be put up automatically.

  • According to Ken Thompson, if you don't also analyze all the tools involved in the software build and load process at the machine code level, you still can't really trust the code [bell-labs.com]. That means compilers, linkers, loaders, etc. Someone who knows what they are doing, and has enough motiviation to go through the effort, could insert code into a compiler that does whatever they want when your code is built with it, and hides itself at the source level.

    These days CPUs are sophisticated enough that you probably w

    • by jcochran ( 309950 ) on Monday June 02, 2014 @05:22PM (#47149811)

      You just might want to look 'Diverse Double-Compiling' as a method of countering the attack described by Ken Thompson in 'Reflections on Trusting Trust'. A paper on DDC is at http://www.acsa-admin.org/2005... [acsa-admin.org]

      • by T.E.D. ( 34228 )
        Interesting idea. But I see two problems there:
        1. It doesn't do anything about the same issue with linkers, the OS's executable loader, your CPU, etc. I suppose you could also try to apply the same concept to then, but them you get to my next issue...
        2. If your problem is that you don't know if you can trust your compilers, a solution that starts with "first, go get a trusted compiler" is kind of an infinitely recursive solution.
        • It does address the issues you mentioned. As for the tool chain (compiler, linker, loader, etc), that is addressed by making them diverse. The term 'compile' means the entire chain from source to binary which includes the entire tool chain. As for the CPU issue, there's nothing in the source that mandates that you must create a binary for the same CPU as you're executing on. So do DDC on multiple CPU families (Intel, ARM, PPC, etc) and compare the final results. And the beauty of DDC is you can do it even i

          • by muridae ( 966931 )

            And for compiling something like a basic C compiler, one could feasibly write their own using ASM from a base of something like CC500 (a 600ish line C compiler). Use said custom compiler to build something like PintOS (full code review possible by one person, I had to do so in collegiate OS courses) on a micro that is running nothing but your compiler from a RS232 port that you are monitoring with a logic analyzer (to watch out for stray data from the 'host' computer at this point). This gets you up to OS a

        • by gweihir ( 88907 )

          The OS, loader and CPU are minor issues, as they do not have the power to analyze your code. And getting a compiler you trust is simple: Write it yourself. Unless the compiler you use to do that was specifically designed to attack your compiler, it will be ineffective.

          Incidentally, the risk of this attack actually happening in practice is very low, as it is exceedingly difficult to implement and as soon as it has bugs the risk of discovery is pretty big.

          • by T.E.D. ( 34228 )

            The OS, loader and CPU are minor issues, as they do not have the power to analyze your code.

            You mean are not supposed to have. You are being awfully trusting here of un-analyzed code.

            And getting a compiler you trust is simple: Write it yourself.

            In what universe is that simple? Writing a functional C compiler takes on the order of man-decades. C++ is a factor of 10 longer [stackoverflow.com]. For a large team, you'd have to somehow be able to trust your entire team (and your network security!) for that entire time. For a single person, it would be a lifetime's work (or more).

            • by gweihir ( 88907 )

              Analyzing code is a high-effort, big-database and lots and lots of cycle task.

              As to the effort of writing a simple, crappy C compiler: If you need more than a month, then you are just not a good programmer.

    • Actually, this isn't true.
      Because encrypted container that contains "weak" encryption wont be able to be decrypted by a build that doesn't have the same weakness.

      It's also the reason bitlocker isn't a replacement - I cant use bitlocker on linux, I use truecrypt containers to store stuff in the cloud, and access from a variety of machines.

      what it really needs is some tidying up, forget about whole disk encryption, and concentrate on making sure the install is safe from tampering.

      • by gweihir ( 88907 )

        You cannot make encryption weak by compiling it badly. You can cause leaks of key material, but only at times the material is actually in use and for disk encryption that attack is irrelevant.

        • by muridae ( 966931 )

          Why not? Assume, for discussion, a malicious compiler. It looks for common code used in encryption and changes parts of the code (see Reflections on Trusting Trust). Identifying the keys should not be that hard with known algorithms, so go for that. Then just replace all keys with 0xDEADBEEF or another known pattern of bits. Viola, encrypted data that can be opened only with code compiled via the corrupt compiler, or by the attacker who knows what bit pattern was used.

          This would also be why verifying that T

          • by gweihir ( 88907 )

            Simple: The result of the encryption has to be bit-identical for things to work. Attacks on crypto by corrupting the cipher are not practical for widely distributed software. And even if you corrupt the cipher, it has to be bijective in order to work at all. Not easy.

        • "But only at times the material is actually in use". .....

          Or when such key material is encrypted into the file with a master key.....

          • by gweihir ( 88907 )

            At that time, it is obviously "in use".

            • I'm not sure you understood me correctly.
              You create an encrypted container using the password "superstrongnoonecanaccesspassword".

              then your container has put into it "thiscontainerspassword="superstrongnoonecanaccesspassword""
              encrypted with
              AllTheFedsHaveThisDecryptKey.

              Like Bitlocker.

              • by gweihir ( 88907 )

                Ah, I see. That is not how it works in any sane design. In a sane design, the password is unknown to the container and protects a master key that has been generated in a cryptographically strong fashion. In a sane design, there also is no space where you could put a copy of the master key or password "protected" in a fashion you describe.

                Of course, bitLocker may just do that, exactly as you describe, and not be a sane design. One more reason to insist that crypto-software is open, the metadata and design is

                • It's perfectly sane if you're the NSA or affiliated with them, not so sane if you are using products they've tampered with.

                  The point with the compile chain/tool, is the compiler can be modified to build in exactly that kind of feature (there's an example from bell I think that did something very similar, since C compilers are compiled by previous version of themselves).

                  Its far more ubiquitious than it should be, for example these guys
                  http://www.phoenixintelligence... [phoenixintelligence.com]
                  Have a ton of hardware installed at micro

    • by gweihir ( 88907 )

      He is wrong and it was only ever a strong hypothesis on his part. Newer research shows that it is a lot easier to build in a way that excludes compiler backdoors: http://www.dwheeler.com/trusti... [dwheeler.com]

      The idea is fascinating. It basically says if you have a really crappy and simple compiler that can compile your code and that you can trust, you can propagate that trust on a really good and complicated compiler. Writing a crappy and simple C compiler can be done in a few weeks.

      • He is wrong and it was only ever a strong hypothesis on his part. Newer research shows that it is a lot easier to build in a way that excludes compiler backdoors: http://www.dwheeler.com/trusti... [dwheeler.com]

        The idea is fascinating. It basically says if you have a really crappy and simple compiler that can compile your code and that you can trust, you can propagate that trust on a really good and complicated compiler. Writing a crappy and simple C compiler can be done in a few weeks.

        Yeah but how are you going to compile your compiler?

        • Hand-compile, then hand-assemble, and finally poke opcodes into RAM with front-panel switches.

          No, I'm not kidding. [stackoverflow.com]

          • The person I replied to said it oculd be done in a few weeks. What you suggest takes months to years.
            And front-panel switches for RAM? WTF are you even talking about? Why would you even need to do that if you've already done everything by hand?

            • "compile by hand" and "assemble by hand" means "write out the results on paper".

              After that, you have to get the machine code into core. That's what the front panel [wikipedia.org] is for.

              Is this somehow new to you? Are you really that young, and that unfamiliar with computing history?

              Of course, if you have a functional operating system you think you can trust, you can poke the machine code into a file using a binary editor (that you think you can trust), and then execute that file as the compiler.

              Read about bootstrapping [wikipedia.org].

          • I have loaded a test program, 47,683 different opcodes, into a pdp11 with no terminal; just to run a test. :shudder:

        • by T.E.D. ( 34228 )

          Presumably the "crappy and simple" compiler is written directly in machine language. That's the only way the GP's thesis works out.

          However, I highly doubt that you can write even a crappy and simple C compiler in machine language "in a few weeks". Unless "few" is measured in the hundreds.

          (FWIW, I don't claim to be the world's foremost expert in this stuff, but I did my master's thesis on compiler construction, so I do at least know a little bit on this topic).

        • by gweihir ( 88907 )

          With itself. Duh.

      • by lgw ( 121541 )

        Sounds interesting, but the abstract for that DDC paper is gibberish.

        In the DDC technique, source code is compiled twice: once with a second (trusted) compiler (using the source code of the compilerâ(TM)s parent), and then the compiler source code is compiled using the result of the first compilation. If the result is bit-for-bit identical with the untrusted executable, then the source code accurately represents the executable.

        It this saying "write your own compiler, then use it to compile GCC, then use that to compile GCC"? I.e., the normal process for bootstrapping GCC to a new architecture?

        Meh, better be running those compiles on a completely trusted OS (which you built how?), on a completely trusted processor (the masks were checked how?). I guess it's a good idea, since the more diverse you go on platforms, the more likely you'd be to find one that's trus

        • by gweihir ( 88907 )

          It is not gibberish, the thing is just complicated. Look into the thesis to really understand what is going on. As he also has a formal proof that this works, the level of confidence I have in it is very high.

          • by T.E.D. ( 34228 )
            "Gibberish" is a bit much. But the GP is exactly right on all other counts. What is being proposed is a ridiculous amount of work, relies on perfect security thereafter, and only addresses a single vector of many that were mentioned in Thompson's talk. It reads a lot more like a "go back to sleep, all is well" paper than an actual practical solution. If I'm wrong, then surely people are doing what he suggested right now. So who are they?
            • by gweihir ( 88907 )

              This is a PhD theses. Have a look at it. It is actually feasible. Of course it is a lot of work initially, but just once.

          • by lgw ( 121541 )

            Formal proofs of correctness mean nothing when it comes to real-world code, even before accounting for malicious project members.

            • by gweihir ( 88907 )

              Have a look at this one. It is a bit different. We are not talking about code verification here. This is a proof that the approach works, including a trace from a proof-checker and all steps. Can be verified by anybody competent in maybe a week or so.

  • Truecrypt fork (Score:2, Informative)

    by Anonymous Coward

    The beauty of opensource is good projects never die.

    http://truecrypt.ch/

  • by Anonymous Coward

    Will they digitally sign a copy of the source they reviewed?
    What encryption will be used for the signature? Will anyone trust it?
    ??????

  • Even if they 'approve' the code, who will trust it? I know i wont. The ship has sailed, use that money for something useful.

    • by Anonymous Coward

      The program (at 7.1a) is still completely useful for an individual or business to scramble personal/business records, in case the computer is lost or stolen, or the overnight cleaning lady is snoopy, etc.

    • Re:Pointless (Score:5, Interesting)

      by dave562 ( 969951 ) on Monday June 02, 2014 @06:01PM (#47150137) Journal

      This is what we are seeing in the field. A number of large financial institutions and government organizations who we deal with on a regular basis have already told us that they are no longer going to use TrueCrypt.

      Most of them are moving towards SecureZip from PKware because it supports AES-256 and is FIPS 140 compliant. Others seem to be okay with 7Zip's "encrypted zip" feature (also AES-256). Others are looking at random packages that I have never heard of before last week, like BestCrypt. Of course there are others who want to go with Symantec's PGP.

      This has proven to be a major pain the ass. For all of its warts, TrueCrypt was the de facto standard for secure data exchange. Now we are seeing a Balkanization of encryption software, and organizations are moving in different directions.

      Personally I think that TrueCrypt is good enough for transferring data on an external USB drive and protecting it against accidental or intentional theft (by anyone other than the NSA). However it is going to be impossible to convince others of that, and I cannot state it with 100% certainty so I am not even trying to have that conversation within the business context.

      As long as Client X is demanding encryption tool Z, that is fine. We will use that tool and let them shoulder the risk. After all, they are telling us what to use, not the other way around.

      • Re:Pointless (Score:4, Insightful)

        by epyT-R ( 613989 ) on Monday June 02, 2014 @08:33PM (#47151169)

        Why would these organizations switch to unknowns? If they trusted truecrypt up to this point, why not continue to trust? These closed source applications could be backdoored and there's no way of really finding out. If you think source auditing is difficult, try auditing a binary.

        It was never possible to trust truecrypt or anything else with 100% certainty.

      • by Anonymous Coward

        Best Crypt is made by Jetico, a finnish crypto software/hardware company that's been around since the early 90's. Their OTFE is top notch and the linux version is full featured with GUI. Both binary and source code packages for linux can be downloaded for free though they don't advertise it. In fact, Best Crypt was used in the Bill Clinton white house. Check them out: www.jetico.com

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Why did you trust it in the first place? You trust unaudited code because the author says its fine but won't trust audited code that the author abandoned?

      • by nurb432 ( 527695 )

        Who said my organization didn't already audit it? And there is more going on here than a simple project abandonment.

        • by muridae ( 966931 )

          So what if there is? Assuming that your organization did audit 7.1, and found no problem, what makes it a risk now? Sure, you wouldn't want to migrate to 7.2 in a years time, and any fork from 7.1 would require a new audit; but I would hope that if you put that much effort into it that you would audit 7.2 internally or any further fork version as well, which would leave you with either a 'this is clean' or 'this is fishy' answer.

          I don't doubt that many large organizations are looking at directions to migrat

        • by Desler ( 1608317 )

          Where's the audit and the methodology, then?

  • Is there a method for individuals to legally canary themselves if they get NSL-ed (which wouldn't surprise me in the least for this audit)?
    • by feufeu ( 1109929 )

      What about moving out of the US of A ? I hear there are other countries with running water now...

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...