Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Security Encryption

OpenSSL Revalidated Following Suspension 51

lisah writes "Despite what looks like an organized effort to prevent it, OpenSSL has been revalidated by an independent testing agency for its ability to securely manage sensitive data and is ready for use by governmental agencies like the Department of Defense. According to the Open Source Software Institute, who has been overseeing the validation process for the last five years (something that typically only takes a few months), it seems that the idea of an open source SSL toolkit didn't sit right with proprietary vendors of similar products. A FUD campaign was launched against OpenSSL that resulted in a temporary suspension of its validation. Developers and volunteers refused to give up the ghost until the validation was reinstated, and has the story of the project's long road to success." and Slashdot are both owned by OSTG.
This discussion has been archived. No new comments can be posted.

OpenSSL Revalidated Following Suspension

Comments Filter:
  • Not necessarily... (Score:5, Insightful)

    by lisah ( 987921 ) on Friday February 09, 2007 @12:36PM (#17949562)
    Well, it's not necessarily "meaningless." It would be great to see more governmental agencies choosing open source options but, from what I understand, when it comes to managing sensitive data, only software that is tested and proven to be reliable and secure can be used -- hence OpenSSL's validation process. Sure, it's important to use the tools wisely but, without the FIPS validation, this open source tool can't be used by the government in the first place.
    • by lymond01 ( 314120 ) on Friday February 09, 2007 @12:40PM (#17949632)
      I believe he means it's technically meaningless while others are arguing that it's beaurocratically important. The specs for validation aren't so difficult to meet, but if you don't go through the process, no one wants to use your software.

      Like us: if you don't have that MCSE on your resumé, we don't want you.

      Oh, wait. Yes we do...
  • Huh? (Score:5, Insightful)

    by teridon ( 139550 ) on Friday February 09, 2007 @12:57PM (#17949918) Homepage
    Since all of OpenSSL's source code has passed the testing process, now developers can focus on compiling binary libraries and submitting those for validation

    Someone please explain to me why binaries aren't good enough for the first review, then later they are? Who says the new source code is "secure"?

    Why didn't they require source code review for vendor products?

    • by mpapet ( 761907 )
      I don't know all the ins-and-outs of the process, but it's my understanding, based on the level of certification and any issues that may arise during certification, a source code review is necessary to clarify concerns and issues.
    • Re: (Score:3, Insightful)

      Someone please explain to me why binaries aren't good enough for the first review, then later they are? Who says the new source code is "secure"?

      I don't think it's a matter of one being better than the other. Certification of one thing doesn't mean related items are also certified. Just because the source code is now certified doesn't mean that all the libraries that can potentially be built by that source code are now automatically certified as well. (If B derives from A, and A is certified, it doesn't
      • It also means that if an organization has some requirement for a rather uniquely configured version of OpenSSL that they can build it themselves from certified sources and be comfortable with using it

        They may be comfortable using it, but if they were a government agency with a requirement to use validated code it doesn't look like they could use a "uniquely configured version".

        Taken from the CMVP validation list on the CMVP website.

        (When built, installed, protected and initialized as specified in the provid

    • by mattp ( 68791 )
      Some level of source code review is required in all FIPS levels for all validations.
      • by Panaflex ( 13191 ) *
        That's true - but it's mostly to prove that the finite state diagram properly describes the module operation and that the key initialization, known answer tests and zeroization are properly implemented. It's not a complete source code review. Lastly, only the certifying lab (not NIST) checks the code.
    • Re: (Score:3, Insightful)

      by Panaflex ( 13191 ) *
      FIPS 140-2 (and moreso -1) were designed for certifying what is commonly known as "black-box" hardware encryption modules. Some given assumptions on the security model include:
      1. Physical security of the boxes
      2. Prevention of attacks
      3. Disclosure of usage, known-good protocols & keys (A Security policy)
      4. Testing of (P)RNG's.
      5. Known Answer tests of FIPS approved algorithms (AES, DES, etc...)

      The movement towards software-only modules has brought a whole series of issues to head - meaning that some whol
  • m$^8 (Score:1, Interesting)

    by Anonymous Coward
    Was this also sponsored by microsoft or was it some other biggie this time?

    Oh wait, there are no other hostile biggies.
  • by Cerebus ( 10185 ) on Friday February 09, 2007 @01:15PM (#17950204) Homepage
    1. FIPS 140 validations taking a long time is not unusual.

    2. OpenSSL was validated as *source*. All other FIPS 140 validations are of *object code* or devices. This is the first cryptomodule to be validated in source form and contributed to the time taken to validate.

    3. The OpenSSL original cert was suspended because there was a small bit crypto code that resided outside the security boundary. Confusion between sponsor, lab, and NIST contributed to the suspension. See #2.

    4. Claims of vendor FUD are overblown. NSS, another Open Source cryptomodule, already has FIPS 140-1 certification (for version 3.6; 3.11 will be entering FIPS 140-2 eval soon).
    • by Nevyn ( 5505 ) *

      2. OpenSSL was validated as *source*. All other FIPS 140 validations are of *object code* or devices. This is the first cryptomodule to be validated in source form and contributed to the time taken to validate.

      This is very misleading. The OpenSSL code was submitted as source, but the lab still evaluated it as a binary blob (after compiling/installing it using the instructions provided). The lab did not evaluate the source anymore than they did for NSS or the MS crypt libs. etc.

  • by rs232 ( 849320 ) on Friday February 09, 2007 @01:16PM (#17950236)
    "We called it the FUD campaign," he says. "There were all kinds of complaints sent to the CMVP including one about 'Commie code.'"

    'While OSSI was not able to review each complaint the CMVP received, the ones they did see often contained redacted, or blacked-out, data about who had filed the complaint. Some documents, however, did reveal the complainant information, and Weathersby says that is how the OSSI became aware that, in some cases, proprietary software vendors [] were lodging the complaints'
  • This is an excellent example of how large and deep the cesspool is in government contracting.

    The competitors intentionally draw out the certification process for the newcomer to literally exhauste them and drive the competition away. This is just one relatively small library/suite of applications. (albeit critical)

    For any of you entrepreneurial developers thinking they're onto the the next great thing that gov'ts will buy, please consider this story carefully. A long career at the top of an agency you wis
    • by mattp ( 68791 )
      I can assure you that the certification process was not created by the companies that are having their products validated. I think that validating OpenSSL presented a unique problem as the writers of the standard did not foresee a source only module.
      • My point was the deep pockets competitors abuse the process to get their desired result: No new competition.
  • If OpenSSL is validated in both binary and source form while proprietary implementations of SSL are only validated as binaries, one could reasonably conclude that proprietary versions are not really fully validated.

    Certainly once validity of visible source code is established it should be possible to relatively easily continue to demonstrate validity of that code. Meanwhile in the case of proprietary versions it is possible to make source changes that change the behavior of binary product in ways difficult
  • Disclaimer, I work for Red Hat so I am very familiar with competitors efforts to spread FUD about open source software but I don't believe any nefarious activities were at work here. The NSS Project [] is an open source SSL toolkit that received FIPS140 certification in 2002. Five years ago! The opensource Crypto++ project was certifed in 2003.

    So if (as the sensationalist headline proclaims) "the idea of an open source SSL toolkit didn't sit right with proprietary vendors of similar products." They've had 5 y

    • by Anonymous Coward
      So a complaint about "commie code" being present in the source isn't FUD in you opinion? You're a lot more tolerant of that crap than I would be.
  • Operating System likes to implement their open message digest command ( if they provide one at all ). If your system is missing a digest command, you can use the openssl utility to generate one-time hashes. OpenSSL supports the SHA1, MD5 and RIPEMD160 algorithms, and accepts one or more files as arguments

Matter cannot be created or destroyed, nor can it be returned without a receipt.