Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Encryption Security Math Technology

NIST Opens Competition for a New Hash Algorithm 187

Invisible Pink Unicorn writes "The National Institute of Standards and Technology has opened a public competition for the development of a new cryptographic hash algorithm, which will be called Secure Hash Algorithm-3 (SHA-3), and will augment the current algorithms specified in the Federal Information Processing Standard (FIPS) 180-2. This is in response to serious attacks reported in recent years against cryptographic hash algorithms, including SHA-1, and because SHA-1 and the SHA-2 family share a similar design. Submissions are being accepted through October 2008, and the competition timeline indicates that a winner will be announced in 2012."
This discussion has been archived. No new comments can be posted.

NIST Opens Competition for a New Hash Algorithm

Comments Filter:
  • by Briden ( 1003105 ) on Friday November 09, 2007 @03:03PM (#21298871)
    i prefer the bubble bag method for making hash
  • Why does the government promote creating new encryption methods when encrypting data so clearly means you have something to hide and are therefore guilty? I mean COME ON!
    • by rock217 ( 802738 ) <> on Friday November 09, 2007 @03:16PM (#21299135) Homepage Journal
      Encryption implies that you can reconstruct the original string from the encoded. Methods like md5, sha1, etc are one way algorithms that cannot be reversed* in a realistic amount of time.

      * - Rainbow tables
      • Re: (Score:3, Informative)

        When hashing a data set larger than the resulting digest, it cannot be reversed at all. However you can find collisions which is handy if you want to subvert the PKI hierarchy that protects web transactions.

      • by Kjella ( 173770 )
        It can only be reversed for the wierd case where you're trying to make a few bits (30-40 bits of a password) into many (160/256/384/512 bits), which is of course impossible and the quality of the hash method doesn't matter. The only thing rainbow tables do is try to make a universal lookup table, and the only thing that helps is (cryptographic) salt. Most of the time hashes are used to make checksums of e.g. your linux distro, which obviously can't be reversed.
      • Re: (Score:3, Informative)

        by Comatose51 ( 687974 )
        Rainbow tables won't help you get the old message back since pretty much by definition or pigeonhole theorem there is more than one plaintext that can generate the same hash. Breaking a hash algorithm usually involves finding a plaintext that generates the specific hash, thus fooling the victim into thinking that plaintext was the original one.

        Or imagine this: you have a simple hash function that takes all the letters in a message, turns them into number based on their place in the alphabet, and adds them
      • Methods like md5, sha1, etc are one way algorithms that cannot be reversed* in a realistic amount of time.

        Nitpick: they can't be reversed in any amount of time, because for any given hash, there are propably an infinite amount of strings which hash to that hash.

  • Once I develope the winning uber hash function, what do I get? I can't find in the timeline where it mentions a large cash prize with strippers jumping out of cake. Some balloons too.

    Where is the link in the story to this part? Anyone?
  • Jung qb V jva?
  • by hellergood ( 968199 ) on Friday November 09, 2007 @03:20PM (#21299187)
  • I put a SHA-1 based KDF in 802.16 because NIST SP800-56 told me to.


  • With hash values getting longer and longer, wouldn't it be more economic to just use Identity as the hashing function?

    Here's your grain of salt...

    • Re: (Score:2, Insightful)

      by marcello_dl ( 667940 )
      • Hashes are used directly in essentially all forms of signatures and integrity verifications, as you hash the data being represented and then sign or protect the hash value. HMAC's are (or should be) used with strong keys for protecting the integrity of communications. As such, hashes should be fast and resistant to capable assault with massive computational resources. Given the birthday effect, collisions will occur with a when a message pool is ~ sqrt(hash size).

        The attacks against SHA-1 have reduced the

    • by Surt ( 22457 )
      A 1024 byte hash is not large compared to the gigabyte file it signs.
  • part of the reason for the long delay is to allow the CIA and NSA to evaluate all contenders for suitability of being crackable/backdoorable by them.
    • The NSA has approved AES for encrypting secret data (128+ bits) and top secret data (192+ bits). Unless they are playing a very deep denial and deception game, it stands to reason that they can't find a way through it either.
      • by JustNiz ( 692889 )
        Of course the NSA are going to claim they can't crack any encryption they've already cracked. It means something new won't come along and blow away all their hard work. Also, they're hoping such statements will cause criminals to choose an encryption that the NSA has cracked.
        • That would require an unbelievable degree of denial and deception.

          Top Secret is defined as information that would seriously damage the US if released. They would not trust encrypting such secrets in it if, at the time they made that decision, they had discovered a weakness that would allow them to break it. Thus, the only way this would happen if they are behaving rationally is if they were lying. For them to do that successfully it would require having something classified at a level no higher than Con
  • So, what requirements should a submission fulfill? I can't find them!

  • I'll I got to say is...

    SHA right!
    • by Unnngh! ( 731758 )
      SHA3 = SHA1(data) xor SHA2(data)

      My patent trumps your patent!
    • You must be trying to be funny, because that would be significantly weaker than SHA-2. First of all, I hope you are trying to do adding, because concatenation is directly publishing the weaker hash. If you are adding, I would do so modulus the larger hash, because otherwise you might get 257 bits of SHA-2. But mainly if you look at crypto-analysis techniques, adding another algorithm normally weakens the original algorithm. It is much better to just add more calculations or more rounds. Even then I would ra
  • The linked NIST report mentions only the work of Prof. Yang with which no one has yet found a collision, but a team from Graz University of Technology (Austria) has proposed a significantly faster algorithm for producing SHA-1 collisions and is running a BOINC project to find one [].
  • There are already good hash functions out there that don't share the basic design of SHA. I've been using whirlpool for applications where security is important. (Good old md5 is fine for applications that don't involve security.) The problem is getting these newer hash functions widely implemented. For instance, here [] is my request to get the perl Digest::Whirlpool module packaged for debian/ubuntu. Until better hashes are conveniently packaged, authors of applications actually have a disincentive to move t
  • Just get 1000 monkeys and 1000 numeric keypads. Where's my prize?
  • I have an anecdote to share.

    Recently I was asked to provide some info about the quality of a PRNG generator used in one of our programs.
    One of the questions was how well it does on the NIST Statistical Test Suite [].

    So, I head over to the NIST site and download [] the latest version for Windows, dated March 22, 2005.

    First thing that I notice is that it does not compile under Visual Studio 2005.
    OK, I understand, they only had about two and a half years to fix this which is obviously not enough for an organization

Remember to say hello to your bank teller.