Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Security IT

A Competition To Replace SHA-1 159

SHA who? writes "In light of recent attacks on SHA-1, NIST is preparing for a competition to augment and revise the current Secure Hash Standard. The public competition will be run much like the development process for the Advance Encryption Standard, and is expected to take 3 years. As a first step, NIST is publishing draft minimum acceptability requirements, submission requirements, and evaluation criteria for candidate algorithms, and requests public comment by April 27, 2007. NIST has ordered Federal agencies to stop using SHA-1 and instead to use the SHA-2 family of hash functions."
This discussion has been archived. No new comments can be posted.

A Competition To Replace SHA-1

Comments Filter:
  • Good News (Score:4, Interesting)

    by Ckwop ( 707653 ) * on Wednesday January 24, 2007 @09:32AM (#17736662) Homepage

    The amount of research done in to hash functions is nothing like the amount that goes in to ciphers. I'm not really sure why this is the case because hashes are much more important than ciphers. Hashes are used in MACs to protect the integrity and authenticity of a message.

    Ask yourself this, is it more important that somebody can read your SSH connection or that somebody can hijack the channel? The reasons for wanting a good hash function suddenly become very clear.

    It's true that hashes are becoming less important as a result of AEAD modes. But they have uses far beyond MACs and it's good to see a competition from NIST to stoke research in to those primitives.

    Simon.

  • by Srin Tuar ( 147269 ) <zeroday26@yahoo.com> on Wednesday January 24, 2007 @09:36AM (#17736702)

    Does anyone know whether or not common protocols and formats such as TLS, ssh, X.509 certs, etc are being updated to use newer hash functions?

    Its easy to change parts of a self-contained system, such as password hashes, but common protocols require interoperability and standards compliance.

    This is actually fairly interesting situation, where NIST certification and platform interoperability may actually be at odds with each other.
       
  • by bhima ( 46039 ) <(Bhima.Pandava) (at) (gmail.com)> on Wednesday January 24, 2007 @09:40AM (#17736738) Journal
    The general consensus among the experts in cryptology is that a competition is far more effective than other methods of designing algorithms. Presumably the 3 years is a function of how long the world can wait as compared to how the experts need to crack it. The thing that makes me wonder is why they waited so long to begin it.

    Characterizing this process as a "serial security-by-obscurity strategy" is completely wrong because due to the very nature of the competition the algorithm is known from the start.

  • How about SHA-512? (Score:4, Interesting)

    by ngunton ( 460215 ) on Wednesday January 24, 2007 @09:43AM (#17736770) Homepage
    Anybody know if SHA-512 is mathematically vulnerable to the same kind of attack as SHA-1 (only presumably requiring more computing power)? Or is it really a different kind of beast?
  • How frustrating! (Score:3, Interesting)

    by Paul Crowley ( 837 ) on Wednesday January 24, 2007 @09:45AM (#17736798) Homepage Journal
    The unfortunate thing is that in order to win this competition, the hash function submitted will have to be pretty conservative - very much like the SHA-512 family. There isn't time to analyze anything really new and have enough confidence in its security to bless it as the new standard for ever on. But if these attacks (and especially the recent attacks on the whole Merkle-Damgard structure) show us anything, it is that the old way turns out not to be the best way to build hash functions, and that more innovative ideas (eg Daemen et al's "belt and mill" proposal RADIOGATUN) should be the way forward.

    What we need is for NIST to pull the rug under everyone near the end, and say "thanks for putting huge amounts of energy and hard work into designing and attacking all these hash functions, now you can all make new proposals based on what we've all learned and we'll start over again!"
  • One Word.... (Score:5, Interesting)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Wednesday January 24, 2007 @09:46AM (#17736814) Homepage
    WHIRLPOOL.

    It's a balanced design, an SPN to boot.

    The big problem with the SHA's [and their elk] is that they're all UFN [unbalanced feistel networks], in particular they're source heavy. Which means the the branch/diffusion is minimal (e.g. it's possible to make inputs collide and cancel out differences).

    SPN [substitution permutation networks] like WHIRLPOOL are balanced in their branch/diffusion.

    Best of all, WHIRLPOOL is already out there. just a sign the paper!

    Tom
  • by Anonymous Coward on Wednesday January 24, 2007 @10:13AM (#17737058)
    Yeah, and it probably takes a year or two for a project like this to be fully thought-out, organized, blessed by the appropriate governmental approval agencies, &c, &c, &c. It's entirely possible that the origins of this contest were with Schneier's suggestion.
  • by Fahrenheit 450 ( 765492 ) on Wednesday January 24, 2007 @11:35AM (#17737992)
    That's great. Except for one thing...
    Hashes are used all over the place in cryptography. That digital signature you generated? You didn't sign the message, you signed a hash of the message. That key you just exchanged? There was likely a hash involved in that process. Hashes are one of the basic building blocks of cryptographic protocols and systems, and while the recent weaknesses aren't too much to worry about yet as they aren't really practical or directly applicable, their presence is troubling.

    And far more interesting (to me at least) are the attacks like Joux's multicollisions and Kelsey and Kohno's Hash Herding/Nostradamus attacks.
  • I was wondering the same thing, and apparently so were a few other people besides. There's another discussion of it further up in the thread, and the quote which seems to be the final answer doesn't seem to be too hot on the idea. Here it is [slashdot.org] (quoting here from another source):
    "...attempts to increase the security of hash functions by concatenating the outputs of two independent functions don't actually increase their theoretical security. For example, defining H(x) = SHA1(x) || RIPEMD160(x) still gives you only about 160 bits of strength, not 320 as you might have hoped. The reason is because you can find a 2^80 multicollision in SHA1 using only 80*2^80 work at most, by the previous paragraph. And among all of these 2^80 values you have a good chance that two of them will collide in RIPEMD160. So that is the total work to find a collision in the construction."
    What this means to me is both 'yes,' and 'no.' Yes, using multiple hash algorithms protects against the failure of one algorithm. It avoids putting all your eggs in one basket. However, using multiple algorithms doesn't, in itself, offer any greater security than just using a single algorithm and a longer hash, assuming the algorithm is good. (By 'good,' I mean that it doesn't offer any ways of finding collisions that are significantly faster than brute force.)

    Mathematically, using multiple algorithms may not offer much of an advantage, but practically, where you may by necessity have to work with algorithms that have flaws (because you have to pick from algorithms that are well-agreed-upon standards), or that may be discovered to have flaws in the future, it seems like a good way to hedge one's bets. Aside from the added complexity, there doesn't seem to be any compelling reasons not to do it, if time and computational power allow.
  • by Anonymous Coward on Wednesday January 24, 2007 @03:48PM (#17741996)
    Its not a troll morons, its the truth. Rijndael was THE LEAST SECURE algorithm that made the finals. It uses the same fundamental concepts as serpent, but takes shortcuts for speed. Serpent was the most conservative cipher in the contest, and twofish the most flexible. Either of them would have made sense. Rijndael was just fast on 686 class hardware (and not even much faster than twofish), and should not have been chosen to be AES. If you don't know shit about crypto, then don't mod posts about it.
  • by iamcf13 ( 736250 ) on Wednesday January 24, 2007 @06:09PM (#17744130) Homepage Journal
    ...is provably collision-resistant.

    http://senderek.de/SDLH/ [senderek.de]

    'Proof' by Ron 'RSA' Rivest...

    http://diswww.mit.edu/bloom-picayune/crypto/13190 [mit.edu]

    SDLH is simple and secure to any number of bits of security desired once set up properly.

    Factoring the modulus in SDLH is the only way to break it.

    For that you need a state of the art number factoring algorithm (currently General Number Field Sieve [wikipedia.org] or Shor's Algorithm [wikipedia.org]).

    Case closed.

One way to make your old car run better is to look up the price of a new model.

Working...