Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Privacy IT

Schneier: We Don't Need SHA-3 143

Trailrunner7 writes with this excerpt from Threatpost: "For the last five years, NIST, the government body charged with developing new standards for computer security, among other things, has been searching for a new hash function to replace the aging SHA-2 function. Five years is a long time, but this is the federal government and things move at their own pace in Washington, but NIST soon will be announcing the winner from the five finalists that were chosen last year. Despite the problems that have cropped up with some versions of SHA-2 in the past and the long wait for the new function, there doesn't seem to be much in the way of breathless anticipation for this announcement. So much so, in fact, that Bruce Schneier, a co-author of one of the finalists not only isn't hoping that his entry wins, he's hoping that none of them wins. ... It's not because Schneier doesn't think the finalists are worthy of winning. In fact, he says, they're all good and fast and perfectly capable. The problem is, he doesn't think that the world needs a new hash function standard at all. SHA-512, the stronger version of the SHA-2 function that's been in use for more than a decade, is still holding up fine, Schneier said, which was not what cryptographers anticipated would be the case when the SHA-3 competition was conceived. 'I expect SHA-2 to be still acceptable for the foreseeable future. That's the problem. It's not like AES. Everyone knew that DES was dead — and triple-DES was too slow and clunky — and we needed something new. So when AES appeared, people switched as soon as they could. This will be different,' Schneier said via email."
This discussion has been archived. No new comments can be posted.

Schneier: We Don't Need SHA-3

Comments Filter:
  • Re:Future proofing (Score:5, Interesting)

    by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Tuesday September 25, 2012 @06:44AM (#41447701) Homepage Journal

    To be fair, the NSA don't seem to have caused problems with the S-Boxes and differential analysis doesn't seem to have worked too well. On the other hand, COCACABANA et al were glorified 1940s-era Colossus machines - cracking codes via a massively parallel architecture. To me, that's the scary part. Turing's work on cryptography and massively parallel code breakers was 100% applicable to the design of DES because the keylength was so incredibly short. You could build enough machines to effectively break it.

    How many DES engines do you think someone could have crammed onto a wafer in the 1980s? (Remember, each die can have multiple engines, and then the dies that work can be hooked together.) Link up a bunch of such wafers and you end up with a crypto engine from hell. It would have been VERY expensive, but I would imagine it perfectly plausible that a sufficiently detemined and rich organization (I would imagine the NSA might have been one such) could have potentially built such a machine when the rest of us still thought the 6502 was a really neat idea.

    Doesn't mean anyone ever did. People could have reached Mars in the 1980s, so "could have" and "did" are obviously very different things. What people actually did is anyone's guess, though "nothing" sounds about right.

    Had they built such a device, though, then near-real-time breaking of DES would have been possible at the time it was in mainstream use. Certainly, there were claims circulating that such devices existed, but a claim like that without proof is hard to accept. All I can say is that it's demonstrably not impossible, merely unlikely.

    Back to SHA-2. Are we in the same boat? Are there ways to build something today, even if nobody is likely to have actually built it yet, that could endanger SHA-2? (To me, THAT is the measure of security, not whether anyone actually has, because they're not likely to tell you when they have.) Quantum computing is the obvious threat, since 512 bits is a lot of security, too much to attack in parallel with a classical architecture. Quantum computing, though, should let you scale up non-linearly. The question is whether it's enough. (I'm assuming here that there are no issues with preimages or timing that can be exploited to reduce the problem to a scale QC can solve even if classical machines can't.)

    There have been a few murmurs that suggest SHA's security isn't as strong as the bitlength implies. Would that be enough? If Japan can build a vector machine the size of a US football stadium, then it is not physically impossible to scale a machine to those sizes. Nobody has scaled a quantum computer beyond a few bits, but I repeat, I don't care what people have publicly done, it is what is within the capacity of people TO build whether publicly or not that matters.

    If you're not 100% certain that not even a quantum computer on such a scale, where all nodes were designed at the hardware level to perform JUST the task trying to break the has, then the hash is not safe for 20+ years. It may be unlikely, but there's nothing to say it might not be vulnerable right now. There's nothing physically impossible about it (as shown), it's merely a hard problem. And hard problems get solved. What you need in a crypto hash is something you can be sure WILL be impossible to break in a 20 year window, which means what you need is a crypto hash that is beyond anything where the components can be prototyped today. For a 30 year window, it needs to be beyond detailed theory. A 50 year window can be achieved if it's beyond any machine ANY existing theory can describe.

    (It takes time to go from theory to prototype to working system to working system on the right scale. The intervals seem to be fairly deterministic in each subject. I believe this to indicate a mathematical model that underpins things like Moore's Law and which is independent of field. Know that model and you know when Moore's Law will fail. Moore's Law is merely the equivalent of Hooke's Constant for computing, failure is inevita

  • Re:Future proofing (Score:5, Interesting)

    by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Tuesday September 25, 2012 @06:53AM (#41447729) Homepage Journal

    Very true. Which is why I'm anxious SHA-3 has as little (ideally nothing) in common with SHA-2, be it algorithmically or in terms of the underpinning mathematical problems used that are assumed to be hard.

    I would have preferred Blue Midnight Wish to be still in the running (well, it's got a cool name, but more importantly it has a very different design).

    I ALSO wish Bruce and the others would pay attention to those of us on the SHA-3 mailing list advocating a SHA-3a and SHA-3b where -3a has the best compromise between speed and security, and -3b has absolutely b. all compromise and is as secure as you can get. Why? Because that meets Bruce's objections. -3a may will be broken before SHA-2 is so threatened that it is unusable, because of all the compromises NIST want to include. -3b, because it refuses to bow to such compromises, should remain secure for much longer. You can afford to stick it in the freezer and let it sit there for a decade, because it should still be fresh BECAUSE no compromises were made. By then, computers would be able to run it as fast, or faster, than -3a could be run now.

    So I have ZERO sympathy with Schneier. He is complaining about a problem that he is, in part, responsible for making. Other views WERE expressed, he thought he knew better, but his path now leads to a solution he believes useless. So, to NIST, Bruce, et al, I say "next time, leave your bloody arrogance at home, there's no room for it, doubly so when you've got mine to contend with as well".

  • Re:Too slow? (Score:5, Interesting)

    by kasperd ( 592156 ) on Tuesday September 25, 2012 @07:02AM (#41447769) Homepage Journal

    As I understood, it has to be slow

    This is a common misconception. The source of this misconception is the way people have tried to protect weak password through the use of hashing. If you take a strong password and hash it with a hash function satisfying all the requirements for a cryptographic hash function, then that password is well protected. That construction doesn't work for weak passwords. If you apply a salt while hashing, you move the threshold for the strength of passwords which can be brute forced. This is quite clearly an improvement over plain hashing. I know of no cryptographer, who has disputed, that it is a good idea to use salts while hashing passwords.

    But there are still some passwords, which are too weak to be protected by a salted hash. This has lead to some people saying this hash function is insecure, because it is too fast. What they should have been saying was, this password mechanism is insecure, because it is using the wrong kind of primitive. This is an important distinction. Even if the hash function satisfies all the requirements of a cryptographic hash, then a salted hash cannot protect a weak password.

    When building cryptographic systems you often need to apply different classes of primitives. Common primitives are hash functions, block ciphers, asymmetric encryption, digital signatures. Examples of primitives in each of these four classes are SHA512, AES128, RSA, RSA (yes RSA does fall in two different classes, there are other primitives, which fall in only one of those two classes). If you want to send an encrypted and signed email, you typically use all those four classes of primitives.

    To protect semiweak passwords better than through a salted hash you really need a new class of primitive. For lack of better term I'll call that primitive a slow function. Claiming that a hash function is insecure because it is fast would be like claiming AES128 is secure because you can derive the decryption key from the encryption key.

    The formal properties I would say a slow function should have are first of all that it is a (mathematical) function mapping bitstrings to bitstrings, and that it requires a certain amount of computing resources to compute the full output from any single input. Properties that I would not require a slow function to have includes collision resistance and fixed size outputs. Those are properties you expect from a hash function, which is a different kind of primitive.

    People have tried to squeeze both kinds of properties into a single primitive, which if they succeeded, would be both a cryptographic hash and a slow function. But they haven't always paid attention to the differences in requirements. And often the result has been called a hash function, which confuses people, since it is different from a cryptographic hash.

    One nice property of slow functions as I would define them is that given multiple candidate functions, you can just compute all of them and concatenate the results. And you will have another slow function, which is guaranteed to be at least as strong as the strongest of the functions you started with.

    Once you have all the primitives you need, you can combine them into a cryptographic system, that achieves some goal. If you want to protect passwords, I think you are going to need both a slow function and a hash function. For the combined system, you actually give a formal proof of the security. The proof of course assumes, that the underlying primitives satisfy the promised criteria. I guess the password protection you would implement given the above primitives would compute the slow function on the password, and then compute a hash of password, salt and output of the slow function.

    Additionally you could prove that the regardless of the strength of the slow function, the password as well protected as it would have been with just a salted hash. That way by handling those as separate primitives, you can reason about the security assuming the failure of one of the primitives. Such a construction would eliminate the main argument about some of the existing slow functions, which is that they haven't had sufficient review.

  • Re:Too slow? (Score:5, Interesting)

    by kasperd ( 592156 ) on Tuesday September 25, 2012 @07:16AM (#41447815) Homepage Journal

    Claiming that a hash function is insecure because it is fast would be like claiming AES128 is secure because you can derive the decryption key from the encryption key.

    That obviously should have said "Claiming that a hash function is insecure because it is fast would be like claiming AES128 is insecure because you can derive the decryption key from the encryption key."

    Put differently. If you use AES when you should have used RSA, you don't blame AES for being insecure. If you use a SHA512 when you should have used a slow function, you shouldn't blame SHA512 for being insecure.

    When you reason about the security of cryptographic systems, you usually show how many times a certain primitive must be invoked to break it. And if that number is large (usually meaning 2^128 or more), then you consider the system to be secure. It is not threat if the primitive itself is fast, because once you have to execute it 2^128 times, it will still take "forever".

    But for protection of weak passwords you can't give such guarantee. Those can be broken with a relatively small number of invocations of the primitive. At that point the resources it takes to compute the primitive matters, and adding another requirement to a primitive means it is no longer the same kind of primitive.

  • Re:Future proofing (Score:5, Interesting)

    by plover ( 150551 ) on Tuesday September 25, 2012 @07:55AM (#41447997) Homepage Journal

    Bruce's argument is essentially "the devil you know." Five years ago it seemed like SHA-2 might succumb to an attack. However, it's been five years, and those attacks never materialized on SHA-512. That's five more years of convincing evidence that SHA-2 is fairly secure. None of the SHA-3 finalists have had the same level of scrutiny. Sure, they've been looked at seriously, but nothing like the widespread amount of attention that an actual standard gets.

    Another consideration is practicality. If a new standard is published, millions of dollars will be expended changing systems around the world. Will all that money have been well spent? If there was no cryptographic reason for the change, all that money and effort was wasted.

    And what about security? Will every replacement be perfect? I personally doubt it; mistakes are made and people screw up implementations all the time. An organization that hired a cryptographer to design and implement a secure solution in 2007 might feel they can do the work themselves today. But we know that cryptographic protocols are notoriously finicky when it comes to unintended information leakage or security. If a secure design is modified in any way, the potential to introduce a security bug means the risk of change is much higher than the risk of sticking with SHA-2.

If all else fails, lower your standards.

Working...