Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Mozilla Security The Internet

Deprecation of MD5 and SHA1 -- Just in Time? (threatpost.com) 87

mitcheli writes: If you're hanging on to the theory that collision attacks against SHA-1 and MD5 aren't yet practical, two researchers from INRIA, the French Institute for Research in Computer Science and Automation, have provided reason for urgency. They demonstrated a new series of transcript collision attacks centered on the SHA-1 and MD5 implementations in TLS 1.1, 1.2 and 1.3, along with IKEv1 and v2, and SSH 2. They say, "Our main conclusion is that the continued use of MD5 and SHA1 in mainstream cryptographic protocols significantly reduces their security and, in some cases, leads to practical attacks on key protocol mechanisms (PDF)." Of course, Mozilla officially began rejecting new SHA-1 certificates as of the first of the year. And as promised, there have been some usability issues. Mozilla said on Wednesday that various security scanners and antivirus products are keeping some users from reaching HTTPS websites.
This discussion has been archived. No new comments can be posted.

Deprecation of MD5 and SHA1 -- Just in Time?

Comments Filter:
  • hashmd5(data) is weak.
    hashsha1(data) is weak.
    hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.

    • hashmd5(data) is weak.
      hashsha1(data) is weak.
      hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.

      hashsha1(data) . hashmd5(data) could possibly be better, but what your suggesting sounds worse to me.

      • by WhiteKnight07 ( 521975 ) on Friday January 08, 2016 @10:41AM (#51262091)
        Actually concatenating hashes together doesn't do much for security at all. In fact it does almost nothing. See: http://link.springer.com/chapt... [springer.com]
        • by cc1984_ ( 1096355 ) on Friday January 08, 2016 @04:02PM (#51264673)

          Very interesting article. However, it seems to be saying that a concatenation of an X bit hash and a Y bit hash are no better than a third hash of length X+Y bits. My original comment was in respect that md5 . sha1 would be better than md5 or sha1 alone, but even then I'm no crypto expert so I'm prepared to be proved wrong on that.

        • You seem to have missed the bit that says:

          We also discuss the potential impact of our attack on several published schemes. Quite surprisingly, for subtle reasons, the schemes we study happen to be immune to our attack.

          In other words the paper says "Here is a devastating attack in paired hash functions that happens not to work on real-world uses of paired hash functions". In fact RIPEMD-160, the longest-lived unbroken hash function [valerieaurora.org], gets its security precisely from being a cascaded hash (and is be immune to the attack described in the paper).

      • by Anonymous Coward

        hashsha1(data) . hashmd5(data) could possibly be better, but what your suggesting sounds worse to me.

        Sounds pretty dumb.

        Then I'll just split the concatenated hash and find a hash collision in either one of em.

      • by Anonymous Coward

        salt = newuuid();
        final = salt + hashsha1(data + salt);

        For storing hashes of data, this is a no-brainer. It also makes the second hash function unnecessary, providing a small, but measurable, performance boost.

        • by Bengie ( 1121981 )
          Even Better, although I used SHA512 myself.

          salt = cryptorand.getbbytes(64);
          final = salt +HMACSHA1(data,salt);
          • Even Better, although I used SHA512 myself.

              salt = cryptorand.getbbytes(64);
              final = salt +HMACSHA1(data,salt);

            What good is the HMAC in this case?

        • salt = newuuid();
          final = salt + hashsha1(data + salt);

          For storing hashes of data, this is a no-brainer. It also makes the second hash function unnecessary, providing a small, but measurable, performance boost.

          Using salts is like publically apologizing for wrongdoing.

          I've tried over the years to make people understand the simple truth only exponents can protect information. When you make something 1000 or 1000000 times harder this "sounds" like a worthwhile accomplishment but in reality it means very little especially if what you are protecting is worth anything to anyone.

          If you feel you need to use salts to protect something you are doing that something a disservice by continuing with your current course of act

          • Using salts is like publically apologizing for wrongdoing.

            The main purpose of salting a hash is to protect against chosen plaintext attacks.

      • hashsha1(data) . hashmd5(data) could possibly be better, but what your suggesting sounds worse to me.

        That's exactly what SSLv3 did. The TLS folks then decided they could drop the dual hash because, you know, SHA-1 is so good that we don't need the extra safety level. It's one of the rare cases when the crypto in protocol version n+1 is actually weaker than in version n.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      You shouldn't let amateurs design cryptographic systems either...

    • by Anonymous Coward on Friday January 08, 2016 @10:42AM (#51262101)

      hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.

      This is not widely believed by crypto-security folks to be more secure.

      See e.g. https://crypto.stanford.edu/~xb/crypto06b/blackboxhash.pdf ---

      We studied the problem of combining multiple hash functions into a single function
      that is collision resistant whenever at least one of the original functions is.
      The hope was that, given a number of plausibly secure hash constructions (e.g.,
      SHA-512 and Whirlpool), one might be able to hedge one’s bet and build a new
      function that is at least as secure as the strongest of them. The combination
      should be space efficient in that the final output should be smaller than the
      concatenation of all hashes.

      We showed that no such efficient black-box combination is possible assuming each hash function is evaluated once

      • If I had to deal with "n" hash algorithms, I'd probably just do the easy way -- take, store, and sign with each hashing algorithm. Of course, this does add computation steps, not to mention more space for hashes to be stored. However, if one wants parallel chain links for top security, this is the surest bet.

        If I wanted to guess and assume I'd have -some- protection, I'd store the size of the file, and the hashes XOR-ed together. However, like the parent poster stated, there isn't a function that guarant

      • I have not fully read the above paper, but the initial summary seemed to be investigating the results of 'compressing' the concatenated results of the two hashes of the data, without investigating the properties of hashing the result of one algorithm hashing the others hash of the data.

    • by Anonymous Coward

      this is wrong and dangerous. you should feel ashamed for posting this.

    • If the first function is breakable, the whole chain is breakable because hashes are deterministic.

      Always use input constraints. Most (all?) hash breaks involve the ability to shovel arbitrary garbage into the hash function. For example, if you want to forge an HTML document with the same MD5 as another HTML document, you embed your carefully crafted junk in a HTML comment, so that it isn't a visible part of the document. If your electronic policies disallow invisible HTML elements, and your human policie

    • hashsha1(hashmd5(data)) is strong

      A hash is a many-to-one function. If you combine many hashes this way, you'll map a huge range of different data onto the same output. To me this seems it would be easier to find collisions, not harder. For starters, the output of sha1 has a fixed length, which greatly limits the range of the final output.

      Combining different hashes is a good idea, but one weak link in the chain will probably ruin it all. I'd rather combine the outputs of different hashes into one long string to keep them independent.

      • by Threni ( 635302 )

        "If you combine many hashes this way, you'll map a huge range of different data onto the same output....Combining different hashes is a good idea, but one weak link in the chain will probably ruin it all."

        OP's not suggesting you combine an arbitrary number of arbitrary hash algorithms though, but specifically those two, once each.

    • That sounds like nonsense isn't it ?
      SHA1(data) is a constant for a given "data"
      MD5(data) is a (different) constant for a given "data"

      SHA1(MD5(data)) is thus SHA1 of a constant which gives you exactly zilch in term of improvement of (in)security. At least it is not worse.

      Trying to improve on a "broken" cryptography function by combining simply does not work, especially if the theory is not well understood.
      For an example, applying two different cryptographic functions after each other on some data ( f(g(data

    • hashmd5(data) is weak.
      hashsha1(data) is weak.
      hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.

      While this construction is wrong I never understood what the problem is with the general approach. The PRF in TLS is derived this way except they are not stacked but rather XOR'd together.

      All justifications for not doing away with this I've ever read had been political rather than technical statements. This approach is not approved, untested, uncertain.. etc.

      Assuming a sufficiently large block size what is the problem with a crapload of hash algorithms? Obviously defective algorithms pull down effective

    • by kiwix ( 1810960 )
      If the issue is with collisions (which is the case in this attack), then you just have to break the innermost hash function. When hashmd5(data1) == hashmd5(data2), you also have hashsha1(hashmd5(data1)) == hashsha1(hashmd5(data2)).
    • hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.

      If hashmd5(text1) == hashmd5(text2), then hashsha1(hashmd5(text1)) == hashsha1(hashmd5(text2)).

  • Catch 22 (Score:5, Interesting)

    by Geoffrey.landis ( 926948 ) on Friday January 08, 2016 @10:46AM (#51262129) Homepage

    Wow, looks like Firefox has some real problems.
    From the link quoted: https://blog.mozilla.org/secur... [mozilla.org]

    How to tell if you’re affected
    If you can access this article in Firefox, you’re fine.

    So, if you Firefox is affected, they won't tell you about it. They'll only tell you if your Firefox is not affected.

    Later, same blog post:

    What to do if you’re affected
    The easiest thing to do is to install the newest version of Firefox. You will need to do this manually, using an unaffected copy of Firefox or a different browser, since we only provide Firefox updates over HTTPS.

    So, if your Firefox is affected, you can't upgrade it: you need to have the working version of Firefox to download a working version of Firefox.

    What a Catch 22! You can't know about the problem unless you already have fixed the problem, and you can't fix the problem... unless you have already fixed the problem.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      People will notice, they will not be able to use their bank for example, they probably try reinstalling Firefox, or the worse for Mozilla, install Chrome

      • People will notice, they will not be able to use their bank for example, they probably try reinstalling Firefox,

        But, per the article, they can't upgrade to the more recent Firefox if what they have is the old version of Firefox.

        or the worse for Mozilla, install Chrome

        Exactly! When the fix for the problem is "use somebody else's product", I'd call this a real problem for Firefox. Wait, isn't that what I just said?

        Wow, looks like Firefox has some real problems.

  • Is MD5 the original name of MC5, who is known for "kicking out the jams" ?
    • That's great. A Slahdot newbie making rap references. Be it noted that on this date you endeared yourself to Slashdotters everywhere!
  • by DERoss ( 1919496 ) on Friday January 08, 2016 @10:56AM (#51262195)

    For use in encryption or for verifying that a file is authentic, SHA1 and MD5 should definitely be avoided.

    When transmitting a file over a LAN, WAN, or the Internet, however, SHA1 and MD5 are still useful to ensure that the file has not been corrupted (e.g., packets lost). Also, those two hashes can be used to determine if two files in the same system are the same.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      When transmitting a file over a LAN, WAN, or the Internet, however, SHA1 and MD5 are still useful to ensure that the file has not been corrupted (e.g., packets lost).

      That's error checking though, not cryptography. They're not saying these hashes are useless, just not a good idea in security.

      For use in encryption or for verifying that a file is authentic, SHA1 and MD5 should definitely be avoided.

      ... Also, those two hashes can be used to determine if two files in the same system are the same.

      That kind of sounds like you contradicted yourself there. (Maybe some minor semantic difference)

      • by Anonymous Coward

        I think he's trying to say that for data de-duplication purposes those hashes might still be useful.
        You'll probably want to do a bit-wise compare to be sure.

        • Would it be better to use a large CRC as opposed to a cryptographically secure hash for deduplication work? CRCs are a lot easier to compute.

          • by Bengie ( 1121981 )
            Unless you have a cryptographically strong hash, you have to compare the data. CRCs are meant only to let you know if data has been corrupted, not if the data is the uniquely different than other data. CRC may be faster than SHA2-256, but it is much slower than having to read and compare a bunch of data.
    • by TheCarp ( 96830 )

      Of course, everything depends on use case. Just being able to find collisions doesn't break all potential uses. It breaks specific use cases where the attacker has a known target and time to work.

      It doesn't break use cases where the attacker has an unknown target and little time. It wouldn't break an authentication protocol based on a hash challenge response, since the attacker is asked to offer up his hash, which is checked.

      Doesn't matter if he can generate arbitrary collisions then, because he has no targ

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Someone earlier said "You shouldn't let amateurs design cryptographic systems".

      You shouldn't let amateurs design transmission protocols either.
      SHA1 and MD5 aren't ideal for transmission. There are simpler hashes that are better at dealing with the type of errors that you typically get from transmissions.
      With the correct method you can even repair a couple of incorrectly transmitted bits without having to resend the file.

      • Someone earlier said "You shouldn't let amateurs design cryptographic systems".

        You shouldn't let amateurs design transmission protocols either.

        You shouldn't let amateurs do most things - including porn.

    • I'd use CRC-64 or CRC-128 over MD5 anyway, just because it is a lot fewer cycles to compute CRCs than to do the more advanced work needed for a cryptographically secure hash.

      I would probably abandon MD5 and SHA1 altogether. If it doesn't need to be cryptographically secure, CRCs do the job. If it does need to be secure, I'd use SHA-3 with whatever length was needed for the job at hand. MD5 is "neither fish, nor fowl" and like MD4, just needs to be moved away from.

      • SHA-3 is extremely new, and while it's passed multiple reviews, it hasn't undergone the same scrutiny that SHA2 has undergone. Trusting SHA3 at this point may be somewhat naive, especially since even the best attacks against SHA2, which has undergone much more intensive review, don't reduce its strength very far.

    • by GuB-42 ( 2483988 )

      MD5 is vulnerable to full collision attacks
      SHA-1 is vulnerable to freestart collision attacks
      None of them are vulnerable to preimage attacks

      It means that SHA-1 is still safe in the vast majority of applications. Freestart collisions are almost useless in practice, they are only worrisome because they may lead the way to full collision attacks.
      MD5 is safe for storing passwords or checking that a file has not been tampered with by a third party. For example, if you download a program, ask the developer (who y

      • And TLS 1.0 and 1.1 both use md5(data).sha1(data) to sign the initial handshake. And since concatenating hashes provides no real additional security this lets an attacker muck around with the initial TLS exchange and perform a protocol downgrade attack by defeating TLS_FALLBACK_SCSV and/or also choose the symmetric cipher used in the TLS session. While its not a full plaintext recovery of the TLS session contents it is certainly not a good thing.
  • Two different issues (Score:5, Informative)

    by kiwix ( 1810960 ) on Friday January 08, 2016 @11:03AM (#51262241)
    The summary mixes two different issues... SHA-1 is being phased out for certificate signatures, but this is not what the SLOTH attack is about.
    SLOTH is about the use of MD5 and SHA1 inside the TLS protocol, to sign or MAC the key-exchange messages.
    (Disclaimer: I'm one of the authors of the paper)
  • So this is a problem concerning the use of weak hashes for cryptographic verification. But how "harmful" are hash collisions when all you're looking for are file transmission errors or bit rot [wikipedia.org]? Do we need a stronger hash function when dealing with compressed archives? Has anybody succeeded in creating two valid zip archives with the same MD5 sum?
    • ZFS uses CRC-64 hashes to check for bit rot for performance reasons.

      As for zip archives, maybe we need to move to a signature model where we use the hash and the file size. It is difficult to make two files with the same hash. It is a lot more difficult for them to be the same size.

    • I don't develop cryptography and have only every used MD5 for file integrity checks. Although my brother was talking the other day about a compressed encryption algorithm that sounded amazingly difficult to implement on even the best modern hardware.

      • There are many encryption and compression algorithms out there. I've seen some people try to combine the two. However, there is a vast distance between making something resistant to modern attacks. Older Mac programs did their own custom algorithms by using a random function with the seed coming from what the user typed in as a password, some other off-the-cuff algorithm, or just doing two rounds of DES for performance reasons.

        Problem is that you don't want the bulk cipher as the weak link, when it is qu

        • That's the thing with encryption there is always things to weigh. Poor performance to make brute force attacks to costly vs the reward so long as a side vector isn't found to render it completely useless?

          Someone using a unique, proprietary, and undisclosed system of encryption if crafted as well as any standard may actually fair better so long as they are not a huge public company.

          • That is the rub. If someone has a unique algorithm that is quite good, and is kept secret, it would add to the security, especially if the attacker didn't have any access to the endpoints for side channel attacks, so they only have ciphertext to go on. However, there are a lot of attacks that one has to deal with, and eventually the algorithm may be found to be weaker than expected. Skipjack comes to mind of how it was considerably weakened (but not completely broken) once it was declassified.

FORTUNE'S FUN FACTS TO KNOW AND TELL: A black panther is really a leopard that has a solid black coat rather then a spotted one.

Working...