Deprecation of MD5 and SHA1 -- Just in Time? (threatpost.com) 87
mitcheli writes: If you're hanging on to the theory that collision attacks against SHA-1 and MD5 aren't yet practical, two researchers from INRIA, the French Institute for Research in Computer Science and Automation, have provided reason for urgency. They demonstrated a new series of transcript collision attacks centered on the SHA-1 and MD5 implementations in TLS 1.1, 1.2 and 1.3, along with IKEv1 and v2, and SSH 2. They say, "Our main conclusion is that the continued use of MD5 and SHA1 in mainstream cryptographic protocols significantly reduces their security and, in some cases, leads to practical attacks on key protocol mechanisms (PDF)." Of course, Mozilla officially began rejecting new SHA-1 certificates as of the first of the year. And as promised, there have been some usability issues. Mozilla said on Wednesday that various security scanners and antivirus products are keeping some users from reaching HTTPS websites.
You shouldn't use one hash. (Score:2, Interesting)
hashmd5(data) is weak.
hashsha1(data) is weak.
hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.
Re: (Score:1)
hashmd5(data) is weak.
hashsha1(data) is weak.
hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.
hashsha1(data) . hashmd5(data) could possibly be better, but what your suggesting sounds worse to me.
Re:You shouldn't use one hash. (Score:5, Informative)
Re:You shouldn't use one hash. (Score:4, Insightful)
Very interesting article. However, it seems to be saying that a concatenation of an X bit hash and a Y bit hash are no better than a third hash of length X+Y bits. My original comment was in respect that md5 . sha1 would be better than md5 or sha1 alone, but even then I'm no crypto expert so I'm prepared to be proved wrong on that.
Re: (Score:2)
You seem to have missed the bit that says:
We also discuss the potential impact of our attack on several published schemes. Quite surprisingly, for subtle reasons, the schemes we study happen to be immune to our attack.
In other words the paper says "Here is a devastating attack in paired hash functions that happens not to work on real-world uses of paired hash functions". In fact RIPEMD-160, the longest-lived unbroken hash function [valerieaurora.org], gets its security precisely from being a cascaded hash (and is be immune to the attack described in the paper).
Re: (Score:1)
hashsha1(data) . hashmd5(data) could possibly be better, but what your suggesting sounds worse to me.
Sounds pretty dumb.
Then I'll just split the concatenated hash and find a hash collision in either one of em.
Re: (Score:2)
What would you do with the one hash collision if the other one doesn't match?
Re: (Score:1)
salt = newuuid();
final = salt + hashsha1(data + salt);
For storing hashes of data, this is a no-brainer. It also makes the second hash function unnecessary, providing a small, but measurable, performance boost.
Re: (Score:3)
salt = cryptorand.getbbytes(64);
final = salt +HMACSHA1(data,salt);
Re: (Score:2)
Even Better, although I used SHA512 myself.
salt = cryptorand.getbbytes(64);
final = salt +HMACSHA1(data,salt);
What good is the HMAC in this case?
Re: (Score:2)
salt = newuuid();
final = salt + hashsha1(data + salt);
For storing hashes of data, this is a no-brainer. It also makes the second hash function unnecessary, providing a small, but measurable, performance boost.
Using salts is like publically apologizing for wrongdoing.
I've tried over the years to make people understand the simple truth only exponents can protect information. When you make something 1000 or 1000000 times harder this "sounds" like a worthwhile accomplishment but in reality it means very little especially if what you are protecting is worth anything to anyone.
If you feel you need to use salts to protect something you are doing that something a disservice by continuing with your current course of act
Re: (Score:2)
Using salts is like publically apologizing for wrongdoing.
The main purpose of salting a hash is to protect against chosen plaintext attacks.
Re: (Score:2)
hashsha1(data) . hashmd5(data) could possibly be better, but what your suggesting sounds worse to me.
That's exactly what SSLv3 did. The TLS folks then decided they could drop the dual hash because, you know, SHA-1 is so good that we don't need the extra safety level. It's one of the rare cases when the crypto in protocol version n+1 is actually weaker than in version n.
Re: (Score:2, Insightful)
You shouldn't let amateurs design cryptographic systems either...
Re:You shouldn't use one hash. (Score:5, Informative)
hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.
This is not widely believed by crypto-security folks to be more secure.
See e.g. https://crypto.stanford.edu/~xb/crypto06b/blackboxhash.pdf ---
We studied the problem of combining multiple hash functions into a single function
that is collision resistant whenever at least one of the original functions is.
The hope was that, given a number of plausibly secure hash constructions (e.g.,
SHA-512 and Whirlpool), one might be able to hedge one’s bet and build a new
function that is at least as secure as the strongest of them. The combination
should be space efficient in that the final output should be smaller than the
concatenation of all hashes.
We showed that no such efficient black-box combination is possible assuming each hash function is evaluated once
Re: (Score:1)
If I had to deal with "n" hash algorithms, I'd probably just do the easy way -- take, store, and sign with each hashing algorithm. Of course, this does add computation steps, not to mention more space for hashes to be stored. However, if one wants parallel chain links for top security, this is the surest bet.
If I wanted to guess and assume I'd have -some- protection, I'd store the size of the file, and the hashes XOR-ed together. However, like the parent poster stated, there isn't a function that guarant
Re: (Score:2)
I have not fully read the above paper, but the initial summary seemed to be investigating the results of 'compressing' the concatenated results of the two hashes of the data, without investigating the properties of hashing the result of one algorithm hashing the others hash of the data.
Re: (Score:1)
this is wrong and dangerous. you should feel ashamed for posting this.
Re: (Score:2)
If the first function is breakable, the whole chain is breakable because hashes are deterministic.
Always use input constraints. Most (all?) hash breaks involve the ability to shovel arbitrary garbage into the hash function. For example, if you want to forge an HTML document with the same MD5 as another HTML document, you embed your carefully crafted junk in a HTML comment, so that it isn't a visible part of the document. If your electronic policies disallow invisible HTML elements, and your human policie
Re: (Score:2)
I know you were joking, but in case anyone was actually planning to try this: gzip isn't a monomorphism.
Re: (Score:2)
Aargh! Sorry, I meant gzip isn't a function. The same input may have many different gzipped representations.
Re: (Score:2)
hashsha1(hashmd5(data)) is strong
A hash is a many-to-one function. If you combine many hashes this way, you'll map a huge range of different data onto the same output. To me this seems it would be easier to find collisions, not harder. For starters, the output of sha1 has a fixed length, which greatly limits the range of the final output.
Combining different hashes is a good idea, but one weak link in the chain will probably ruin it all. I'd rather combine the outputs of different hashes into one long string to keep them independent.
Re: (Score:1)
"If you combine many hashes this way, you'll map a huge range of different data onto the same output....Combining different hashes is a good idea, but one weak link in the chain will probably ruin it all."
OP's not suggesting you combine an arbitrary number of arbitrary hash algorithms though, but specifically those two, once each.
Re: (Score:1)
That sounds like nonsense isn't it ?
SHA1(data) is a constant for a given "data"
MD5(data) is a (different) constant for a given "data"
SHA1(MD5(data)) is thus SHA1 of a constant which gives you exactly zilch in term of improvement of (in)security. At least it is not worse.
Trying to improve on a "broken" cryptography function by combining simply does not work, especially if the theory is not well understood.
For an example, applying two different cryptographic functions after each other on some data ( f(g(data
Re: (Score:2)
hashmd5(data) is weak.
hashsha1(data) is weak.
hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.
While this construction is wrong I never understood what the problem is with the general approach. The PRF in TLS is derived this way except they are not stacked but rather XOR'd together.
All justifications for not doing away with this I've ever read had been political rather than technical statements. This approach is not approved, untested, uncertain.. etc.
Assuming a sufficiently large block size what is the problem with a crapload of hash algorithms? Obviously defective algorithms pull down effective
Re: (Score:2)
Re: (Score:3)
hashsha1(hashmd5(data)) is strong, and unlikely to be attacked successfully unless your key data is too short.
If hashmd5(text1) == hashmd5(text2), then hashsha1(hashmd5(text1)) == hashsha1(hashmd5(text2)).
Catch 22 (Score:5, Interesting)
Wow, looks like Firefox has some real problems.
From the link quoted: https://blog.mozilla.org/secur... [mozilla.org]
How to tell if you’re affected
If you can access this article in Firefox, you’re fine.
So, if you Firefox is affected, they won't tell you about it. They'll only tell you if your Firefox is not affected.
Later, same blog post:
What to do if you’re affected
The easiest thing to do is to install the newest version of Firefox. You will need to do this manually, using an unaffected copy of Firefox or a different browser, since we only provide Firefox updates over HTTPS.
So, if your Firefox is affected, you can't upgrade it: you need to have the working version of Firefox to download a working version of Firefox.
What a Catch 22! You can't know about the problem unless you already have fixed the problem, and you can't fix the problem... unless you have already fixed the problem.
Re: (Score:2, Interesting)
People will notice, they will not be able to use their bank for example, they probably try reinstalling Firefox, or the worse for Mozilla, install Chrome
Re: (Score:2)
People will notice, they will not be able to use their bank for example, they probably try reinstalling Firefox,
But, per the article, they can't upgrade to the more recent Firefox if what they have is the old version of Firefox.
or the worse for Mozilla, install Chrome
Exactly! When the fix for the problem is "use somebody else's product", I'd call this a real problem for Firefox. Wait, isn't that what I just said?
Wow, looks like Firefox has some real problems.
MD5? (Score:1)
Re: (Score:2)
It Depends on Why You Are Using Hash Codes (Score:4, Informative)
For use in encryption or for verifying that a file is authentic, SHA1 and MD5 should definitely be avoided.
When transmitting a file over a LAN, WAN, or the Internet, however, SHA1 and MD5 are still useful to ensure that the file has not been corrupted (e.g., packets lost). Also, those two hashes can be used to determine if two files in the same system are the same.
Re: (Score:2, Interesting)
When transmitting a file over a LAN, WAN, or the Internet, however, SHA1 and MD5 are still useful to ensure that the file has not been corrupted (e.g., packets lost).
That's error checking though, not cryptography. They're not saying these hashes are useless, just not a good idea in security.
For use in encryption or for verifying that a file is authentic, SHA1 and MD5 should definitely be avoided.
... Also, those two hashes can be used to determine if two files in the same system are the same.
That kind of sounds like you contradicted yourself there. (Maybe some minor semantic difference)
Re: (Score:1)
I think he's trying to say that for data de-duplication purposes those hashes might still be useful.
You'll probably want to do a bit-wise compare to be sure.
Re: (Score:2)
Would it be better to use a large CRC as opposed to a cryptographically secure hash for deduplication work? CRCs are a lot easier to compute.
Re: (Score:3)
Re: (Score:3)
Of course, everything depends on use case. Just being able to find collisions doesn't break all potential uses. It breaks specific use cases where the attacker has a known target and time to work.
It doesn't break use cases where the attacker has an unknown target and little time. It wouldn't break an authentication protocol based on a hash challenge response, since the attacker is asked to offer up his hash, which is checked.
Doesn't matter if he can generate arbitrary collisions then, because he has no targ
Re: (Score:2, Informative)
Someone earlier said "You shouldn't let amateurs design cryptographic systems".
You shouldn't let amateurs design transmission protocols either.
SHA1 and MD5 aren't ideal for transmission. There are simpler hashes that are better at dealing with the type of errors that you typically get from transmissions.
With the correct method you can even repair a couple of incorrectly transmitted bits without having to resend the file.
Re: (Score:3)
Someone earlier said "You shouldn't let amateurs design cryptographic systems".
You shouldn't let amateurs design transmission protocols either.
You shouldn't let amateurs do most things - including porn.
Re: (Score:2)
I'd use CRC-64 or CRC-128 over MD5 anyway, just because it is a lot fewer cycles to compute CRCs than to do the more advanced work needed for a cryptographically secure hash.
I would probably abandon MD5 and SHA1 altogether. If it doesn't need to be cryptographically secure, CRCs do the job. If it does need to be secure, I'd use SHA-3 with whatever length was needed for the job at hand. MD5 is "neither fish, nor fowl" and like MD4, just needs to be moved away from.
Re: (Score:2)
SHA-3 is extremely new, and while it's passed multiple reviews, it hasn't undergone the same scrutiny that SHA2 has undergone. Trusting SHA3 at this point may be somewhat naive, especially since even the best attacks against SHA2, which has undergone much more intensive review, don't reduce its strength very far.
Re: (Score:1)
Re: (Score:2)
MD5 is vulnerable to full collision attacks
SHA-1 is vulnerable to freestart collision attacks
None of them are vulnerable to preimage attacks
It means that SHA-1 is still safe in the vast majority of applications. Freestart collisions are almost useless in practice, they are only worrisome because they may lead the way to full collision attacks.
MD5 is safe for storing passwords or checking that a file has not been tampered with by a third party. For example, if you download a program, ask the developer (who y
Re: (Score:2)
Two different issues (Score:5, Informative)
SLOTH is about the use of MD5 and SHA1 inside the TLS protocol, to sign or MAC the key-exchange messages.
(Disclaimer: I'm one of the authors of the paper)
Still useful outside a cryptographic context? (Score:2)
Re: (Score:1)
ZFS uses CRC-64 hashes to check for bit rot for performance reasons.
As for zip archives, maybe we need to move to a signature model where we use the hash and the file size. It is difficult to make two files with the same hash. It is a lot more difficult for them to be the same size.
Re: (Score:3)
Re: (Score:2)
I don't develop cryptography and have only every used MD5 for file integrity checks. Although my brother was talking the other day about a compressed encryption algorithm that sounded amazingly difficult to implement on even the best modern hardware.
Re: (Score:1)
There are many encryption and compression algorithms out there. I've seen some people try to combine the two. However, there is a vast distance between making something resistant to modern attacks. Older Mac programs did their own custom algorithms by using a random function with the seed coming from what the user typed in as a password, some other off-the-cuff algorithm, or just doing two rounds of DES for performance reasons.
Problem is that you don't want the bulk cipher as the weak link, when it is qu
Re: (Score:2)
That's the thing with encryption there is always things to weigh. Poor performance to make brute force attacks to costly vs the reward so long as a side vector isn't found to render it completely useless?
Someone using a unique, proprietary, and undisclosed system of encryption if crafted as well as any standard may actually fair better so long as they are not a huge public company.
Re: (Score:1)
That is the rub. If someone has a unique algorithm that is quite good, and is kept secret, it would add to the security, especially if the attacker didn't have any access to the endpoints for side channel attacks, so they only have ciphertext to go on. However, there are a lot of attacks that one has to deal with, and eventually the algorithm may be found to be weaker than expected. Skipjack comes to mind of how it was considerably weakened (but not completely broken) once it was declassified.
Re: (Score:1)
The old version of PGP (pre /dev/random) used to have something similar as well, where one typed in some keystrokes, and the timing was used to seed the RNG. I wish more programs had this as an option (KeePass does), mainly because it adds a decent source of randomness that is extremely hard to duplicate.
Re: (Score:2)
You are going to great lengths to subvert the security process by falsely clinging to a security through obscurity paradigm. If your system is secure you can post all of the details of said system and it will be no less secure. Relying on keeping details of the implementation private is Security through/by Obscurity, and only serves to limit the ability of the auditor to thoroughly and properly evaluate your scheme. No competent security professional believes t
Re: (Score:2)
The company I'm working for is undergoing some transitions and we needed a way to uniquely identify users without showing their personal information to everyone. I designed a system that uses some personal information (SSN, DOB), concatenates it into a string (whose composition is known to only a select few), and encrypts it using SHA1.
Your problem is not preimage resistance of SHA1 but the piss poor entropy of SSN + DOB.
If you want to do something like this at least use HMAC-SHA1 and try to keep the secret from falling into the wrong hands.
My question is this: Given this article and assuming that the hashes fell into the hands of someone who wanted to decrypt them, how long would it take to do so? Would it be hours? Months? Decades? Would having multiple hashes tell the would-be cracker anything about the structure of the decrypted string which might help shorten the decryption time? Or would they have to brute force each string until they got one decrypted?
None of this is relevant when the hashed data contains no meaningful entropy in the first place.
Re: (Score:2)