Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Encryption Security

BLAKE2 Claims Faster Hashing Than SHA-3, SHA-2 and MD5 134

hypnosec writes "BLAKE2 has been recently announced as a new alternative to the existing cryptographic hash algorithms MD5 and SHA-2/3. With applicability in cloud storage, software distribution, host-based intrusion detection, digital forensics and revision control tools, BLAKE2 performs a lot faster than the MD5 algorithm on Intel 32- and 64-bit systems. The developers of BLAKE2 insist that even though the algorithm is faster, there are no loose ends when it comes to security. BLAKE2 is an optimized version of the then SHA-3 finalist BLAKE."
This discussion has been archived. No new comments can be posted.

BLAKE2 Claims Faster Hashing Than SHA-3, SHA-2 and MD5

Comments Filter:
  • Re:links to NIST (Score:2, Informative)

    by nazsco ( 695026 ) on Tuesday December 25, 2012 @02:47PM (#42389101) Journal

    Time to encode is not always related to time to decode. So it would benefit dictionary attacks only,

  • Re:links to NIST (Score:2, Informative)

    by Anonymous Coward on Tuesday December 25, 2012 @02:57PM (#42389207)

    There is no "decode" - breaking hashes is done by brute force (or by using dictionaries + modifiers).

  • Re:links to NIST (Score:5, Informative)

    by Anonymous Coward on Tuesday December 25, 2012 @03:01PM (#42389227)

    It is optimized for runtime speed, though they also claim a slightly lower memory footprint.
    The chance of hash collisions is about the same for any unbroken cryptographic hash.

    Keccak has good performance in dedicated hardware, but was one of the slower SHA-3 candidates in software implementations. Compared to MD5, the best SHA-3 implementation is between 2 to 4 times slower on recent Intel processors, or 5 times slower on a recent ARM CPU.

    Note that NIST has stated that any of the five SHA-3 finalists would have been an acceptable SHA-3. Keccak was selected in part due to having a very different composition than the SHA-2 family, decreasing the risk that both are broken at once. That does not mean Keccak is superior to Blake for all purposes - if software speed is the main issue, it is almost certainly worse.

  • Re:Missing the point (Score:4, Informative)

    by magic maverick ( 2615475 ) on Tuesday December 25, 2012 @03:18PM (#42389381) Homepage Journal

    Nah, you are. Hashes are used for a lot more than just passwords. Yes, for passwords a fast generic hash function like SHA2 or SHA3 (let alone MD5) is not such a good option. But for verifying that a downloaded executable or other file has not been modified, it's mostly fine. But don't use MD5, because it's completely broken (e.g. with it being possible to have two distinct PDF files having the same hash).

    For password hashing use Blowfish/Bcrypt [codahale.com].

  • Re:Faster? (Score:4, Informative)

    by Anonymous Coward on Tuesday December 25, 2012 @03:22PM (#42389415)

    Faster hashing is better. The good guys have to pay a cost of X for hashing at b bits. The attacker has to, ideally, pay a cost of 2^b*X (well, a little less due to the Birthday paradox, but it'll still be exponential). Halving X helps the good guys, because hashing is now faster for everyone. It doesn't help the attacker because you should have chosen b to be so big that halving X makes just absolutely no difference because 2^b is so big that it's completely impractical even if X is just the time for a single cycle on a CPU.

    Here's another way of looking at it: trying to impede the attacker by increasing X is the wrong idea because you ALSO have to pay for X. Instead, you want to focus on b because here your cost grows slowly with b while the attacker's cost grows exponentially with b. So smaller X is better, that is, faster hashes are better.

  • Re:Time (Score:5, Informative)

    by Gaygirlie ( 1657131 ) <gaygirlie@Nospam.hotmail.com> on Tuesday December 25, 2012 @03:24PM (#42389429) Homepage

    So use a generic hashthis() function (or class, whatever), and then you don't have to replace sha3() or blake2() or whatever through your code, merely modify the hashthis() function to use the new algorithm instead. Forward thinking amirite?

    Otherwise yes, but it would make all the existing data unreadable. If you make it easy to change the hashing method like that then you will also have to always track what data was hashed with what method.

  • Re:links to NIST (Score:5, Informative)

    by marcansoft ( 727665 ) <hectorNO@SPAMmarcansoft.com> on Tuesday December 25, 2012 @03:25PM (#42389439) Homepage

    For password hashing, that is correct. However, cryptographic hash functions are not designed for such use (and yes, all the websites and services out there using plain hashes for passwords are Doing It Wrong, even if they are using a salt). You can build a good password hashing scheme out of a cryptographic hash function (for example, PBKDF2), but plain hash functions are not suitable (precisely because they are too fast).

    Fast cryptographically secure hash functions are a Good Thing, so you can hash a given block of data (and thus compute its digest and e.g. verify its digital signature) as fast as possible. This speeds up things like GPG, SSL, Git*, good old sha1sum checksums, etc. If you then want to use such a hash function as the core of a password hashing scheme, you can compensate for the extra speed by simply increasing the number of iterations. Making a hash function slower is always easy.

    *Git is pretty much stuck with SHA-1 for now, but future incompatible versions of the repo format could conceivably switch to a faster hash function if it made sense.

  • Re:links to NIST (Score:5, Informative)

    by Anonymous Coward on Tuesday December 25, 2012 @03:37PM (#42389523)

    A common misconception, but untrue. Cryptographic hashes can be used for many purposes - for storing passwords, to sign documents, to verify file integrity, as a short unique identifier for content, etc.

    The only of these for which slower is better, is password storing.
    If you are using e.g. cryptographic hash functions with a 512-bit output, it is irrelevant for practical security that one hash function is a billion times faster than another. Either way, you will not find a collision in the lifetime of the universe without finding a weakness in the hash.

    Now, this applies when we can assume that any needed keys are of good randomness. This is not the case with passwords, because human-chosen passwords have low enough entropy that they can be guessed.
    However, it is very simple to make a slow hash from a fast one by using a key derivation function like PBKDF2 [wikipedia.org], which iterates over the hash function a configurable number of time, often hundreds of thousands.

    In short, many applications benefit from a fast hash algorithm, and none suffer, because you can always easily make a slower one.

UNIX is many things to many people, but it's never been everything to anybody.