Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Encryption Security Technology

New AES Attack Documented 236

avxo writes "Bruce Schneier covers a new cryptanalytic related-key attack on AES that is better than brute force with a complexity of 2^119. According to an e-mail by the authors: 'We also expect that a careful analysis may reduce the complexities. As a preliminary result, we think that the complexity of the attack on AES-256 can be lowered from 2^119 to about 2^110.5 data and time. We believe that these results may shed a new light on the design of the key-schedules of block ciphers, but they pose no immediate threat for the real world applications that use AES.'"
This discussion has been archived. No new comments can be posted.

New AES Attack Documented

Comments Filter:
  • Yawn (Score:3, Insightful)

    by Shikaku ( 1129753 ) on Wednesday July 01, 2009 @05:56PM (#28550501)

    So instead of taking 1 million years to brute force, it will take .9 million years?

    I totally made up those numbers but that's about the difference.

    • Re:Yawn (Score:5, Informative)

      by a_n_d_e_r_s ( 136412 ) on Wednesday July 01, 2009 @06:55PM (#28551301) Homepage Journal

      Given that the new theory lowers the time to break it with about 99.7% if it before took 1 million years it now only takes 3000 years.

      Remember for everý less bit it takes to decrypt - it halves the time it takes to break a cipher.

      • by snikulin ( 889460 ) on Wednesday July 01, 2009 @07:28PM (#28551685)

        Security []

        • by froon ( 1160919 ) on Wednesday July 01, 2009 @08:42PM (#28552403)

          If you want your secrets to remain secret past the end of your life expectancy, then, in order to choose a key length, you have to be a futurist. You have to anticipate how much faster computers will get during this time. You must also be a student of politics. Because if the entire world were to become a police state obsessed with recovering old secrets, then vast resources might be thrown at the problem of factoring large prime numbers.

          So the length of the key that you use is, in and of itself, a code of sorts. A knowledgeable government eavesdropper, noting Randy's and Avi's use of a 4096-bit key, will conclude one of the following:

          -Avi doesn't know what he's talking about. This can be ruled out with a bit of research into his past accomplishments. Or,

          -Avi is clinically paranoid. This can also be ruled out with some research. Or,

          -Avi is extremely optimistic about the future development of computer technology, or pessimistic about the political climate, or both. Or,

          -Avi has a planning horizon that extends over a period of at least a century.

          -- Neal Stephenson, Cryptonomicon

    • Re: (Score:3, Insightful)

      by Eivind ( 15695 ) []
      Pay special attention to the reaction of the "slashdotter" to "minor weakness found", and compare it to your reaction.
      Remember, attacks always gets better, never worse. The first attack that weakens an algorithm *is* a big deal.

      Oh, and reducing complexity from 2^128 to 2^110 isn't as it may appear a reduction of 10% in time-to-break, infact it's a reduction of 2^18 or about a factor of a million, so it's more like if before it took a million years, now it takes ONE year. Lu

  • by techno-vampire ( 666512 ) on Wednesday July 01, 2009 @05:57PM (#28550509) Homepage
    TFA refers (as does the summary) to complexity of 2^119, and possibly lowering it to 2^110.5. Could somebody rephrase that in a way that people like me, who aren't cryptography specialists can understand what they're talking about?
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      I believe the complexity is a rough measure of how long it should take to break the code. So in this case, a reduction from 2^119 to 2^110.5 is approximately 360 times faster (that is, a 2^119 complexity attack takes 360 times as long as a 2^110.5 complexity attack).

    • Re: (Score:3, Informative)

      by wealthychef ( 584778 )
      Normally, "complexity" in computer science refers to how long it takes to do a given task, given the size of the task. It's usually expressed as O(blah), read, "Order of blah". For example, an O(n^2) ("order n squared") complexity means that if it takes "m" minutes to finish a problem of size x, then it will take 16m minutes to finish a problem of size 4x. I'm not familiar with the term "complexity" being used in this context and with these specific numbers.
      • It's usually expressed as O(blah)...

        Yeah; I know, and I'd not have wondered if they'd expressed it that way. It's good to know that I'm not the only reader that doesn't understand the terminology.

      • Re:Complexity (Score:4, Interesting)

        by Kjella ( 173770 ) on Wednesday July 01, 2009 @07:52PM (#28551921) Homepage

        I'm not familiar with the term "complexity" being used in this context and with these specific numbers.

        Because it's not a problem that scales with n, it's an attack on one particular value of n. Ideally brute forcing an n-bit cipher has complexity O(2^n). For 256 bit AES, they've found an attack that instead of the ideal 2^256 attempts takes 2^119 attempts. But you can't say O(2^119) because that is equal to O(1), and any function with n would be false since it doesn't apply to other n. I guess you could say an attack with "complexity O(2^(n*119/256) for n=256" but you're likely to confuse a hundred times as many as are enlightened.

    • Re:Complexity (Score:5, Informative)

      by vux984 ( 928602 ) on Wednesday July 01, 2009 @06:18PM (#28550809)

      Could somebody rephrase that in a way that people like me, who aren't cryptography specialists can understand what they're talking about?

      Sure I'll rephrase it for you. "Don't worry."

      What? You wanted something deeper without having to know anything? AES was thought to require 2^128 time units to brute force. So 2^119 time complexity means essentially that the new algorithm takes 2^119 units of time to complete which is a lot better, and they think it might be able to optimize it down to 2^110 units of time.

      What a 'unit of time is' is a computing science hand-wave because it doesn't really matter what it is. When comparing algorithms for large problems you are interested in how it compares relative to other algorithms, not how much absolute time it will take on a Commodore 64 or Intel i7 or whether its programmed in Smalltalk vs C. Those details while important in their own right aren't really relevant to the comparison of the algorithms themselves.

      A 2^110 algorithm is significantly better than a 2^119 algorithm for 'large problems' regardless of what we set the unit of time to be, and in turn 2^119 is much better than 2^128.

      In practice the unit of time is rooted in how long it takes a computer to do 'an operation'. So it might be milliseconds or nanoseconds, or whatever. And the upshot is that even 2^110 is STILL gazillion years even if its programmed in C on an i7 and every i7 on the planet is contributing to the effort...

      Hence... "Don't worry."

      Its mathematically very interesting, but for the moment, its nothing to "worry" about.

      • What? You wanted something deeper without having to know anything?

        Not quite. I wanted to learn enough to understand what they were talking about, even if I couldn't follow the math they used in the proof, and that you (and a few others) have given me. Thank you.

    • by jrl87 ( 669651 )

      Basically they're looking for weaknesses in the encryption or a way to break the encryption. The basic idea is that if you have a x-bit key for you encryption system then you should be able to generate 2^x different keys. So for instance if you had 4-bit encryption, then you would have 4 bits that you could assign a value to. That is you have something like _ _ _ _ where each _ can be either a 1 or a 0. When you work out the number of unique ways you can make this assignment you get 16, or 2^4.

      To break

    • It means it just got roughly 400 times faster.
      • No, 119 to 110.5 is the amount they predict this attack could be improved with more work.

        If it reduces an attack 256-bit AES to 2^119 complexity then it's 2^137 times better, and 2^137 times better is half a metric asston.
        • Re: (Score:2, Funny)

          by jcwayne ( 995747 )

          ... 2^137 times better is half a metric asston.

          I measure algorithmic complexity in imperial asstons, you insensitive clod.

    • Re:Complexity (Score:5, Informative)

      by Joce640k ( 829181 ) on Wednesday July 01, 2009 @06:38PM (#28551085) Homepage
      It means you only have to test 2^119 possible keys to break 256-bit AES - still far beyond what's ever going to be feasible (do the math - give everybody on the planet a million PCs running at 1THz and see how long it takes to do 2^119 things, then figure out where you're going to get that much electricity from)

      Interesting to note is that AES-128 is immune to this attack - it's now the strongest variant of AES. Everybody (like me) who thought the 256-bit and 192-bit were a waste of time now has a reason to be smug about it.

      Reason: Both AES-192 and AES-256 are just AES-128 internally but they mess around with the key data between each loop of the encryption process. The new attack only works on the "messing around" part of the process so AES-128 is unaffected.
      • Thank you. However, one question remains: is that the maximum number of keys, or the average number. Either way, of course, it's a huge number, but it does make a difference.
        • That's the maximum, the average would be half that.

          ...unless they're expecting you, in which case they can put the key towards the end of the keyspace to make it take longer. You could of course anticipate that move and start searching at the end but in return they could anticipate that and put it at the beginning. It's a Sicilian mind game.
          • What I'd do in a case like that is put it somewhere near the middle. That way, it's about the same time no matter which end you start at, and if you do start in the middle, you still have to guess if mine is above or below your starting point giving yourself a 50/50 chance of starting out by going in the wrong direction.
  • by Anonymous Coward on Wednesday July 01, 2009 @05:57PM (#28550513)

    Crypto is broken. It's not IF, but WHEN. That's why crypto is pointless to use. this is why I use open source, and even keep all doors unlocked. It's pointless to try and protect propery, real or intellectual/imaginary.

  • Complexity. (Score:5, Funny)

    by girlintraining ( 1395911 ) on Wednesday July 01, 2009 @05:57PM (#28550517)

    For those who don't have a degree in oh-shit-that's-a-big-number, can someone give a comparative analysis of what "2^119" complexity means? I mean what else is "2^119" hard to solve? And yes, the math nerds are undoubtedly either dying of laughter or yelling at the screen for my abuse of powers of two... I don't care.

    • Re:Complexity. (Score:5, Informative)

      by xZgf6xHx2uhoAj9D ( 1160707 ) on Wednesday July 01, 2009 @06:05PM (#28550631)

      AES-128 uses keys which are 128 bits long. That means in order to "break AES" (in order to decrypt something you don't have the key to), all you have to do is try all possible keys of length 128 until you find one that works. That means you would have to try 2^128 different combinations, which is a lot.

      What these people have done is found some clever way where you can break AES trying only 2^119 combinations. Effectively this means AES is no better than if it had used 119-bit keys instead of 128-bit keys. Sometimes you'll hear this colloquially as something like "AES has 119 bits of security", referring to how many combinations of keys you have to try before you find the one works.

      2^119 is a massively large number. Trying 2^119 combinations is still terribly far outside of the realm of what all of the world's most powerful supercomputers combined could hope to do. This is an attack of theoretical interest, not practical interest.

      • Re:Complexity. (Score:5, Insightful)

        by cpu_fusion ( 705735 ) on Wednesday July 01, 2009 @06:11PM (#28550725)

        Pardon me, but isn't the article about AES-256? So this is a much more significant drop in the number of bits.

        Of course, I've only read the summary. This is slashdot, natch.

        • Re: (Score:2, Insightful)

          Oh dear, you're absolutely right. This is about AES-256. That's quite a significant attack indeed (though still not enough to make it practical).
          • Re: (Score:2, Informative)

            by dermoth666 ( 1019892 )

            I believe the probability being halved has something to do with the birthday paradox. It's been a time where I could explain this better; if you wish to find out just search for it on Google... This page seems to have a good explanation too:


            • Re: (Score:3, Interesting)

              by BitterOak ( 537666 )

              I believe the probability being halved has something to do with the birthday paradox.

              Actually, that just applies to secure hash functions (like MD5 and SHA, and the like) and not to block ciphers. If AES-256 can be cracked with only 2^119 calculations, that is a HUGE drop in security.

              The reason that hash functions really only give you half as many bits of security as you have bits in the digest is that a hash is considered broken if you can find two messages which have the same hash. Since you can vary both messages, you only have to try 2^(n/2) as many, just like the birthday "paradox".

        • It's a time/memory/data trade-off. If you have 2^128 AES-256-encrypted copies of a known plaintext, you can already find one of the keys using 2^128 time. You just encrypt using 2^128 random keys and find collisions.

          This attack does better by a factor of 2^9 = 512 if the keys are related in some way known to (or maybe chosen by) the attacker. They think they can get another factor of 2^9 out of it with a more careful analysis. Even so, this is only a theoretical weakness.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        Except IYRTFA, the attack only works on AES-192 and AES-256. AES-128 is unaffected, which would seem to imply that, oddly, AES-128 could be stronger than AES-256 and AES-192 in some circumstances.

        • Re: (Score:3, Informative)

          This is an attack on key schedules; the key schedule of AES-128 is different from that of AES-192 and AES-256, thus rendering it impervious to this particular attack. As the authors note, this sheds new light on key schedule design, much in the same way that differential cryptanalysis shed light on S-box design.
        • by Skuto ( 171945 )

          Not really. The thing is that a break against AES-256 can use up to 2^256-1 operations and be considered faster than brute force. The same technique wouldn't count as a break against AES-128, because that is brute forceable in 2^128 anyway.

          If there's an attack against AES-256 that takes 2^200 operations, it's considered a break. But this is still more effort than the one needed to just brute force AES-128. So AES-256 would still be more secure.

      • Re:Complexity. (Score:5, Informative)

        by quercus.aeternam ( 1174283 ) on Wednesday July 01, 2009 @06:16PM (#28550793) Homepage

        Two things:

        First, they are talking about AES-256.

        Second, I find it useful to think about how much faster that is. In this case, it means it is 2^137 times faster than a pure brute force attack, which certainly seems impressive. Fortunately, as you mentioned, this is still far too difficult to be applied.

        Just for fun, google this: 2^119 picoseconds in millenia

        • Re: (Score:3, Informative)

          Just for fun, google this: 2^119 picoseconds in millenia.

          Google says:

          2.10607966 × 10^13 millenia -- which equals about 21,000,000 billion years

          With the new attack the figure is 2^110.5 trys, which is

          (2^110.5) picoseconds = 5.81727815 × 10^10 millenia

          a mere 58,000 billion years.

          Looking at it another way, 119 - 110.5 = 8.5, which is the reduction of the exponent, giving 2^8.5 = 362. So knocking off 8.5 bits reduces the amount of time+effort by that ratio.

          "Picosecond" is the assumed

        • Re:Complexity. (Score:5, Interesting)

          by AlHunt ( 982887 ) on Wednesday July 01, 2009 @07:21PM (#28551597) Homepage Journal

          >Just for fun, google this: 2^119 picoseconds in millenia

          And for even more fun - 64 minutes after the parent posted, the post itself was the first result.

        • by Kjella ( 173770 )

          I think that's the only kind of number that makes sense without using the 2^x notation - 137 bits broken, 119 bits of strength left. It's lost over half the bitstrength. If it'd been the same for 128 bit keys, it'd be well into crackable ranges. That's really the big issueeasily here, why would this NOT affect 128 bit AES? Did they do something really, really smart in that version that leaves them immune? Or is it rather we don't want to set of every alarm bell there is?

        • Just for fun, wolfram-alpha it (doesn't exactly roll off the tongue, does it?). That is what it's designed to do, after all - plus I found the results a little more useful than google.
      • Re: (Score:3, Interesting)

        by Anonymous Coward

        2^119 is a massively large number.


        Meh. I've seen bigger.

        • Re: (Score:2, Funny)

          by Anonymous Coward
          that's what she said.
      • by bmajik ( 96670 )

        Not only is 2^119 a big number...its around 6e35.

        Wolfram says that the milky way galaxy weighs around 6e45 grams.

        assuming 1 mol of electrons in 1 gram of galaxy, you're talking about 3e69 electrons in the whole galaxy.

        I suppose it might be possible within the confines of our galaxy to do something 2^119 times. We do have enough electrons. That's an important upper bound to not violate.

      • If I'm reading the paper correctly, it appears that AES-128 is unaffected. (Please correct me if I'm wrong!)
    • I mean what else is "2^119" hard to solve?

      Finding a file which has an MD5 hash of either 000000000000000000000000000000XX or 000000000000000000000000000001XX for some pair of hexadecimal digits XX.

      Computing the 2^100th bit of Pi (approximately -- the BBP algorithm has some factors of log thrown in, so I've dropped a factor of 2^19 to account for those).

      Sorting a list of 31 elements using bogo-sort.

    • Re: (Score:2, Interesting)

      by godrik ( 1287354 )
      A couple of years ago (2003), a cryptographic system was said to be secured if it requires more than 2^80 computation. I do not know what is the current standard and I have no clue how to find it.
      However 2^110 is still way to large for us at the moment.
      To give an estimation. Supose you have one million processors clocked at 10Ghz (which nobody have nowadays), you can do 10^6 * 10^10 = 10^16 ~= 2^48 computations per second. To crack AES and using this machine you'll need 2^110 / 2^48 = 2^62 seconds to do
      • The attack doesn't work in AES-128 or AES-192. It only works on AES-256.

      • CPU power doubles about every 18 months (Moores Law) so every 3 years a new computor can decrypt 2 bits more in the same time frame.

        Thus to get it down to 80 bits from 119 it will take 39 bits divided by 2 times 3 or in about 60 years the AES will become breakable.

        However given that Moores law is starting to get hard to continue with - it might take smore time than that. By that time the new crypto standard will probably use 1024 bits to be safe.

    • Well I'm not particularly a math geek, but 2^10=1KB. 2^20=1MB. 2^30=1GB. And so on. So if you were storing 2^120 bits, it would be basically be a trillion trillion terabytes. Is that right? Someone feel free to check my math.

      I mean, that doesn't give an explanation of the problem, so it doesn't really answer your question. But maybe it gives you an idea of scale? I guess by lowering the complexity of the attack by 2^8.5 it means that an encryption key that would take you 300 years to crack, you mig

    • Well, it's the expected length of time you'd need to spend tossing a coin before you got 119 heads in a row. A very long time.

    • Re: (Score:2, Interesting)

      by rawler ( 1005089 )

      Basically, it says how long it would take to be sure to crack it.

      To give a comparison, most machines on the net today is clocked at ~4Ghz, that is 4 000 000 000 instructions per second. Imagine a CPU:s doped for only one thing, and one cpu-cycle would mean one key tested ( in reality, there's more like maybe 1000s of cycles/test or something like that ). That would mean that the cpu crunched 172 000 billions of tests per day. With that monster cpu, it would still take 5 billions of billion years to be sure

      • To give a comparison, most machines on the net today is clocked at ~4Ghz, that is 4 000 000 000 instructions per second

        Uh, no they're not. But for the sake of mathematical illustration, I'll allow it.

    • ok, someone needs to check my math and logic because I'm basically asleep at the moment, but:

      2^100.5 = 1.8e30
      2^119 = 664613997892457936451903530140172288 = 6.6x10^35

      In atoms:
      Avogadro's number is 6x10^23.
      1 mole of iron contains 6x10^23 Fe atoms, and has a mass of 55 grams. So, 2^119 atoms are 2^119/6^23 = 1e12 moles. This means that 2^119 atoms of iron would have a mass of 55 megatonnes. (1 tonne = 10^6g), which is (very roughly) the mass of a solid cube of iron 200 meters to the edge.
      So, if you take a 200 m

  • but they pose no immediate threat for the real world applications that use AES.

    Funny how news of just about every major break of an existing cryptography system or secure hash method has started out with just about those same exact words.

  • Quantum Computers (Score:3, Insightful)

    by religious freak ( 1005821 ) on Wednesday July 01, 2009 @06:05PM (#28550643)
    Yeah, this is interesting math, but I don't think our cryptographic scheme is in danger until quantum computers become a stable and reliable source of heavy computing. Then we're all in trouble. How do you create a key, when the entire large number method is made obsolete by quantum computing? I haven't looked into it much, but I don't think anyone has found an answer yet.

    To my knowledge quantum cryptography is still limited to very close distances, while cracking a crypto key is obviously not affected by this limitation.
    • Quantum computers won't do much to break block ciphers, they'll only be useful against ciphers which rely on finding factors of large numbers.

      The real-world applications for quantum computers are very limited.
      • The real-world applications for quantum computers are very limited.

        No, the real-world scenarios where quantum computers are a major improvement are limited. They can be used for everything normal computers are as well.

    • by Xtravar ( 725372 )

      The government will require licenses to run quantum computers, and they'll be so fast that the FBI can run spyware on them without you noticing performance degradation! If you are innocent you have nothing to hide!!!

    • Re:Quantum Computers (Score:4, Informative)

      by mathimus1863 ( 1120437 ) on Wednesday July 01, 2009 @07:31PM (#28551713)
      Parent is slightly off on the Quantum computing comment. Quantum computers can break cryptographic protocols based on the difficulty of integer factorization (RSA/PGP/GPG/PKI/SSL/TLS), and discrete-logarithms (all of the above plus elgamal, elliptic curves). However, AES is a block cipher which relies on neither of these pure-math problems.

      The only advantage of QCs in breaking AES is that Grover's Algorithm can be applied for random guessing of the encryption key. AES-256 has 2^256 possible encryption keys. It takes a classical computer an average of n/2 guesses to find the right key, or 2^255 operations. However a QC running Grover's Algorithm does it in an average of approx sqrt(n) "guesses." This means that it takes about 2^128 operations to get the AES-256 key using a quantum computer.

      As previous posters have mentioned, 2^128 is still far out of our reach. And to subvert QCs for this type of problem, all we have to do is double our key length to get the same security. Perhaps if we find a way to combine Grover's Algorithm with this new AES vulnerability, we can get it down to 2^60 to 2^64, but that is still extremely prohibitive. Additionally, that's a big "if," since Grover's Algorithm is intended for pure-guessing problems.
      • I defer to your knowledge, since it looks like you've got more than I do, judging by your comment.

        However, I was under the impression that due to the superpositional state of a qubit (i.e. being both 1 and 0) adding one extra qubit essentially doubles the effective power of the quantum computer (in terms of the very specific operations of finding factors for large numbers and searching data). So, once we are more effective at entangling qubits, we don't really have a defense against a brute force attac
      • How exactly does Grover's Algorithm help in this situation? With Grover's, you have an unsorted list and want to look up the position of an arbitrary element in the list. This takes O(n) sequentially, but O(sqrt(n)) with Grover's. It has absolutely nothing to do with guessing. In brute-forcing a block cipher, you have a large number of keys and you need to try each one sequentially. There's no lookup involved at all. I'm afraid I'm gonna have to call bullshit, something I need to do all too much in QC
    • Re: (Score:3, Informative)

      by evilviper ( 135110 )

      How do you create a key, when the entire large number method is made obsolete by quantum computing?

      There are several methods of public-key encryption which are secure against quantum computers. Try Lamport Signatures for a start: []

    • Re: (Score:3, Interesting)

      There are some physics I know (I was one once...) that work on quantum computers. They don't think they will ever be faster at cracking than classic computers.

      There are 2 reasons.

      First a quantum computer construction complexity goes up to the power of qbits. ie a quantum computer with n qbits has construction complexity of O(2^n), so with Moores law in place the number of qbits goes up linearly with time... This is leaving out the extra qbits you need for error correcting with decoherence that makes a
  • 2^119 is... (Score:5, Interesting)

    by AnotherBlackHat ( 265897 ) on Wednesday July 01, 2009 @06:13PM (#28550757) Homepage

    For those who are asking "what's 2^119 complexity mean?"

    2^64 is about as hard a problem as we can reasonably solve these days.
    2^80 is about as hard a problem as we can unreasonably solve. I.e. we can do it, but it would take the budget of a country for several years to do.
    A can of soda has about 2^83 molecules in it.
    2^119 is still way beyond anything we can reasonably do, but isn't so hard that we can rule out any theoretical possibility of solving it.
    A house sized computer built of solid nano-compute units, each a few hundred molecules on a side, with a cycle time of about 10 petahertz could do it in less than a lifetime.
    Perhaps possible but I wouldn't worry about it.
    2^256 is so hard that it may not even be theoretically possible to solve - or maybe you could if you're willing to destroy a few solar systems, and wait a few million years.
    While cracking 2^256 may not be theoretically impossible, it would be easier to look everywhere the information you want might be hidden - including inside the mind of your opponent - even if he's dead.

    • Re: (Score:3, Interesting)

      by Tycho ( 11893 )

      2^40 would take very little time on a home PC, an afternoon or maybe a day.

      40 bits is also the size of the keyspace used by HDCP for HDMI and DVI, for "encrypted" HD displays. I don't feel like doing the math, but determining all of the 40-bit keys used in HDCP could probably be done in a short time on a reasonable home PC, using a man in the middle attack. However, for copying HD video, one would still probably get better quality by showing Macrovision, the MPAA, and the Blu-Ray consortium that using BD+

  • I don't know if the size of the hash has anything to do with the package or the resultant file, but what about simply doubling (or greater) the hash?

    I did try to read the article, but got a bit lost when I began to read stuff like, "A basic boomerang distinguisher [12] is applied to a cipher EK() which is
    considered as a composition of two sub-ciphers: EK(1) = E1 2 E0. The rst sub-cipher is supposed to have a dierential  ! , and the second one to have a..."

    Man, it
    • by Sique ( 173459 )

      I don't know if the size of the hash has anything to do with the package or the resultant file, but what about simply doubling (or greater) the hash?

      Here comes the irony: The attack is possible, because AES-256 and AES-192 are "extended" versions of AES-128. While AES-128 still goes strong, the extended versions are attacked, and their complexity is reduced to at least 2^119. AES-128 remains at 2^128.

      • by Tycho ( 11893 )

        I'm a bit confused here, I remember reading an article on hardware hard drive encryption or perhaps something else, but the my understanding was that AES-256 was just performing AES-128 encryption on the data two times in a row. The logic was that is circumvented various legal restrictions on AES keys longer than 128 bits while still being just as hard to crack as single AES encryption with a 256 bit key. I'm uncertain if I remember it right, I may be confusing it with DES, however.

  • For some reason that SSL link to the paper uses the name of a server that doesn't have the SSL certificate so Firefox complains. Replacing the hostname gives this link that Firefox doesn't complain about: []

  • by Anonymous Coward on Wednesday July 01, 2009 @07:39PM (#28551789)

    The usual threat model for a cipher is either a "chosen plaintext attack" (CPA) or a "chosen ciphertext attack" (CCA). In both of those, you have a lot of plaintext-ciphertext pairs all encrypted under the same key, and your job is to use that info against the cipher. Not necessarily to actually compute the key (which would totally destroy the cipher) but even to be able to infer anything about it statistically (for example, to have a better than random chance of guessing whether a new plaintext/ciphertext pair was encrypted with the same key).

    This attack is a related-key attack, which traditionally means that you get to see the same plaintext encrypted under enormous numbers (like 2^119 in this case) of different but related keys, rather than under the same key (or a "small" number of keys like a few trillion). This is a threat model that most ciphers aren't designed against and it's instead countered by designing the application to not rely on it. For example, don't use the cipher as a hash function by using the plaintext as a key and encrypting some constant. Properly designed crypto applications don't let attackers access the keys, and they generate their keys randomly rather than letting them be related. I don't think related-key attack resistance was part of the specification given to entrants of the AES contest, and IIRC the AES standard doesn't claim such resistance.

    Nonetheless, the designers of Rijndael (the cipher that is the basis of AES) designed Rijndael to be "ideal", which among other things Rijndael was supposed resist related-key attacks, which was above and beyond the AES requirements.

    This new discovery finds that the AES cipher in fact does not meet Rijndael's design goals. Rijndael's design goals, however, exceeded the requirements stated in the AES standardization process, and any applications using AES are supposed to only use the characteristics of AES stated in the standard. So, even if this attack were of low enough complexity to be practical, it STILL should not affect valid AES applications, unless they are relying on characteristics that AES was never promised to have.

  • 2^256
    26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
    115 792 089 237 316 195 423 570 985 008 687 907 853 269 984 665 640 564 039 457 584 007 913 129 639 936

    12 11 10 9 8 7 6 5 4 3

Variables don't; constants aren't.