Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Encryption Math

MIT Research: Encryption Less Secure Than We Thought 157

A group of researchers from MIT and the University of Ireland has presented a paper (PDF) showing that one of the most important assumptions behind cryptographic security is wrong. As a result, certain encryption-breaking methods will work better than previously thought. "The problem, Médard explains, is that information-theoretic analyses of secure systems have generally used the wrong notion of entropy. They relied on so-called Shannon entropy, named after the founder of information theory, Claude Shannon, who taught at MIT from 1956 to 1978. Shannon entropy is based on the average probability that a given string of bits will occur in a particular type of digital file. In a general-purpose communications system, that’s the right type of entropy to use, because the characteristics of the data traffic will quickly converge to the statistical averages. ... But in cryptography, the real concern isn't with the average case but with the worst case. A codebreaker needs only one reliable correlation between the encrypted and unencrypted versions of a file in order to begin to deduce further correlations. ... In the years since Shannon’s paper, information theorists have developed other notions of entropy, some of which give greater weight to improbable outcomes. Those, it turns out, offer a more accurate picture of the problem of codebreaking. When Médard, Duffy and their students used these alternate measures of entropy, they found that slight deviations from perfect uniformity in source files, which seemed trivial in the light of Shannon entropy, suddenly loomed much larger. The upshot is that a computer turned loose to simply guess correlations between the encrypted and unencrypted versions of a file would make headway much faster than previously expected. 'It’s still exponentially hard, but it’s exponentially easier than we thought,' Duffy says."
This discussion has been archived. No new comments can be posted.

MIT Research: Encryption Less Secure Than We Thought

Comments Filter:
  • Re:good news for NSA (Score:4, Informative)

    by Bob the Super Hamste ( 1152367 ) on Wednesday August 14, 2013 @03:10PM (#44567493) Homepage
    But at the same time

    It’s still exponentially hard

    .

  • by Anonymous Coward on Wednesday August 14, 2013 @03:22PM (#44567611)

    It is (as given on the paper) the "National University of Ireland, Maynooth" and NOT simply "University of Ireland". "The constituent universities are for all essential purposes independent universities, except that the degrees and diplomas are those of the National University of Ireland with its seat in Dublin". I'm from Ireland and had no clue WTF "University of Ireland" was going to be and had it not been for the MIT connection would have assumed it was one of those places you send a few dollars to get a "fake" degree. When and if it's truncated you might see "NUI", "NUIM" or "NUI Maynooth".

  • Re:Huh? (Score:3, Informative)

    by Anonymous Coward on Wednesday August 14, 2013 @03:25PM (#44567631)

    There is no "cipher equivalent", unless you're doing something stupid like using ECB mode. [wikipedia.org]
    No modern encryption scheme works by simple one-to-one substitution; you use a nonce [wikipedia.org] or an IV [wikipedia.org] with a chaining mode so that even if the same plaintext appears several times, either in the same document or over multiple messages, it will "never" (neglible chance) encode to the same value twice.

  • Re:Huh? (Score:5, Informative)

    by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Wednesday August 14, 2013 @03:32PM (#44567701)

    As usual, the paper [arxiv.org] makes more sense than the press release, but is less grandiose in its claims.

    It's a fairly technical result that finds some appeals to the asymptotic equipartition property [wikipedia.org] lead to too-strong claims, compared to a more precise analysis.

  • by Geirzinho ( 1068316 ) on Wednesday August 14, 2013 @03:45PM (#44567799)

    How is this in principle different from the known plaintext attacks (https://en.wikipedia.org/wiki/Known-plaintext_attack [wikipedia.org])?

    These assume that the attacker knows both the encrypted version of the text and the original it was based on, and tries to glean information from their correlation.

    Modern ciphers are made resistant even to chosen plaintext attacks, where the analyst knows the key and can tailor-make pairs of plain- and ciphertext.

  • Re:good news for NSA (Score:5, Informative)

    by Shaiku ( 1045292 ) on Wednesday August 14, 2013 @06:07PM (#44568841)

    I read the article. The impression I got was that it will still take the same time today that it would have taken yesterday to break encryption, but it turns out that the metric used to demonstrate an algorithm's effectiveness at hiding information was inadequate for electronic communication. In a nutshell, the latest math explains that most encryption systems are vulnerable to side-channel attacks, even if you might not have realized it. But side-channel attacks have been employed for a long time, so those who do security already knew this anecdotally.

  • by cryptizard ( 2629853 ) on Thursday August 15, 2013 @12:12AM (#44571037)
    Pretty sure what they are saying here is that having a lot of Shannon entropy in your key is not enough for security. The paper seems to be deliberately obtuse though, which is really annoying. I am a cryptographer and it doesn't make a whole lot of sense to me right away. They note that if you draw a word from some stochastic process then the difficulty in guessing that word may not be very high, even if the entropy is high. This is completely intuitive and known.

    Imagine you have an algorithm that generates an n-bit secret key. First, it flips a random bit b. If b = 0, then it just outputs a string of n zeroes as the key. If b = 1, then it outputs n random bits. The entropy of this process is n bits, which seems good, but cryptographically it is terrible because half the time it just uses a fixed key of all zeroes. Instead of Shannon entropy, cryptographers uses a different form called min entropy which is inversely proportional to the most likely event. So in the above case, the min entropy would only be one bit, which properly reflects how bad that algorithm is.

    It's late, and I might be missing something, but it doesn't seem like anything that wasn't known before. Particularly, they talk about distributions with high entropy but which are not uniform, and in cryptography you always assume you have uniform randomness. It has been known for quite a while that many things are not even possible without uniform randomness. For instance, it is known that encryption cannot be done without uniform randomness.

The moon is made of green cheese. -- John Heywood

Working...