Forgot your password?
typodupeerror
Encryption

Math Advance Suggest RSA Encryption Could Fall Within 5 Years 282

Posted by Soulskill
from the which-will-be-about-the-time-encryption-becomes-useless-anyway dept.
holy_calamity writes "The two encryption systems used to secure the most important connections and digital files could become useless within years, reports MIT Technology Review, due to progress towards solving the discrete logarithm problem. Both RSA and Diffie-Hellman encryption rely on there being no efficient algorithm for that problem, but French math professor Antoine Joux has published two papers in the last six months that suggest one could soon be found. Security researchers that noticed Joux's work recommend companies large and small begin planning to move to elliptic curve cryptography, something the NSA has said is best practice for years. Unfortunately, key patents for implementing elliptic curve cryptography are controlled by BlackBerry."
This discussion has been archived. No new comments can be posted.

Math Advance Suggest RSA Encryption Could Fall Within 5 Years

Comments Filter:
  • by Anonymous Coward on Tuesday August 06, 2013 @05:19PM (#44491135)
    ...the NSA will have a backdoor into that epileptic curve whatnot too.
  • by Anonymous Coward on Tuesday August 06, 2013 @05:25PM (#44491207)

    Hmm ... considering Blackberry/RIM's precarious hold on existence, I have a hunch those patents will be in other hands very soon.

  • by EvanED (569694) <evaned&gmail,com> on Tuesday August 06, 2013 @05:29PM (#44491253)

    The RSA encryption has been depreciated for years now. Hell, back in 2000 we were saying that DES was insecure, and triple-DES was just a stop-gap. Everyone's been switching to AES for awhile now. This isn't news.

    Your first sentence sounds weird to me, and it isn't supported by your second. AES can't be a suitable replacement for RSA because AES is a secret-key system and RSA is a public-key one.

    I'm not a crypto person, but RSA and elliptic-curve systems are the only two public-key systems I can think of. (There are others that allow secure exchange of a secret key, but that's different.)

  • by qubex (206736) on Tuesday August 06, 2013 @05:45PM (#44491411) Homepage

    Based on my limited understanding, proving P = NP would not necessarily and automatically provide a manner of constructing reductions. It might. But there are proofs in computation theory that demonstrate limit complexities but do not provide the algorithms that might implement them, nor do they (currently, visibly) provide any indication of how that algorithm may be arrived at.

    Besides, proving P = NP would have a vast number of consequences that would echo across mathematics and the more fundamental sciences. To harp upon the security implications is as short-sighted as fretting that all-out thermonuclear war would negatively affect the postal delivery service.

  • by Anonymous Coward on Tuesday August 06, 2013 @05:46PM (#44491421)

    Yeah. They have taxes for that.

  • by dmbasso (1052166) on Tuesday August 06, 2013 @05:47PM (#44491433)

    They do. It is called taxes.
    You pay to be spied on, good deal!

  • by Anonymous Coward on Tuesday August 06, 2013 @05:50PM (#44491487)

    Actually in some ways it would be really really exciting and almost certainly a really good thing in the long run, because there are a lot of important, currently-intractable problems that become tractable if P=NP.

    Proving that P=NP doesn't make anything tractable, unless you use the ridiculous definition where tractable is the same as polynomial time. What would have practical applications is if someone finds a very fast algorithm for solving all the NP problems. Whether P=NP is not really very much related to the question of whether such an algorithm exists. ML has exponential-time type checking, yet ML compiles don't take that long. Polynomial time is not the same as practical - it fails in both directions.

  • by ceoyoyo (59147) on Tuesday August 06, 2013 @06:39PM (#44492009)

    The story is talking about the possibility of a mathematical breakthrough that would make solving the discrete logarithm problem (and possibly the integer factorisation problem) much, much easier. RSA relies on it being much easier to raise something to an integer power than to find a discrete logarithm (inverse operations). If you figure out how to make the two operations of similar difficulty then any encryption scheme based on them is hopelessly broken for any key size.

  • by pipedwho (1174327) on Tuesday August 06, 2013 @06:46PM (#44492097)

    Likely example: the horrific clusterfuck of an abomination known as "little-endian binary". I don't know for sure it came about due to patent reasons, but I can't think of any sane reason why it would have ever come into existence otherwise.

    From a purely machine theoretical standpoint, having the low order byte in the lowest memory location makes as much or more sense than the other way around.

    Streaming transmission is a different matter, and in some instances can benefit from being able to receive the MSB first. This is especially true if the data gets acted upon in real time and the MSB is required earlier during the calculation. However, in may other cases, LSB first network byte order can be more advantageous (or at most at least not a disadvantage). So the decision to use either is really based on the algorithms chosen for the network traffic itself.

    In creating interface code to opposite-endian systems, it's easier to think about avoiding translation and keeping both in the same format. But, I've personally never had trouble with this since I've always used reversed buffers where direct use of reversed multi-byte arithmetic was useful.

    However, it stands to reason that the designers of the first little-endian processors didn't consider this a problem, as most byte traffic needs to be buffered and checked before it can be used in calculations, and that obviates the need for having network byte order being same-endian. Since these were all designed in the early days, I see no reason to assume that the choice to go with little-endian would have been any sort of compromise to the state of the art.

  • by ceoyoyo (59147) on Tuesday August 06, 2013 @06:56PM (#44492209)

    You misunderstand the difference between throwing hardware at a problem and coming up with a more efficient algorithm.

    RSA doesn't specify a key length. I can use a key that's 64 bits long (used originally but insecure today) or 1 megabit long (secure against known classical algorithms for the age of the universe no matter how much hardware you throw at it). As hardware gets better I can encrypt things using longer keys, in the same amount of time. It takes you MUCH MORE time to decrypt that, even with the better hardware. So long as you keep increasing key length as hardware gets faster, the encryption actually gets BETTER with better hardware.

    The article is talking about a breakthrough in mathematics that could make solving discrete algorithms much faster. If it made it anywhere near as fast as exponentiation then it wouldn't take me much longer to decrypt your message than it took you to encrypt it, regardless of key length.

    DES is insecure because it uses fixed length keys, that became practical to brute force. RSA doesn't have that problem. The situations are entirely different, and the potential breaking of RSA is much more interesting, and much more of an accomplishment.

  • Swype fail (Score:4, Insightful)

    by raymorris (2726007) on Tuesday August 06, 2013 @08:29PM (#44492897)
    For encryption, that's fine. For signatures and hashes, cascading WEAKENS it. An attacker only has to crack ANY of the algorithms to crack the whole. To prove that to yourself, try it with one of the algorithms defined as:

    function Weak(msg) {
    return 1;
    }

    Compare these two:

    Weak(MD5(msg))
    MD5(msg)
  • by raymorris (2726007) on Tuesday August 06, 2013 @08:48PM (#44493007)
    That's a common misconception. The actual law is:

    You cannot patent the laws of nature, including the laws of physics and mathematics.

    A car MAKES USE of the laws of physics, but it may be patentable if it's a new invention. You cannot patent X + 1 = X - (-1) because that's mathematical truth, it existed before you noticed it. Just as you can patent a new invention that USES the laws of physics, you can patent some system that uses math. In this case, a system for securely delivering secret messages across a public network. Of course it still has to be a new and useful invention in order to be patentable.
  • by phantomfive (622387) on Tuesday August 06, 2013 @11:27PM (#44493891) Journal

    I don't need to be a math major to figure out that if I have a car that can go 200 MPH it'll get there twice as fast as a car that can only do 100 MPH.

    You would have been better as a math major. To understand the issue, realize that a car going 200MPH needs much more power than a car going 100MPH. A car going 400MPH will need even more power. Similarly, with some algorithms, the solution becomes harder and harder the larger the dataset grows; often exponentially (or even factorially).

You can bring any calculator you like to the midterm, as long as it doesn't dim the lights when you turn it on. -- Hepler, Systems Design 182

Working...