Math Advance Suggest RSA Encryption Could Fall Within 5 Years 282
holy_calamity writes "The two encryption systems used to secure the most important connections and digital files could become useless within years, reports MIT Technology Review, due to progress towards solving the discrete logarithm problem. Both RSA and Diffie-Hellman encryption rely on there being no efficient algorithm for that problem, but French math professor Antoine Joux has published two papers in the last six months that suggest one could soon be found. Security researchers that noticed Joux's work recommend companies large and small begin planning to move to elliptic curve cryptography, something the NSA has said is best practice for years. Unfortunately, key patents for implementing elliptic curve cryptography are controlled by BlackBerry."
We need to keep this secret (Score:5, Funny)
Re:We need to keep this secret (Score:5, Informative)
You mean like SSL is broken and nobody talks about it?
First there was BEAST in 2011, which was fixed. But the situation in 2013 is not better!
https://www.globalsign.com/blog/is-ssl-broken.html [globalsign.com] (and links therein, especially the last two)
https://www.imperialviolet.org/2013/02/04/luckythirteen.html [imperialviolet.org]
http://blog.cryptographyengineering.com/2013/02/attack-of-week-tls-timing-oracles.html [cryptograp...eering.com]
List of all attacks: http://armoredbarista.blogspot.de/2013/01/a-brief-chronology-of-ssltls-attacks.html [blogspot.de]
Elliptical Curve (Score:2, Informative)
http://en.wikipedia.org/wiki/Elliptic_curve_cryptography
RE: Elliptical curves (Score:4, Interesting)
https://www.schneier.com/essay-198.html
Re: Elliptical curves (Score:4, Informative)
Without a statement as to whether the NSA has been involved in elliptic curve stuff (though I will point out that they have nearly as much motivation to make things hard for, say, the USSR/China [depending on era] to crack as they do to make things easy for them to crack), did you read your link? It isn't really talking about elliptic curve crypto at all.
It's describing a potential flaw in a random-number generator whose algorithm is based around elliptic curve crypto. Even if every worry presented by the article is true, that means absolutely nothing about whether elliptic curve is secure against the NSA.
(Actually it almost suggests that it is, because if EC was breakable then the NSA wouldn't have as much motivation to get a known key into the RNG standard.)
Re: (Score:2)
OpenSSL? (Score:5, Interesting)
OpenSSL [openssl.org] has had a good working implementation of ECDSA/ECDH for years: http://wiki.openssl.org/index.php/Elliptic_Curve_Cryptography [openssl.org]
What exactly does BlackBerry have chained down that we don't have an open solution for?
Re: (Score:2, Informative)
They claim patents on various ECDSA/ECDH implementations. There really isn't more to say, we here at slashdot know how evil patents are. :)
To avoid the patent issues its best to implement as specified in: http://tools.ietf.org/html/rfc6090
Abstract
This note describes the fundamental algorithms of Elliptic Curve
Cryptography (ECC) as they were defined in some seminal references
from 1994 and earlier. These descriptions may be useful for
Re: (Score:2)
I.E. If you implement these RFC 6090 "Pre-patent" methods, the first obvious thing you would think of to make it better is point compression. That is one of the obvious implementation things that were patented.
Re: (Score:2, Interesting)
Except thats not true. The patent often claimed to cover point compression specifies fields of characteristic 2 (not the large prime fields most implementations actually use; as the certicom patents cover a ton of characteristic 2 field stuff).
It's also the case the point compression was described in publication at least as far back as 1986, making it too old to be patentable in general:
Quoting DJB:
Miller in 1986, in the paper that introduced elliptic-curve cryptography, suggested compressing a public key
The grease weasels strike back! (Score:3)
a common misconception (Score:4, Insightful)
You cannot patent the laws of nature, including the laws of physics and mathematics.
A car MAKES USE of the laws of physics, but it may be patentable if it's a new invention. You cannot patent X + 1 = X - (-1) because that's mathematical truth, it existed before you noticed it. Just as you can patent a new invention that USES the laws of physics, you can patent some system that uses math. In this case, a system for securely delivering secret messages across a public network. Of course it still has to be a new and useful invention in order to be patentable.
Re: (Score:2)
haven't read the patents, but (Score:2)
A new configuration of gears and levers, doing something new, is patentable.
A new configuration of logic operations, equations and and interfaces, doing something new, is patentable.
The question is "is i
Key patents controlled by Blackberry (Score:4, Insightful)
Hmm ... considering Blackberry/RIM's precarious hold on existence, I have a hunch those patents will be in other hands very soon.
Re: (Score:2)
Elliptic key cryptography per se isn't patented, just the most efficient ways of using it. So, worst-case, we might end up with some horrific eternal kludge whose only reason for existing was to provide a commercially-safe route around patents not set to expire for another 2-3 years.
Likely example: the horrific clusterfuck of an abomination known as "little-endian binary". I don't know for sure it came about due to patent reasons, but I can't think of any sane reason why it would have ever come into existe
Re: (Score:2)
I blame little-endian on a sort of backwards compatibility. Whether it's a pointer (index register) to an 8-bit or 16-bit value, you're pointing at the low-order byte. I've always suspected that made the first 16-bit Intel processors easier to build, in some way.
But if little-endian is a "horrific clusterfuck of an abomination" (and I'm not saying it isn't), what do you call PDP "middle-endian" - the Antichrist? I mean seriously, sometimes there is just no excuse.
Re:Key patents controlled by Blackberry (Score:4, Informative)
I blame little-endian on a sort of backwards compatibility.
Little-endian simplifies CPU circuits that perform multi-word operations. Its really that simple. End of discussion.
Re: (Score:2)
Somehow big-endian happened just as often. I'm not buying "simplifies" unless there's some legacy that you're trying to find some simple way to add to. After all, the physical order of the circuit traces related to a register need not relate to the logical order of the bytes way off in main memory.
Re: (Score:2)
After all, the physical order of the circuit traces related to a register need not relate to the logical order of the bytes way off in main memory.
Registers. Cute.
No argument about registers matter because registers dont have the concept of endianness attached.
Re:Key patents controlled by Blackberry (Score:4, Insightful)
Likely example: the horrific clusterfuck of an abomination known as "little-endian binary". I don't know for sure it came about due to patent reasons, but I can't think of any sane reason why it would have ever come into existence otherwise.
From a purely machine theoretical standpoint, having the low order byte in the lowest memory location makes as much or more sense than the other way around.
Streaming transmission is a different matter, and in some instances can benefit from being able to receive the MSB first. This is especially true if the data gets acted upon in real time and the MSB is required earlier during the calculation. However, in may other cases, LSB first network byte order can be more advantageous (or at most at least not a disadvantage). So the decision to use either is really based on the algorithms chosen for the network traffic itself.
In creating interface code to opposite-endian systems, it's easier to think about avoiding translation and keeping both in the same format. But, I've personally never had trouble with this since I've always used reversed buffers where direct use of reversed multi-byte arithmetic was useful.
However, it stands to reason that the designers of the first little-endian processors didn't consider this a problem, as most byte traffic needs to be buffered and checked before it can be used in calculations, and that obviates the need for having network byte order being same-endian. Since these were all designed in the early days, I see no reason to assume that the choice to go with little-endian would have been any sort of compromise to the state of the art.
Re: (Score:2)
PS. I'm not defending patents here, just pointing out that patents were unlikely to be a motivation in the choice of Intel and others to go with little-endian byte order.
Re:Key patents controlled by Blackberry (Score:5, Interesting)
There are plenty of sane reasons to use little endian. It means the same pointer can point to a value as 8-bits, 16-bits, 32-bits and so on, and it will get the right value as long as the value does not overflow.
Big-endian only exists because Latin languages write their numbers wrong -- text is written left-to-right but numbers are written right-to-left. This mess has also caused the middle-endian date and time formats currently in use. ISO tries to fix the date format, but unfortunately does it by standardizing exactly the big-endian way that feels so alien to humans.
If computers had been invented by someone writing either left-to-right or right-to-left consistently, big-endian would not have occurred to them. They would naturally write the smallest value first, just like the Arabic inventors of the numbers. Alas...
(Yes I am a convert. I used to think little endian was a sick joke.)
Re:Key patents controlled by Blackberry (Score:5, Interesting)
huh? numbers are written in exactly the same order they would be expressed in words - "fifty-one thousand, three hundred and forty-eight" == 51,348
being trained from a young age to read numbers like that, i have no idea whether it really would have been just as easy to learn to read the digits in reverse order ("8 4 3 1 5") but it doesn't seem so to me. In fact it seems competely unnatural - the kind of thing you might do just to prove you can rather than because it's any better or more efficient.
it's really only americans who do this (MDY). and maybe the japanese because of the "Operation Blacklist" post-WWII occupation led by Gen. MacArthur. Everyone else uses DMY or YMD because the middle-endian american date format is alien - people naturally order things from either smallest to largest (or least significant to most significant) or from largest to smallest (most to least).
YYYY-MM-DD seems perfectly natural to me, not at all alien. I was raised on D-M-Y but figured out for myself that YYYYMMDD is the only format that sorts properly (and then later learnt there was as ISO standard for it).
YYYYMMDD also has the advantage of being unambiguous - you don't have to guess whether whoever wrote the date is american and if so, whether they're using a sane or insane date format or not - for days >12, it's easy to figure out: 8-13-2013 can't be anything but 13th August....but 8-7-2013 could be July 8 (sane) or August 7 (insane).
worse, there's absolutely no way to tell except to look for other dates on the same page (or journal/ledged/book/web site/etc) and check to see if any of them have day numbers > 12.
Explains BBRY 7% stock jump yesterday? (Score:3)
Re: (Score:2)
I wouldn't be surprised if (Score:2)
the NSA had already solved the discrete logarithm problem...
[...] elliptic curve cryptography, something the NSA has said is best practice for years.
...or cracked elliptic curve cryptography.
What patents? (Score:2)
You can't patent math.
Re:What patents? (Score:5, Informative)
You can't patent math.
As TFS states, it's the implementation that is patented. Not sure which ones belong to blackberry, but google patents has a number of related patents based on a quick cursory search. [google.com]
Re: (Score:2)
the math you're forgetting is that if you've got $6 billion in the bank you can patent whatever the fuck you like and sue almost everyone else into bankruptcy.
Wasn't this the premise of the movie Sneakers? (Score:3)
Go patents on math! (Score:2)
Patents have now become an issue of national security.
Re: (Score:3)
Patents have been an issue of national security for a while. Several countries, including the US, has secret patents. It takes someone wiser than me to explain how that promotes the progress of science and useful arts.
Re: (Score:3)
Not if you believe in "intellectual property" rather than "[...] To promote the Progress of Science and useful Arts, by securing for limited Times [...]"
This is why the "intellectual property" meme is so perniciously evil - it completely transforms the purpose and intent of copyrights and patents.
I've been working on RSA for over ten years... (Score:4, Interesting)
I'm surprised to see other people going in the direction I've been going for about 3 years now. Really. I thought I was quite alone in my path, LOL.
I need to read this paper still, but if it's taking the same path I did, then it's not a peachy as some think.
I'm only am amateur, so take this from the point of view as someone who kicks back with a beer and enjoys solving impossible computational problems.
I don't think it's that close to being broken... I think it'll take a huge computing effort (think multi-terabyte databases) to generate the tables across the PQ space required so that existing problems can be used to quickly find paths and intersections. At the beginning you're looking at only a VERY SMALL speedup from modern sieving, but once the tables get generated (years of effort) you'll eventually see faster and faster improvements. At least, that's with my algorithm, which I'm sure is far from perfect and only works on a certain set of primes right now. Which is about 20%. Which is far from optimal.
So yeah, progress. But I'm unconvinced that this will work for all primes.
I'm going to read the paper now... which I'm sure is far better than what I've been doing.
Re: (Score:2)
Okay, here's the slides from a talk:
https://www.cryptolux.org/mediawiki-esc2013/images/c/cd/Joux-EM-multiuser-ESC2013.pdf [cryptolux.org]
And a paper which is related:
http://eprint.iacr.org/2013/095.pdf [iacr.org]
Basically, from my first read, this is just a better sieve, a system which should find smooth numbers faster by choosing better starting points and using faster tests. I wouldn't call it a general break in RSA, and while it might certainly be a better sieve than GNFS, it's no silver bullet either. I can't imagine anyone bre
Re: (Score:2)
Thanks for that, I found it separately also, and read a few of the papers referenced. I tend to agree that this madness is a bit overblown. Reducing the time by 15% is really impressive overall, but when our anticipated sieving times for a typical 2k-4k keysize are measured in months and years that isn't a huge difference.
Re: (Score:2)
Yes, it's all computation. Once you have the databases, it's done. Unfortunately, GNFS sieve's are only applicable to a single pair of primes, so you can't reuse the data for other problems.
http://en.wikipedia.org/wiki/General_number_field_sieve [wikipedia.org]
I doubt this article should be taken too seriously (Score:4, Interesting)
There does not appear to exist any single piece of evidence that DLP (discrete logarithm problem)
will benefit from algorithms running in polynomial time. The recent work of Antoine Joux that they
are referring to (one of which I assume to be http://arxiv.org/pdf/1306.4244v1.pdf [arxiv.org]) provides
improvements of quasi-polynomial agorithms for breaking DLP. But there is no reason to believe
that these improvements can lead to a polynomial-time attack. And as long as this does not happen,
those attacks can still be defeated by increasing the key size.
Comment removed (Score:3)
The real math secret (Score:2)
This is a correct answer. Do you know why?
Re:The real math secret (Score:4, Funny)
1 + 1 = 3
This is a correct answer. Do you know why?
It was calculated in Excel?
Diffie-Hellman encryption (Score:2)
Time to get the patent lawyers out (Score:3)
Correct me if I'm wrong but you are not allowed to patent mathematical processes. "Elliptic curve cryptography" sure sounds like a "mathematical process" to me and a pack of especially smart and vicious patent lawyers should be able to blast RIM and Blackberry away in short order (by patent litigation standards which is aeons long). Sounds like a job for Amazon whose entire business model, the one they make money on anyway, depends upon the integrity of SSL which depends upon Diffie-Hellman and RSA for key exchange, if my flawed memory serves. Gotta blow the dust off my SSL book....
Game Theory (Score:2)
Yes, we need to check everything. That being said, this feels like game theory. Don't you get the sense that the NSA wants us to doubt the technology. If cryptography was widely used most of what the NSA does would be made obsolete.
So what? (Score:2)
By its very nature, encryption is a never-ending arms race.
Antoine Joux (Score:3)
Re:RSA = out of date (Score:5, Insightful)
Your first sentence sounds weird to me, and it isn't supported by your second. AES can't be a suitable replacement for RSA because AES is a secret-key system and RSA is a public-key one.
I'm not a crypto person, but RSA and elliptic-curve systems are the only two public-key systems I can think of. (There are others that allow secure exchange of a secret key, but that's different.)
Re:RSA = out of date (Score:5, Informative)
You need upvotes, but I'm out of modpoints.
You are very correct. Take for instance OpenVPN. It uses RSA to exchange an random AES session key. RSA and AES/DES/3DES have different uses, and replacing RSA with AES is simply not possible.
read the paper (Score:2, Informative)
http://arxiv.org/abs/1306.4244
Re: (Score:2)
AES can't be a suitable replacement for RSA
True, but he isn't making any such claim.
Re:RSA = out of date (Score:5, Interesting)
There is another promising public key encryption method known as NTRUEncrypt (http://en.wikipedia.org/wiki/NTRUEncrypt). It's lattice based, and apparently it will still be effective in a post quantum computing world where RSA/Elliptic curve methods will fail.
Re: RSA = out of date (Score:3)
Deprecated - I don't think that word means what you think it means. RSA can't be deprecated when there isn't a replacement. Elliptic curve cryptography has only really become a realistic replacement fairly recently, and nobody outside of government rushed to give Blackberry lots of money to use it because there wasn't a compelling reason to do so. Governments DID, which suggests to the conspiracy theorist that they might know of such a reason.
It's been long recommended you don't use short keys with RSA (whi
Re: RSA = out of date (Score:2)
IEEE Std 1363a defines a method to to Elliptic Curve Crypto that is not patent encumbered.
Re:RSA = out of date (Score:5, Informative)
You still can't replace an outdated public-key encryption key system with a symmetric system. Because, in real life, usage scenarios and key exchange systems actually matter - in fact, they are the most crucial aspect of the whole thing, otherwise we'd use true random one-time pads and be safe from any attack with any level of computing power forever.
Re:RSA = out of date (Score:5, Informative)
I didn't say that you said that AES could replace RSA: I said that your AES/DES analogy didn't support your statement that RSA is or should be deprecated. That may sound like I'm nitpicking here, but I'm really not: it's pretty fundamental to my point. And the reason is this:
This absolutely need not be true. RSA for instance is based in part around a hardness assumption: that given a very large number n which is the product of p and q, it is far harder to find p and q from n then it is to find n from p and q. Assume for the sake of argument that this is the only hardness assumption RSA depends on. (If the summary isn't misleading it apparently also depends on the hardness of discrete log, but I don't know how.)
If the hardness assumption holds, then RSA as such will never be insecure. Why? Suppose you say "here is a computer capable of factoring a number n with b bits." I'll say "OK, fine; I'll use 100*b bits (or something)"; because multiplying is so much easier than factoring, your computer will still be able to carry out that task but it won't be able to crack my key.
In other words, if the hardness assumption holds, RSA doesn't have a specific difficulty: it can scale with computational power. That's why you see people using 2048-bit keys now instead of 512-bit keys a couple of decades ago.
The only things that the age of the algorithm has to say about the security of it is (1) if the difficulty cannot scale with computational power (true of DES, not true of RSA) and (2) being out longer gives people more time to find flaws in its assumptions.
But here's the thing: #2 isn't necessarily bad or speak against the algorithm. It is conceivable that the assumptions just fundamentally hold. If they do, being out longer will not impact the security at all. If anything, being out longer with no one discovering anything should give a higher assurance that an algorithm is secure than a newer one would.
I don't think I've ever heard a blanket statement about RSA being insecure -- only things like certain key sizes or certain implementations or PRNGs being insecure. (Wikipedia also lists a couple of side-channel and plain-text attacks, but those are also arguably quality-of-implementation issues, and similar attacks exist for EC systems.) The intro to the Wikipedia article says nothing about RSA being insecure. "Deprecated" and "discouraged" both fail to appear on the page.
The strongest statement against RSA I've heard is just that EC is better.
Except that the DES vs AES case is not even close to being the same case, as Adam Van Ymeren said [slashdot.org] in response to you, and then I elaborated on elsewhere [slashdot.org] and above.
The reason it's not even close is that DES does not scale with computational power, because it has a fixed key size.
Re: (Score:2)
What you have described is true when both parties hold the relevant keys, and they believe that the keys have not been compromised - this is when I can trust that I really have your public key and not one substituted by an adversary. To solve this problem of key distribution the DH key exchange algorithm is normally used, and this relies on the hardness of discrete logs. If the DH problem is weak (which now appears to be the case) then RSA would be borken in the sense that you could not exchange keys to use
Re: (Score:3)
Maybe it might be time for an algorithm challenge, similar to how AES got decided, and the lastest hash algorithm got chosen.
Of course, asymmetric algorithms are a lot harder to make that are secure than symmetric ones.
I wonder about, instead of naming one, naming three. That way, if in the future one gets compromised, the broken one would just not be used, or for very sensitive stuff, all three can be cascaded (not for bit length, but to keep things signed or encrypted in case one gets severely weakened.)
cascaded signatures, hashes are weaker, not strong (Score:2)
function Weak(msg) {
return 1;
}
Compare these two:
Weak(MD5(msg))
MD5(msg)
Swype fail (Score:4, Insightful)
function Weak(msg) {
return 1;
}
Compare these two:
Weak(MD5(msg))
MD5(msg)
Re: (Score:2)
There are always parallel signatures:
RSA(m1)
ECC(m1)
Lattice(m1)
If all three verify, then the message (or realistically, a message hash) are good.
As for hashes, I've wondered about using this method, where one gets multiple hashes of the message via different algorithms, then XORs all of them. In this method, the resulting hash from all three should be as strong as the strongest link, because one couldn't tell the part from one algorithm from another.
Re: (Score:2)
Re:RSA = out of date (Score:5, Informative)
Wow, that is so wrong.
RSA is an asymmetric (aka publick key) cipher - because it requires two keys - one to encrypt, one to decrypt. AES, DES, 3DES are symmetric ciphers because you use the same key to encrypt and decrypt.
RSA and EC (elliptic curve) encryption is useful if you want to send data to someone without the hassles of secretly sharing the key ahead of time - e.g., I can encrypt a message using the public key and only the private key can decrypt it. Or I can use my private key to encrypt a message, and the public key can be used to decrypt it (the latter is often used to sign stuff, except the message is typically a hash instead of the original message).
The reason you use AES, DES, 3DES is because public key encryption is hideously slow. In the case of RSA, you're exponentiating one horrendously large number with other horrendously large numbers. (If your message is long, that horrendously large number Is big).
That's why what every public key encryption thing does is it encrypts the message with a fast symmetric cipher like AES, then encrypts the key (much shorter) with RSA or EC. If I want to send you a document, I encrypt it with AES, then use your public key to encrypt the AES key I used.
It's also why signing uses a hash - it's easier to encrypt the hash than the message. And verification just means recomputing the hash, and then decrypting the encrypted hash with the public key, producing the original hash to which can be compared to the just computed one.
The breakthrough in math would be a way to factor a large number quickly - which is what RSA relies on for security - it's easy to multiply two big numbers, but it's very time consuming to factor it.
Re: (Score:2)
Wow, that is so wrong.
How is it wrong? He's not referring to the operating principles, only to the fact that RSA and DES are about equally dated. You've just wasted time and space providing him with information he's already had.
Re: (Score:2)
Adam Van Ymeren said it well [slashdot.org]. An algorithm's age doesn't necessarily speak to how secure it is. DES is considered insecure because it has a fixed key size that can be brute-forced, not because it is a fundamentally weak crypto system.
By contrast, the same objection does not apply to RSA, at least AFAIK: the key size can be scaled arbitrarily, so as computing resources grow so can the difficulty of the pr
Re: (Score:2)
It's not only the key, which is too small. The blocksize is also too small. DES has a blocksize of only 64 bits. Even the 128 bit blocksize of AES is a bit on the short side. My rule of thumb for how many blocks of data you can safely encrypt with with a single key is two raised to one quarter of the blo
Re: (Score:2)
Re: (Score:2)
The thing I'm not sure about right now is whether the RSA method itself is becoming insecure or if standard-size keys can simply be brute-forced. If it's a question of key size, then why not use larger keys?
The last time I checked, it is possible to increase the size of RSA keys quite a bit. Most frontends for PGP/GPG only allow keys up to 4096-bit to be created but several years ago I was able to generate valid key pairs up to 11296-bit. I had to modify the GPG source code and recompile it before it would
Re: RSA = out of date (Score:5, Insightful)
The story is talking about the possibility of a mathematical breakthrough that would make solving the discrete logarithm problem (and possibly the integer factorisation problem) much, much easier. RSA relies on it being much easier to raise something to an integer power than to find a discrete logarithm (inverse operations). If you figure out how to make the two operations of similar difficulty then any encryption scheme based on them is hopelessly broken for any key size.
Re: (Score:3)
If it's a question of key size, then why not use larger keys? The last time I checked, it is possible to increase the size of RSA keys quite a bit.
Because large RSA keys do unfortunate things to performance. You double the key length, and it takes 6-7 times longer to run the decryption [javamex.com].
Does that matter for just doing key exchange? (Score:2)
I thought the bulk of encryption was usually done with some streaming cipher.
Re: (Score:2)
Sure. But you still need RSA for the initial handshaking and the larger key will mean you go down under large numbers of incoming connections (e.g. /.) much sooner.
discrete logarithm ~ integer factorization (Score:2)
Re:RSA = out of date (Score:5, Informative)
The RSA encryption is
c = m^e (mod n), where m is message, c is ciphertext, e is public exponent, and n is p*q
Decryption is
m = c^d (mod n) where d is the private exponent.
The process of computing d given m,n and c is exactly the discrete logarithm problem. Given n and e, which are public, you can pick an arbitrary m and generate a corresponding c.
Re: (Score:2, Funny)
Why elliptic curves when we can go back to good old fashioned original RSA that uses prime number factoring as the problem? No patent nonsense to worry about there.
We used up all the good prime numbers during the Internet boom years under Clinton.
Re:Why not go back to original RSA? (Score:5, Interesting)
Why elliptic curves when we can go back to good old fashioned original RSA that uses prime number factoring as the problem? No patent nonsense to worry about there.
Sometimes the past needs to remain in the past...
Although prime factoring is considered a hard problem, the sparse distribution of prime numbers (~x/ln(x)) makes RSA increasingly inefficient in that superlinearly large moduli (to match large primes) need to be used to increase security linearly.
Lest nostalgia continue to be your guide, the original RSA was also found to be broken and needed to be patched to avoid the insecurity
1. Messages corresponding to small input values could be simply inverted ignoring the modulus operation (just doing numerical root estimation to invert the exponentiation). The larger the modulus, the more "insecure" messages there are.
2. Encryption is deterministic so is subject to dictionary attacks.
When people say they are using RSA today, they are usually using RSA-OAEP (optimal asymmetric encryption padding) which patches these two specific vunerablities of RSA.
FYI, the original RSA was patented (although later RSA labs decided to not enforce the patent and let it expire). This patent nonsense around RSA was a big issue in its day...
Re: Why not go back to original RSA? (Score:2)
The discrete logarithm problem isn't technically the same as prime number factoring, but approaches that work on one tend to inspire approaches that work on the other. A breakthrough algorithm for one is likely to lead to a breakthrough algorithm for the other.
Re:RSA is outdated, but... (Score:5, Insightful)
Based on my limited understanding, proving P = NP would not necessarily and automatically provide a manner of constructing reductions. It might. But there are proofs in computation theory that demonstrate limit complexities but do not provide the algorithms that might implement them, nor do they (currently, visibly) provide any indication of how that algorithm may be arrived at.
Besides, proving P = NP would have a vast number of consequences that would echo across mathematics and the more fundamental sciences. To harp upon the security implications is as short-sighted as fretting that all-out thermonuclear war would negatively affect the postal delivery service.
Re: (Score:3, Informative)
What exactly, does proving P = NP have to do with the price of tea in China? We knew when RSA was created that advances in computation power would eventually make it feasible for us to decrypt its contents. We even know what that boundary is.. and we're coming up on it now.
No encryption algorithm is immune to the fact that the faster you can run an algorithm, the sooner you'll get a result. That's all encryption is. I don't need to be a math major to figure out that if I have a car that can go 200 MPH it'll
Re: (Score:2)
Re: (Score:3)
We knew when RSA was created that advances in computation power would eventually make it feasible for us to decrypt its contents. We even know what that boundary is.. and we're coming up on it now.
No, we did not know any such thing. Advances in computation power can be defeated by increasing the key length of RSA, indefinitely. RSA cannot be made useless just by making regular computers run faster.
Re: RSA is outdated, but... (Score:5, Insightful)
You misunderstand the difference between throwing hardware at a problem and coming up with a more efficient algorithm.
RSA doesn't specify a key length. I can use a key that's 64 bits long (used originally but insecure today) or 1 megabit long (secure against known classical algorithms for the age of the universe no matter how much hardware you throw at it). As hardware gets better I can encrypt things using longer keys, in the same amount of time. It takes you MUCH MORE time to decrypt that, even with the better hardware. So long as you keep increasing key length as hardware gets faster, the encryption actually gets BETTER with better hardware.
The article is talking about a breakthrough in mathematics that could make solving discrete algorithms much faster. If it made it anywhere near as fast as exponentiation then it wouldn't take me much longer to decrypt your message than it took you to encrypt it, regardless of key length.
DES is insecure because it uses fixed length keys, that became practical to brute force. RSA doesn't have that problem. The situations are entirely different, and the potential breaking of RSA is much more interesting, and much more of an accomplishment.
Re: (Score:2)
Advances in computation power alone will never break encryption. Ever. There is no boundary. An encryption can always just create larger keys.
The article is discussing advances in mathematics. Mathematics is more powerful than any computer. Unfortunately, results are also much less predictable. Encryption could be broken with mathematics in 5 minutes, or even never.
Re:RSA is outdated, but... (Score:5, Insightful)
I don't need to be a math major to figure out that if I have a car that can go 200 MPH it'll get there twice as fast as a car that can only do 100 MPH.
You would have been better as a math major. To understand the issue, realize that a car going 200MPH needs much more power than a car going 100MPH. A car going 400MPH will need even more power. Similarly, with some algorithms, the solution becomes harder and harder the larger the dataset grows; often exponentially (or even factorially).
Re: (Score:3)
Based on my limited understanding, proving P = NP would not necessarily and automatically provide a manner of constructing reductions. It might. But there are proofs in computation theory that demonstrate limit complexities but do not provide the algorithms that might implement them, nor do they (currently, visibly) provide any indication of how that algorithm may be arrived at.
You are technically correct, but certainly the quickest and most direct proof is to show a general solution for an NP-complete problem that runs in P time. And while proving P=NP would not necessarily provide the manner of constructing reductions in the general case, solving any NP-complete problem in P time does absolutely provide automatic solutions for *all* NP-complete problems in P time since, by definition, all NP-complete problems are reducible to each other. And factoring is an NP-complete problem
Re: (Score:3, Informative)
You don't know that for certain; it is conceivable (if seemingly unlikely) that the easiest proof and the first found could be non-constructive.
(Remember, to prove that a problem is in P you not only have to come up with a P algorithm for it but then you have to prove that the algorithm is actually in P. It could be that any algorithm for a (currently-cons
Re: (Score:2)
Re: (Score:3, Insightful)
Actually in some ways it would be really really exciting and almost certainly a really good thing in the long run, because there are a lot of important, currently-intractable problems that become tractable if P=NP.
Proving that P=NP doesn't make anything tractable, unless you use the ridiculous definition where tractable is the same as polynomial time. What would have practical applications is if someone finds a very fast algorithm for solving all the NP problems. Whether P=NP is not really very much related to the question of whether such an algorithm exists. ML has exponential-time type checking, yet ML compiles don't take that long. Polynomial time is not the same as practical - it fails in both directions.
Re: (Score:2)
Actually in some ways it would be really really exciting and almost certainly a really good thing in the long run, because there are a lot of important, currently-intractable problems that become tractable if P=NP.
Proving that P=NP doesn't make anything tractable, unless you use the ridiculous definition where tractable is the same as polynomial time. What would have practical applications is if someone finds a very fast algorithm for solving all the NP problems. Whether P=NP is not really very much related to the question of whether such an algorithm exists. ML has exponential-time type checking, yet ML compiles don't take that long. Polynomial time is not the same as practical - it fails in both directions.
Factoring is in NP... If P=NP, factoring is in P...
Factoring could be in P anyways as well...
Re: (Score:2)
The wonderful thing about a constructive solution to P=NP, is that you can use the solution to optimize itself. Suppose it is of polynomial order N. Just encode the statement "is there a program of length R that solves P=NP in polynomial order N-1" as a SAT, and use the N-order solution to solve it. Keep decreasing N until you can't anymore.
Then take a specific computer model, and encode optimality conditions as a SAT. You could then use the best known solution to solve that.
Sure, it might be more than a
Re: (Score:2)
He should have said "Proving that P=NP doesn't necessarily make anything tractable"
Re:RSA is outdated, but... (Score:5, Interesting)
(One way this could fail is the following: factoring I think is in a no-mans land between P and NP, not known to be in P nor known to be NP-complete. If NP collapses into P then so must factoring, but it could be that factoring is some weird-ass O(n^23) algorithm or something while every NP-complete problem can't be done in less than, say, O(n^6000).)
Consider this: Performing 2^256 operations is physically impossible (based on the quantum mechanical minimum energy to do anything, and the total amount of energy in the universe). 100 digit numbers are about 330 bits in size. If factoring n bit numbers required n^30 operations, then factoring just one 330 bit number would be physically impossible.
Re: (Score:2)
There was a story I read way back where the ridiculously-advanced AIs manipulated artificial pocket universes with computationally-friendly physics so they could solve NP problems in much-less-than-P-time from the perspective of the "real" universe. The dominant AIs were the ones with the best pocket universe designs/resources.
Although I'd guess we're a long ways from having to worry about that. :)
Re:Ah what does it matter... (Score:4, Insightful)
Yeah. They have taxes for that.
Re:Ah what does it matter... (Score:4, Insightful)
They do. It is called taxes.
You pay to be spied on, good deal!
Re:Ah what does it matter... (Score:5, Funny)
It's a great deal if you're an exhibitionist!
Re: (Score:3)
I care more about Thieves gaining access to my bank account than Hackers or the NSA.
Re: (Score:2)
Is there someone you think is better qualified?
Re: (Score:2)
Re: (Score:2)