Forgot your password?
Encryption Security

RSA slightly broken 137

Posted by CmdrTaco
from the lotsa-bits-in-there dept.
ccshan writes "Adi Shamir, the "S" in "RSA", discovers a factorization method that renders 512-bit keys vulnerable to cryptoanalysis today. " (its at the NYT:you know what that means)
This discussion has been archived. No new comments can be posted.

RSA slightly broken

Comments Filter:
  • by Anonymous Coward
    RC5 is a symmetric cipher. This is an attack against RSA, which is a public-key cipher and has nothing to do with RC5.
  • by Anonymous Coward
    No shit. I've always been a bit terrified to contemplate just how advanced the government's knowledge about things like this is ever since I found out that the design of DES was optimized against a technique (differential cryptanalysis, I think) that wasn't "discovered" by the public academic community for another 20 years!

    And this is in mathematics(!), a field where dollars don't drive discoveries. What do they know about cloning, chip technology and other areas where good funding is the most significant obstacle to discoveries?

  • by Anonymous Coward
    I was in a rare lecture Adi Shamir gave in our school in israel via Video Conference last WEEK in which he calmly explained to us how come RSA is so unbreakable and how it works exactly!!!

    You can imagine my suprise as i saw this article at the NY Times, this guy was EXTREMELY convincing and there was no way to tell from his face he hade already developed a way to crack his own system! WHAT A POKER FACE!!!

    The lecture by itself was a fascinating experince, but now it has become

    I wish i asked him if he thinks RSA would ever be

    try67 [mailto]
  • by Anonymous Coward
    Well, lets see here... Deep Crack breaks DES... Shamir's method breaks RSA, so my guess is the synergistic effect here will be rather... minimal. Two totally different problems you see...

    And QC is hardly abuot to burst onto the scene... maybe in 20 years it will be ready for a non=trivial problem, but maybe not. Right now it can factor the number 4 with a little help. That's quite a ways from breaking a 1024bit prime. Also at best QC gives you a square root Order of speedup, which is impressive, but if yuo double your key length its sorted....
  • by Anonymous Coward
    I highly doubt we'll ever find a truly unbreakable algorithm. As time goes on, we learn more about the mathematics that make algorithms work. And then we can spot vulnerabilities that we couldn't see before.

    And remember -- the algorithm isn't always (or, I believe, even usually) the weakest link in the chain.
  • Unfortunately, there's been LESS public research on ECC encryption than on factoring techniques. So you might just be switching to a system that has flaws that haven't been discovered yet.

  • by Anonymous Coward
    And to complete your answer..

    In the US, domestic versions of SSL clients and servers are limited by the U.S. Govt. to 168-bit symmetric keys and 1024-bit asymmetric keys. Export-grade software is limited to 40- (or in some cases 56-) bit symmetric keys, and 512-bit asymmetric keys.
  • by Anonymous Coward

    Remember that the NSA is the world's largest employer of mathematicians. It shouldn't be surprising that they have been able to make discoveries faster than anyone else.
  • QC is not at all like conventional crypto -- it takes advantage of two corrolated properties of a photon (or other particle), usually the x and y spins of a photon. Corrolated properties are subject to Heisenberg's principle, so unless you know the right way to set your polarizer, you get garbage, and the two sides know someone is listening in because they're getting garbage too. However, to make this work requires a communications channel which preserves spins of photons -- which conventional networks do not, and technically is very hard, so don't expect QC to be something you can use for some time.
  • by Anonymous Coward on Sunday May 02, 1999 @09:00PM (#1907318)
    You are thinking of Euclid's proof for the existence of infinitely many primes.

    Take all the known primes, multiply them together and add 1. Call this x. x is not divisible by any known prime, so either x itself is a new prime, or some unknown prime divides x.

    Note that there is no way to ensure x itself is a prime. In particular, there is no known formula for generating another prime from any quantity of known primes.
  • there is no way to factor large prime numbers

    10 INPUT "Prime number to be factored"; A
    20 PRINT "The factors are:","1",A

    The difficulty, as I understand it, has more to do with determining whether a given large number is prime, or the product of two primes (and if so, which ones), or something like that.

    I don't really understand the math that makes the cool public key stuff work, but I do know that the special thing about prime numbers is that they don't have factors (beyond the trivial ones produced by my 2-line BASIC program above). Contrast 7 (prime) with 6 (non-prime), for example. Two times three makes six, but what, besides one times seven, makes seven?

  • El Gamal, ECC, Diffie-Hellman... Read the FAQ at
  • You know, this explains why the government stopped bitching about RSA and dumped any legal action against the PGP folks all of the sudden a couple of years back. I suppose NSA figured this out and decided to keep it to themselves....


  • There goes 512 bit keys in PGP. I'd best revoke my key and rework it on the Linux box.

    But this means we could see a way of speeding up's RC-64 contest!!!

    Spammed? Click here [] for free slack on how to fight it!

  • Mind-bending stuff, really.

    Check out; They're working on a simulator for these things...

    There's a Scientific American on the subject at: enfeld.html
  • No matter what they say, FlexLM does not use cryptography. They simply compute a hash with the license data and some vendor and product related values, then compare it for equality with the password field (with some subtleties).

    BTW, RSADSI is probably the only cryptography company that had at a time proprietary cyptography systems trusted by the community (rc2 and rc4). Rivest effect, I guess.

  • I'm a bit rusty on this, but I seem to recall that if P=NP (which some belive true, but has never been proven) then ALL problems that cause public key encryption to work beomce vulnerable at the same time.

    In other words we already know there is a simple way to go from factoring a big prime number to breaking ECC. Note that here simple means N time or something equally simple in comptuer terms. It does not mean we know what the simple procedure is, only that we already know there is some way to go from factoring primes to breaking ECC.

    Or as my Professer in that ECC course I took a couple years ago liked to say "Until that unlikey day when someone proves a therom on chaos theory, this is unbreakable."

  • by sjames (1099)

    The public key is used to protect a 128 bit symmetric key (one time pad) in PGP. The symmetric key encrypts the plaintext. This is because the RSA method takes too long to use directly on the plaintext. Thus, we already have 2048 bit keys protecting 128 bit plaintexts.

  • by sjames (1099)

    In this case. The symmetric key used is a 128 bit key chosen at random (and thus in this case, it is a one time pad). The algo used is 3des. One time pad just means the key is meant to be used only once. In practice, most OTP keys are for symmetric algos.

  • Are they crackable in the same short period of time?

    Based on the similarity of the problem, probably. The prudant move is to assume yes, and use longer keys. In fact, use longer keys for any public key system (1024 should be safe for a while, but use 2048 for anything really important.

  • ...I found out that the design of DES was optimized against a technique (differential cryptanalysis, I think) that wasn't "discovered" by the public academic community for another 20 years!

    This was allegedly the reason why "they" didn't want to allow IBM to release how they designed and verified Lucifer (DES' precursor) and DES itself. They incidentally got this clue, and the "big guys" were afraid that someone else might get another clue how to break/analyse other crypto-systems (and find their weak points) easier. This is really long known.

  • In theory, they could do it. but first they'd need this machine that the article describes. In other words, your keys are still safe until someone actually builds one (even the guy who came up with the algorithm hasn't done that yet).

    In other words, you're probably safe for about a week after Shamir reveals the algorithms (which I don't think he's actually done yet). Then, well, I'd look into stronger security.
  • There is one major problem with this premise.

    As the length of the key increases, so does the amount of pad needed to encode short messages. For example, if your key was 8kb long and you encrypted a 2kb e-mail message, you would have 6kb of padding. This means, among other things, that you have 6kb of known plaintext, and also, the large amount of pad would affect the output of the cipher so that the result isn't as random or difficult to decipher as it theoretically should be with that key length.

    This is why increasing key length can only happen for a limited time, and we need to keep looking for better algorithms.

  • Ah, I see what you are getting at - rereading my original post I can see how it might be misinterpreted. My point was that this "hole" in RSA was only found because the algorithm is public (a good thing). An encryption algorithm that wasn't public might well contain an equivalent hole but since it wouldn't be subject to scrutiny it would continue to appear secure. As far as "noone would trust ...", I'm not so certain - many companies use FlexLM to license their software and I've never seen any mention of the algorithm FlexLM uses. (admittedly I may be wrong here, I've not done exhaustive research to find out what sort of crypto FlexLM uses - it may well be published somewhere). My overall point was that with published algorithms one can have confidence in their quality because people are actively trying to break them whereas with proprietary algorithms you just have to trust the vendor.
  • by YogSothoth (3357) on Sunday May 02, 1999 @12:40PM (#1907333) Homepage
    Interesting article - it shows again how important it is to only trust those algorithms that are open, published and subject to the scrutiny of the community. I am always amazed by the number of companies who really believe that keeping their encryption algorithms or security holes secret somehow makes them more secure. On a related note, I've often wondered if someday someone will find a fast, general algorithm for factoring. Such an algorithm might exist but be as yet undiscovered or it may be the case that brute force is about as good as can ever be done. Fascinating stuff, cryptoanalysis - I strongly recommend reading Applied Cryptography by Bruce Schneier to anyone who is interested in this sort of thing.
  • [...]Why not use random noise as the pad[...]

    Is this overly easy or am I missing something?

    Random numbers are far better then known plaintext. Random numbers are also really hard to get on current machines. What we have is psudo-random (see the man page for random), and on some-OSes we-did-some-stuff-that-may-be-pretty-random (see /dev/random). Psudo-random isn't so much better then known plaintext, in part because sometimes it ends up being known plaintext (like if a program uses the "normal" method of seting the random number seed to the current time, you will amost know the random text). In part because the streams some RNGs make can be recognised.

    Still psudo-random would be better then known-not-random.

    Also the original message had a small flaw, eight times the current 2K *bit* key size is 2K *bytes*, which isn't way huger then many things you may want to protect. Still if we have to keep expanding the key space every few years it will become a real problem.

  • I'm not terribly surprised to hear about a good algorithm being not quite as good as we expected, but it does dissapoint me a little. Assuming you're not one of the bad guys, who doesn't want to see an uncrackable algorithm emerge?

    It was good to see DES last for so long with no real "cracks" that weren't massive brute force. Even today, triple DES looks like it will be pretty strong for a long time.

    At least with the openness of the algorithms in question, we'll at least be educated as to how strong they really are, under current technology at least.
  • Why not simply prepend a length field to the message and pad with strongly random data? This is not vulnerable to known-plaintext attack, and the random pad can only be discerned from the "real data" by decrypting the length field. Alternatively, one could use Ron Rivest's MAC-based chaffing-and-winnowing scheme on a packetised form of the message prior to encryption. There are a number of solutions to this problem.
  • I read the article, and I'm a big fan of the RSA system but the article tells you nothing about the propsed machine, just that there's going to be on. Sure, I can see that the NYT might want to water things down a bit for their readers but they could at least link to somewhere where there's more tech info.

    Currently, with packages like maple and mathematica, on a decent box you can factor primes that are reasonably big, but nowhere near in the range of what RSA uses. This special "machine" Shamir says he's thought up...I need the details!!!

  • Actually if I remember correctly from what I have read SSL uses a 128bit symmetric encryption algorithm, such as Idea, 3DES et cetera.

    This is diffrent from the asymmetric encryption methods used for PGP, and their strenght is measured differently as well.

    A good page to get very basic knowledge on this... Which is incidently where I got this from can be found at the following site. erek/certify/
  • Does anyone know what method this is based on? In sneakers, the scientist came up with something new, faster than the NFS (numeric field sieve I think). Shamir's machine sounds like a massively parallel NFS cracker, but I don't see any details to confirm this.
  • Correct, the EFF has. To join the search, visit The Great Internet Mersenne Prime Search (GIMPS) [], they are ones you would like to join, unless you got a Cray.

    The prize is $50,000 for the first 1 million digit prime, $100,000 for the first 10 million and so on up to 1 billion digits. It all goes to you.

    /* Steinar */
  • There is no `bug' in RSA. The algorithm is still working well and good. The point is: It (like all other public-key algorithms, to my knowledge) depends on the difficulty of factoring a large number. Increase the size of this number, and you're safer. No secret, has never been.

    Repeat: RSA has not been broken. Shamir has only found a better way of factoring numbers (but he hasn't even made the machine he's been talking about).

    /* Steinar */
  • There is, of course a step three:

    3. Take the number modulo 2**n-1 (the prime itself)

    Otherwise, you could never end up with zero, of course.

    /* Steinar */
  • The exact formula is given og GIMPS' pages, see my last comment.

    I believe the formula is. Start with 4.

    Do this n times (n = the exponent):
    1. Square the number.
    2. Subtract two.

    If (after n iterations) you end up with zero, it's a prime.
    /* Steinar */
  • I'm not sure what you mean. The RSA algorithm is well-known and has been studied by cryptanalysts and cryptographers exhaustively. Nobody, not even suits at large corporations, would trust a cryptosystem with an unknown algorithm.

    The closed part of most closed cryptosystems is the implementation. Of course, nobody should trust a crypto implementation for which they do not have the source, lest the code be mailing your private keys back to the manufacturer/NSA/whomever.


  • NYT: New Your Times... but sign up to read the article

  • by craw (6958)
    You are correct that the AES is a symmetric algorithm that will hopefully replace the ancient DES (which must be around 20 yrs old by now). I should have been more explicit. I simply put up my original post because I thought that ppl might be interested in this project; that's why I start off with FWIW.
  • by craw (6958) on Sunday May 02, 1999 @02:01PM (#1907347) Homepage
    FWIW, the (NIST) is currently evaluating several candidates for the new Advanced Encryption Standard (AES) []. This standard will presumably become the officially approved US Gov encryption methodology. One nice thing about this project is that most of the candidates have made their algorithms available for public scrutiny. Furthermore, it appears that there is concern about IP issues (e.g., patents).

    This was taken from the AES site:

    A process to develop a Federal Information Processing Standard (FIPS) for Advanced Encryption Standard (AES) specifying an Advanced Encryption Algorithm (AEA) has been initiated by NIST. NIST is currently soliciting candidate algorithms for inclusion in the AES. It is intended that the AES will specify an unclassified, publicly-disclosed encryption algorithm available royalty-free worldwide, that is capable of protecting sensitive government information well into the next century. It is also hoped that this standard will be as widely accepted as the Data Encryption Standard (DES) in the private and public sectors.

    I looked at one of the algorithms, but it just made my head hurt.:)
  • by Andreas Bombe (7266) on Sunday May 02, 1999 @01:05PM (#1907349)
    No, there is no known plaintext. You don't encrypt the message, you encrypt a key for conventional cryptography and encrypt the message with that one. And the key you encrypt is (and should be, or you have a problem) a random number of which nothing is known about by an attacker.
  • I love the title of this newsbit; an encryption algorithm being slightly broken is like being slightly pregnant.

    What, me worry?

  • ElGamal

    We need a catchy, or at least pronounceable name,
    if we want widespread adoption.
  • Isn't that one of the main requirements? That
    the "apparatus" being patented has to work?
    That way they stopped people patenting perpetual motion machines and so on...
  • "I am always amazed by the number of companies who really
    believe that keeping their encryption algorithms or security holes secret
    somehow makes them more secure."

    It buys them time...
    If they publish their weak code or their security
    holes, they are in trouble right away. Otherwise
    they are in trouble, maybe, at some unspecified future time. "They" can live with that. One way
    they get money, one way they do not.

    Another thing to consider is the soft job market.
    The people who wrote your code don't work there anymore, by and large. This is the downside of the ease of finding high-tech jobs, but I digress.
  • Hmm.. I think I may have just accidentally sent a blank message, ohwell. Preview should be mandatory here too.

    Anyway, What does this mean for my https server? I only allow 128bit or better clients, but does this mean potentially anyone sniffing the packets in between me and the clients would (given this nifty new hardware) be able to see everything that is being sent? (or at least with a far greater degrree of ease than before?) I'm afraid I'm not up on my crypto..
  • by parallax (8950) on Monday May 03, 1999 @12:24AM (#1907355) Homepage
    Each bit added to a number makes it exponentially harder to factor using a classical computer. Factoring techniques will improve, but there is no reason to believe anyone will ever develop a polynomial time factoring algorithm for a classical (non-quantum) machine.

    Factoring large numbers in polynomial time is one of only two known algorithms for quantum computers. A modest quantum computer, if it could be built, would crack every prime-factorization cryptosystem currently in use (which is everything, essentially).

    Researchers here at the Media Lab have already built a two quantum-bit bulk spin-resonance quantum computer. There are significant technical challenges to building a quantum computer large enough for cryptanalysis, but there is currently no reason to believe that this is impossible. If it can be done, it will almost certainly be done in the next five to ten years; major governments and private companies all over the world are pouring lots of money into this effort.

    RSA is good, but we need to start developing cryptography algorithms which aren't factoring based (or reducible to prime factoring), and we need to start NOW. If there isn't a strong, widely-available non-factoring crypto implementation if/when the quantum computing breakthrough happens, all hell is going to break loose.

    You're worried about the banks dropping a few transactions on Y2K? How about someone spoofing the Federal Reserve Banking Netowrk and bringing the global economy to its knees.

  • The article says if it works. And the computer to use this new calculation hasn't been built yet, so no need to worry. But the article also stated that this machine would be cheap to build, so it would be exciting to see what it gets to :) 1024-bit keys will still be considered secure for some time, though. So our privacy is not yet threatened :)
  • Wasn't somebody having a contest with $$$ to the first person to find a prime number with one million digits or something?
  • I don't see how having a large plaintext will make your symmetric key more vulnerable.. yeah.. encrypting a 56 bit symmetric key with a 4000+ bit RSA key doesn't give you any more security than a 56 bit symmetric key you agreed to in person.

    Having a large text gives them more to look at... but I don't think it makes decryption faster... maybe there's something with that differential cryptanalysis.
    Besides.. you could switch symmetric keys every once in a while if neccessary.
  • They explain at that (as another mentioned here already) the secret key for regular encyption is the only thing encrypted by public key, so you don't have to worry about how big your message is.
    The rsa FAQ is pretty interesting
  • Perhaps you are thinking of certain symetric cyphers like DES or IDEA. Both have fixed key lengths. However, many other algorithms don't suffer from such a limitation. A good example is Blowfish, which (IIRC) has a key length from 64 to 448 bits, variable.

    Check out Applied Cryptography for examples of other variable-length cyphers.

    Whether increasing keylength is a good idea or not, though... 128-bit keys are secure against any brute force search for a very long time ( even at 1 billion billion billion try/sec (ie 2^27)) it takes ~8e22 YEARS to break. Increasing a key further than this is silly. Symetric algorithms are invulnerable to advances in factoring and related math; they are vulnerable only to weaknesses in their underlying design.


  • "Mercury Rising" is a better analogy, except in this case it was not an autistic kid who "broke" it. AFAIK. :-)
  • Actually SSL uses a combination of both assymetric and symmetric.

    The symmetric key is used for encrypting the data itself. However, obviously both sides need to have the key for it to work. The first step in SSL is a key exchange mechanism. Typically a symmetric RC4 key is generated by the client, then sent to the server encrypted with the server's public RSA key. Other ciphers are available as well but the RSA / RC4 combination is the most common.
  • I'm not sure if I would trust any of the new PK systems until they have been exhaustively analyzed.

    My memory is a bit fuzzy, but I remember there were a large number of PK systems before RSA that were proposed and then cracked in rapid succession.
  • The Chor-Rivest knapsack system fell to a lattice-reduction attack a while back.
  • Well that's rather ignorant. Why not use random noise as the pad and have a field in the message that would indicate the true length of the plaintext. Through away everything else after that lenght once decoded. The field would be encoded also, so you couldn't easily pick out the plaintext length.

    Is this overly easy or am I missing something?

  • by Rayban (13436) on Sunday May 02, 1999 @12:26PM (#1907366) Homepage
    If I'm not mistaken, RSA becomes free in September 2000 (patent expiry). Maybe this is RSA's way of "planned obselecance". :)

    Because it relies on the product of two large primes, RSA will always be vulnerable to advances in factoring technology. In about five years, even the 1024-bit keys may become easily breakable. Until a real method to factor numbers faster than currently becomes available, we are going to find that RSA is still the most powerful encryption algorithm available (security vs. speed). I would recommend that the minimum key length in bits be increased to at least 8 times the maximum for easily broken keys.

    In this case, 512 is easy to break, so 4096 should be the standard. This should yield a number of years of security. And, as always, make sure you give your keys one-year expiry dates.

  • There is something I feel I should point out. It's not just a large prime, they are really simple to come by. IIRC a Mersenne Prime is a prime number in the form (2^n)-1. There is a rather simple formula forgetting primes from another prime. I forget what it is but it's something like taking a prime and multiply it by 2 then add 1. Although this hasn't been working for me too well so far. If anyone else knows this formula please post it. It's gonna drive me crazy to figure it out. Then again I could be 100% off.
  • You are actually right. I thought that X was definately a prime though. The problem is it's not necisarally (or even probally) the next prime. Ie 2*3+1 = 7 but you miss 5. Arg! Too much math in my head (AP Calc BC exam is next thursday).
  • sneakers was one of the dumbest i've ever seen. right up there with "hackers"
  • Quantum cryptography I presume.
  • You know what this means!?

    They knew when they published, they've watched half a hundred stumbling steps and laughed, and finally, after selling bogus security for 15 years, they laugh all the way to the bank and disprove themselves, so we can pay them more money to fix what they knew was broken.

  • "As soon as information, be it atoms or bits, is out of
    your posession or shared amoung two or more people, then it's no longer
    secret, authentic, or trustable. "

    And we used to laugh at people who protected their images, would not allow photographs for fear of loss of the soul?

    Were we? Are we? clever? Ever?
  • Big difference here...Shamir's new thing just reduces (by a lot, albeit) the computation required to factor RSA. Assuming this just reduces factoring difficulty, a bunch of other public key cryptosystems won't be affected (like ElGamal, which deals with the difficulty of the discrete log problem instead of prime factorization). The Sneakers story revolved around a little box which (in a matter of seconds) cracked _any_ data scrambled with _any_ algorithm.

    CJK, frantically trying to finish up his crypto semester project (due tomorrow), which deals with discrete log public key crypto :)
  • ECC isn't necessarily tied to any given cryptosystem (there are several cryptosystems which use ECC). It's just that the math operations are done over elliptic curves instead of some other finite field...

    (and I'm not sure that the statement about no patents is necessarily true)

    But yes, ECC _is_ cool...but needs more research. Lots of people would be a little wary of implementing it, since EC cryptosystems haven't gotten as much scrutiny as of yet.

  • Dude, regardless of advances being made...quantum crypto is *NOWHERE CLOSE* to being practical. Factoring single digit numbers won't get you very far. If you equate 'about to burst onto the scene' with 'several decades', then _maybe_ I might be able to not laugh out loud :)

  • I'm not sure how the weakness of one particular algorithm (RSA) has implies vulnerability in 'modern cryptographic techniques'.

    Sure, this shows a weakness in RSA (but if you are using 512-bit keys right now, you are living dangerously as far as what's recommended)

    Shamir's discovery doesn't imply any weaknesses in the entire class of public key cryptosystems based around discrete logs. (or other public key cryptosystems not based around prime factorization, such as some knapsack types and McEliece, based on algebraic coding)

    The one time pad is impractical for all but a very narrow group of scenarios.

  • There are many public key cryptosystems which do not get their strength from the difficulty of factoring prime numbers.

  • ? What you're saying makes no write as if the government only uses encryption based around 1 principle. It's not like the government says "ok, we're going with discrete logs" and standardizes on that....they pick standards, and the standards use whatever method they happen to use. There is definitely (and has always been) cryptosystems in use within the government that are based on a variety of principles. So, like I said, this makes the Sneakers idea way off and your comment about 'what the governmetn was using at that time' bogus too..

  • Hehe...just because you don't know about them doesn't mean there aren't a load of them :)

    You have ElGamal, based on discrete logs...(and the discrete log problem generalizes easily to allow for many many many variants on the same principle). DSA is based on discrete logs, too.

    Most EC cryptosystems are based on an analog of the the discrete log problem on a finite field...generally a more intractable problem than the general discrete log problem (except for a few degenerate cases, in which it reduces to the problem of discrete log in a finite field).

    McEliece, based on algebraeic coding theory...

    I think there is one cryptosystem based on the knapsack problem which is yet unbroken (most knapsacks are shown to be weak)

  • Why are you saying 'we need to start developing cryptography algorithms which aren't factoring based, and we need to start NOW' if we don't already have plenty such algorithms. Look into ElGamal or any other discrete log cryptosystem. ElGamal has the bonus have having an already-expired patent...

  • I fear any cryptographer who decides to implement an algorithm based on how cool its name sounds.

    This is kind of's easy to make up additional or new names as part of your marketing which are catchy/easy to remember.

  • Uh, do you realize how slow 4096-bit key generation is? Sure, computers are always getting faster...but you have to realize that the state of the art machine is not the 'baseline' platform. You also have to realize that these things need to work with cpu-poor devices, like smart cards. EC cryptosystems help this out to a degree, but they are no panacea either.

    Sensible key sizes are a much better solution. Using the max key size 'just because it's there' is just wasteful (and slow as hell).

  • cryptogeek> So how is the situation _better_ than
    cryptogeek> this? First, these advances only apply
    cryptogeek> to public-key encryption, not
    cryptogeek> secret-key encryption schemes like
    cryptogeek> DES.

    Isn't all encryption using a fixed-length
    key solvable by a non-deterministic machine
    in polynomial time? If a password was a maximum
    of x characters with y characters in its alphabet,
    wouldn't a non-deterministic machine just have
    to spawn off y^x copies of itself and try each
    combination in parallel? Or am I missing
    something here?
  • No kidding. I'm stuck trying to decide if this makes life more interesting/entertaining or just downright scary.
  • Adi Shamir has not published this paper yet. It is being presented at Eurocrypt on Tuesday.
  • Actually, last I heard the MIT Quantum Computer was a *long* way from Prime Time. I don't remember the details, but the techniques that work for a two qubit quantum computer will not work for anything much larger -- and the two qubit version was difficult enough.

    There's no real rush yet. I think we have a number of years to go before Quantum Computing treatens cryptography.

  • The situtation is both worse and better than this. If quantum computing could only be used to factor large numbers, the world wouldn't change much. We would stop using RSA encryption (which is only as secure as factoring is hard) and start using schemes based on the difficulty of the discrete log (such as El Gamal). However, Shor's algorithm can also be used to perform discrete logarithms in polynomial time, thus blowing away most of the remaining public-key algorithms and some very important key exchange schemes (particularly Diffie-Hellman). We might still be able to salvage some sort of public key scheme out of lesser known problems (like the knapsack problem), but it would take a lot of work.

    So how is the situation _better_ than this? First, these advances only apply to public-key encryption, not secret-key encryption schemes like DES. Second, quantum mechanics also opens up new possibilites for key exchange that were not available before. In particular, quantum mechanics can be used to distribute random key material for a one-time pad over a public medium. There's a good overview of the process in the Oct. 1992 Scientific American, but the main idea is this: Quantum entities (photon, electrons, fundemental particles) change when observed. Therefore, someone can send out the random key material in the form of a stream of photons, and the reciever can tell if they were observed in transit.

    This is a Good Thing, cryptographically speaking, because one-time pads are proven to be _unbreakable_. Furthermore, this type of key exchange has already been one, over distances as long as 30km (I believe).

    So quantum computing would change things, certainly, but it's not the end of the world.

    (For those interested, Schneier's _Applied Cryptography_ and the _Handbook of Applied Cryptography_ by Menzes, van Oorschot and Vanstone are good general references. As mentioned above, the quantum key distribution method can be found in Scientific American, Oct. 1992. Peter Shor's home page is here []. There's lots of information on quantum computing on the web, but a good place to start is here [].)

  • Caveat: We are now outside my area of expertise. If anyone knows more about these matters, please feel free to correct my mistakes.

    My reply would be that quantum computers are not non-deterministic. A non-deterministic machine is one that is allowed to make guesses, or choose between multiple possibilities. To say that a non-deterministic machine can solve something in polynomial time usually includes the unspoken assumption that the machine _always guesses correctly_. So, yes, secret key encryption is probably breakable by a non-deterministic machine in linear time-- the machine simply guesses each bit of the key correctly. However, I don't think anyone has actually built a non-deterministic machine.

    A quantum computer, on the other hand, can be in multiple states simultaneously-- until you look at it. When you look at it, however, it collapses into one of its component states RANDOMLY. For example, suppose you have a quantum bit ("qubit"). This bit can be in either of the two classical states: |1>, or |0>. It can also be in any combination of the two, such as (a * |1> + b * |0>). (The coefficents a and b, by the way, can be complex numbers.) When you look at the bit, on the other hand, it will collapse into one of the two classical states randomly, with the probabilites given by the values of a and b.

    So, as opposed to a non-deterministic machine, a quantum machine can look at all options simultaneously but will only show you one outcome. It may not be the outcome you want, and once you look at it _all the other outcomes are gone_. The trick of quantum computing is to cleverly massage the values of a and b until you can be sure that it will collapse into an outcome you want. (This is the essence of Shor's algorithm.)

    If you want to know more, look at
  • True, it's not going to be household stuff for a while, but the physics behind it is very solid, and it has been done.

    Over pretty large distances too...

  • Actually, if you stretch your definitions a little, somebody has found an efficient factoring algorithm. Peter Shor's algorithm will factor a number in polynomial time, trouble is it only works on a quantum computer.

    It's been proven that P QP (that is, the set of all classical polynomial-time problems is a subset of all quantum polynomial-time problems) Also, there are certain things that can be done on quantum computers more efficiently than a classical computer could (not just speculation, there are problems like this)

    I wish I knew more about this stuff.

  • It's neat to think that there is a psychic connection with other slashdots

    To be honest, I always thought that the movie 'sneakers' would remain fiction, with any attack done the 'brute' force way, which really made these cyphers intractable.

    I thought that the story was well written, and this just proves it.
  • Scientific American carried a story about a cypher using 'quantum mechanics'

    Theoretically this cypher is the cypher to end all cyphers. I say 'theoretically' because that's the way it is for all codes, until perserverance and science break them.

    But the quantum mechanical cypher is pretty convincing. Any one have good links?
  • The article says if it works. And the computer to use this new calculation hasn't been built yet,
    That he is aware of. The governments have a way of keeping things real secret when its in the intrest of 'national security'. Ah well maybe I'm being paranoid.
  • Shamir has designed on paper a parallel machine that can crack 512 bit keys in a 'reasonable' amount of time.

    All this heralds is that computing power has acheived another milestone, and that it's gotten easier/faster to factor numbers - and thus crack crypto.

    Let it be pointed out, though, that the difficulty increase for each bit increase is exponential, not linear, so 768/1024/2048 bit RSA keys should be safe for quite some time...

    maybe 10 years? HHOS.

  • I think that this is the requisite response to a question about unbreakable systems. We have it and it's the one time pad. It has the same problem as all symmetric systems in that you need to secretly share the key with the added bonus of having to keep a key as long as the message, and never being able to reuse a key. It's a nightmare for key distribution.

    Since what you are combining with is totally random there is no way to crack this system. One decryption is equally as likely as any other. I'm looking forward to Tuesday to see what the paper turns out to be.

  • I am a Physics major at ISU ( occassionally ) and would like to point out that quantum computing is a field that is still considered 'pure research' . It is thought of as pure because of the remoteness of it's application . Using a quantum computer in the laboratory may help you simultaneously solve a two bit problem but the more possible states you need to represent the larger the molecule that contains the representative atoms has to be . DNA molecules may one day solve this ( if they can get a similarly extensible molecule to carry out the computations ) but I am much more encouraged by the use of DNA strands to solve the NP complete problems in reasonable time frames .
    Traveling salesmen , smallest network and factoring primes are all problems that can be converted into one another in a linear ( or is it quadratic ? ) amount of time . Hence ; you solve one , you solve em all . No one has proven NP complete problems to be hard to solve . It is a conjecture that any of these things is difficult ( Given , it is a conjecture that has held up for a darned long time under intense scrutiny ). RSA and anything dependent on factoring primes or other NP complete problems may be trivially easy to break . If they are , we just haven't figured out the method yet .
    If they aren't , it is entirely possible that we will never be able to prove that they aren't . One way or the other I wouldn't hold my breath for quantum computing to make these things suddenly open up.
    See also ( before the NSA bans publication ) :
    Bruce Schnier's book : Applied Cryptography
    2nd edition
    Your Squire [mailto]
  • Anything that is dependent on the difficulty of cracking ( factoring ) primes is going to suffer form these new techniques . The fate of quite a few encryption schemes are tied to RSA because of the general dependence on the conjecture that factoring primes is difficult ( given that it has been a bitch so far , it still hasn't been proven to be difficult ) . I don't even remember if factoring primes has been proven to be NP complete .
    P.S. I believe the NSA is the single largest employer of Mathematicians in the world . Does that scare anyone ? I'm not frightened about the potential abuses ... I am bothered by the idea of that may Matmats in one place ...
  • Just because yu have encrypted your regular key with 1024 bit RSA doesn't mean that your regular ( that is symmetric key ) key is safe . If you use a long enough plaintext the symmetric key becomes vulnerable . The RSA part is just a way to share symmetric keys without letting anyone else see them . It doesn't make a whole lot of sense to use a 4000 + bit RSA key to disguise a 56 bit symmetric key for the encryption of the library of congress . For passing mail to and from this is alost never a concern but when companies start talking about streaming megabytes of data ...
  • The problem is that Mersenne's formula does not reliably generate primes, i.e. a portion of the primes returned by (2^n)-1 [where n is itself prime] are _not_ primes. Hence all Mersenne primes generated via (2^n)-1 have to be tested to make sure they're bona fide, hence expensive computations required :)

  • The AES is for a new symmetric encryption algorithm, to replace the tremendously outmoded DES. This has nothing to do with public key cryptography.
  • where did you get this info? certainly not the nyt article. you can't write this off as just more computational power -- there is some slick algorithm at work here for factoring in hardware. i doubt any machine could reasonable attempt trial division on numbers of that size. and reducing the time to factor a 140 digit number to the time required to factor 80 digits is seriously incredible. if it works.
  • i realize that there is no trial division in todays factoring landscape, my point was that the factoring "equivalent" of brute forcing a keyspace is trial division. i have never heard of factoring implemented in hardware, which is what really intrigues me: does he implement the gnfs or ecm, or what?
  • Disregarding cost and/or availability (you'd need
    fiber for transmitting photons, right?), there's
    still a huge problem here.

    If, for example(!), the US government (or rather,
    No Such Agency), wants to make using crypto
    impossible (and I mean QC here), they simply
    install `photon-watchers' in all the main backbone

    So, you get a one-time pad... but you see it was
    observed. You then either communicate in clear
    (as the key isn't secure anyway) or you
    encrypt with the key, full knowing that someone
    can decrypt it...

    Which would imply that quantum computing kills
    crypto no matter what... I *hope* I'm wrong there!
  • Quoted from []:
    Astronomers should know about it. RSA is typically performed using 512 bit prime numbers. There are approximately 3.778e151 such prime numbers. Using the advanced storage technology available to the NSA, it should be possible to store a 512 bit number in a single hydrogen atom. A typical universe (e.g. ours) contains approximately 1e90 hydrogen atoms. If the NSA has hidden 3.778e61 universes in an inconspicious little building in Maryland, astronomers should notice some deviations in the gravity field in the area.

"If John Madden steps outside on February 2, looks down, and doesn't see his feet, we'll have 6 more weeks of Pro football." -- Chuck Newcombe