Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Security

More on Bernstein's Number Field Sieve 151

Russ Nelson writes "Dan Bernstein has a response to Bernstein's NFS analyzed by Lenstra and Shamir, entitled Circuits for integer factorization. He notes that the issue of the cost of factorization is still open, and that it may in fact be inexpensive to factor 1024-bit keys. We don't know, and that's what his research is intended to explore."
This discussion has been archived. No new comments can be posted.

More on Bernstein's Number Field Sieve

Comments Filter:
  • So using 4096 bit encryption wasn't as paranoid as everybody told me...

    Quantum Computers, Advances in Number Theory; looks like this decade will become interesting.

    BTW Could the admin of http://cr.yp.to please check the serverlogs for any visitors from nsa.gov?
    • So using 4096 bit encryption wasn't as paranoid as everybody told me...
      Actually, from my reading of the analysis of DJB's original statement, you're closer to being at risk.
      BTW Could the admin of http://cr.yp.to please check the serverlogs for any visitors from nsa.gov?
      Which would prove what, exactly? The NSA doesn't get AOL CDs like the rest of us?
      • OK, you're right. Or as DJB puts it: "This is revisionist history, not a technical dispute."

        But you're wrong concerning the AOL CDs. One of NSA's missions is "protection of U.S. information systems". So no AOL allowed...

    • Quantum Computers, Advances in Number Theory; looks like this decade will become interesting

      I'll say! Arthur Andersen has advanced number theory further than anyone had imagined it could go!

  • If you're worried about your 1024-bit keys being broken, switch to using 4096-bit keys. Until quantum computers are developed, factorisation will still remain a near-exponential/superpolynomial time activity, and 4096-bit keys will be safe until the military discovers how to harness quantum computing.
    • Re:So what? (Score:2, Informative)

      by tomstdenis ( 446163 )
      Or you can be smart and stop using RSA. In the realm of things the DLP over GF(p) is harder to solve then the IFP.

      Even harder is the DLP over eliptic curves. With proper curves [e.g. NIST supplied ones] the best attacks on the math take SQRT work which means a 256-bit ECC key will take roughly 2^128 work to solve for the private key [from the public key], provided the cryptosystem is setup correctly [yadada].

      Tom
      • Usually you just want p=2 in which case GF(2^m) is embedable in 2^n for large enough n.
        • yes you can do ECC over GF(2^m) but you can also do it over GF(p).

          [cheap plug]
          My (free, portable, fairly packed) libtomcrypt lib takes the NIST curves over GF(p) since they are simpler to work with...

          http://libtomcrypt.sunsite.dk

          [/cheap plug]

          Using the other field structure is more important in hardware or say low end microcontrollers.

          Tom
      • In the realm of things the DLP over GF(p) is harder to solve then the IFP.

        True, but arguably not significant. With the NFS, both problems have the same asymptotic difficulty. Further, the DLP isn't much harder than the IFP in practice. Solving instances of the N-bit DLP is approximately as hard as solving the IFP with N+30 bits.

        Even harder is the DLP over eliptic (sic) curves.

        Very much the case at present, though more and more curves are found to be vulnerable to variants of known attacks, and more attacks are being discovered all the time. There were some interesting papers at the ANTS-V and ECHIDNA conferences held earlier this month in Sydney.

        Paul

        • True, but arguably not significant. With the NFS, both problems have the same asymptotic difficulty. Further, the DLP isn't much harder than the IFP in practice. Solving instances of the N-bit DLP is approximately as hard as solving the IFP with N+30 bits.

          While in practice the time-wise estimates are the same the space-wise are not. Over a single large prime group the DLP takes p times more memory to solve [where p is the # of bits in the group modulus]. Also the matrix ops in the final stage are not with bits but with p-bit integers.

          This makes the last stage "practically" way more difficult than it is for the IFP.

          As for type sof curves, [no joke here], I'm not an ECC expert. I just used what NIST gives out as they are more traditional types of curves.

          Naturally one must keep an eye open for thr horizon of new attacks but for the time being ECC appears to be way harder than RSA or DH[over GF(p)].

          One little rant-addon. You can do DH over ECC despite what most PGP drones want you to think. This really p's me off. Add ECC support to GPG already!

          Tom
    • 4096-bit keys will be safe until the military discovers how to harness quantum computing.

      History is littered with the bodies of people who think that a cryptosystem is unbreakable, or essentially unbreakable. Just because we can't break something now doesn't mean a solution won't be found next year, or even next month.

      There's a reason that most cryptosystems aren't proveably secure; they aren't secure. It's all a gamble that the next mathematical breakthrough in whatever field will be far enough down the road not to matter to you.

      Thinking anything else, or becoming complacent with a seemingly secure system, is little better than no cryptography at all.

    • Heh, how do you know they don't have it? Do you think they would announce it right away? The military has always been ahead of the rest of the world.

      Anyways, once i read a theory that suggested using a graph-theory based encryption scheme. The idea was interesting, particulary because - as others mentioned - we simply have no proof that factorization is an NP-full task, while on the ther hand, we _have_ proof that the graph theory has some problems like this (e.g. finding Hamilton-circles IIRC).
  • by gweihir ( 88907 ) on Sunday July 28, 2002 @08:27PM (#3969208)
    ... this is. I especially like the mixture of theoretical, practical and yet unknowen aspects of the whole problem.

    My impression is that so far DJB has done a good job of being honest and clear. Although "the press" is sadly lacking in experts these days and often will not even notice they have not understood the problem. I have to admit that I did not quite follow
    Lenstra-Shamir-Tomlinson-Tromer, but I think DJB's original proposal is still the best source on what is going on. No real surprises so far for practical purposes, but I will follow this closely.

    Incidentally I don't fear for my 4096/1024 bit ElGamal/DSA gpg key in the near future. I am confident that installing a keyboard sniffer without me noticing is far easier than breaking that key.
    • Incidentally I don't fear for my 4096/1024 bit ElGamal/DSA gpg key in the near future. I am confident that installing a keyboard sniffer without me noticing is far easier than breaking that key.

      I concur. I find this device [thinkgeek.com] far more threatening than any cluster of machines at the NSA.

    • ElGamal/DSA are based on discrete logs, not factorization. So I'd say you're safe from DJB for now.

      But it is suspected that factorization is AS STRONG as the DL problem. But no proofs exist yet.

      It should be assumed however that if the integer DL is solved, all PK crypto (RSA & factor based ciphers included) would fall with ECC and GF(2^x) DLs to fall shortly after...this is just the general sentiment in the field. FYI

      JLC
      • ElGamal/DSA are based on discrete logs, not factorization. So I'd say you're safe from DJB for now.

        That is the second reason I feel pretty save. Just trying to see whether anybody notices...

        I agree that the if somebody gets a foot into DL anywhere, they likely can open the whole door and presumably some other doors too. Not my field of expertise though.

  • Isn't public key encryption basically security through really extreme obscurity?
    • Re:Security ... (Score:3, Interesting)

      by Russ Nelson ( 33911 )
      No. Security through obscurity means that your task is made easier on every decryption by knowing something which is intended to be a secret. Needing to factor a number every time isn't obscurity. It's just plain hard.
      -russ
      • Well, every security implementation is technically security through obscurity unless you're using a perfected implementation of biometric authorization. Even passwords and high bit number factorization is simply based on not knowing something (the password or the prime numbers) and doesn't really prevent the wrong person from accessing resources.
    • All encryption requires secrets. Good algorithms require that only the keys are secret, and everything else (especially the algorithms themselves) can be published without compromising security.

      Security Through Obscurity is bad because every (say, MS Media Player) has the same secret. If that secret is discovered, then /everything/ using that secret is compromised.

      Asymmetric encryption (e.g. Diffie-Hellman) avoids this problem by having a different secret for each end user. If the secret is discovered then only that user is compromised.

      The point is that Security Through Obscurity depends upon widely distributed secrets (e.g. the exact behaviour of binary code), whereas competent encryption doesn't.
    • Re:Security ... (Score:3, Informative)

      No. The phrase "security through obscurity" generally means that the security algorithm is kept a secret. The lock on your house is a good example - how the lock works is open, public information. The particular key that you use, however, is not. To unlock your door, someone must either learn what key you use or exploit weaknesses in the lock. A standard locksmith text, some standard locksmith tools, and a little time is all that's required to pick the standard lock. How the lock works is not obscured; what key you use is.

      Since everyone knows how the lock works, everyone can see what the weaknesses are and everyone can try to correct them (this has been done over the last couple of centuries). If you used a non-standard lock with a mode of operation that only you knew, that would be security through obscurity. No one can learn to pick it by reading standard locksmith texts. The weaknesses would have to be learned through trial and error by someone on your front porch. Any lockpicker would have to learn not only what key you used but how to use it in the lock.

      Public key encryption generally uses well-known algoorithms - this is what makes it not obscure. The keys are secret, just as with obscure systems. However, everyone can see whether or not the system is a strong one.

      Go read Cryptonomicon for a fictional introduction to the perils of obscured cryptosystems, as well as the benefits of open ones. Then read Applied Cryptography for the real (ie technical and mathematical and correct) deal. Or just take my word for it like a happy little slashdotizen. :)

      BTW, modern locks are the result of a couple hundred years of improvements and refinements; they are currently about as good as they need to be, given weaknesses of the door itself, the convenience of regularly using the mechanism, the cost of the lock and keys, etc. Cryptosystems are nowhere near that level yet. The internet is still a country town, in the sense that everyone leaves their house unlocked and their keys in the ignition of their car. Once everyone's been hacked a couple of times (and I do mean everyone), adoption of crypto will become widespread, and we'll begin to see sufficient standardization to make crypto comparable to real-life locks. Government officials don't whine about locks the way they do about crypto for good reason - they aren't much more than courtesy and inconvenience devices. Give me a good sledgehammer and I'll be in your door in five minutes. Give me a power saw and I'll make my own door. Betcha the standardized crypto will be similar.
      • If you used a non-standard lock with a mode of operation that only you knew, that would be security through obscurity. No one can learn to pick it by reading standard locksmith texts. The weaknesses would have to be learned through trial and error by someone on your front porch. Any lockpicker would have to learn not only what key you used but how to use it in the lock.

        To carry the analogy a bit further, your lock would probably be very difficult to pick quickly, no matter how skilled the person trying to pick it is in picking other locks- at least until he figured out how it worked. Once the principle of operation of your lock is known, though, the person trying to pick it can develop a standard approach to picking that style of lock.

        The problem then is that there's a very good chance that your new lock design will be easier to pick for somebody who knows how it works than a standard lock would be. Chances are you've made a mistake and left in some vital flaw in your design that makes it easier to pick. After all, your lock would not have the decades of dedicated people trying to figure out how to pick it that are needed to find and correct the weaknesses in its design. And if you try to sell your lock to the public, anyone who's interested in learning how to pick it will be able to buy one, take it apart, and spend as much time as he wants trying to figure out how it works.

  • I didn't really think there was any need for anything better than 128 bit encryption. It would take a lot of factoring that is practically impossible by human standards to figure out the key for a 32 bit encrypted code, and this site [stack.nl] seems to tell me that 128 bit encryption is nearly impossible to break by any standards.

    On that same site, they are saying that most encryption cracking comes in the form of key snooping, trojan horses, and packet sniffing, so simply increaing the cipher strength probably won't do much.
    • Re:Why? (Score:5, Informative)

      by God! Awful ( 181117 ) on Sunday July 28, 2002 @09:28PM (#3969378) Journal

      I didn't really think there was any need for anything better than 128 bit encryption. It would take a lot of factoring that is practically impossible by human standards to figure out the key for a 32 bit encrypted code, and this site [stack.nl] seems to tell me that 128 bit encryption is nearly impossible to break by any standards.

      128-bit private key encryption is considered virtually unbreakable. 128-bit public key encryption is not. AES is an example of private key encryption; RSA is an example of public key encryption.

      -a
      • I believe your confusing public/private with symetric/asymetric. Public/Private terminology is used with asymetric enryption techinques like RSA, but not witrh symetric techniques like AES or DES.

        • I believe your confusing public/private with symetric/asymetric. Public/Private terminology is used with asymetric enryption techinques like RSA, but not witrh symetric techniques like AES or DES.

          I most certainly am not. Do a little research and you will find that the terms "public key cryptography" and "private key cryptography" are commonly used to refer to asymmetric and symmetric encryption respectively. E.g. see result of google search [google.ca].

          -a
          • You are right, although I would like to point out that the terms "public-key cryptography" and "secret-key cryptography" (note "secret" instead of "private") are preferred because of the lower risk of confusion.
    • 128 bit encryption is for symmetric (i.e. private) key encryption. It is nearly unbreakable. The problem is that you have to arrange a key with the person to whom you're sending the message, which could be difficult, especially if you don't know him or her.

      That's where public key crypto comes in. Knowing that my PGP public key ID is 0x84B0FDB8, you could send me a message that only I could decode, unless the NSA has a few very interesting secrets. Since I have to publish information (namely how to encrypt the message), this type of code is inherently easier to break. RSA can be broken by factoring a large number, and a 128-bit number poses little challenge for even the Sieve of Eratosthenes on a fast computer, so longer ones are needed. The question is, how much longer?

      Which is why I use DH/DSS, which operates with the (slightly) harder problem of discrete logarithms.
      • {{{
        RSA can be broken by factoring a large number, and a 128-bit number poses little challenge for even the Sieve of Eratosthenes on a fast computer, so longer ones are needed.
        }}}

        I hope you regret writing that.

        For 128 bits (39 digits), you almost certainly need a better than O(n^(1/3)) algorithm, or an amazing constant factor in something like a Fermat method with sieves (a la Knuth).
        Fortunately P-1 and Rho are arguably O(n^(1/4)), and ECM is better still (arguably sub-exponential). However, if you have prior knowledge that the number is unlikely to have small factors, then at 39 digits, you may as well just roll out a QS, which is sub-exponential, and excels when there are no small factors.

        Satoshi Tomabechi's PPSIQS would do the job in the bat of an eye.

        Mike Scott's QS, accompanying his Miracl library, would do the job too.

        THL.
    • The average computer can crack a 128 bit RSA key in about a month. Of course, the average slashdoter's computer can break 128 bit key in about a week, and, in 1999, a beowulf cluster broke a 128 bit key in a matter of a few hours. Of course, with 1024 bit encryption, it is far easier just to stick a key stroke log onto somebody's computer, but, considering that in 1999 a beowulf cluster was able to brute force a 128 bit key in a matter of hours, it is probably possible to break a 1024 bit key in about a year these days (unless, of course, your the NSA, in which case, you can break it in a couple of days).
      • The average computer can crack a 128 bit RSA key in about a month.

        Actually, the average computer can crack a 128-bit RSA key in about a minute. Assuming, that is, it's using an algorithm developed in the last twenty years or so. 128 bits is only 43 digits, and factoring 43-digit integers these days really is very easy.

        in 1999, a beowulf cluster broke a 128 bit key in a matter of a few hours.

        References please. This is a new one on me. What kind of key? At this size a RSA key, or a discrete log over the integers, is essentially trivial and a 3DES or AES key is essentially impossible, AFAIK. State of the art for the ECDLP (elliptic curve discrete logarithm problem) is still a few bits short - again, AFAIK.

        Paul

        • 128 bits is only 43 digits

          Sigh. I really ought to check before relying on memory, rather than afterwards. A 128-bit integer actually has 39 decimal digits.

          Paul

    • You're thinking of keys for symmetric ciphers - the kind where the same key does both the encryption and decryption. For public-key algorithms, more key bits are needed to achieve the same level of security.

      According to Bruce Schneier in Applied Cryptography, Second Edition, a 56-bit symmetric key is roughly as secure as a 384-bit asymmetric key, and a 128-bit symmetric key is roughly as secure as a 2304-bit asymmetric key. (Table 7.9 on page 166; there's a copy of the same table on the page you linked to.)

      Even with a symmetric algorithm, though, breaking a 32-bit key is not "practically impossible by human standards". 56 bits, the key size used by DES, is considered to be insecure since it can be broken in a few days on a PC. I'm not sure where you got your information on that page you referred to, since in the last paragraph the author says "For today's secrets, a 1024-bit is probably safe and a 2048-bit key definitely is." And "today" in this case is January 26, 1997, when the page was last modified.

      I'd recommend at least 4096 bits for new keypairs. It may or may not be overkill, but modern computers are fast enough that the time it takes to cipher with a longer key is still insignificant in the course of normal usage.

      • by ^BR ( 37824 )
        I'd recommend at least 4096 bits for new keypairs. It may or may not be overkill, but modern computers are fast enough that the time it takes to cipher with a longer key is still insignificant in the course of normal usage.
        This may be true for something like PGP because you only send that many email a day, but for a https server it will make a huge difference in the connection rate that you'll be able to sustain, RSA computation being by far where most of the CPU is spent. 4096 keys are not viable in a web context. By the way many toolkits support RSA keysize only up to 2048 bits.
  • We already learned from "Sneakers" what one of these things costs:
    • one cleaned up record
    • a whinnebago (burgandy interior)
    • a date with an armed woman named Mary
    • a trip to Europe (plus Tahiti)
    • peace on earth, good will towards man.
    Oh ... He means performance costs. Nevermind.
  • by hazyshadeofwinter ( 529262 ) <arto&telusplanet,net> on Monday July 29, 2002 @01:25AM (#3970086) Homepage
    Yeah, I once thought it was secure, but it looks like now I might have to replace my rot13 encryption with rot26, or even rot52...
  • Cost model (Score:4, Funny)

    by blair1q ( 305137 ) on Monday July 29, 2002 @02:58AM (#3970254) Journal
    Computation time multiplied by the cost of the computer?

    His department comptroller must love him. "No, you can't have a new plastic spoon, because it costs 11 cents and you will be using it for 0.8 years and that's...2.8 million dollar-seconds...we'll buy you a new $40 silver spoon every day and let you use it to stir your coffee for three seconds per...that's only 35K dollar-seconds..." It's pathological.

    Okay, if you fully depreciate the computer to the moment you start the computation, or better yet, market-price it, then watch the price as the computation continues along (could drop 10-20% in a few weeks for a given top-end PC type machine), then you're calculating the average replacement cost of the machine over the life of the computation.

    It still seems a little verschimmelt. The quasi-rent on such a machine is really the depreciation over the term of the computation.

    Need to think more on what cost means to someone who's trying to steal all your base. They probably stole the computer, anyway.

    --Blair
  • http://www.bearnol.pwp.blueyonder.co.uk/Math/Facto r/Factorfa.html
    and
    http://www.bearnol.pwp.bluey onder.co.uk/Math/Facto r/Factorwa.html
  • It just sounds way too complicated. What does cost factorization have to do with cute little bears anyway?
  • I think the most interesting part of Bernstein's answer is here [cr.yp.to]. The last paragraph states:
    I am astonished that anyone would publish this obvious use of mesh routing as if it were an interesting new result; I am annoyed that my grant proposal has been characterized as presenting slower algorithms than it actually does; and I am particularly disappointed in Lenstra and Shamir for publishing their paper after I informed them in plain language that their claim of an ``improvement'' was false.
    And on the same page he cites an email he sent to Arjen Lenstra in April which explains what this all is about in a much more understandable way.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...