Forgot your password?
typodupeerror
Security Math

Factors Found in 200-Digit RSA Challenge 184

Posted by Zonk
from the really-big-numbers dept.
diodesign writes "The two unique prime factors of a 200-digit number have been discovered by researchers at Bonn University (Germany) and the CWI (Netherlands). The number is the largest integer yet factored with a general purpose algorithm and was one of a series of such numbers issued as a challenge by security company RSA security in March 1991 in order to track the real-world difficulty of factoring such numbers, used in the public-key encryption algorithm RSA. RSA-200 beats the previous record number 11281+1 (176 digits, factored on May 2nd, 2005), and RSA-576 (174 digits, factored on December 3rd, 2003)."
This discussion has been archived. No new comments can be posted.

Factors Found in 200-Digit RSA Challenge

Comments Filter:
  • Hooray! (Score:2, Interesting)

    by mfh (56)
    Now the longest /. UID in history can be created!
    Seriously though... any idea when our databases will enable INTs this high with indexing and normalization? I'd like to see a kind of infinite rid length at some point, and while database character types like varchar in mysql goes to 255, maybe it's really enough? (ducks from incoming Bill Gates quotes)
    • Mathematica, a mathematics package, can handle 2,100,000,000+ bit numbers (I don't have the exact figure), but I don't think that that's the sort of thing you're looking for.
    • Java's BigInteger class handles integers of any size--although its speed is nothing to be envied.
    • Oracle's VARCHAR2 type goes up to 4000 chars, while the LOB types (BLOB, CLOB and NCLOB) support up to about 4GB.

      For what it's worth though, think about what you're asking. If you have an integer id that increases sequentially from 1, you can already have a huge number of entries in the table before you start running out of ids for them. For any non-trivial table, chances are you're going to run out of disk space long before you run out of ids.
      • Do tell me what kind of mathematical operations you would plan to perform with that varchar field?

        It's all find and dandy that you have a way to store it, but you really don't have much way to work with it do you?
        • You load it into memory and use any of a number of large number math libraries to multiply/divide/factor, whatever.
    • You can get 1 GB of varchar in PostgreSQL. Autoincrement goes upto 64 bit integers.
  • Prime Numbers (Score:2, Interesting)

    by JaxWeb (715417)
    Not really anything to do with this specifically, but this story does have something to do with primes so I will bring it up. I wrote something about prime numbers which might interest a few Slashdot readers.

    Prime Numbers [hopto.org]
    • Not really anything to do with this specifically, but this story does have something to do with goatse, so I will bring it up.

      Prime numbers for you [surfeu.fi]

    • Not a bad start, but you haven't covered all the fundamental topics yet. Don't forget the sieve of Erathosthenes and Mersenne and Fermat numbers. Also, you should mention the prime number theorem, Riemann hypothesis and Goldbach conjecture even if you can't really prove anything.

      A few corrections as I read through. Your definition of divisibility lacks rigor and needlessly invokes the division algorithm. Try instead: let a,b be integers, then b divides a if there exists an integer c such that a=bc. T

      • Hey, thank you for the comments.

        I wrote this for people who don't know mathematics, so I do not feel I should go into more complicated topics. The sieve of Erathosthenes would be do-able, but I don't really think there is much of interest to prove in it. Mersenne and Fermat numbers both get a bit complicated when they get interesting, so I will not include those either. They are interesting topics though, I agree.

        I don't think I could explain the proof of the Prime Number Theorem, and I do not have enoug
        • I've made those changes. I couldn't find the "counting numbers" bit. Hmm. The divisors bit was mainly in the "What" section I see - I must have forgotten to change those when I changed the rest.

          Anyway, I've credited you as pilkul on my maths page, http://jax.hopto.org/maths/ [hopto.org]. Is that what you would like to be known as?

          Thank you again for taking the time to read this.
        • Well, the useful and somewhat nonobvious part of the sieve of Erastosthenes is the fact that you only need to check divisibility by primes up to the square root. So it's very easy to tell whether a number less than 100 is prime: you only need to see whether 2,3,5,7 divide it.

          One interesting and easy-to-prove thing to say about Mersenne and Fermat numbers is the following: let a,b be integers greater than 1, then a^b-1 is composite if it is not a Mersenne number, and a^b+1 is composite if it is not a Ferm

  • Hmmm (Score:3, Funny)

    by Anonymous Coward on Tuesday May 10, 2005 @04:47PM (#12492345)
    Slashdot is a prime factor in how much time I waste day to day.
  • 55 CPU years (Score:4, Insightful)

    by Inkieminstrel (812132) on Tuesday May 10, 2005 @04:49PM (#12492368) Homepage
    The article says it took 55 CPU years to factor the number, though they did it in parallel for about a year and a half. I'd hate to imagine the teams that we don't hear about who are, say, 30 CPU years into the problem who just found out it's already been done.
  • by neiffer (698776) on Tuesday May 10, 2005 @04:51PM (#12492394) Homepage
    I tried to do it on my TI-85 and I keep getting an error!
    • I tried to do it on my TI-85 and I keep getting an error!

      Turn your calculator upside down on the step just before the error.

    • Try it on a TI-89. It will do it, in theory. TI themselves admit, however, that it may take longer than the life of the calculator, the user, and/or the planet to come up with the answer.
  • by winkydink (650484) * <sv.dude@gmail.com> on Tuesday May 10, 2005 @04:52PM (#12492404) Homepage Journal
    May 9, 2005 The two unique prime factors of a 200 digit number have been discovered by researchers at Bonn University.

    Note: we need a source on this. All we have now is an anonymous edit on Wikipedia from someone at Cal State Fullerton.


    An anonymous edit in Wikipedia. Now there's a source for you!
    • Well it is pretty easy to test--just multiply the two factors. Its a bit hard to fake factors.
      • Thats the beauty of math. It doesn't matter who you are, or what sources you have. If you're right, you're right.
      • Indeed, Factoring is in the class of problems that are seemingly hard to do (non-polynomial time on the best general algorithm known) but easy to check (polynomial time). The classic problems of this form are called NP-Hard [wolfram.com], and many are NP-Complete [wolfram.com]. Factoring has not yet been proved NP-Hard or NP-Complete, but is assumed to be, and that is the basic assumption of RSA public-key cryptography. This result does not change that, it just encourages use to boost our key sizes if we hadn't lately.

        And, using per [perl.org]

    • hey, he points out that these are "unique prime factors." Clearly a knowledgeable source, distinguishing between unique primes and those cheap redundant non-unique prime factors . . .

      :)

      hawk

  • Waste of time! (Score:5, Insightful)

    by Tom7 (102298) on Tuesday May 10, 2005 @04:52PM (#12492411) Homepage Journal
    I think these RSA challenges are a waste of computer power. It's trivial to compute how much computing resources it will take to factor numbers using an existing algorithm on paper, and you get a more accurate estimate than you get from sampling experimentally. I'm all for the development of new, faster algorithms and implementations, but the challenge should be for the development and demonstration of these algorithms, not the brute-force months-on-end search that ensues.
    • Re:Waste of time! (Score:5, Insightful)

      by Anonymous Coward on Tuesday May 10, 2005 @05:03PM (#12492515)
      If someone claims to have found a better factoring method, then it's easier for RSA to check that p,q < n and p*q = n, than to read his algorithm and analysis and award a prize based on that. (Imagine how many crackpots would apply with 100 pages long "proofs").
    • Re:Waste of time! (Score:1, Insightful)

      by Anonymous Coward
      Except that the majority of the factoring algorithms aren't just an algorithm. They change greatly with various parameters used to start, seed or otherwise define the algorithm.

      Add to that little tricks such as using multiple algorithms for different parts of the solution area [because some algorithms work better under different conditions] and even the "paper" estimate becomes hazy.

      That's ignoring the advances in computing processing, communication, and programming which have large practical effects on t
    • Re:Waste of time! (Score:5, Insightful)

      by Vellmont (569020) on Tuesday May 10, 2005 @05:04PM (#12492529)

      It's trivial to compute how much computing resources it will take to factor numbers using an existing algorithm on paper, and you get a more accurate estimate than you get from sampling experimentally

      You can certainly make a decent estimate of how long it will take, but you're never going to get a close approximation of the real-world performance of your implementation until you actually write the code and run it.

      The other side is that theoretical calculations are nice, but there's nothing quite like actual verification. It's much easier for a programmer to justify using larger key lengths when someone has actually cracked smaller key lengths rather than using calculations based on estimates of computing power.
      • Re:Waste of time! (Score:3, Insightful)

        by Tom7 (102298)
        I said it's worthwhile to make good implementations. I think we should do this. Then, based on our understanding of the code's behavior (and probably some short-lived experiments), we can extrapolate to find the expected time to completion. This is also better for randomized algorithms that actually running it, since by running it we only get one point randomly sampled from the probability distribution. They clearly knew that the experiment would take approximately 1.5 years to run, otherwise they wouldn't

        • What we should not do is, once we figure out how long something is going to take, to actually run it if the answer is totally pointless. This last step is a waste of time.

          A waste of who's time? The computers time? The only used an opteron for the sieving. It's never stated how many computers were used for the rest of the cracking. Once you've written the code, it's not much harder to actually perform the experiment. Like I said, actual cracked keys are far easier to justify to a programmer than theo
          • Re:Waste of time! (Score:3, Insightful)

            by Tom7 (102298)
            It took them a year and a half of computer time on, I believe, a cluster. I am arguing that that computational resource was wasted (plust the cost of powering and maintaining the cluster, etc.).

            Like I said, actual cracked keys are far easier to justify to a programmer than theoretical calculations. Actual cracked keys can be trusted 100%. Calculations of performance from unknown researches can be trusted much less than that.

            I strongly disagree. A theoretical analysis is better because you can prove that
    • But if there are no real-world-implementations of the algorithms, what good are they?
      Also, and more importantly, these challenges show just how good public-key cryptography is, and what is technically feasible to break.
      • I guess I didn't make myself clear. (clarification post [slashdot.org]) I whole-heartedly endorse the implementation of algorithms. But once we know it's going to take 1.5 years to run, we shouldn't bother actually running them for 1.5 years!

        these challenges show just how good public-key cryptography is, and what is technically feasible to break.

        We can know that by paper-and-pencil calculations, once we know how fast our implementation is. And we can know it 1.5 years sooner!
    • Of course it is a waste of time. This is the exact goal of such a challenge. It is good PR for RSA if it takes long, they want to show the world their encryption can only be broken using LOTS of computer power. If someone finds a new FAST algorithm, obviously he will win the contest (in a spectacular time, because it is fast) and RSA will need to think of a better encryption.
    • Yeah, all that computing power could be used on better things.... like posting on /. Or Quake 3. Or finishing Duke Nukem Forever! Don't they know there are hungry children all over the world! Won't someone please think of the children?!
  • by wcitech (798381) on Tuesday May 10, 2005 @04:55PM (#12492434)
    The air force has a practical use for this discovery, because when these numbers are fed into an infinite improbability drive, oncoming surface to air missles will be changed into a sperm whale and a harmless bowl of petunias.
  • Now that's what I call "long division."
  • by GillBates0 (664202) on Tuesday May 10, 2005 @04:57PM (#12492460) Homepage Journal
    My plans for world domination have been foiled.

    $ factor --version
    factor (GNU sh-utils) 2.0.11

    $ factor 10000000000000000000
    10000000000000000000: 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5

    $ factor 100000000000000000000
    factor: `100000000000000000000' is not a valid positive integer
    Try `factor --help' for more information.

    On a positive note, I was short only by 179 digits.
  • by G4from128k (686170) on Tuesday May 10, 2005 @05:02PM (#12492501)
    Factoring numbers looks harder than it is. At first glance, it looks like adding digits makes the factoring problem exponentially harder. The question is: what is the base of the exponent. A naive analysis suggests that adding one binary digit makes the number twice as big and thus makes the factoring problem twice as hard. Such analyses are where get estimates that proclaim it will take a computer the life of the universe to factor an X-digit number.

    If adding one bit to the number, makes the problem twice as hard, then the base of the exponent for the executive time is 2. But what if the base is not 2, but is only 1.01? Then, adding 200 bits to the number only makes the problem 7 times harder (1.01 ^ 200). The scary part is that we can't prove that factoring has a lower limit to the base of the exponent. It could be 1.1, 1.01, or 1.001, or 1.0001. This means that any crypto based on prime factors has an unknown vulnerability in it.

    For now, prime factoring is hard, tomorrow, it might not be.
    • so there's still a chance that 'Sneakers' might come true?
    • The scary part is that we can't prove that factoring has a lower limit to the base of the exponent. It could be 1.1, 1.01, or 1.001, or 1.0001. This means that any crypto based on prime factors has an unknown vulnerability in it.

      According to what you said, what you really mean is that it is possible that any crpto based on prime factorization has an unknown vulnerability. You can't say that for certain unless it is proven that there is no lower limit on the base exponent. (Again, I'm just using your p

    • Not to mention that the density of prime numbers reduces as you enlarge the numbers (iirc the number primes up to n is an order of log(n)).

      Because of this, it is possible to reasonably factor numbers made of primes of up to 200-300 bits.
    • The base is superflous. Factoring is approximately linear in the key size as a number, but said to be 'exponential' in the number of digits in the key. The algorithms that exist must search the vast majority of the keyspace. Because we use a binary number system, that is why it is said that adding a single digit increases the running time by 2, because the keyspace has increased by a factor of two. The base has nothing to do with it.
      • The base is superflous. Factoring is approximately linear in the key size as a number, but said to be 'exponential' in the number of digits in the key.

        Agreed. I was referring to the base of the exponent in the exponential formula for the factoring time. If the running time of the algorithm is t = A * B ^ N. A is a speed constant (decreasing with Moore's law). N is the number of bits in the number and B is the (perhaps misnamed) base of the exponent. For a brute-force algorithm, B = 2. For a better
    • "For now, prime factoring is hard, tomorrow, it might not be."

      It must be tomorrow here, because i can do prime factoring in a heartbeat.

    • But what if the base is not 2, but is only 1.01? Then, adding 200 bits to the number only makes the problem 7 times harder (1.01 ^ 200).

      Adding a bit always means base 2. Perhaps you meant to say "adding a digit"? Now apart from the basic prooblem that there is no such digit representing 1.01 (or .01), the difficulty of factoring is not changed by choosing a lower base. All your posts are saying to us is that the numbers get bigger quickly in comparision to the number of digits added if these digits are

      • Funny - I was confused reading his post too - he said "adding 1 bit" and then added "10"....

        For a second there I felt pretty stupid...

        Wait... Still... Feeling it....
    • At first glance, it looks like adding digits makes the factoring problem exponentially harder. The question is: what is the base of the exponent.

      This is an interesting analysis, but unfortunately completely wrong. The thing is that the Number Field Sieve algorithm's complexity is sub- exponential in number length. (To be precise, it's O(exp(c*log(n)^(1/3)*loglog(n)^(2/3)+o(1))) ).

      A naive analysis suggests that adding one binary digit makes the number twice as big and thus makes the factoring problem

      • This is an interesting analysis, but unfortunately completely wrong. The thing is that the Number Field Sieve algorithm's complexity is sub- exponential in number length. (To be precise, it's O(exp(c*log(n)^(1/3)*loglog(n)^(2/3)+o(1))) ).

        Thanks for the info! What is the value of c? Does it have a lower bound? This is what I am talking about. All of these t = O(???) equations have constants in them that may be lower than people think.

        That said, it doesn't seem that the factoring problem will becom
  • Notation? (Score:3, Insightful)

    by man1ed (659888) on Tuesday May 10, 2005 @05:03PM (#12492520) Homepage Journal
    Does anyone know what the notation "11281+1" means?
    • Re:Notation? (Score:5, Informative)

      by iMaple (769378) on Tuesday May 10, 2005 @05:07PM (#12492553)
      Does anyone know what the notation "11281+1" means?

      It means 11282 :) .
      There seems to be a typo in the article post (A typo on slashdaot .. thats news ..I mean just one typo thats cool). Its probably due to some filter. It should say 11^281 +1
      • Re:Notation? (Score:3, Insightful)

        by petermgreen (876956)
        ^ is kinda a dirty hack notation where you can't superscript

        my guess is that someone copypasted it and in doing so lost the superscript (it should be noted that slashdot don't allow superscripting at least in comments)
        • ^ is kinda a dirty hack notation where you can't superscript

          What's so dirty about that use as the power operator is that it comes from the BASIC programming language. For straight ASCII use (no superscripts), one could instead use ** from FORTRAN, but that's less well known.
      • Aww, really? I thought it meant 1^1281 + 1. I was like "wtf 2 is a prime number, and that takes me all of no time to calculate."
    • I think it means [2^(11281)] + 1. But I could be wrong.
    • it's 11^281+1 (Score:1, Informative)

      by Anonymous Coward
  • I guess using integer factoring algorithms are out of vogue, these days. I wonder how well A* works for factoring.
  • by fxer (84757) on Tuesday May 10, 2005 @05:30PM (#12492796)
    ...the punchline is

    3,532,461,934,402,770,121,272,604,978,198,464,36 8, 671,197,400,197,625,023,649,303,468,776,121,253,67 9,423,200,058,547,956,528,088,349
    and
    7,925,869, 954,478,333,033,347,085,841,480,059,687, 737,975,857,364,219,960,734,330,341,455,767,872,81 8,152,135,381,409,304,740,185,467

    tip your waitresses! :)
  • Now I can use that number in my encryption! ..oh wait...nevermind.
  • trying to crack RSA challenges:
    1. those with a university's budget and hardware and what might be called an academic interest or mild economic incentive. I'd put hackers and organized criminals in the same category as far as budget and power available. This crowd took 14 years to crack a 200 digit public key
    2. NSA and its equivelents in UK, Russia, China [maybe Israel? they have some good academic papers on highly paralell factoring methods]. They had far greater incentive and in NSA's case, far greater reso
    • by MoonBuggy (611105) on Tuesday May 10, 2005 @06:07PM (#12493162) Journal
      Well the fact that the researchers from number 1 stated that the factorising took 55 CPU years (based on a 2.2GHz Opteron) pretty much sorts things out for the others. We can realistically assume that anyone with a few-million-$ reason could devote 100+ CPUs to this so basically you have to hope that your data will be outdated in 6 months or so.

      Alternatively if you take advantage of Sun's rent-a-cluster for $1/CPU hour you'd get change from $500,000 and get your results faster too, but then you have to pay again for the next problem that needs cracking, so it's probably more economical to purchase a smaller cluster.
  • ... people who miss the point would be saying "of course there are collisions in hashes". Well, now I'm going to say the obvious in the same tone.

    Of course there are factors in a composite number. Nothing new to see here. Move along.
  • FWIW: ANY factorization of a number into two primes is unique, as any number is uniquely factored into primes.

    For a second source, see Mathworld [wolfram.com].
  • Real Discovery! (Score:5, Interesting)

    by kenp2002 (545495) on Tuesday May 10, 2005 @06:20PM (#12493264) Homepage Journal
    I have found a way to crack any code in a matter of minutes. It's simple!!! It works plenty of times!!

    Find out where the subject lives that encrypted the data. (1-3 days)

    Break into their home. (10 minutes)

    Look under their keyboard (1 minute)

    Read their private and public key off the notecard taped under the keyboard. (2 minutes.)
    Optionally: Steal the notecard and leave a fake one with the wrong key written down.
    Laugh maniacally... Done!!!

    To date when doing security sweeps at my various clients sites, 80% of staff have their password somewhere in their cube. 50% had their PGP keys under the keyboard, 10% had pen drives marked "Passwords" handing off a thumb tack on their cube wall. Who cares about better encyption, physical security (or perhaps mental security is a better choice) is where we need to focus.

    And remember network admins! Have you users spade or neutered :)
    • This might not work. According to The Music of the Primes Rivest (the 'R' in RSA) threw away the prime numbers in one of the RSA challenges.
  • Verify yourself! (Score:4, Informative)

    by graf0z (464763) on Tuesday May 10, 2005 @07:12PM (#12493711)
    To verify the factorization [crypto-world.com] just type

    echo "3532461934402770121272604978198464368671197400197 6\
    25023649303468776121253679423200058547956528088349 *\
    79258699544783330333470858414800596877379758573642 \
    19960734330341455767872818152135381409304740185467 " | bc
    After deleting the spaces slashcode mysteriously puts in, you should get RSA-200 [wikipedia.org].

    Btw: Not 11^281+1 itself (which has obviously >281 decimal digits) was the previous world record, but a 176-digit factor of 11^281+1 called "c176":

    echo "8428398995380842661984668205419427509438600\
    88703946121840940131686719691460399191375953 *\
    11981208699381274324213719517435209389491006\
    236671100986363096780488054684807819312870741" | bc
    /graf0z.
  • by Kozar_The_Malignant (738483) on Tuesday May 10, 2005 @07:59PM (#12494084)
    The time to do it once may or may not be meaningful. Do it six more times and average the results.
  • I'd prefer a more useful unit of measurement. How long is it in bits?
    • There's a useful table containing both measurements for all the RSA numbers (old and new) here [wikipedia.org].

      RSA-200 is 663 bits long. It's interesting to contrast it with RSA-640 (640 bits long). RSA-640 is shorter, so should be easier to factor. And unlike RSA-200, RSA-640 carries a cash prize of US$20,000 for its factorisation. So, a puzzling question is why did the team take on RSA-200 rather than RSA-640?

All constants are variables.

Working...