Factors Found in 200-Digit RSA Challenge 184
diodesign writes "The two unique prime factors of a 200-digit number have been discovered by researchers at Bonn University (Germany) and the CWI (Netherlands). The number is the largest integer yet factored with a general purpose algorithm and was one of a series of such numbers issued as a challenge by security company RSA security in March 1991 in order to track the real-world difficulty of factoring such numbers, used in the public-key encryption algorithm RSA. RSA-200 beats the previous record number 11281+1 (176 digits, factored on May 2nd, 2005), and RSA-576 (174 digits, factored on December 3rd, 2003)."
55 CPU years (Score:4, Insightful)
Re:55 CPU years (Score:5, Insightful)
This would mean that their algorithm and/or heuristics is/are superior, which would be beneficial to everyone, including these researchers who "won".
The good thing about research like this is that no one really loses.
Waste of time! (Score:5, Insightful)
Re:Waste of time! (Score:5, Insightful)
Re:Waste of time! (Score:1, Insightful)
Add to that little tricks such as using multiple algorithms for different parts of the solution area [because some algorithms work better under different conditions] and even the "paper" estimate becomes hazy.
That's ignoring the advances in computing processing, communication, and programming which have large practical effects on the actual implimentation.
Notation? (Score:3, Insightful)
Re:Waste of time! (Score:5, Insightful)
It's trivial to compute how much computing resources it will take to factor numbers using an existing algorithm on paper, and you get a more accurate estimate than you get from sampling experimentally
You can certainly make a decent estimate of how long it will take, but you're never going to get a close approximation of the real-world performance of your implementation until you actually write the code and run it.
The other side is that theoretical calculations are nice, but there's nothing quite like actual verification. It's much easier for a programmer to justify using larger key lengths when someone has actually cracked smaller key lengths rather than using calculations based on estimates of computing power.
Re:Notation? (Score:3, Insightful)
my guess is that someone copypasted it and in doing so lost the superscript (it should be noted that slashdot don't allow superscripting at least in comments)
Re:Waste of time! (Score:3, Insightful)
What we should not do is, once we figure out how long something is going to take, to actually run it if the answer is totally pointless. This last step is a waste of time.
Comment removed (Score:4, Insightful)
Re:Waste of time! (Score:3, Insightful)
Like I said, actual cracked keys are far easier to justify to a programmer than theoretical calculations. Actual cracked keys can be trusted 100%. Calculations of performance from unknown researches can be trusted much less than that.
I strongly disagree. A theoretical analysis is better because you can prove that it works for all cases (not just the one you experimentally verify), and because you get a more accurate picture for a probabilistic algorithm.