Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Security

RC5-72 Clients Available on distributed.net 197

Yoda2 writes "From the distributed.net site... 'The RC5-72 project is now officially up and running, as of 03-Dec-2002! You will need to download a new client in order to participate. Our FAQ-O-matic has been updated with the beginnings of a new RC5-72 section.' Also, there is a $10,000 prize for the winner, but as with the other RC5 projects, the owner of the computer that finds the key does not get all of the money."
This discussion has been archived. No new comments can be posted.

RC5-72 Clients Available on distributed.net

Comments Filter:
  • I predict (Score:5, Funny)

    by Anonymous Coward on Tuesday December 03, 2002 @06:05PM (#4805254)
    That it will take 3 years, 2 months, 12 days, 4 hours, 17 minutes and 10 seconds to crack it.
  • Why bother? (Score:5, Insightful)

    by Anonymous Coward on Tuesday December 03, 2002 @06:09PM (#4805300)
    It's an outdated, unused cipher with a completely unused keysize. Do something useful, like protein folding or golomb rulers. (Not SETI@Home, I said useful ;-)
    • WTF is a golomb ruler?
    • I find dnet's _=OGR info disclosure FAQ answer=_ [distributed.net] to be far too ambiguous to even think of doing OGR processing for them. I get the feeling they're using the public for free research, so they can have a piece of information to sell to some government defense contractor...
      • Re:Why bother? (Score:3, Informative)

        ambiguous? in what way? They are merely covering their backs because there are cetain technical difficulties in verification, hence, why OGR-24 is not "completed" even though little work is handed out. Because of the branch search method they are using for the calculations, 2 stubs can be scored with different results! Thus, someone has to mathematically verify that one of them is correct. In this case, d.net is claiming they cannot insure that the current OGR is indeed the most optimized if nothing is being returned saying otherwise.
    • (Not SETI@Home, I said useful ;-)

      Hey, don't knock SETI@Home--this is the first time it actually has a better chance of succeeding than the crypto challenge does!

      What would be bad-ass is if aliens came down from outer space with superpowerful quantum computers and totally schooled us in RC5-72 and SETI@Home at the same time!

    • I was more interested in the cost to the environment, so I did some research. What I found disturbed me enough to send a letter to the distributed.net people asking them to cease this pointless consumption of energy. What follows is a portion of that letter.

      -------

      Here's the executive summary: CPUs consume more electricity when actively computing than they do when idle. To solve the RC5-72 challenge may require an additional 2 million tons of coal be burned in order to produce the additional electricity required. That's over 200 full coal trains. 9.2 billion pounds of additional carbon dioxide will be produced and released. The details follow.

      I sent a letter to my buddies during a discussion of relaunching our team to attack the RC5-72 challenge. It showed a simplistic estimation of the energy costs required for me to participate in the challenge. I know that my CPU uses more energy to perform math calculations than it does to sit idle. It has since occurred to me that not only would I be burning an extra megawatt or two of electricity during the contest to participate, but so would all the other participants.

      I've researched things a bit more since then. The distributed.net speed page [distributed.net] shows an Athlon 1GHz Thunderbird averaging 3,540,087 keys/sec, or 12,744,313,200 keys/hour during the RC5-64 contest. A hardware vendor's page [zalman.co.kr] shows an active Athlon 1GHz Thunderbird CPU consumes an extra 10 watt-hours above its standby level. This is only the difference between an active CPU and an idle CPU, and does not account for any other standby power savings that may or may not take place. That means a 1GHz Athlon Thunderbird participating in the contest can either sit idle or test 1,275 million keys at a cost of one additional watt-hour. Since the RC5-64 contest tested 15,769,938,165,961,326,592 keys, at this rate that is 12,368,578,953 additional watt-hours used. That means about 12 gigawatt-hours (gWH) of additional electrical power were produced and consumed over the last four years just to solve the contest.

      This Los Alamos National Laboratory web page [lanl.gov] provided lots of data regarding coal and electrical generation. Referring to only the 1998 figures, I found that U.S. electric generators required 10,311 BTU to generate one kilowatt-hour. If the contest required 12 gWH of additional electricity, it must have taken about 123,732 million BTUs to generate it. Bituminous coal yields 24 million BTU per ton; sub-bituminous coal yields only 17 million BTU per ton. In 1998, the US was mining and burning about a 47%/53% mix, averaging out to about 20.5 million BTU/ton. Therefore 6,036 tons of coal had to be burned in order to generate that much eletricity. Over sixty railroad cars of coal. Looking at the CO2 problem, at the reported U.S. average of 208 lbs of CO2 produced per million BTU generated by burning coal, the contest was responsible for the production and emission of about 26 million pounds of carbon dioxide.

      When it comes to the RC5-72 contest the numbers get even worse, since according to the RC5-72 speed page [distributed.net] the number of keys per second drops to about 72% of the RC5-64 cracking speed for the Athlon 1GHz Thunderbird. Assuming that this 72% ratio is similar across most architectures, extrapolating the contest to RC5-72 should require about 2^8 times as much of everything to solve at 72% efficiency, or about 356 times the RC5-64 figures. 12 gWH * 356 is 4.3 terawatt-hours. 6,036 tons * 356 is over 2 million tons of coal. More than 210 full trains. 26 * 356 is about 9.2 billion pounds of carbon dioxide that will be produced.

      Now, these numbers are pretty much long-range projections made from some small, narrow observations. Not every CPU will consume 10 additional watts when busy. And not every CPU would otherwise drop to an idle or standby state. But some computers will be left on and cracking keys rather than hibernating or being powered off, which could save 116 watts/hour or more. And some may consume more than 10 extra watt-hours when active; such as a Pentium III-667 MHz which consumes 34 watt-hours operating but only 5 watt-hours when it can drop to standby. [jemai.or.jp]

      Also, only about 56% of our electricity is generated by burning coal: the rest is produced by nuclear power, or burning natural gas, fuel oil or biomass; about 10% is produced by renewable resources. The key could be found tomorrow, or it could be found 15 years from now. So my estimates are still just that: estimates. I could be wrong by orders of magnitude, but even so, the fact is that the RC5-72 contest is going to increase electricity consumption. Over the course of its life, the RC5-72 contest might be responsible for burning only 100 tons of coal, or it might cause the burn of 4 billion tons of coal.

      -------

      And for those of you are still reading and haven't been bored by all the numbers, I think it would have cost me about $850.00 worth of electricity to personally participate. The prize is $10,000, $1,000 of which goes to distributed.net, $8,000 goes to a charitable organization of distributed.net's choosing (the EFF, I think) and $1,000 goes to the person whose machine found the winning key.

      That's an $850 investment for a 1/165,000,000,000 chance of winning $1,000 in the next 10 years. That's discounting

      • rising electric costs
      • devaluation of the dollar due to inflation
      • the chances that RSA will still be in business and able to pay the $10,000 reward in 10 years.
      • I think my money would be MUCH safer invested in lottery tickets, where I've heard that investments pay out about $0.11 on the dollar (average.)

  • by Yoda2 ( 522522 ) on Tuesday December 03, 2002 @06:10PM (#4805307)
    if they randomly cracked it in a week?
  • by TerryAtWork ( 598364 ) <research@aceretail.com> on Tuesday December 03, 2002 @06:10PM (#4805310)
    We KNOW it'll take a lot of computers a long time to crack the code.

    These cycles would be a lot better spent on something constructive like the protean folding project.
  • Stats Page... (Score:5, Informative)

    by httpamphibio.us ( 579491 ) on Tuesday December 03, 2002 @06:11PM (#4805317)
    the current stats page doesn't seem to be linked from the main page anywhere... anyway, here's the link [distributed.net].
    • From bovine's .plan:

      Updates to various portions of our website will be continuing
      throughout the coming days. Stats for RC5-72 are not yet available,
      but all RC5-72 results submitted with v2.9001.477 or above will be
      reflected once they do come online.


      I guess they still are in development or something
    • by Decibel ( 5099 ) on Tuesday December 03, 2002 @06:21PM (#4805407) Journal
      Althogh that link does work, RC5-72 stats are not yet available, we're still working some bugs out.
      • I spent 3 months using 160 Sun workstations to crack keys on RC5-64 only to quit when the stats became "temporarily unavailable" (which IIRC, was at least a couple months!!!!) and my enjoyment of seeing how I fared against the rest of the participants disappeared.

        That was in early 1999, and I never returned.

        I dont give my cycles away anymore, but if I were, itd be to something that would help advance science and have nothing to do with aliens or cryptography.

        siri
  • by bnenning ( 58349 ) on Tuesday December 03, 2002 @06:11PM (#4805318)
    RC5-64 took 4 years, and this has a keyspace that's 256 times larger. Even if we assume that computers are 4 times faster now than the average speed at which RC5-64 keys were processed, we're still looking at 256 years to completion. It doesn't seem like it makes any sense to start until computers are at least 20 times faster.
    • But do you think the average computer will stay at this speed for the next 256 years? :)
    • At the keyrate of the last days of RC5-64 it would take even less than the predicted duration of RC5-64 when it started using RC5-56 keyrates as a measurement. This is inaccurate since the cores had to be rewritten to support RC5-72, but developers said they could get 99% of the speed of RC5-64 into RC5-72.
      • At the keyrate of the last days of RC5-64 it would take even less than the predicted duration of RC5-64 when it started using RC5-56 keyrates as a measurement.


        If I'm reading their charts [distributed.net] right, the rate at the end of RC5-64 was around 250 Gkeys/sec. That's roughly 2^38, so to search half the keyspace of RC5-72 at the same rate would take 2^33 seconds, or around 270 years. Until computers get a lot faster, any work done on RC5-72 will just be a drop in the very large bucket.

        • The interesting thing about brute forcing keys and Moore's law, is that if every 18 months processor speed doubles, you can accurately predict when the key will be broken.

          If right now we can do 2^38 keys/sec, or about 2^63 keys/year, in 18 months we'll be able to do 2^64 keys a year. 18 months later, 2^65. Still only a tiny fraction of the keyspace, but it gets bigger every 18-month period.

          From 2^63, or 1/512th of the keyspace, it takes 8 18-month periods to get to 2^71, meaning that in 12 years, we can cover half the keyspace in the course of a year.

          RC5-72 will be broken within 12 years. In theory.
          • by Crag ( 18776 ) on Tuesday December 03, 2002 @08:01PM (#4806234)
            The principle of Optimum Slackitude points out that because of Moore's Law, the overall cost in time or money can be decreased by waiting to being. If current numbers predict 12 years to exhaust the keyspace, and we wait 18 months to start, then that first 18 months worth of effort will have to be made up at the end, but 12 years later computers will be 2^8 or 256 times faster. That first 18 months worth of effort will only take 2-3 days to make up at the end of the project.

            I think that's probably what people object to about starting this project now instead of in a couple years.
    • by athakur999 ( 44340 ) on Tuesday December 03, 2002 @06:27PM (#4805460) Journal
      Especially since one of the goals of the project (from this page [distributed.net]) was to show that the US policy dictating the maximum keysize was out of date. That policy has since been changed and there is AFAIK, no restriction on keylength anymore (but you still can't export to "bad" countries).

      The "Because it's fun" one is bizarre too. I'm sure it was fun writing the client and developing all the server side stuff. But if you just run the client in the background and get any excitement of that then you need to get out more ;)

      But, as always, it's their computers and if they want to run this contest more power to them.

    • by jerryasher ( 151512 ) on Tuesday December 03, 2002 @06:31PM (#4805493)
      If it's a distributed solution, don't you also have to consider the sheer numbers of processors participating? There are more folks in participating in the project now than four years ago, and many of these folks have more computers.

      Five years from now, it may be that your house is participating, your cars are, as well perhaps as your shirts and underwear.

      In sixeen years, shortly before skynet takes over, the smart dust in your living room may decide to participate as well. (Most likely the dust will not participate, but will instead form themselves into a gollum and try to kill you, but maybe...)
    • It doesn't seem like it makes any sense to start until computers are at least 20 times faster.

      Or you could just get 20 times as many people to run the client. There are LOTS of unused CPU cycles in the world. Probably 99.999% of all CPU cycles are doing nothing but spinning in main{} right now. Let's put 'em to work!
      • Or you could just get 20 times as many people to run the client. There are LOTS of unused CPU cycles in the world. Probably 99.999% of all CPU cycles are doing nothing but spinning in main{} right now. Let's put 'em to work!

        Or let's put 'em to sleep! I used to leave all my computers running all night just to crack RC5. I noticed a significant drop in my power bill when I started turning machines off. (Also, my laptop battery started lasting 2.5 hours instead of 40 minutes.)

        Maybe if they started paying for my cycles, I'd reconsider, but I'd still have to look at peak power prices first.

        Tim

        • As I think you know, you raise a valid question: the social good to society of the answers to these distributed processing questions (aliens-p, better drugs, better understanding of math) vs. the social good to society of better energy conservation.

          And then you also bring in the economic problems of understanding altruism (folks do pay the costs of participating in these low payoff questions, why do they?).

          The problem of free riders. Jeez Tim, when we find the aliens one hour too late, just because you didn't turn your computer on, I hope you're the first one up against the wall.

          And even the problem of how much sense does common sense make: I think you want to look at average power prices, not peak.
    • Assuming that computers continue to get twice as fast every 1.5 years, and that we start with a set of computers, right now, that could crack RC5-64 in a year, then the RC5-72 project should be finished in 12 years.

      2^(t/1.5) = 256

      (t/1.5) = log_2 (256) = 8

      t = 8 * 1.5 = 12.

      • by Darkforge ( 28199 ) on Tuesday December 03, 2002 @08:45PM (#4806545) Homepage
        2^(t/1.5) = 256

        (t/1.5) = log_2 (256) = 8

        t = 8 * 1.5 = 12.

        Uhm, not quite. That's how long it will take before our machines are 256 times faster, which is a very different question. (It would be tempting to just multiply this number by 4, the number of years it took to solve RC-64, but that would merely tell us how long it would take the computers of 2014 to solve RC-72 [answer: 48 years].)

        You need a more nuanced answer that takes into account your exponential progress as you're ramping up to full speed.

        Let C be the Moore doubling time. Let P be the number of computations required to solve RC-64. Let X be the instantaneous speed at which you can solve problems, in units of P/year. So for t = 4

        1/2 x/C t^2 = 1

        so x = (C/8 years) P/year

        Given that, we can calculate t in this equation:

        1/16years^2 t^2 = 256

        t^2 = 4096 years^2

        t = 64 years

        • What the hell are you talking about? RC-72 requires 256 times as much computation as RC-64, so once computers are 256 times faster, the faster computers will be able to solve RC-72 in the same length of time that current computers could solve RC-64. The fact that the previous poster forgot to take into account that work can be done while you're ramping up to speed means that the correct answer should be smaller than the previous poster's answer, not larger.

          In fact, if we do as you did and let C be the Moore doubling time, let P be the number of computations to solve RC-64, let t be time in years, with 0 being now, let f(t) be the instantaneous computation speed in units of P/year, and let f0 be the time it would take to solve RC-64 now if computers remained at constant speed, then we have f(t) = (1/f0)*2^(t/C).

          Starting now, we can solve RC-72 in time t, such that the integral of f(x) from 0 to t is 256.

          The integration gives us C/(f0*ln(2))*(2^(t/c)-1)=256.

          So, if we plug in C=1.5 and f0=1, then t=10.3 years.

    • And, of course, that was when distributed.net was really the only game in town when it came to the whole "turn idle CPUs into something productive" thing. The prize money was probably a good incentive too. :)
      Now of course, we have SETI@Home, the various protein folding projects, all stuff that many people would argue is a "better" use of time.
      Plus, and feel free to correct me if I'm wrong, wasn't the original point of the RC5 projects to show how weak limited-length keys were?
  • by jericho4.0 ( 565125 ) on Tuesday December 03, 2002 @06:13PM (#4805332)
    While you're at it, would you mind cracking this
    $1$cFtzhvlv$waP1EXtATPrxZYz1W/4kv1

    Ideally before the end of the semester, thanks.

  • Choices. (Score:3, Interesting)

    by Night0wl ( 251522 ) <iandow@@@gmail...com> on Tuesday December 03, 2002 @06:14PM (#4805341) Homepage Journal
    I'm torn as to wether or not I want to participate in this, or Folding@Home.
    I ran the RC5-64 Project for a long time. I like it, In my year(s?) or participating I developed a habit of definding it, explaining it, and had grown to care for it.
    But when the end of RC5-64 came along I was left idle. I believe that some good can come of these distributed projects, but I've never made the effort to install F@H on my assorted boxen, my own little garden.
    I'm well versed in the cow though, and could be back on RC5 quickly...

    argh, choices, choices.
    • F@H? Come on! At least be supportive and run the United Devices client. F@H has issues, primarily with code stability. Structural chemists do not necesarily make good coders. :)
    • Choose both.

      Seriously. You can run both. I've been running d.net and f@h for a couple weeks now with nothing bad happening.

      Personally, I'm none too excited about RC5-72, because, in the end, we don't really know anymore than we do now. But OGR-25 actually has some value, so I'm sticking with it and, for the moment, doing a few RC5's just for the heck of it.

      You won't climb the stats mountain as quickly doing both, but I'm not a stats hound exactly either.
    • This is like playing the lottery. If I can get a couple thousand out of my spare cpu cycles then I'm all for it. Plus, I don't have to got to 7-11 and buy a ticket.
  • Team Slashdot? (Score:1, Informative)

    Team Slashdot was 5th overall for RC5-64... maybe it's time to step it up a notch, except that there is no Slashdot.org team yet. :-/
    • I always wondered about this 'team' distributed computing.

      Wooo... I'm not doing shit, and I'm on a team of people not doing shit.

      And the rankings.. If your team wins, that means either on of or both of the following:

      a) you have more computer than you need
      b) you don't use the computer you have

      Myself, I waste my idle cycles. Completely wasted. That brand new 2.53ghz P4 is sitting at home, powered on, doing absolutely NOTHING.
    • Someone form a team, and I'll donate 3 CPUs to the effort:

      P3 866
      P3 933
      P4 240

      - Jim
  • by Anonymous Coward
    Come on people -- why waste the incredable amount of processing time this is going to require. There are much better uses for cycles -- from cancer research to finding unique, undiscovered numbers (primes, etc.)

    At some point, there's just no point...
    • I agree. That is why I set my distributed.net client to work on the OGR-25 project exclusively.

      OGR-25 has some really cool potential and will be helpful to the world if solved. Basically it is a project to find the "Optimal Goloumb Ruler of 25 marks" -- which basically means trying to find a ruler with the least number of marks that can measure the most distances. It's a number theory problem.

      For instance, if you have a ruler with a mark at one inch and one at three inches, one can measure 1", 2", 3" and 4" objects only using those two marks.

      The applications of this principle are numerous. One that comes to mind is the optimal antenna that has the least number of of parts (aka smallest) that can transmit and recieve the largest bandwidth (range). Very cool.
  • This is stupid (Score:2, Flamebait)

    by CvD ( 94050 )
    There are so many other uses for CPU cycles than this. I've looked at the site, and none of the reasons they are doing this is really worth the massive amounts of electricity that go into this BS: an idle CPU uses a lot less than a busy one.

    - To do something with all this computing power

    There are other interesting more useful things to do with computing power.

    - To prove that small-bitsize encryption is insufficient

    I think they got the message the first 2 times.

    - To explore the feasibility of cooperative networked multiprocessing

    You mean they're still not convinced after all those years?

    - Because it's fun

    Yeah, okay... I guess everyone has their little projects. :-) I'll give them that.

    - Because you can win money!

    Um... yeah... you can win money with the lottery too. This might give you a slightly higher chance, but you'll have to wait many years to find out if you've won or not.

    - To get to know more people

    You don't need to waste CPU cycles and electricity in this manner to meet people. Running an RC5 client is not necessary to use IRC. :-)

    Cheers,

    Costyn.
    • And it *will* shorten the life of your CPU, no matter what anyone says.

      You may, or may not care, if the machine sitting on your desk works in 5-10 years. Some upgrade and throw the old out, some like me keep it around as a printserver/router, etc.

      But they should at least mention that keeping your CPU (and memory bandwidth, HDD) at a close to 100% utilization constantly *WILL* shorten its life.
  • Lets get all those china residents to load it up... so what there gov't will kill them for being a spy or somthing... but you must admit, all those people would give them great chances.
  • by jki ( 624756 ) on Tuesday December 03, 2002 @06:26PM (#4805452) Homepage
    as we run the cyberian rc5-56 [cyberian.org] effort some years ago (the good guys of distributed.net beat us that time :)) - it was very fun and interesting. Ever since that we have been every now and then looking for something similar to do. However, I personally and the rest of us either did not see much point in putting a competing effort on rc5-64 or rc5-72 either. Distributed.net does a great effort in there :)

    Anyway, I think all of these efforts would benefit from some real competition. You can't believe how rewarding it was to race with distributed.net and the other efforts and to see who can develop best optimized code - for example.

    But to build that spirit of competition (without doing duplicate work) between the efforts, we would need some fresh and new (reasonable, interesting) idea for: what to crunch? Any ideas there? I am sure the guys at distributed.net and the multiple other efforts would love to see the same "fighting spirit" again as well :) And as result, I believe everyones code will be optimized much faster and new ideas will be created faster, more people will be interested to join...but: what to crunch, what would be really really interesting? :)

    • by Anonymous Coward
      The real question is, how do we get people like you talking with people who run projects like protein folding? I'm betting that if their project were properly managed (open sourced cores for all to improve, many platform availability for all the geeks who care about this sort of thing) and some hand optimizing work done, then their processing rate would go up tenfold.

      I really hope to see this some day, because I will move from d.net to cure for cancer when I believe that the medical software is making as efficient use of my processor as d.net is now.
      • The real question is, how do we get people like you talking with people who run projects like protein folding?

        ...I think the key is to build a framework that feeds friendly but ruthless competition for the effort organizers as well. That's what seems to be missing today. :)

  • it appears..the spoon ran away.
  • Hmm... (Score:1, Redundant)

    Other then yet another encryption crack or useless mathematics, what else is there around to waste my server's cycles on? Preferably something useful, like cancer research or something. Oh, and SETI isn't considered to be useful IMHO.

    • Re:Hmm... (Score:2, Interesting)

      by greymond ( 539980 )
      Cancer Research - if you install the google bar you will be helping the google team with the folding@home project for simulating the folding of proteins.

      - some how this is supposed to help with cancer but I am no doctor and have no idea what folding protein means - i just draw pretty pictures all day, but this makes me feel warm and fuzzy inside after passing by the bums on the street.
    • Re:Hmm... (Score:3, Informative)

      by rhombic ( 140326 )
      Like this [ox.ac.uk] ?
      • Ah yes, the United Devices client for Cancer Research. I already run that one on my main (windows) PC, I'm not looking for something on my Linux server.

  • Also... (Score:5, Funny)

    by serlaten ( 619839 ) on Tuesday December 03, 2002 @06:41PM (#4805583)

    if the correct key is found by a P2 300 MHz laptop, floating around the pacific on a small raft, before it's batteries are empty, Taco Bell will give free tacos to all.

  • opening the page in Chimera gave me bloated font size. Doesn't happen on any other pages. Is this a strange bug in Chimera? Slashcode? Anyone else have this problem?
  • by EschewObfuscation ( 146674 ) on Tuesday December 03, 2002 @07:34PM (#4805990) Journal
    OK, granted that this project may be a waste of computing power (assuming that they're not going to be just sitting there wasting cycles anyway), but I saw a lot of people suggesting that users instead participate in the folding@home project. That got me to thinking...

    I'm not against folding@home, but I don't think that the number crunching approach to solving protein folding is ever really going to give us the breakthroughs we want. We need to theoretically address the issue of folding and find more simple behavioral theories with which to approach the problem. I know a lot of work is currently being done from the physics front with spin glasses and other complex systems models.

    The difference between these two approaches is the difference between the current encryption cracking projects, and a Sneakers-like approach to actually find a mathematical solution to the large number factoring problem.
    • You'd be right if the goal were just to find a folded state, but well, it's not :) In this case it's to simulate the actual folding process, matching the models (which they are working on) with experimental results.
      • > In this case it's to simulate the
        > actual folding process

        My understanding is that the folding model they're using is based on molecular kinetics. I agree that we need further investigation of the area, and it may absolutely be that physical chemistry is the only way to address the problem.

        But the similarity of the problem space to other complex systems would seem to indicate that there may be another way. All I'm saying is that as long as we are forced to address the problem with horsepower, as opposed to refining a theoretical approach (possibly a radical approach that does not come directly from the underlying molecular mechanics), the solutions are going to remin one-shot deals. There are just so *many* proteins we're interested in that taking a long time on an @home type project just will not be able to address them.

        If I'm incorrect, and this is the actual point of the research, I aplolgize... It just didn't look that way from their website (based on a recent, and cursory, examination).
  • by Anonymous Coward
    Quite frankly, I don't care about RC5. It's a useless project at this point, given how long it's taken to crack RC5-64. As far as I'm concerned if they want stuff done for them they can start cutting me checks for every block of keys done. Why bother with 72? Why not just go to say... 1024 to guarantee it'll take them an extremely long time to finish and justify their continued existence?
  • OGR stats shot up after RC5-64 was done...but I can't help but wonder what the graph is going to look like in a week.

    Here's a curious question/criticism...how come the OGR-24 project isn't completed yet? It doesn't look like there's really anyone working on it...and that should have been done long ago.

    And I'm running RC5-72 because it's winter up here and my apartment is cold. :)
  • Ahahahahahaha... from the stats page [distributed.net]:

    The odds are 1 in 5,137,904,802 that we will wrap this thing up in the next 24 hours. (This also means that we'll exhaust the keyspace in 5,137,904,802 days at yesterday's rate.)

    Kick ass! I can't wait for the year 14,078,453 so that I can spend my $10,000. I wonder what I'll be able to buy...
  • The new client will not play nicely with the buff-in.ogr and buff-out.ogr you've been using with previous clients. You can mail your buff-out.ogr as an attachment to flush@distributed.net, rather than just deleting it, or you can flush before you upgrade.

    I didn't find this in a cursory view of the readme's, but did find it in the FAQ-o-matic (and with a helpful comment from Jeff Palmer on the mail list).
  • by Insanity ( 26758 ) on Tuesday December 03, 2002 @08:45PM (#4806542)
    Someone needs to convice me why I should continue supporting dnet. I was with RC5-64 from January 1999 all the way to the end, and in that time, I've witnessed the entire organization stagnate and become irrelevant.

    OGR-24 started over two years ago, the distributed calculations were done in a matter of months, and yet we have no results returned for what should be an interesting project. Were previous methods of finding OGRs optimal? Did we find a new one, and if so, what is it? You'd think that with all the calculations done, two years would be enough time to process the results. In fact, you'd think that the results processing code would be written BEFORE using countless cpu-years on such a project. To me, this indicates a disturbing lack of professionalism, if not outright laziness. Or perhaps the entire project is flawed, and dnet wishes it would just go away. Why did we need RC5-72 pushed out the door in a matter of weeks when dnet already has OGR-25 running?

    Look at the frequency of .plan updates. There are stats and client features that have been talked about for years and not implemented.

    And instead of laying down the infrastructure, resolving old issues, and finishing previous projects, the organization hacks together an RC5-72 project as quickly and with as little work as possible, and launches it.

    It's getting very hard to take dnet seriously.

  • by mraymer ( 516227 ) <mraymer@nOsPaM.centurytel.net> on Tuesday December 03, 2002 @09:39PM (#4806844) Homepage Journal
    For r64, dnet provided very detailed stats of every part of the project. That alone makes it an excellent reason it join, as I'm sure the r72 stats page will be just as detailed when it goes up. I've never seen a distributed computing project that made it as fun to look at the fruits of your labor. Sure, I'll admit that cracking an old form of encryption isn't the most productive thing in the world, but hey, that doesn't mean it isn't FUN!!! :)

    Oh, lastly, I thought I'd mention I was one of the many people that submitted this story before it finally made it. I'm glad it was more than just me... :)

    • Agreed, the stats are what drives a lot of people. IMHO, dnet doesn't have the best stats though. If you're mainly doing it for the stats, there are other projects with much better stats.

    • LOL thanks for all the karma guys!

      NOTE: to increase performance on the client, be sure to manually select the core it uses! I've noticed that if you don't do this, it will automatically run a microbench to determine the fastest core EVERY time, and that will cause you to lose precious keys.

  • Okay, we know that it takes a long time to crack by brute force. Great. More interesting would have been to have a distributed genetic algorithm finder to solve for an arbitrary encrption key.

  • by Kris_J ( 10111 ) on Tuesday December 03, 2002 @10:59PM (#4807325) Homepage Journal
    What's the key rate for a Playstation 2? I'm thinking of getting the Linux kit, but if it can't manage at least 50% of the old server sitting next to me it's really not worth it.
  • it heated up my Dell Inspiron 8000 laptop so much over time, one of the cooling fans burned out. And I thought I'd help the process... what an ungrateful little cow that RC5 is...
  • Instead of brute-forcing cryptographic algorithms, how about wasting your free CPU cycles on pure math?

    GIMPS [mersenne.org] is the Great Internet Mersenne Prime Search - a distributed effort to check what numbers of the form 2^P - 1 are prime, given that P is prime.

    See the project's web site for more details. With a bit of luck, this research will never be of any practical use, as pure math was intended to be...

For God's sake, stop researching for a while and begin to think!

Working...