Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption

Blazing Fast Password Recovery With New ATI Cards 215

An anonymous reader writes "ElcomSoft accelerates the recovery of Wi-Fi passwords and password-protected iPhone and iPod backups by using ATI video cards. The support of ATI Radeon 5000 series video accelerators allows ElcomSoft to perform password recovery up to 20 times faster compared to Intel top of the line quad-core CPUs, and up to two times faster compared to enterprise-level NVIDIA Tesla solutions. Benchmarks performed by ElcomSoft demonstrate that ATI Radeon HD5970 accelerated password recovery works up to 20 times faster than Core i7-960, Intel's current top of the line CPU unit."
This discussion has been archived. No new comments can be posted.

Blazing Fast Password Recovery With New ATI Cards

Comments Filter:
  • by ShadowRangerRIT ( 1301549 ) on Tuesday March 16, 2010 @10:02AM (#31495872)
    This isn't really about GPUs, it's an advert for ElcomSoft products. The whole summary is in marketing-speak for crying out loud.
    • by ShadowRangerRIT ( 1301549 ) on Tuesday March 16, 2010 @10:03AM (#31495916)
      And for the curious, TFA is no better. They're calling it a benchmark so they can advertise more effectively, that's all.
      • Re: (Score:2, Insightful)

        ... The whole summary is in marketing-speak for crying out loud.

        And for the curious, TFA is no better. They're calling it a benchmark so they can advertise more effectively ...

        You must be new here.

    • Re: (Score:3, Interesting)

      by Sir_Sri ( 199544 )

      And a bit of an and underhanded advert for ATI. 'Password recovery' is an inherently parallel problem that really likes the sort of math gpus do, and not so much the sort CPU's do. The ATI 5000 series are the fastest GPU's available at retail right now, doesn't take a genius to put 2 and 2 together here. Anyone who knows anything about NVIDIA's workstation parts knows they are not radical departures from their current retail chips so saying your new fancy retail part is twice as fast as the workstation v

    • well, I've not RTFA but if they can get double the performance of a Tesla system using much cheaper (as I recall it's expensive, which isn't saying much ~ I refuse to google if I won't RTFA) video cards isn't that something to talk about?

      BAH, now you've got me bothered to RTFA... guess I should go do work instead?

    • Re: (Score:3, Funny)

      by jank1887 ( 815982 )

      come on. It CLEARLY states that "An anonymous reader" wrote that summary.

    • by Lord Ender ( 156273 ) on Tuesday March 16, 2010 @01:57PM (#31499672) Homepage

      As an IT security guy, I found this to be informative, actually. When analyzing the security of a system or organization, I need to know not just what is theoretically possible, but what can be done with already-existing software and hardware.

      This article gives me some idea as to what attacks are currently practical (and for what key lengths).

      When research or engineering achievements come from the commercial (rather than academic) sector, it isn't really reasonable to expect an academic tone. They're tooting their own horn, but they are doing it about something important.

    • Re: (Score:3, Insightful)

      by node 3 ( 115640 )

      Having skimmed TFA (actually, TF Press Release) it doesn't sound like there's anything really interesting here other than GPUs are faster are parallel calculations than CPUs. This is already known.

      Cracking WPA and iPod/iPhone backups is still not a feasible task. Instead of 20 billion years (or whatever), it'll now only take 1 billion? Saying "20 times faster" makes it sound like you can already reliably crack these things, and now instead of a few hours, it's only a few minutes. But unless I missed it (and

  • Portrayal (Score:5, Insightful)

    by Dan East ( 318230 ) on Tuesday March 16, 2010 @10:04AM (#31495918) Journal

    I like the way this is portrayed in a totally positive light, as if a person, upon forgetting the password to their device, is going to go out and buy one of these video cards, install it in a machine capable of supporting it (PSU wattage, bus speed, OS, etc), purchase the proprietary "password breaker" software (sold by the company that authored this "story"), all just to recover their password. I think the typical usage for this type of setup is of a more nefarious sort.

    • Re: (Score:3, Interesting)

      by mcgrew ( 92797 ) *

      You remember that Elcomsoft was the company Dmitry Skylarof was (is?) with? He's the guy who got thrown in a US jail for something he did in Russia that was completely legal in Russia.

      • Re: (Score:2, Redundant)

        by jonbryce ( 703250 )

        No, the US jury found him not guilty.

        • Re:Portrayal (Score:4, Informative)

          by ElectricTurtle ( 1171201 ) on Tuesday March 16, 2010 @10:23AM (#31496218)
          Being found not guilty does not mean he didn't spend time in jail. Not everybody is released on their own recognizance pending trials.
          • No, but short of capital crimes, or high risk flight, bail is usually an option.
            • Re: (Score:3, Informative)

              Foreign nationals such as Dmitry Skylarof are usually classified 'high risk of flight' because they are expected to run back to their country if given half a chance, so, yeah, not out of the ordinary.
            • Re:Portrayal (Score:4, Insightful)

              by Rene S. Hollan ( 1943 ) on Tuesday March 16, 2010 @05:01PM (#31502072)

              Try posting bail when no one else has access to your money or collateral and no one is willing to advance you a loan for that purpose. You first have to get to your lawyer (assuming you have one, and not a public defender who won't give a crap), have him draw up (or use a boilerplate) power of attorney form so s/he can access your funds, have a notary witness your signature at the jail (often not possible since the only physical (non-video) visitor you can have is your lawyer), and take that to your bank during business hours.

              A debit/credit card might work, and you might indeed have it on your person when you are arrested. But, it will be safely stored with your personal possessions, and not provided to anyone other than upon filing in a release form, that your jailer may not approve (generally the deputy overseeing the jail module where you are held). Have you got your debit/credit card number memorized? The expiration date? The code on the back?

              Things that can take a few minutes over the phone can take many days when one is in jail.

        • Re:Portrayal (Score:4, Informative)

          by russotto ( 537200 ) on Tuesday March 16, 2010 @10:45AM (#31496572) Journal

          No, the US jury found him not guilty.

          No, the charges against Sklyarov were dropped and he was released as part of a deal in which Elcomsoft agreed to accept US jurisdiction. The US jury then found Elcomsoft not guilty.

        • by mcgrew ( 92797 ) *

          Yeah, after spending four months in jail. Lot of good it does you to be found not guilty when you're incarcerated anyway.

      • Dmitriy Skliarov is the more correct phonetic spelling. /. still does not accept UTF-8, it's retarded.

    • Yeah like selling one time password solutions to IT bosses when someone gets ahold of their SAM.....

    • Re: (Score:2, Troll)

      by wvmarle ( 1070040 )
      It all depends on your point of view.
      One man's "password recovery" is another man's "password cracking".
      Just like the same person being a "freedom fighter" and "terrorist/insurgent" at the same time.
      It all depends on your point of view.
      • by elrous0 ( 869638 ) *
        I'll take the point of view of 99.999% of people who buy (or more likely pirate) this software, and say that's its primary use will be nefarious.
    • I tend to 'recover my password' for my wireless APs with my little friend the paper clip. You're absolutely right that the more common use of something like this is going to be cracking.
    • by Minwee ( 522556 )

      a person [...] is going to go out and buy one of these video cards, install it in a machine capable of supporting it (PSU wattage, bus speed, OS, etc), purchase the proprietary "password breaker" software (sold by the company that authored this "story"), all just to recover their password. I think the typical usage for this type of setup is of a more nefarious sort.

      I think you're right. Someone could use this kind of setup to play Crysis.

    • by Yvanhoe ( 564877 )
      Yeah, they may be people who just moved in and who want to have *gasp* internet access ! The vicious bastardly devils ! The thieves of non-thievable property ! the... the... TERRO-PIRATES !
      • Last time I checked, free unfettered internet access still isn't a god given right. If you want a service, you need to pay for it.
    • I like the way crowbar makers advertise their produce in a positive light. As if anyone, upon realizing that they need to pull hundreds of deeply sunken nails, is going to go out to the store and buy a heavy 2 foot crowbar. I think the typical usage for crowbars like this is of a more nefarious sort.

  • GPUs (Score:5, Interesting)

    by Thyamine ( 531612 ) <thyamine@NoSPaM.ofdragons.com> on Tuesday March 16, 2010 @10:05AM (#31495946) Homepage Journal
    This isn't the first story about how crazy fast GPUs are for crunching. I know very little about that level of hardware, but why aren't we incorporating these types of things into CPUs? Is the coding/assembly so different that it doesn't translate? Do they only do certain kinds of processing really well (it is a GPU after all), so it couldn't handle other more 'mundane' OS needs?
    • Re: (Score:3, Informative)

      by godrik ( 1287354 )

      It is in progress in fact. That was the point of intel 80 cores prototype.

      I found funny that with time we keep doing the cycle external processor->co processor->ntergrate in CPU dye -> external processor

      • you mean the one that horribly failed and wasn't even close to performing as good as a graphics card?

        oh, right.

      • The effect is a long-standing and well documented [catb.org] observation about this industry. I guess Moore's Law is antithetical to satori.

      • Re:GPUs (Score:4, Interesting)

        by ShadowRangerRIT ( 1301549 ) on Tuesday March 16, 2010 @10:40AM (#31496442)
        That's not really the same thing. The Intel 80 core prototype was still a CPU at heart, they just made improvements to communication. GPUs are quite different. GPUs are designed as primarily floating point processors (though newer ones can do low precision integer math with similar efficiency), but more importantly, they are vector processors with virtually no support for conditional statements and optimized for sequential access to memory instead of random access. They're halfway between dedicated circuitry and a general purpose CPU; what they can do, they do *very* well, and they can generalize a little, but tasks they weren't designed for need to be rewritten to accommodate their quirks, and eventually reach a point of diminishing returns. Integrating GPUs into the CPU will allow more programs to use it (and possibly speed processing and enable new scenarios where the CPU and GPU need to communicate frequently), but for run of the mill computing tasks, the relatively inflexible design of GPUs is a problem.
        • by godrik ( 1287354 )

          well, It was not really a CPU at the heart. It was more a complex network of heterogeneous computing unit with classical CPUs but also DSPs, vector float processing units...

          It was of course a prototype and never reached the amrket, but merging CPU-type and GPU-type on the same chip seems definitely to be in Intel's and AMD's roadmap.

    • Re: (Score:3, Informative)

      by Anonymous Coward

      GPU's are better at doing certain calculations generally, and are very good at parallel processing seeing as graphics can be broken down to be processed parallel very quickly. For this, gpu's have a ton of cores. So in a way processors are indeed starting to follow with multicore systems but it is nowhere near the number GPU's use. High end GPU's now have 480+ processor cores on a card these days, thats a lot more then 4 core intel's ;). But if you had a ton of cores on the processor, each additional on

      • ...most things must be done linearly, not parallel.

        Or to be a bit more precise: Humans can't think about parallelism well. Certain obvious, discrete tasks can be split up, but having whole threads of execution constantly communicating and touching shared resources overwhelms the capacity of most programmers. You could write a massively multi-threaded program to do a lot of stuff that is currently done linearly, but you'd risk a whole lot of crashes and deadlocks from the inevitable bad code and you wouldn't get the full increase in speed since the thread an

    • Re: (Score:3, Interesting)

      by imgod2u ( 812837 )

      To some level, CPU's have been moving to be more GPU like for a long time. SIMD (SSE, AltiVec, NEON) are GPU features that made their way to CPU's. Ditto for parallel, long pipelines. Remember the Pentium 4? That was a huge step in the GPU direction.

      There are two problems with that approach:

      1. Code that isn't pure number-crunching doesn't run well on such a compute model.
      2. The model is almost entirely memory-starved. GPU's have up to a GB of high-speed, dedicated RAM on the card itself. CPU's have to live

      • As an aside, GPU's have up to 2GB [slashgear.com] and soon to be 4GB. The rest of what you said it dead on.

        You are right though, the concept of gpu/cpu hybrid seems to be a possible end result if the combination can be run successfully. I suspect there is a lot of very tough engineering involved with getting such a concept working.

        • by imgod2u ( 812837 )

          The biggest problem will be heat. GPU's currently consume and dissipate upwards of 200W. Likewise for CPU's. To get a single die or even package to consume and dissipate that much power and heat will be a challenge not just for the silicon designers but the system guys as well.

    • Yes GPUs are very different, they are designed to do a lot of very similiar calculations to an extremely large set of vector data. That's also pretty much all they do, they aren't nearly as good for logic like a traditional CPU is.

    • The coding / assembly is so different that it doesn't translate, and they only do certain kinds of processing well.

    • Re: (Score:3, Informative)

      The last sentence nails it. They only do certain types of operations well, and the frequency with which I upgrade GPUs compared to CPUs - or more specifically, the fact that I very rarely replace both at the same time - leads me to believe I'm better off having them separate. Maybe there are parts of the GPU which could be incorporated into the CPU, and I think that might be what the Core i3/5/7 processors are doing with GMA integration.

      • Coming soon: the i3-387DX, i5-587DX and i7-787DX GPU co-processors, and motherboards with GPU coprocessor sockets next to the CPU.

        Happy days.

    • They tend to be specialised processors, designed specifically for graphics related tasks. Those tasks happen to be computationally very similar to other tasks such as protein folding. Though they will be poor performers or possibly totally incapable of certain tasks your CPU has to do.

      That said I'm waiting for the first CPU to build in a GPU so we don't even need a separate graphics chip on our motherboards any more to for the already integrated graphics output.

      • What are you waiting for? The new Intel processors already have integrated on-die GPUs. The next generation will have the GPU and the CPU completely integrated.

        • You see how well I follow current hardware :)

          When buying a computer these days I go for the cheapest/slowest specced hardware which is way more than what I need (watch videos, troll /., e-mail, general browsing, some web/general programming, standard office work).

          And actually what I'm waiting for to buy a new box is for the old one to die. 5 year old hardware is still fast enough for pretty much everything that I do.

          The CPU speed problem is solved and done with for all but the most demanding applications

          • Slow down, my friend. ICs aren't created equal, and you can't just throw everything on one die. Processes can be vastly different and what is optimal for one IC isn't optimal for another. For example, you can't manufacture standard DRAM on a CPU process - RAM processes are vastly different as they need to create capacitors. WiFi needs specific RF circuitry and often requires external ICs with exotic processes for switching/etc (GaAs or silocon on sapphire). Bluetooth will probably still need some funky sili

          • OK so now the GPU is on the CPU die. The memory controller is there already, right? It should be a no-brainer to integrate small stuff like ethernet. Bluetooth/wifi may be a bit harder due to the necessary aerial. Now all that's left is to integrate the RAM on the die and we're there. A one-chip computer. No need for complex motherboards any more. At that moment the whole hardware issue is solved.

            We've had ICs like that for ages, mainly in portable devices which have everything and the kitchen sink in a single package. However, I'm not sure it would be ideal for a general-purpose PC to have everything in the CPU for a bunch of reasons, mainly how it would raise costs and give you less flexibility regarding peripherals. It seems more reasonable to have bluetooth, network, etc. in the motherboard or plug-in cards, so you don't have to replace your CPU when a newer version comes out (or when you use yo

    • Re:GPUs (Score:5, Informative)

      by SuperMog2002 ( 702837 ) on Tuesday March 16, 2010 @10:36AM (#31496372)

      Is the coding/assembly so different that it doesn't translate? Do they only do certain kinds of processing really well (it is a GPU after all), so it couldn't handle other more 'mundane' OS needs?

      Yes, exactly. CPUs are built from the ground up to do scalar math really, really fast. That lends itself well to doing tasks that must be performed in sequence, such as running an individual thread. However, they've only recently gained the ability to do more than one thing at a time (dual core processors), and even now high end CPUs can only do six calculations at once (6 core processors).

      Meanwhile, GPUs are built to do vector math really, really fast. They can't do individual adds anywhere near as fast as a CPU can, but they can do dozens of them at the same time.

      Which type of processor is best for which job depends entirely on the nature of the math involved and how parallelizable the task is. In the case of 3D graphics, drawing a frame involves tons of vector arithmetic work, which is why your 1 GHz GPU will run circles around your 3 GHz CPU for that task (and is also where the GPU gets its name from). In the case mentioned in the article, password cracking is highly parallelizable: you've gotta run 100 million tests, and the outcome of any one test has zero influence on the other tests, so the more you can run at the same time, the better. By running it on the GPU, each individual test will take a bit longer than running it on the CPU would, but you'll be able to run dozens simultaneously instead of just a few, and will thus get your results much faster.

      CPUs certainly have their place, though. Some tasks simply must be done in sequence and cannot be easily divided up in to seperate parallel tasks. The CPU will get these done much faster, since running them on the GPU would incur the speed penalty without realizing any benefit.

      I've simplified it a bit for the sake of explanation, but that's the gist of it. Hope that helps!

    • by 0123456 ( 636235 )

      I know very little about that level of hardware, but why aren't we incorporating these types of things into CPUs?

      Because most people don't want their CPU consuming 300W of power when idle?

    • It's all about IP. It wouldn't be horribly difficult to put a GPU and CPU on the same die. BUT, Intel doesn't GPU manufacturers getting into the x86 business and GPU makers certainly aren't going to give Intel any of their technology and get cut out the market. Intels attempts at GPUs has been less than spectacular. Good enough for Word and Excel. Not good at modern gaming.

      So for the foreseeable future CPUs and GPUs will be treated as seperate entities.

      • Intel doesn't GPU manufacturers getting into the x86 business

        I think you accidentally a word.

        Also, how can Intel prevent GPU manufacturers from getting into the x86 business, if AMD is already in both?

    • 3D rendering involves lots of integer math, and there are huge portions of any given render that do not depend on each other. For example, the scene may involve calculating the vectors from thousands of vertices and faces of polygons towards hundreds of light sources. That is millions of operations that are essentially independent. Another phase of a render will require calculating the intersection of each view vector (and more if you use FSAA) with a polygon in the scene.

      So, modern GPUs are a special ca

  • I think we all know what they really mean. ;)

    (Anyway, I'm also impressed by the power shown by the GPUs. Its a good demonstration that some of the new technologies (CULA? CUDA?) that allow "regular" programmers to use this power actually will really speed up some things.)

  • by Anonymous Coward on Tuesday March 16, 2010 @10:09AM (#31496002)

    Hey Editors,

    You forgot a link to the buying page [elcomsoft.com]
    For as low as 1.399,- € you can start cracking^Wrecovering passwords today.

    • Re: (Score:3, Informative)

      by cOldhandle ( 1555485 )
      In case anyone wants to play around with this tech without paying (or rolling your own): I tried out this free (as in beer) windows software yesterday: http://golubev.com/rargpu.htm [golubev.com] It seemed to work very effectively - I was able to brute force 5 lower case letter only passwords on RAR files in a couple of minutes on a GTX260. It also has some advanced options to specify mutations of strings to try, and to use word lists.
    • Re: (Score:3, Informative)

      by elrous0 ( 869638 ) *
      Agreed, looks more like the kind of "story" we'd see posted by kdawson, not Taco.
  • Huh? (Score:2, Informative)

    Is this supposed to be a good thing? Sounds like someone's password encryption algorithm needs some upgrading to me.

  • Out of curiosity... (Score:2, Interesting)

    by Anonymous Coward

    I keep hearing stories about using GPUs for non-GPU computations, but has anybody here tried it?

    What does your screen look like while a program like this is running?

    • Re: (Score:3, Informative)

      by cbope ( 130292 )

      Normal. Running GP-GPU or CUDA apps has no effect on output to the screen. We do it for medical imaging processing.

    • I keep hearing stories about using GPUs for non-GPU computations, but has anybody here tried it?

      Yes many people do it and have for years.

      What does your screen look like while a program like this is running?

      Why do you assume that the screen looks different.

      • Re: (Score:2, Funny)

        by Anonymous Coward

        Good point. Why would I assume a graphics card operation would have any effect on graphics? I've only ever used mine to take ice off the windshield.

        • Except for the fact that the part doing these calculations has nothing to do with the parts of the GPU that are handling outputting to the screen?

      • Why do you assume that the screen looks different.

        because when you run a cpu intensive application your pc becomes really slow, so if you use instead your gpu the screen should become "slower" too, but probably you wouldn't even notice

      • What does your screen look like while a program like this is running?

        Why do you assume that the screen looks different.

        He is still running a Voodoo series add-on card that takes over the video output when it is in use?

    • Re: (Score:2, Informative)

      by Anonymous Coward

      The display buffer for a 1920x1200 screen with 24-bit colour takes less than 7MB. Even a fairly low-end graphics card will have at least 128MB of memory. In other words, there's plenty of memory for a program running on a GPU without needing to piss on the display buffer.

      If your screen is just displaying a bunch of 2D windows, then the 100s of cores in your GPU will be sitting idle. Again, computations running on the GPU will have no impact on what you see.

      • Again, computations running on the GPU will have no impact on what you see.

        True, but if it did it'd be way cooler than the hourglass cursor! A program could use its calculation memory space as the display buffer for its own window while it runs. Auto-visualization!

      • Newer windowing systems no longer draw the screen as a single 2D object. This includes X (Compiz), OSX, and Windows.

    • Re: (Score:3, Informative)

      I run the Folding@home [stanford.edu] GPU client [stanford.edu] on my GeForce 8800 GTX. On Vista and later OSes (pre-Vista, the driver model wasn't well adapted to GPGPU and this leads to a polling driven communication scheme which is really inefficient), the effect on resources is unnoticeable aside from during games (where I kill the client to reduce jerkiness); the GPGPU work is lower priority and gets shunted aside from rendering, though the latency involved is a problem for graphics intensive games. For less demanding work and gene
    • There have been several documentaries about hacking [filmroster.com] over the years that demonstrate the use of GPU-based computations. It is soo bad.

    • Re: (Score:3, Funny)

      by Waffle Iron ( 339739 )

      What does your screen look like while a program like this is running?

      Well I haven't kept up with the latest developments, but if it's anything like the Sinclair ZX80 I'm posting from, the screen goes blank gray when you start actively computing. Then it returns to normal when the answer is ready.

  • boo (Score:5, Informative)

    by Anonymous Coward on Tuesday March 16, 2010 @10:16AM (#31496102)

    boo slashvertisement

  • by roman_mir ( 125474 ) on Tuesday March 16, 2010 @10:22AM (#31496192) Homepage Journal

    On that one ATI board that get 103K passwords per second and only 4K on the latest quad-core intel (which by the way, is almost 26 and not 20 only times faster.)

    So that's wonderful. How many passwords are there in 1024 bit SSL encryption? 1024 asymmetric is equivalent to 80 symmetric algorithm, so that's like 2^80 passwords, right?

    Let's say 100,000 passwords per second, that's 10^5.

    Google says this: (2^80 / 10^5 ) / (3600 *24 *365*1000) = 383 347 863

    383.3 million years to go through every password in 2^80 possibilities.

    In reality, of-course, not every combination is used, many passwords can be eliminated by heuristic and also it helps to have a good dictionary file handy, from which to generated most likely password combinations. That probably cuts down from 383 million years to something much more ATI friendly. Of-course we need to use stronger cypher.

    As a final note: at last I understand why Hugh Jackman needed the 7 monitor setup, each one must have been used as an output device for the video card it was connected to. Obviously the video cards were the actual power behind all that hacking!

  • What is with the spin talk here in the title? Basically this just says I need to use better passwords. Speak truths....
  • Finally...someone who understands!

    I wanted to get one of those professional car door jimmy kits (the ones with a jimmy for just about every model of vehicle!) that tow truck supply vendors sell "just in case I get locked out of my car", but they had these outrageous demands that I "prove" that I was a legit tow outfit or garage.

    The locksmith supply was much the same way when I tried to buy a lockpick set, "just in case I get locked-out of my house".

    You can bet I'll be getting this software. I must've

  • From 1986 [aminet.net]:

    Executes the cellular automata game of LIFE in the blitter chip. Uses a 318 by 188 display and runs at 19.8 generations per second. Author: Tomas Rokicki

  • http://it.slashdot.org/story/09/01/15/1334222/GPUs-Used-To-Crack-WiFi-Passwords-Faster [slashdot.org]

    This seems to be an update of last year's story, just to mention that the HD5000 series is now supported, and it's faster on the newer, faster video cards.

  • What's the difference between "recovering a password" and hacking into a phone? Shouldn't the summary read "use GPU to break into stolen smart phones."
  • This is a blatant advertisement. Who's responsible for letting junk like this through? Has your account been hacked, CmdrTaco (or should we now call you CmdrSPAM)? It's bad enough stories are often duplicates and days/weeks old. This is just sh*tty spam.

Decaffeinated coffee? Just Say No.

Working...