Blazing Fast Password Recovery With New ATI Cards 215
An anonymous reader writes "ElcomSoft accelerates the recovery of Wi-Fi passwords and password-protected iPhone and iPod backups by using ATI video cards. The support of ATI Radeon 5000 series video accelerators allows ElcomSoft to perform password recovery up to 20 times faster compared to Intel top of the line quad-core CPUs, and up to two times faster compared to enterprise-level NVIDIA Tesla solutions. Benchmarks performed by ElcomSoft demonstrate that ATI Radeon HD5970 accelerated password recovery works up to 20 times faster than Core i7-960, Intel's current top of the line CPU unit."
Stop with the advertising (Score:5, Interesting)
Re:Stop with the advertising (Score:4, Informative)
Re: (Score:2, Insightful)
... The whole summary is in marketing-speak for crying out loud.
And for the curious, TFA is no better. They're calling it a benchmark so they can advertise more effectively ...
You must be new here.
Re: (Score:3, Interesting)
And a bit of an and underhanded advert for ATI. 'Password recovery' is an inherently parallel problem that really likes the sort of math gpus do, and not so much the sort CPU's do. The ATI 5000 series are the fastest GPU's available at retail right now, doesn't take a genius to put 2 and 2 together here. Anyone who knows anything about NVIDIA's workstation parts knows they are not radical departures from their current retail chips so saying your new fancy retail part is twice as fast as the workstation v
Re: (Score:2)
Anyone who knows anything about NVIDIA's workstation parts knows they are not radical departures from their current retail chips so saying your new fancy retail part is twice as fast as the workstation version of the other guys last gen part is stating the obvious.
I agree with what you're saying but Nvidia's current GPUs are about 2 or 3 generations old.
They did a die shrink but its the same as their previous generation chip. They've been
re-cycling chips with new part numbers while they fix the bump problems.
Doesn't Fermi get released like next week?
Re: (Score:2)
well, I've not RTFA but if they can get double the performance of a Tesla system using much cheaper (as I recall it's expensive, which isn't saying much ~ I refuse to google if I won't RTFA) video cards isn't that something to talk about?
BAH, now you've got me bothered to RTFA... guess I should go do work instead?
Re: (Score:3, Funny)
come on. It CLEARLY states that "An anonymous reader" wrote that summary.
Re:Stop with the advertising (Score:4, Interesting)
As an IT security guy, I found this to be informative, actually. When analyzing the security of a system or organization, I need to know not just what is theoretically possible, but what can be done with already-existing software and hardware.
This article gives me some idea as to what attacks are currently practical (and for what key lengths).
When research or engineering achievements come from the commercial (rather than academic) sector, it isn't really reasonable to expect an academic tone. They're tooting their own horn, but they are doing it about something important.
Re: (Score:3, Insightful)
Having skimmed TFA (actually, TF Press Release) it doesn't sound like there's anything really interesting here other than GPUs are faster are parallel calculations than CPUs. This is already known.
Cracking WPA and iPod/iPhone backups is still not a feasible task. Instead of 20 billion years (or whatever), it'll now only take 1 billion? Saying "20 times faster" makes it sound like you can already reliably crack these things, and now instead of a few hours, it's only a few minutes. But unless I missed it (and
Portrayal (Score:5, Insightful)
I like the way this is portrayed in a totally positive light, as if a person, upon forgetting the password to their device, is going to go out and buy one of these video cards, install it in a machine capable of supporting it (PSU wattage, bus speed, OS, etc), purchase the proprietary "password breaker" software (sold by the company that authored this "story"), all just to recover their password. I think the typical usage for this type of setup is of a more nefarious sort.
Re: (Score:3, Interesting)
You remember that Elcomsoft was the company Dmitry Skylarof was (is?) with? He's the guy who got thrown in a US jail for something he did in Russia that was completely legal in Russia.
Re: (Score:2, Redundant)
No, the US jury found him not guilty.
Re:Portrayal (Score:4, Informative)
Re: (Score:2)
Re: (Score:3, Informative)
Re:Portrayal (Score:4, Insightful)
Try posting bail when no one else has access to your money or collateral and no one is willing to advance you a loan for that purpose. You first have to get to your lawyer (assuming you have one, and not a public defender who won't give a crap), have him draw up (or use a boilerplate) power of attorney form so s/he can access your funds, have a notary witness your signature at the jail (often not possible since the only physical (non-video) visitor you can have is your lawyer), and take that to your bank during business hours.
A debit/credit card might work, and you might indeed have it on your person when you are arrested. But, it will be safely stored with your personal possessions, and not provided to anyone other than upon filing in a release form, that your jailer may not approve (generally the deputy overseeing the jail module where you are held). Have you got your debit/credit card number memorized? The expiration date? The code on the back?
Things that can take a few minutes over the phone can take many days when one is in jail.
Re: (Score:3, Informative)
Bail bondsmen can't help you if you can't post collateral or pay the bond fee.
The problem isn't not having the resources to post bail. (Well, that is a problem, but a different one.) The problem is not being able to execute the steps to do so.
Re:Portrayal (Score:4, Informative)
No, the charges against Sklyarov were dropped and he was released as part of a deal in which Elcomsoft agreed to accept US jurisdiction. The US jury then found Elcomsoft not guilty.
Re:Portrayal (Score:5, Informative)
Dude, I was there. Defcon 9.
He didn't "enter a hostile country" unless you think the USA hates everybody and is hostile to all.
Dmitriy broke no US laws and broke no Russian laws. No US entity had complained about his activities before his arrest. He had every right to think he'd not be bothered.
But he he angered a powerful and amoral US corporation named Adobe, so they had their government lackeys detain him. When Adobe took a horrible blog-beating and a nearly instantaneous sales hit they asked the fedguv to drop the charges and the USA said "no, you turned him in, you don't prosecute DCMA, we do - he stays in jail for a year until we eventually get around to trying him and finding him not guilty". The worm turned on its master, very funny for everyone but Dmitriy's wife and infant children.
What did Dmitriy do that brought corporate wrath down on him? He revealed in a public forum that Adobe's e-book cipher, which they were shopping to authors as "hard encryption", was ROT-13. I was there when he did it. That's right, Adobe was telling authors that their technology would prevent duplication of their books, but their copy-protection was ROT-13. It's beyond parody.
Dmitriy revealed to e-book authors that Adobe had ripped them off. For that, he was held in durance vile.
Why did he do it? Not for the challenge, it was trivial! He did it so people could back up their legally purchased e-Books and so that blind people could read e-books. For that, he was held.
Re: (Score:2)
Yeah, after spending four months in jail. Lot of good it does you to be found not guilty when you're incarcerated anyway.
Re: (Score:2)
Dmitriy Skliarov is the more correct phonetic spelling. /. still does not accept UTF-8, it's retarded.
Re: (Score:2)
Yeah like selling one time password solutions to IT bosses when someone gets ahold of their SAM.....
Re: (Score:2, Troll)
One man's "password recovery" is another man's "password cracking".
Just like the same person being a "freedom fighter" and "terrorist/insurgent" at the same time.
It all depends on your point of view.
Re: (Score:3)
Re: (Score:2)
Re:Portrayal (Score:4, Funny)
Re: (Score:2)
If you don't have "Save Settings" / "Load Settings" buttons somewhere in the interface, you should upgrade. I don't use them myself, but I know they are there.
Re: (Score:2)
Re: (Score:2)
I think you're right. Someone could use this kind of setup to play Crysis.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I like the way crowbar makers advertise their produce in a positive light. As if anyone, upon realizing that they need to pull hundreds of deeply sunken nails, is going to go out to the store and buy a heavy 2 foot crowbar. I think the typical usage for crowbars like this is of a more nefarious sort.
GPUs (Score:5, Interesting)
Re: (Score:3, Informative)
It is in progress in fact. That was the point of intel 80 cores prototype.
I found funny that with time we keep doing the cycle external processor->co processor->ntergrate in CPU dye -> external processor
Re: (Score:2)
you mean the one that horribly failed and wasn't even close to performing as good as a graphics card?
oh, right.
Re: (Score:2)
The effect is a long-standing and well documented [catb.org] observation about this industry. I guess Moore's Law is antithetical to satori.
Re:GPUs (Score:4, Interesting)
Re: (Score:2)
well, It was not really a CPU at the heart. It was more a complex network of heterogeneous computing unit with classical CPUs but also DSPs, vector float processing units...
It was of course a prototype and never reached the amrket, but merging CPU-type and GPU-type on the same chip seems definitely to be in Intel's and AMD's roadmap.
Re: (Score:3, Informative)
My understanding is that even DX10+ compliant GPUs still suffer badly when conditional branching occurs. They can do it, but it basically causes them to throw away everything.
That's entirely up to the implementation. Today's generations of GPU's don't pay much heed to conditional branching but the upcoming Fermi from nVidia, for instance, does introduce branch prediction and tracking. The API supports conditionals and loops.
As for Larrabee, while it was designed as a GPU in some ways, I got the impression it still hewed to CPU roots. It was integer based, not floating point based
*boggle* no it wasn't. The thing was a bunch of 486 CPU's each with a gigantic 128-bit SIMD (read: vector floating point) unit attached. It obviously was not made to do anything but the most rudimentary CPU tasks. Hell, it doesn't even support branch predicti
Re: (Score:3, Informative)
GPU's are better at doing certain calculations generally, and are very good at parallel processing seeing as graphics can be broken down to be processed parallel very quickly. For this, gpu's have a ton of cores. So in a way processors are indeed starting to follow with multicore systems but it is nowhere near the number GPU's use. High end GPU's now have 480+ processor cores on a card these days, thats a lot more then 4 core intel's ;). But if you had a ton of cores on the processor, each additional on
Re: (Score:2)
...most things must be done linearly, not parallel.
Or to be a bit more precise: Humans can't think about parallelism well. Certain obvious, discrete tasks can be split up, but having whole threads of execution constantly communicating and touching shared resources overwhelms the capacity of most programmers. You could write a massively multi-threaded program to do a lot of stuff that is currently done linearly, but you'd risk a whole lot of crashes and deadlocks from the inevitable bad code and you wouldn't get the full increase in speed since the thread an
Re: (Score:3, Interesting)
To some level, CPU's have been moving to be more GPU like for a long time. SIMD (SSE, AltiVec, NEON) are GPU features that made their way to CPU's. Ditto for parallel, long pipelines. Remember the Pentium 4? That was a huge step in the GPU direction.
There are two problems with that approach:
1. Code that isn't pure number-crunching doesn't run well on such a compute model.
2. The model is almost entirely memory-starved. GPU's have up to a GB of high-speed, dedicated RAM on the card itself. CPU's have to live
Re: (Score:2)
As an aside, GPU's have up to 2GB [slashgear.com] and soon to be 4GB. The rest of what you said it dead on.
You are right though, the concept of gpu/cpu hybrid seems to be a possible end result if the combination can be run successfully. I suspect there is a lot of very tough engineering involved with getting such a concept working.
Re: (Score:2)
The biggest problem will be heat. GPU's currently consume and dissipate upwards of 200W. Likewise for CPU's. To get a single die or even package to consume and dissipate that much power and heat will be a challenge not just for the silicon designers but the system guys as well.
Re: (Score:2)
Yes GPUs are very different, they are designed to do a lot of very similiar calculations to an extremely large set of vector data. That's also pretty much all they do, they aren't nearly as good for logic like a traditional CPU is.
Re: (Score:2)
The coding / assembly is so different that it doesn't translate, and they only do certain kinds of processing well.
Re: (Score:3, Informative)
The last sentence nails it. They only do certain types of operations well, and the frequency with which I upgrade GPUs compared to CPUs - or more specifically, the fact that I very rarely replace both at the same time - leads me to believe I'm better off having them separate. Maybe there are parts of the GPU which could be incorporated into the CPU, and I think that might be what the Core i3/5/7 processors are doing with GMA integration.
Re: (Score:2)
Coming soon: the i3-387DX, i5-587DX and i7-787DX GPU co-processors, and motherboards with GPU coprocessor sockets next to the CPU.
Happy days.
Re: (Score:2)
They tend to be specialised processors, designed specifically for graphics related tasks. Those tasks happen to be computationally very similar to other tasks such as protein folding. Though they will be poor performers or possibly totally incapable of certain tasks your CPU has to do.
That said I'm waiting for the first CPU to build in a GPU so we don't even need a separate graphics chip on our motherboards any more to for the already integrated graphics output.
Re: (Score:2)
What are you waiting for? The new Intel processors already have integrated on-die GPUs. The next generation will have the GPU and the CPU completely integrated.
Re: (Score:2)
You see how well I follow current hardware :)
When buying a computer these days I go for the cheapest/slowest specced hardware which is way more than what I need (watch videos, troll /., e-mail, general browsing, some web/general programming, standard office work).
And actually what I'm waiting for to buy a new box is for the old one to die. 5 year old hardware is still fast enough for pretty much everything that I do.
The CPU speed problem is solved and done with for all but the most demanding applications
Re: (Score:2)
Slow down, my friend. ICs aren't created equal, and you can't just throw everything on one die. Processes can be vastly different and what is optimal for one IC isn't optimal for another. For example, you can't manufacture standard DRAM on a CPU process - RAM processes are vastly different as they need to create capacitors. WiFi needs specific RF circuitry and often requires external ICs with exotic processes for switching/etc (GaAs or silocon on sapphire). Bluetooth will probably still need some funky sili
Re: (Score:2)
OK so now the GPU is on the CPU die. The memory controller is there already, right? It should be a no-brainer to integrate small stuff like ethernet. Bluetooth/wifi may be a bit harder due to the necessary aerial. Now all that's left is to integrate the RAM on the die and we're there. A one-chip computer. No need for complex motherboards any more. At that moment the whole hardware issue is solved.
We've had ICs like that for ages, mainly in portable devices which have everything and the kitchen sink in a single package. However, I'm not sure it would be ideal for a general-purpose PC to have everything in the CPU for a bunch of reasons, mainly how it would raise costs and give you less flexibility regarding peripherals. It seems more reasonable to have bluetooth, network, etc. in the motherboard or plug-in cards, so you don't have to replace your CPU when a newer version comes out (or when you use yo
Re:GPUs (Score:5, Informative)
Is the coding/assembly so different that it doesn't translate? Do they only do certain kinds of processing really well (it is a GPU after all), so it couldn't handle other more 'mundane' OS needs?
Yes, exactly. CPUs are built from the ground up to do scalar math really, really fast. That lends itself well to doing tasks that must be performed in sequence, such as running an individual thread. However, they've only recently gained the ability to do more than one thing at a time (dual core processors), and even now high end CPUs can only do six calculations at once (6 core processors).
Meanwhile, GPUs are built to do vector math really, really fast. They can't do individual adds anywhere near as fast as a CPU can, but they can do dozens of them at the same time.
Which type of processor is best for which job depends entirely on the nature of the math involved and how parallelizable the task is. In the case of 3D graphics, drawing a frame involves tons of vector arithmetic work, which is why your 1 GHz GPU will run circles around your 3 GHz CPU for that task (and is also where the GPU gets its name from). In the case mentioned in the article, password cracking is highly parallelizable: you've gotta run 100 million tests, and the outcome of any one test has zero influence on the other tests, so the more you can run at the same time, the better. By running it on the GPU, each individual test will take a bit longer than running it on the CPU would, but you'll be able to run dozens simultaneously instead of just a few, and will thus get your results much faster.
CPUs certainly have their place, though. Some tasks simply must be done in sequence and cannot be easily divided up in to seperate parallel tasks. The CPU will get these done much faster, since running them on the GPU would incur the speed penalty without realizing any benefit.
I've simplified it a bit for the sake of explanation, but that's the gist of it. Hope that helps!
Re: (Score:2)
I know very little about that level of hardware, but why aren't we incorporating these types of things into CPUs?
Because most people don't want their CPU consuming 300W of power when idle?
Re: (Score:2)
It's all about IP. It wouldn't be horribly difficult to put a GPU and CPU on the same die. BUT, Intel doesn't GPU manufacturers getting into the x86 business and GPU makers certainly aren't going to give Intel any of their technology and get cut out the market. Intels attempts at GPUs has been less than spectacular. Good enough for Word and Excel. Not good at modern gaming.
So for the foreseeable future CPUs and GPUs will be treated as seperate entities.
Re: (Score:2)
I think you accidentally a word.
Also, how can Intel prevent GPU manufacturers from getting into the x86 business, if AMD is already in both?
Re: (Score:2)
3D rendering involves lots of integer math, and there are huge portions of any given render that do not depend on each other. For example, the scene may involve calculating the vectors from thousands of vertices and faces of polygons towards hundreds of light sources. That is millions of operations that are essentially independent. Another phase of a render will require calculating the intersection of each view vector (and more if you use FSAA) with a polygon in the scene.
So, modern GPUs are a special ca
I like the use of the word "Recovery" (Score:2)
I think we all know what they really mean. ;)
(Anyway, I'm also impressed by the power shown by the GPUs. Its a good demonstration that some of the new technologies (CULA? CUDA?) that allow "regular" programmers to use this power actually will really speed up some things.)
Slashvertisement (Score:5, Funny)
Hey Editors,
You forgot a link to the buying page [elcomsoft.com]
For as low as 1.399,- € you can start cracking^Wrecovering passwords today.
Re: (Score:3, Informative)
Re: (Score:3, Informative)
Huh? (Score:2, Informative)
Is this supposed to be a good thing? Sounds like someone's password encryption algorithm needs some upgrading to me.
Out of curiosity... (Score:2, Interesting)
I keep hearing stories about using GPUs for non-GPU computations, but has anybody here tried it?
What does your screen look like while a program like this is running?
Re: (Score:3, Informative)
Normal. Running GP-GPU or CUDA apps has no effect on output to the screen. We do it for medical imaging processing.
Re: (Score:2)
I keep hearing stories about using GPUs for non-GPU computations, but has anybody here tried it?
Yes many people do it and have for years.
What does your screen look like while a program like this is running?
Why do you assume that the screen looks different.
Re: (Score:2, Funny)
Good point. Why would I assume a graphics card operation would have any effect on graphics? I've only ever used mine to take ice off the windshield.
Re: (Score:2)
Except for the fact that the part doing these calculations has nothing to do with the parts of the GPU that are handling outputting to the screen?
Re: (Score:2)
Why do you assume that the screen looks different.
because when you run a cpu intensive application your pc becomes really slow, so if you use instead your gpu the screen should become "slower" too, but probably you wouldn't even notice
Re: (Score:2)
What does your screen look like while a program like this is running?
Why do you assume that the screen looks different.
He is still running a Voodoo series add-on card that takes over the video output when it is in use?
Re: (Score:2, Informative)
The display buffer for a 1920x1200 screen with 24-bit colour takes less than 7MB. Even a fairly low-end graphics card will have at least 128MB of memory. In other words, there's plenty of memory for a program running on a GPU without needing to piss on the display buffer.
If your screen is just displaying a bunch of 2D windows, then the 100s of cores in your GPU will be sitting idle. Again, computations running on the GPU will have no impact on what you see.
Re: (Score:2)
Again, computations running on the GPU will have no impact on what you see.
True, but if it did it'd be way cooler than the hourglass cursor! A program could use its calculation memory space as the display buffer for its own window while it runs. Auto-visualization!
Re: (Score:2)
Newer windowing systems no longer draw the screen as a single 2D object. This includes X (Compiz), OSX, and Windows.
Re: (Score:3, Informative)
It's pretty cool. (Score:2)
There have been several documentaries about hacking [filmroster.com] over the years that demonstrate the use of GPU-based computations. It is soo bad.
Re: (Score:3, Funny)
What does your screen look like while a program like this is running?
Well I haven't kept up with the latest developments, but if it's anything like the Sinclair ZX80 I'm posting from, the screen goes blank gray when you start actively computing. Then it returns to normal when the answer is ready.
boo (Score:5, Informative)
boo slashvertisement
103000 passwords per second. So? (Score:3, Informative)
On that one ATI board that get 103K passwords per second and only 4K on the latest quad-core intel (which by the way, is almost 26 and not 20 only times faster.)
So that's wonderful. How many passwords are there in 1024 bit SSL encryption? 1024 asymmetric is equivalent to 80 symmetric algorithm, so that's like 2^80 passwords, right?
Let's say 100,000 passwords per second, that's 10^5.
Google says this: (2^80 / 10^5 ) / (3600 *24 *365*1000) = 383 347 863
383.3 million years to go through every password in 2^80 possibilities.
In reality, of-course, not every combination is used, many passwords can be eliminated by heuristic and also it helps to have a good dictionary file handy, from which to generated most likely password combinations. That probably cuts down from 383 million years to something much more ATI friendly. Of-course we need to use stronger cypher.
As a final note: at last I understand why Hugh Jackman needed the 7 monitor setup, each one must have been used as an output device for the video card it was connected to. Obviously the video cards were the actual power behind all that hacking!
Re: (Score:2)
no, definitely, you can buy 100,000,000 of them.
Then it's 3.83 years only. With a bulk discount will only cost maybe $10 per card, that's only 1000,000,000 a billion? Chump change for any government. Spend 100 times more money, get results in days.
Re: (Score:2)
At 103000 attempts per seconds, that's... 421 years oh.
(Yes I know it's not going to take until the entire set has been attempted to crack a password.)
Re: (Score:2)
yeah, but the letters are case-insensitive and can use numbers only, no special characters.... that makes for about 6 months on average (300ish days in the worst case).
now... can you run these cards in a SLI configuration, and how many cards can you buy after you've cracked^H^H^H^H^recovered Warren Buffet's account password? :)
Re:103000 passwords per second. So? (Score:5, Insightful)
Still within the realm of cracking, especially if those passwords guard a few million dollars of assets. 421 years sounds like a lot until you add things like:
- Crossfire or SLI where you have multiple boards installed
- Setup half a dozen machines to work on the problem
- Apply a botnet to the problem
- Future improvements in technology
- Apply some heuristics to the guessing process
All of which can easily shave off at least 2 orders of magnitude and possibly 3 orders of magnitude. Which reduces that 421 years down to a few months (or worse).
8 character passwords are pretty much dead in the water now. Or at least they need to be phased out within the next few years. Or protected by rate-limiters which control how fast passwords can be tried. (Personally, I always assume that the attacker has the stored hash and can apply parallelism to the attack. Which means that rate limiters should not be relied on to prevent cracks.)
Password Recover is the new Hacking? (Score:2)
Finally! (Score:2)
Finally...someone who understands!
I wanted to get one of those professional car door jimmy kits (the ones with a jimmy for just about every model of vehicle!) that tow truck supply vendors sell "just in case I get locked out of my car", but they had these outrageous demands that I "prove" that I was a legit tow outfit or garage.
The locksmith supply was much the same way when I tried to buy a lockpick set, "just in case I get locked-out of my house".
You can bet I'll be getting this software. I must've
Re: (Score:2)
You owe me one new Sarcasm Detector. My current one just overloaded.
ObRokicki (Score:2)
Executes the cellular automata game of LIFE in the blitter chip. Uses a 318 by 188 display and runs at 19.8 generations per second. Author: Tomas Rokicki
ElcomSoft updates their Slashvertisements? (Score:2)
This seems to be an update of last year's story, just to mention that the HD5000 series is now supported, and it's faster on the newer, faster video cards.
What's difference? (Score:2)
Someone should revoke Taco's privileges... (Score:2)
Re: (Score:3, Informative)
Re: (Score:3, Funny)
Great! now when I go into the bank with my stack of Radeon cards they'll call security.
Re: (Score:2, Funny)
No, you're only doing them a favour by "recovering" their passwords.
Re: (Score:2)
How do you put commas and spaces into the combination for your luggage?
Re: (Score:3, Insightful)
Re:My password is safe (Score:5, Funny)
Re:My password is safe (Score:5, Funny)
Try resetting someone's password to 'obvious' when they call in with a 'forgotten password'. Then see how long you can string them along by saying "I've reset your password - the new one's obvious..."
Caller: "What? Like my surname?"
You: "No, it's obvious"
Caller "First name?"
You "No"
Caller "letmein?"
Yeah, it's been a bad day!
Re: (Score:2)
Yep, and you can be sure no one else will be using it...
In fact, no one will have any interest in it at all. Your 'secret' is safe!
Re: (Score:2)
Does that mean it's "dicktionary" based?
Obviously there's also no minimum length requirement...
Re: (Score:2)
Not every algorithm scales linearly.
Re: (Score:3, Informative)
Not really. GPUs are good at going really fast in a straight line. Throw so much as an "if" statement at them and they become about as fast as a P2. The closest you'd get to what you're describing is a Cell PCI-E card, or Intel's vapourware Larrabee.
Though if all you want is to use your old stuff on a new PC, you can get ISA/PCI card motherboards that run off the host's power/peripherals.