Protected Memory Stick Easily Cracked 220
Martin_Sturm writes "A $175 1GB USB stick designed to protect your data turns out to be a very insecure. According to the distributer of the Secustick, the safety of the data is ensured: 'Due to its unique technology it has the ability to destroy itself once an incorrect password is entered.' The Secustick is used by various European governments and organizations to secure data on USB sticks. Tweakers.net shows how easy it is to break the protection of the stick. Quoting: 'It should be clear that the stick's security is quite useless: a simple program can be used to fool the Secustick into sending its unlock command without knowing the password. Besides, the password.exe application can be adapted so that it accepts arbitrary passwords.' The manufacturer got the message and took the Secustick website offline. The site give a message (translated from Dutch): 'Dear visitor, this site is currently unavailable due to security issues of the Secustick. We are currently working on an improved version of the Secustick.'"
Well they could have been like other companies (Score:5, Insightful)
Re:Well they could have been like other companies (Score:4, Interesting)
N number of attempts... (Score:2)
So it's not a case of typing it wrong once and *poof* goes the data (note that they didn't find any physical evidence of things in there capable of physical destruction either). If you set it to 3 times, and you get it wrong 3 times yourself - oh well. Maybe you *could* set it to only once, though.. but if you do that,
Re: (Score:3, Interesting)
Depends on how much trouble you'll get in if law enforcement agents manage to get at the data... seeing as how that's the only *possible* use I can imagine these things would ever be put to.
Re: (Score:2)
Working from home, but needing to carry sensitive data.
Or consultants that have to travel, and carry sensitive documents.
Lots of legal reasons as well.
Re: (Score:2)
Re:Well they could have been like other companies (Score:5, Insightful)
There are a lot of situations where having a local copy of the data is a convenience, rather than a necessity, and this would allow the convenience without the risk of it being stolen. If it's accidentally destroyed, then it's an inconvenience, not a disaster.
Re: (Score:2)
Destroying the data on excessive retries is an effective way of preventing a sustained brute-force attack. This implementation is completely useless, to be sure, but the concept is a good one.
Re:Well they could have been like other companies (Score:5, Insightful)
Re: (Score:2)
I don't know about you, but I don't keep original copies of data on a USB key.
Quite - if the data is important enough to protect, it's important enough to backup. I've not had good experiences with reliability of USB sticks either - I've encountered two (in my limited experience) that had the habit of occasionally showing up as 'unformatted' when you plugged them into a PC. Mostly they worked, but sometimes they decided to vape themselves.
But then, it's a bit like expecting hard drives to never die, I guess.
Re:Well they could have been like other companies (Score:5, Insightful)
Re: (Score:3, Insightful)
The only way to secure data is to make it so absolutely no-one but the authorised people have access to it. You can keep data secure physically if you isolate it from any form of access. However information does not work well if isolated like that, information has to be shareable to be useful, otherwise its just dead data, worthless bits.
I have several pieces of information that are unhackable. T
Re:Well they could have been like other companies (Score:4, Insightful)
It appears that the system doesn't use a form of encyption unlocked by a key (entered by the user) to store the data - and that instead it simply requires use of a single instruction to the USB device indicate the data ought to be accessible or not. That just sounds ludicrous.
If it had been developed in good faith, and this were a bug (rather than part of the design) and/or the result of a sphosticated exploit that it would have been hard to predict, I would be sympathetic. As I would if they had clearly indicated it's limitations (which they could have, but if they've taken the website down now, I'm guessing not).
What's particularly telling for me is, while the company were quite happy to tout the supposed virtues of the product, they are clearly worried about it now they have been found out. That repesents a staggering failure by the designers of the software, their managers, the marketing and product design teams, the HR department who hired all these people of clearly very dubious virtue and the senior management involved.
Either they are crooks (because they were complicit in touting such a crummy product that didn't really do what it claimed to do in a reasonable way) or are they are all, really, really dumb (and none of them asked pertinent questions of the other parties at any stage of product development).
Re: (Score:2)
Re:Well they could have been like other companies (Score:5, Funny)
Which is a shame for this company, because idiots are in such short supply these days...
Re: (Score:2)
Re:Well they could have been like other companies (Score:5, Interesting)
Well, not completely. A spokesperson for the product is reported saying:
This is quite a different statement from the one made near the start of the article.
Funny part is, all they did was run the program in a debugger, put a breakpoint after the clearly labelled "VerifyPassWord" function, and change the return value from 0 to 1. Pretty embarassing. But the article went pretty easy on them after that. Really good read by the way.
Re: (Score:3, Insightful)
If people don't bother breaking your security, they aren't that interested in your information in the first place.
If people who are interested in your secrets are able to do so trivially, you can just as well abstain from encryption altogether to save you the hassle.
Re: (Score:3, Insightful)
I'm usually amazed how many people would add more and more layers of "security" to their door, ignoring that the walls are made of paper. Sure, you, as the authorized user, have to go through time consuming hassle to get to your data, but the attacker, who doesn't care about damage, has it easy to circumvent your security. He might leave yo
Re:Well they could have been like other companies (Score:5, Funny)
Re:Well they could have been like other companies (Score:5, Insightful)
Wait. The executable was compiled with debug symbols turned on? With functions with easy-to-understand names? I mean, I know it's only security-through-obscurity, but c'mon! At least up the ante a little bit
Re: (Score:2, Informative)
Re: (Score:3, Insightful)
Still, there's no reason why one has to use DLLs, either. You can put everything into one
Re: (Score:3, Insightful)
That would be the sensible way to go since its unlikely any other app would ever use that
That reminds me of Sony/BMG. (Score:3, Funny)
Duh.
Does that remind anyone else of "Most people don't even know what a rootkit is, so why should they care?"
Oh my god, some people are really projecting their own dumbness at their customers. Such marketroids should really be sacrificed to the war against terror. Or cluebatted.
Re: (Score:2)
Re: (Score:2)
Nice one! (Score:5, Interesting)
Re: (Score:2)
I'm having a boring day. Bring on the lawsuit!
Re: (Score:2)
Just put - (Score:5, Informative)
Re: (Score:2)
TrueCrypt on a memory stick with an encrypted volume file with a good passphrase and your data will be secure from pretty much anything. I have not heard of TrueCrypt being cracked yet.
I use an encrypted image generated by the Apple Disk utility which is capable of creating AES-128 encrypted DMG's. I don't know if aes-128 has been cracked yet but even if it has I rather doubt any thief will go to the trouble of trying to access my data. Of course I might be unlucky enough that my memory stick is stolen by a super Hacker who will go to the trouble of cracking my little DMG crypto image but that seems highly unlikely.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Not a total solution, but it's at least a way to allow a corporation that wanted to use it to not open things up entirely. (Though in the large corps I've worked with, if you don't have a recovery key, you're out of the running... they want to be able to get into an employees secured data after they've been terminated...)
Re: (Score:3, Informative)
Once Truecrypt is installed on a machine (by Administrator) every Limited User can use it without problems. I have it set up that way at home.
Running Truecrypt requires a driver and inserting that in the operating system requires Admin, once it's there, using it is allowed by everyone
Linux... (Score:2)
No encryption... (Score:2)
The password.exe does, however, address a controller chip. Without the correct password, the controller chip will simply refuse to provide further access to the flash memory.
So if you're really wondering - I would imagine that the entire thing won't work with Linux, period.
Re: (Score:2)
huzzah! a linux version should soon be in the works!
Re: (Score:2)
Re: (Score:3, Insightful)
So even if the password control worked (which it doesn't), you could get at the contents by desoldering the flash chip and putting it in a reader. (Something hobbyists have been doing with HD-DVD drives to reverse engineer/modify the firmware.) And this is a supposedly intelligence-service recommended device for government use? Right, go on, pull the other one.
TrueCrypt (Score:5, Informative)
Most Slashdotters know you should not trust the built in security on these devices.
The solution for real security on these devices is to use TrueCrypt [truecrypt.org].
It's not hard to use, though the more technical among us may need to help out the less technically inclined to get things rolling. Once it's setup, though, it's secure and easy to use.
Re:TrueCrypt (Score:5, Insightful)
MOD PARENT UP. (Score:2)
what have s/w engineers done to fix this problem? (Score:3)
The type of people who have got the wherewithal to set up TrueCrpyt are not the market this was aiming for. This seems like a product made for the techno-clueless PHB types who just want to buy something off the shelf they can stick in their magic computer box and have it "just work," and who see that high a price on a simple 1-gig USB stick not as an obvious ripoff, but as a measure of how much good computer magic it must surely contain.
So they designed a flawed product. Slashdot folks tend to complain about how companies keep coming out with crummy products. Is it realistic to expect one billion MS Windows (tm) users to get a CS degree so they understand its shortcomings and are able to recognize these crummy products. Maybe it is time to switch to Apple products so you don't have to worry about trojan horse AUTORUN.EXE flaws in Windows. Corporate IT professionals were hired to help free users from the burden of maintaining their PCs muc
Does everyone have to be an expert on everything? (Score:2)
Elsewhere in this thread, it's pointed out that you shouldn't have to be an expert in crash testing to be able to buy a car that's safe. I tend to agree. While I see your point about PHBs and throwing money at problems, I've also reached the stage in life where I have (some) money and little enough time to futz around with doing everything from scratch. When I was a kid, I personally, carefully, expertly assembled every round of ammunition I shot; nowadays I'm likely to grab a box off 9's at the sporting
Re: (Score:2)
But you don't have to be an expert to know the difference between buying a car from an established company with certificates on the windows, publicly available crash test ratings, and a legally mandated inspection every so often.. and buying a jumbled mass of bolts, pipes, wheels, and a barber's chair that some guy you've never heard of made in his garage and calls a "car
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
biggest problem with non technical people like CEO's CFO's CTO's and the like is they can not understand what you mean when you say "use a secure passphrase." they think their SSN is secure, it takes a amateur 20 minutes and $30.00 to get someones SSN from one of the big databases by having a name and address or phone number. Most executives info is based off of name + business name in these DB's.
They can not understand that the
Re: (Score:2)
I know that there are ways to improve the technical side to the point where it can be trusted to be Fort Knox. The human factor is the limit, and if I knew a way to improve the human side of security, I'd be traveling from c
Re: (Score:2)
Re: (Score:2)
Of course, in practice, unless you do something silly like posting your method on a publicly accessible message board, no one's likely to figure out where you're pull
Re: (Score:2)
Re: (Score:2)
Even the `portable` mode as far as i remember.
Re: (Score:2)
This begs the question...... (Score:5, Interesting)
Re:This RAISES the question...... (Score:5, Informative)
Re: (Score:2, Informative)
Since there are a ton of these products out there. Does any third party verifiy that they are secure as they are claimed to be? Or are we truly at the mercy of the marketing spin that these companies put out?
According to TFA, the product was commissioned by the French government and is approved by the French intelligence service. It also is reportedly used in the defense and banking industries. One would hope that there would be some sort of verification by knowledgeable IT folks prior to approval by all these groups, but it appears that no one gave it a real examination.
Mod +1 erudite-sounding (Score:4, Funny)
Re: (Score:2)
1. They don't simply hand out their seal of approval like it's a "Vista compatible" sticker. They actually DO test your stuff.
2. They don't refrain from telling you if your product is actually flawed, and (what's worse), they don't even stay silent when you toss it on the market regardless.
3. Managers don't know jack about them, they don't care about security seals and lis
$175 for a flash drive? (Score:2)
Re: (Score:2)
1. This which doesn't cost anything has no value.
2. If there is no company behind it, we cannot sue anyone if it breaks (because we all know MS is close to bankrupcy because of those horrible lawsuits that follow their blunders).
Re: (Score:2, Funny)
Dumb design (Score:4, Interesting)
- self destruct, great, so if you want to destroy someones data, just grab their memory stick and intentional use bogus passwords. Now that's brilliant. A MS with a builtin self DOS.
- No security support in hardware, just desolder the actual memory and stick it into your favourite $15 MS. Brilliant.
- So smug in their design they don't even encrypt the data. Outstanding.
- Software designed apparently by a 12 yo. Oh wait, a 12yo probably wouldn't have made it so dumb. Maybe it was a 6yo, were there identifiers named after Spongebob characters?
Actually, the bigger problem is that so many govt agencies approved of this thing, apparently, without it going through any type of remotely rigorous testing and verification. As much as our US govt agencies get ripped for doing stupid stuff, it's clear that they don't have the market cornered on such activity.
Hey, I have a secure self destructing bridge to sell to
Re: (Score:2)
It should have been obvious (Score:4, Insightful)
Unique and secure are mutually exclusive.
It is not possible, through a feat of sheer genius, to make something that is both novel and demonstrably secure. It turns out that genius isn't a particularly rare commodity. With 6.5 billion people in the world, there are 6,500 people who are walking around with one-in-a-million levels of intellect. Any one of those people, on a good day, can beat any other person on earth in a battle of wits. Any one of of the millions of people with one-in-a-thousand intellects probably can, too.
Security is the one aspect of technology where state of the art is better than something which advances state of the art. State of the art means nobody has yet, even on the best day they've ever had, been able to beat it. We've seen some recent examples where very narrow vulnerabilities have been found in hashing algorithms, which has forced the state of the art to change slightly to favor drop in replacements. But by in large the state of the art has been remarkably stable over a long, long time. Anybody who claims to have something nobody else has probably has something worthless, if he has anything at all.
This is why product security is so bad. It's not possible to differentiate yourself based on security, without affecting other areas such as usability. There is considerable irony in this fact: a product that is carefully thought out and implemented using widely known techniques would have a good chance of being unique. The problem is selling the product. Lotus Notes is a good example. It has its strengths and weaknesses, but as of the early 90s it was the most secure email system in the world. In fact it still would be. But it wasn't the easiest to use or administer. Unfortunately their attempts to make the system more attractive were failures. It's never been more attractive than Exchange. But it's always been more secure.
There's Your Problem (Score:3, Funny)
Re:There's Your Problem (Score:5, Interesting)
They had added it to close a previous security problem I'd pointed out with their product that stored an internal customer id in a cookie to grant access to a web app - problem was, the customer id's were allocated sequentially, so anyone brute-forcing it would get access to all their customer data in minutes, including the adress books of the entire top management team.... base64 "encrypting" the customer id was supposed to prevent anyone from trying that trick again... I left that company pretty much as soon as I could..
"Secure" Digital Already Cracked? (Score:2)
A surprise and a non-surprise. (Score:5, Insightful)
No surprise that the security is non-existant, but a nice surprise that tweakers.net[0] have people skilled enough to do a thorough technical review. Tip-of-the-Hat to the reviewers and keep the good work up. Anyone can run 3D benchmarks and make graphs against the previous generation, but this requires a different level of technical know-how. It's always been my hope that the future would feature this type of review, using reverse-engineering techniques for indepth technical reviews, as a norm not an exception.
[0] No disrespect to the people of tweakers.net, I mean in the sense of 'any popular review site'.
GnuPG. GnuPG. GnuPG. (Score:2)
I trust exactly one encryption product: GnuPG. It's had it's pucker moments, such as the El Gamal signing key problem (IIRC - and I'm too lazy to look it up right now), but those problems get fixed and we move on. Given the choice of whether to trust a little hardware gimmick or a piece of Free Software that millions of people use, even if they don't realize it, I'll stick with the code. If/when problems arise, I believe that it's developers will look out for my interests and not their bottom line.
Havi
French intelligence (Score:2, Funny)
Yet another reason (Score:2)
Validation/Verification of Security (Score:2, Insightful)
It looks like that for $175, you get a 1GByte USB key,
Target Audience! (Score:2)
On other days, we discuss things like "Linux may be too hard for Average Joe". That's because we use a statistical example of {Total Users}*{Skill Level of 68% of Users}.
This stick WILL be secure
There's only one problem: That's the wrong audience. When you label something "Top Secret"... you are thundering a challenge for the whole world to take their best shot. The rules change.
Maybe "The Best Hackers Money Can Buy" will always win. Fine.
But at a m
DUH! (Score:2)
Whether they fix that stick or not, after showing just how much clue they got abou
Lexar USB stick security was broken by @stake (Score:2)
This was also on slashdot: http://slashdot.org/article.pl?sid=04/09/14/185523 2 [slashdot.org]
I wouldn't trust USB stick security unless there was a 3rd party assessment of the security from a reputable security firm and that assessment was published. Customers need to start demanding this. What track record do these companies have on security?
The bad thing about hardware is how do you patch the security hole? All hardware these days should have the ability
Patching hardware security holes (Score:2)
If you want a safe, don't start with a greenhouse. Start with a metal box. Adding a layer of security ontop of something insecure doesn't work well, as people can peel back layers. If you want something to REALLY be secure, start with something inherantly secure. If you constantly need to patch something for security holes your method was flawed from the start.
If the flash chip can be removed on it's own, it can be put in something insecure. If you must use this scheme, make sure the in
Re: (Score:2)
"USB stick security" seems to be something that is largely unnecessary in the first place. You don't need any special security on the stick, assuming you have reasonably strong encryption available on the computers you use the stick with: you encrypt data, give it to the stick, and decrypt it when you take it off. There are vanishingly few cases where
What? They're not suing? (Score:3, Interesting)
I am also curious. . . What does the law in the Netherlands say regarding corporate mandates? Are Dutch corps allowed to put other things ahead of generating profit for shareholders?
-FL
Note this sentence in the second paragraph (Score:3, Interesting)
I think that says quite a lot for the French intelligence service. Unless they wanted an insecure device to be marketed as secure.... black helicopters at the ready.
Unfortunate (Score:2)
Someone also referenced above about @stake finding an issue with the way passwords were stored
Stupid is as stupid does (Score:4, Insightful)
(1) If you don't have encryption, GOOD ENCRYPTION, you can't protect squat.
(2) "Self Destruct" is interesting, but unless you have a custom micro-controller on the ram stick, AND an independent power supply, AND the device potted in epoxy, it is all just a made for TV gimmick.
(3) Password.exe? I didn't see this in the article, but what happens if one plugs it into a Mac, Linux, FreeBSD, etc? Does it just work or does it self destruct?
(4) With reference to #2, since the article showed that one could make the device read-only, would self-destruct no longer work? If so, it MUST be potted in epoxy.
(5) Does the "self destruct" operate on the PC or th ram stick? We all know if it runs on the PC, it is doomed to fail.
If they want to REALLY do this:
(1) before everything, encrypt the data. This buys the device time to operate and basic security.
(2) Install a PIC or something that MUST have an encoded heart beat with some sort of hard to reproduce calculated byte pattern.
(3) Without a valid heart beat, the PIC will simply not enable the flash device.
(4) With a valid heart beat, the system must pass a valid password hash string within a reasonable amount of time to the PIC, or the data will be destroyed.
(5) After a number of failed attempts, the PIC will destroy the data.
(6) When the heart beat stops, the PIC disables the flash. (It is presumed that the software clears he file system cache as well.)
(7) Pot the damned device in epoxy.
dd (Score:2)
Re: (Score:2, Insightful)
A cheaper alternative that actually works (Score:5, Informative)
No self-destruct, but hard enough enryption for all but the most sensitive secret data.
Re: (Score:2)
I imagine self-destruct was the lure. If they had bothered to Encrypt the contents as well, bypassing the self-destruct would not have been the catostrophic failure it was. The crunchy on teh outside, chewy on the inside security model fails again!
Re: (Score:3, Informative)
Not shipping with debug symbols is important, looks like just that happened here. It also reduces the file size greatly.
Those devs are very clueless.
Not debug symbols. (Score:2)
When you build Dynamic Link Libraries, you need to export the function names in order to be able to call them. This way you can call something like GetProcAddress() [microsoft.com], which takes as parameters a handle to a DLL and a string representing the name of the function you're interested in calling.
Here, have some sample code.
typedef ULONG (WINAPI * External_Function)(/* parameter list goes here */);
HMODULE targetDll =
External_Function H4X0R = (Ex
Re: (Score:3, Informative)
So you can maintain the other SE's crappy code.
But maybe we should look to the security through obscurity methodology as an additional layer of protection.
That's what obfuscators are for.
Re: (Score:2)
But you do not want to give the same level of comfort the guy that throws your code through some disassembler. That's just plain dumb. Leaving the debug symbols in code is not really a sign of smarts, since it us
Re: (Score:3, Insightful)
No (Score:3, Interesting)
And those guys rarely leave any clues left in the code, often e
Re: (Score:2)
We're supposed to use descriptive variable names because when you're working with a team of people, it's more apparent to your teammates what a method called Ver
Re: (Score:2)
There's only one good way of making this safe: Encrypt the data on the drive. But that makes the "protected memory stick" idea really pointless. AFAIK, there's no standard way of sending a passsword to an USB key. Since you're going to need special drivers for
Re: (Score:2)
If you had actual security (i.e., used reasonably strong encryption) to start with, the obscurity wouldn't add substantially to the security. Here, the only protection was security through insufficient obscurity, so yes, additional obscurity would have made it marginally less easy to access, but would have been nowhere near as good as actually using strong encryption on the data, and wouldn't have mad
Re: (Score:3, Insightful)
Repeat after me:
If the debug symbols in your executable have ANY EFFECT WHATSOEVER on the security of your product, your product is insecure.
Let's say that again:
Debug symbols are a good thing. They allow people to analyze the behavior of your software better. If analyzing the behavior of your software leads them to
Re: (Score:2)
Get on a train and say that.