Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Data Storage Hardware

Protected Memory Stick Easily Cracked 220

Martin_Sturm writes "A $175 1GB USB stick designed to protect your data turns out to be a very insecure. According to the distributer of the Secustick, the safety of the data is ensured: 'Due to its unique technology it has the ability to destroy itself once an incorrect password is entered.' The Secustick is used by various European governments and organizations to secure data on USB sticks. Tweakers.net shows how easy it is to break the protection of the stick. Quoting: 'It should be clear that the stick's security is quite useless: a simple program can be used to fool the Secustick into sending its unlock command without knowing the password. Besides, the password.exe application can be adapted so that it accepts arbitrary passwords.' The manufacturer got the message and took the Secustick website offline. The site give a message (translated from Dutch): 'Dear visitor, this site is currently unavailable due to security issues of the Secustick. We are currently working on an improved version of the Secustick.'"
This discussion has been archived. No new comments can be posted.

Protected Memory Stick Easily Cracked

Comments Filter:
  • by insanemime ( 985459 ) on Friday April 13, 2007 @08:57AM (#18717041)
    At least they had the balls to admit that something was wrong and try to take steps to fix it. It will be intresting to see if they recall the ones already sold.
  • Re:Truecrypt (Score:2, Insightful)

    by bcoff12 ( 584459 ) on Friday April 13, 2007 @09:07AM (#18717163)
    Yep, traveler mode + solid password + key files = oops I lost the USB stick with my password list on it, oh well.
  • Re:TrueCrypt (Score:5, Insightful)

    by Rob T Firefly ( 844560 ) on Friday April 13, 2007 @09:12AM (#18717215) Homepage Journal
    The type of people who have got the wherewithal to set up TrueCrpyt are not the market this was aiming for. This seems like a product made for the techno-clueless PHB types who just want to buy something off the shelf they can stick in their magic computer box and have it "just work," and who see that high a price on a simple 1-gig USB stick not as an obvious ripoff, but as a measure of how much good computer magic it must surely contain.
  • by antime ( 739998 ) on Friday April 13, 2007 @09:12AM (#18717217)
    What they admitted is that they have no idea what they are doing and have no idea what they are selling. You would have to be an idiot to buy anything security-related from a company like that.
  • by eddy ( 18759 ) on Friday April 13, 2007 @09:21AM (#18717319) Homepage Journal

    No surprise that the security is non-existant, but a nice surprise that tweakers.net[0] have people skilled enough to do a thorough technical review. Tip-of-the-Hat to the reviewers and keep the good work up. Anyone can run 3D benchmarks and make graphs against the previous generation, but this requires a different level of technical know-how. It's always been my hope that the future would feature this type of review, using reverse-engineering techniques for indepth technical reviews, as a norm not an exception.

    [0] No disrespect to the people of tweakers.net, I mean in the sense of 'any popular review site'.

  • by FuzzyDaddy ( 584528 ) on Friday April 13, 2007 @09:33AM (#18717463) Journal
    I don't know about you, but I don't keep original copies of data on a USB key. I use it to transfer files from one computer to another, so wiping the data after unsuccessful attempts, in this context, strikes me as a good idea.

  • by lexarius ( 560925 ) on Friday April 13, 2007 @09:37AM (#18717511)
    Shouldn't stripping the debugger symbols from the executable be sufficient? The problem is that people don't give up that easily. Having everything obviously labeled made the job quicker, but not having those won't stop a sufficiently skilled/bored hacker.
  • by rucs_hack ( 784150 ) on Friday April 13, 2007 @09:38AM (#18717513)
    it's not that silly. They saw a way to make money from the current delusion that data can be unbreakably secured.

    The only way to secure data is to make it so absolutely no-one but the authorised people have access to it. You can keep data secure physically if you isolate it from any form of access. However information does not work well if isolated like that, information has to be shareable to be useful, otherwise its just dead data, worthless bits.

    I have several pieces of information that are unhackable. That's because they are written to dvds in a non encrypted form, but the dvds themselves are stashed away where no-one can find them.

    That is alas also a delusion, because if I died tomorrow no doubt someone could find them. However, so long as I'm around to protect them, they are safe.

    That's not a good way to think if you realise that there is money in pseudo security. Paranoia of customers can be a source of income, and a wise businessman will take advantage of that whenever possible.
  • by Opportunist ( 166417 ) on Friday April 13, 2007 @09:38AM (#18717515)
    If you're satisfied with a level of security that was proven to be broken easily, you prove that you don't need any security altogether.

    If people don't bother breaking your security, they aren't that interested in your information in the first place.
    If people who are interested in your secrets are able to do so trivially, you can just as well abstain from encryption altogether to save you the hassle.
  • by hey! ( 33014 ) on Friday April 13, 2007 @09:41AM (#18717545) Homepage Journal
    When they are harping on the device's unique technology.

    Unique and secure are mutually exclusive.

    It is not possible, through a feat of sheer genius, to make something that is both novel and demonstrably secure. It turns out that genius isn't a particularly rare commodity. With 6.5 billion people in the world, there are 6,500 people who are walking around with one-in-a-million levels of intellect. Any one of those people, on a good day, can beat any other person on earth in a battle of wits. Any one of of the millions of people with one-in-a-thousand intellects probably can, too.

    Security is the one aspect of technology where state of the art is better than something which advances state of the art. State of the art means nobody has yet, even on the best day they've ever had, been able to beat it. We've seen some recent examples where very narrow vulnerabilities have been found in hashing algorithms, which has forced the state of the art to change slightly to favor drop in replacements. But by in large the state of the art has been remarkably stable over a long, long time. Anybody who claims to have something nobody else has probably has something worthless, if he has anything at all.

    This is why product security is so bad. It's not possible to differentiate yourself based on security, without affecting other areas such as usability. There is considerable irony in this fact: a product that is carefully thought out and implemented using widely known techniques would have a good chance of being unique. The problem is selling the product. Lotus Notes is a good example. It has its strengths and weaknesses, but as of the early 90s it was the most secure email system in the world. In fact it still would be. But it wasn't the easiest to use or administer. Unfortunately their attempts to make the system more attractive were failures. It's never been more attractive than Exchange. But it's always been more secure.
  • by mykepredko ( 40154 ) on Friday April 13, 2007 @09:42AM (#18717549) Homepage
    Sorry, I don't have the time to research the device, but what kind of testing/validation of this product was done? If this was for a government originally, shouldn't it have to have demonstrated some kind of hacker proof level of security? What was on the package was it marketing hype ("Protects your data from targeted attacks" which means nothing) or an indication that some kind of testing was done (ie "Meets MIL-1234 requirements for data security")?

    It looks like that for $175, you get a 1GByte USB key, with a Windows access program on the Flash in a non-protected partition and a pretty box.

    From the description it sounds like the product was just marketing razzamatazz with no real substance to back up marketing claims - so why would somebody have bought it in the first place?

    myke
  • Re:TrueCrypt (Score:1, Insightful)

    by SuseLover ( 996311 ) on Friday April 13, 2007 @09:50AM (#18717629)
    I'm sorry, but secure encryption is a complicated subject and anyone who doesn't understand it should not rely on it to be secure. If you lack the basic skills to properly implement it then you have no business using it.
  • by @madeus ( 24818 ) <slashdot_24818@mac.com> on Friday April 13, 2007 @10:01AM (#18717745)

    it's not that silly.
    I contend it is not only silly, but sufficently bad to warrent legal action, because whoever built it must have known how badly it was designed to start with.

    It appears that the system doesn't use a form of encyption unlocked by a key (entered by the user) to store the data - and that instead it simply requires use of a single instruction to the USB device indicate the data ought to be accessible or not. That just sounds ludicrous.

    If it had been developed in good faith, and this were a bug (rather than part of the design) and/or the result of a sphosticated exploit that it would have been hard to predict, I would be sympathetic. As I would if they had clearly indicated it's limitations (which they could have, but if they've taken the website down now, I'm guessing not).

    What's particularly telling for me is, while the company were quite happy to tout the supposed virtues of the product, they are clearly worried about it now they have been found out. That repesents a staggering failure by the designers of the software, their managers, the marketing and product design teams, the HR department who hired all these people of clearly very dubious virtue and the senior management involved.

    Either they are crooks (because they were complicit in touting such a crummy product that didn't really do what it claimed to do in a reasonable way) or are they are all, really, really dumb (and none of them asked pertinent questions of the other parties at any stage of product development).
  • Funny part is, all they did was run the program in a debugger, put a breakpoint after the clearly labelled "VerifyPassWord" function


    Wait. The executable was compiled with debug symbols turned on? With functions with easy-to-understand names? I mean, I know it's only security-through-obscurity, but c'mon! At least up the ante a little bit ... many programmers are not skilled enough to disassemble a program with no symbol table. And the ones that are ... *shrug* rely on the security of your methods, not on the obscurity of your code. IOW, they should have used encryption, even with the self-destruct mechanism.
  • by mlwmohawk ( 801821 ) on Friday April 13, 2007 @10:06AM (#18717809)
    Like other posters, I am at a loss at where to start.

    (1) If you don't have encryption, GOOD ENCRYPTION, you can't protect squat.
    (2) "Self Destruct" is interesting, but unless you have a custom micro-controller on the ram stick, AND an independent power supply, AND the device potted in epoxy, it is all just a made for TV gimmick.
    (3) Password.exe? I didn't see this in the article, but what happens if one plugs it into a Mac, Linux, FreeBSD, etc? Does it just work or does it self destruct?
    (4) With reference to #2, since the article showed that one could make the device read-only, would self-destruct no longer work? If so, it MUST be potted in epoxy.
    (5) Does the "self destruct" operate on the PC or th ram stick? We all know if it runs on the PC, it is doomed to fail.

    If they want to REALLY do this:

    (1) before everything, encrypt the data. This buys the device time to operate and basic security.
    (2) Install a PIC or something that MUST have an encoded heart beat with some sort of hard to reproduce calculated byte pattern.
    (3) Without a valid heart beat, the PIC will simply not enable the flash device.
    (4) With a valid heart beat, the system must pass a valid password hash string within a reasonable amount of time to the PIC, or the data will be destroyed.
    (5) After a number of failed attempts, the PIC will destroy the data.
    (6) When the heart beat stops, the PIC disables the flash. (It is presumed that the software clears he file system cache as well.)
    (7) Pot the damned device in epoxy.

  • by TheRaven64 ( 641858 ) on Friday April 13, 2007 @10:09AM (#18717855) Journal
    It is unlikely that the only copy of sensitive data would be on the USB stick. If it is destroyed, you still have the original copy somewhere more secure than your pocket. If it's destroyed accidentally, it could be a lot less of a problem than if it fell into the wrong hands.

    There are a lot of situations where having a local copy of the data is a convenience, rather than a necessity, and this would allow the convenience without the risk of it being stolen. If it's accidentally destroyed, then it's an inconvenience, not a disaster.

  • Ah, didn't see that part I guess. But still, with DLL functions you can name your function something like zxgvflqrt() or something.

    Still, there's no reason why one has to use DLLs, either. You can put everything into one .exe file if you like.
  • by Viol8 ( 599362 ) on Friday April 13, 2007 @11:09AM (#18718697) Homepage
    "You can put everything into one .exe file if you like."

    That would be the sensible way to go since its unlikely any other app would ever use that .dll. But sensible isn't something that generally applies to Windows programs these days as the amount of .dll splatter from even the simplest apps testifies.
  • by Anonymous Coward on Friday April 13, 2007 @11:17AM (#18718801)
    No, because you did your homework.
  • by AJWM ( 19027 ) on Friday April 13, 2007 @11:34AM (#18719057) Homepage
    Without the correct password, the controller chip will simply refuse to provide further access to the flash memory.

    So even if the password control worked (which it doesn't), you could get at the contents by desoldering the flash chip and putting it in a reader. (Something hobbyists have been doing with HD-DVD drives to reverse engineer/modify the firmware.) And this is a supposedly intelligence-service recommended device for government use? Right, go on, pull the other one.
  • by Elladan ( 17598 ) on Friday April 13, 2007 @12:33PM (#18719935)
    And this is a great example of why most people shouldn't be allowed near a security product without training. Training which includes getting their head slapped when they say things like this.

    Repeat after me:
    If the debug symbols in your executable have ANY EFFECT WHATSOEVER on the security of your product, your product is insecure.

    Let's say that again:
    Debug symbols are a good thing. They allow people to analyze the behavior of your software better. If analyzing the behavior of your software leads them to conclude that it's insecure, then it was insecure WHETHER OR NOT YOU HAD DEBUG SYMBOLS.

    A third time:
    A secure piece of software is secure whether or not you ship the debug symbols, the source code, and a giant manual explaining the design of the system in excruciating detail. If any of these things affects the security in any way, your design is broken.

    The fact is, every time something like this comes up, people start screaming "kill the messenger!" In this case, the messenger was the debug symbols. The message was, "this security product is a laughable toy."

    Come on, if they'd shipped you the software in non-compiled python, would you have screamed, "What fools! Only hand coded assembly can be secure!" ? Ridiculous. A secure design can be implemented in any language whatsoever, with or without source code, object code, and symbols.

    If you are the least bit worried about supplying all those things with your software, you have no business calling it a security product.

    It's a toy.

    Case in point: this memory stick. It sure is a good thing they made it so easy to analyze their system with debugging symbols. If they hadn't, people might still be falsely believing it was a security product, and putting their valuable data in it. Now wouldn't that be a terrible thing?
  • by Opportunist ( 166417 ) on Friday April 13, 2007 @12:46PM (#18720135)
    Not that bad, but it's just as good as leaving it lying around in plain view. The only difference is that you don't have to jump through hoops to get to your information.

    I'm usually amazed how many people would add more and more layers of "security" to their door, ignoring that the walls are made of paper. Sure, you, as the authorized user, have to go through time consuming hassle to get to your data, but the attacker, who doesn't care about damage, has it easy to circumvent your security. He might leave your system in ruins, but does he care?

    Security is a matter of "wholeness". The whole "building" of your security has to be flawless, even a little gap is enough for an attacker. The attacker only has to find the weakest link and pry it open.

    That's why it's usually easier to attack data than to defend it. Especially if the human element is in play.

Everybody likes a kidder, but nobody lends him money. -- Arthur Miller

Working...