Protected Memory Stick Easily Cracked 220
Martin_Sturm writes "A $175 1GB USB stick designed to protect your data turns out to be a very insecure. According to the distributer of the Secustick, the safety of the data is ensured: 'Due to its unique technology it has the ability to destroy itself once an incorrect password is entered.' The Secustick is used by various European governments and organizations to secure data on USB sticks. Tweakers.net shows how easy it is to break the protection of the stick. Quoting: 'It should be clear that the stick's security is quite useless: a simple program can be used to fool the Secustick into sending its unlock command without knowing the password. Besides, the password.exe application can be adapted so that it accepts arbitrary passwords.' The manufacturer got the message and took the Secustick website offline. The site give a message (translated from Dutch): 'Dear visitor, this site is currently unavailable due to security issues of the Secustick. We are currently working on an improved version of the Secustick.'"
Well they could have been like other companies (Score:5, Insightful)
Re:Truecrypt (Score:2, Insightful)
Re:TrueCrypt (Score:5, Insightful)
Re:Well they could have been like other companies (Score:5, Insightful)
A surprise and a non-surprise. (Score:5, Insightful)
No surprise that the security is non-existant, but a nice surprise that tweakers.net[0] have people skilled enough to do a thorough technical review. Tip-of-the-Hat to the reviewers and keep the good work up. Anyone can run 3D benchmarks and make graphs against the previous generation, but this requires a different level of technical know-how. It's always been my hope that the future would feature this type of review, using reverse-engineering techniques for indepth technical reviews, as a norm not an exception.
[0] No disrespect to the people of tweakers.net, I mean in the sense of 'any popular review site'.
Re:Well they could have been like other companies (Score:5, Insightful)
Re:Security through obscurity? (Score:3, Insightful)
Re:Well they could have been like other companies (Score:3, Insightful)
The only way to secure data is to make it so absolutely no-one but the authorised people have access to it. You can keep data secure physically if you isolate it from any form of access. However information does not work well if isolated like that, information has to be shareable to be useful, otherwise its just dead data, worthless bits.
I have several pieces of information that are unhackable. That's because they are written to dvds in a non encrypted form, but the dvds themselves are stashed away where no-one can find them.
That is alas also a delusion, because if I died tomorrow no doubt someone could find them. However, so long as I'm around to protect them, they are safe.
That's not a good way to think if you realise that there is money in pseudo security. Paranoia of customers can be a source of income, and a wise businessman will take advantage of that whenever possible.
Re:Well they could have been like other companies (Score:3, Insightful)
If people don't bother breaking your security, they aren't that interested in your information in the first place.
If people who are interested in your secrets are able to do so trivially, you can just as well abstain from encryption altogether to save you the hassle.
It should have been obvious (Score:4, Insightful)
Unique and secure are mutually exclusive.
It is not possible, through a feat of sheer genius, to make something that is both novel and demonstrably secure. It turns out that genius isn't a particularly rare commodity. With 6.5 billion people in the world, there are 6,500 people who are walking around with one-in-a-million levels of intellect. Any one of those people, on a good day, can beat any other person on earth in a battle of wits. Any one of of the millions of people with one-in-a-thousand intellects probably can, too.
Security is the one aspect of technology where state of the art is better than something which advances state of the art. State of the art means nobody has yet, even on the best day they've ever had, been able to beat it. We've seen some recent examples where very narrow vulnerabilities have been found in hashing algorithms, which has forced the state of the art to change slightly to favor drop in replacements. But by in large the state of the art has been remarkably stable over a long, long time. Anybody who claims to have something nobody else has probably has something worthless, if he has anything at all.
This is why product security is so bad. It's not possible to differentiate yourself based on security, without affecting other areas such as usability. There is considerable irony in this fact: a product that is carefully thought out and implemented using widely known techniques would have a good chance of being unique. The problem is selling the product. Lotus Notes is a good example. It has its strengths and weaknesses, but as of the early 90s it was the most secure email system in the world. In fact it still would be. But it wasn't the easiest to use or administer. Unfortunately their attempts to make the system more attractive were failures. It's never been more attractive than Exchange. But it's always been more secure.
Validation/Verification of Security (Score:2, Insightful)
It looks like that for $175, you get a 1GByte USB key, with a Windows access program on the Flash in a non-protected partition and a pretty box.
From the description it sounds like the product was just marketing razzamatazz with no real substance to back up marketing claims - so why would somebody have bought it in the first place?
myke
Re:TrueCrypt (Score:1, Insightful)
Re:Well they could have been like other companies (Score:4, Insightful)
It appears that the system doesn't use a form of encyption unlocked by a key (entered by the user) to store the data - and that instead it simply requires use of a single instruction to the USB device indicate the data ought to be accessible or not. That just sounds ludicrous.
If it had been developed in good faith, and this were a bug (rather than part of the design) and/or the result of a sphosticated exploit that it would have been hard to predict, I would be sympathetic. As I would if they had clearly indicated it's limitations (which they could have, but if they've taken the website down now, I'm guessing not).
What's particularly telling for me is, while the company were quite happy to tout the supposed virtues of the product, they are clearly worried about it now they have been found out. That repesents a staggering failure by the designers of the software, their managers, the marketing and product design teams, the HR department who hired all these people of clearly very dubious virtue and the senior management involved.
Either they are crooks (because they were complicit in touting such a crummy product that didn't really do what it claimed to do in a reasonable way) or are they are all, really, really dumb (and none of them asked pertinent questions of the other parties at any stage of product development).
Re:Well they could have been like other companies (Score:5, Insightful)
Wait. The executable was compiled with debug symbols turned on? With functions with easy-to-understand names? I mean, I know it's only security-through-obscurity, but c'mon! At least up the ante a little bit
Stupid is as stupid does (Score:4, Insightful)
(1) If you don't have encryption, GOOD ENCRYPTION, you can't protect squat.
(2) "Self Destruct" is interesting, but unless you have a custom micro-controller on the ram stick, AND an independent power supply, AND the device potted in epoxy, it is all just a made for TV gimmick.
(3) Password.exe? I didn't see this in the article, but what happens if one plugs it into a Mac, Linux, FreeBSD, etc? Does it just work or does it self destruct?
(4) With reference to #2, since the article showed that one could make the device read-only, would self-destruct no longer work? If so, it MUST be potted in epoxy.
(5) Does the "self destruct" operate on the PC or th ram stick? We all know if it runs on the PC, it is doomed to fail.
If they want to REALLY do this:
(1) before everything, encrypt the data. This buys the device time to operate and basic security.
(2) Install a PIC or something that MUST have an encoded heart beat with some sort of hard to reproduce calculated byte pattern.
(3) Without a valid heart beat, the PIC will simply not enable the flash device.
(4) With a valid heart beat, the system must pass a valid password hash string within a reasonable amount of time to the PIC, or the data will be destroyed.
(5) After a number of failed attempts, the PIC will destroy the data.
(6) When the heart beat stops, the PIC disables the flash. (It is presumed that the software clears he file system cache as well.)
(7) Pot the damned device in epoxy.
Re:Well they could have been like other companies (Score:5, Insightful)
There are a lot of situations where having a local copy of the data is a convenience, rather than a necessity, and this would allow the convenience without the risk of it being stolen. If it's accidentally destroyed, then it's an inconvenience, not a disaster.
Re:Well they could have been like other companies (Score:3, Insightful)
Still, there's no reason why one has to use DLLs, either. You can put everything into one
Re:Well they could have been like other companies (Score:3, Insightful)
That would be the sensible way to go since its unlikely any other app would ever use that
Re:Does everyone have to be an expert on everythin (Score:1, Insightful)
Re:No encryption... (Score:3, Insightful)
So even if the password control worked (which it doesn't), you could get at the contents by desoldering the flash chip and putting it in a reader. (Something hobbyists have been doing with HD-DVD drives to reverse engineer/modify the firmware.) And this is a supposedly intelligence-service recommended device for government use? Right, go on, pull the other one.
Re:Security through obscurity? (Score:3, Insightful)
Repeat after me:
If the debug symbols in your executable have ANY EFFECT WHATSOEVER on the security of your product, your product is insecure.
Let's say that again:
Debug symbols are a good thing. They allow people to analyze the behavior of your software better. If analyzing the behavior of your software leads them to conclude that it's insecure, then it was insecure WHETHER OR NOT YOU HAD DEBUG SYMBOLS.
A third time:
A secure piece of software is secure whether or not you ship the debug symbols, the source code, and a giant manual explaining the design of the system in excruciating detail. If any of these things affects the security in any way, your design is broken.
The fact is, every time something like this comes up, people start screaming "kill the messenger!" In this case, the messenger was the debug symbols. The message was, "this security product is a laughable toy."
Come on, if they'd shipped you the software in non-compiled python, would you have screamed, "What fools! Only hand coded assembly can be secure!" ? Ridiculous. A secure design can be implemented in any language whatsoever, with or without source code, object code, and symbols.
If you are the least bit worried about supplying all those things with your software, you have no business calling it a security product.
It's a toy.
Case in point: this memory stick. It sure is a good thing they made it so easy to analyze their system with debugging symbols. If they hadn't, people might still be falsely believing it was a security product, and putting their valuable data in it. Now wouldn't that be a terrible thing?
Re:Well they could have been like other companies (Score:3, Insightful)
I'm usually amazed how many people would add more and more layers of "security" to their door, ignoring that the walls are made of paper. Sure, you, as the authorized user, have to go through time consuming hassle to get to your data, but the attacker, who doesn't care about damage, has it easy to circumvent your security. He might leave your system in ruins, but does he care?
Security is a matter of "wholeness". The whole "building" of your security has to be flawless, even a little gap is enough for an attacker. The attacker only has to find the weakest link and pry it open.
That's why it's usually easier to attack data than to defend it. Especially if the human element is in play.