Trust Is For Suckers: Lessons From the RSA Breach 79
wiredmikey writes "Andrew Jaquith has written a great analysis of lessons learned from the recent RSA Cyber Attack, from a customer's perspective. According to Jaquith, in the security industry, 'trust' is a somewhat slippery concept, defined in terms ranging from the cryptographic to the contractual. Bob Blakley, a Gartner analyst and former chief scientist of Tivoli, once infamously wrote that 'Trust is for Suckers.' What he meant is that trust is an emotional thing, a fragile bond whose value transcends prime number multiplication, tokens, drug tests or signatures — and that it is foolish to rely too much on it. Jaquith observed three things about the RSA incident: (1) even the most trusted technologies fail; (2) the incident illustrates what 'risk management' is all about; and (3) customers should always come first."
Trust is required (Score:5, Insightful)
Re: (Score:2, Insightful)
Trust but verify, means don't trust otherwise you would not have to verify. Non-thinking people like the phrase because their idol said it. Why middle class folks idolize someone who sold out the middle class I do not understand.
Re: (Score:2)
Re: (Score:2)
What do you think got us here?
Re: (Score:1)
Re: (Score:2)
Spending a crazy low taxes, one or the other folks. At this point we need to cut spending and when our economy is in decent shape raise taxes to get debt down.
Our problem was we spent and spent while cutting taxes and did not save up for the rainy days ahead.
Re: (Score:2)
Re:Trust is required (Score:4, Insightful)
Re: (Score:3, Insightful)
I don't hate anyone, I dislike people who work against folks I do like though. I don't like it when people idolize those who work against them either. "Trust, but verify" just makes no sense. "Never trust, always verify" at least makes good sense.
Re: (Score:2, Informative)
I don't hate anyone
Bullshit.
Re:Trust is required (Score:5, Informative)
No, what it means is that you don't blindly trust anybody, but you do verify periodically that the trust hasn't been abused. It's like granting a business the right to take money out of your checking account to cover expenses, like say a CC company. You trust them not to put things on the bill which you didn't authorized. And you verify at least once a month that everything that's on the bill was authorized by you.
Same thing here, the problem with RSA was that people trusted them, but there was no particular manner of verifying that the trust was well placed.
Re: (Score:1)
Trust but verify means that some (most?) of the time you trust, but occasionally you verify to ensure that trustworthiness is still warranted.
Don't trust means you have to verify every time. Trust without verifying means that sooner or later you're going to get taken.
Really, it's not rocket science. (grin)
Re: (Score:2)
> Why middle class folks idolize someone who sold out
> the middle class I do not understand.
It's easy. We all have our own private reality.
Here's mine. Whatever any POTUS may have done is inconsequential compared to what our Gov't did in 1913. They set in motion everything that's happened since when they handed over our economy to a semi-accountable, quasi-governmental, 100% privately owned for-profit banking cartel.
Re: (Score:2)
I'd like to rebut your rebuttal, but I'm having trouble figuring out what you think you rebutted.
So long, and thanks for the red herring.
Re: (Score:2)
That's one point. But I don't know why you consider it more important than all the others.
Personally I consider two points crucial:
1) The Civil War, when BOTH sides centralized control of the government over the populace.
2) The Union Pacific addendum, which got corporations to be considered legal persons.
Basically, though, when the frontier closed, increased governmental control over the citizenry started ramping up immediately. The increase was slow at first. But you could also pick the Constitution gi
Re: (Score:2)
> I don't know why you consider it more important than all the others.
Because we were talking about the economy.
But I agree completely with the milestones you picked for the general erosion of liberty. I hope one day we'll be able to point to events that put us back on track.
Re:Trust is required (Score:5, Insightful)
> trust is also required to have a functioning society.
Maybe, to a degree.
>"Trust but verify" is the best.
Indeed it is. Trust works when claims can be supported.
Problems happen when information is just not verifiable, such as in closed source products, secret negotiations, undisclosed business interests, or whenever information is withheld or misrepresented.
When "trust me" is all the verification a vendor offers, trust is for suckers.
Re: (Score:2)
closed source products
Again coming back to a lack of trust, of your customers and clients. Trust them with the source code, and verify they aren't misusing it.
Re: (Score:3)
Problems happen when information is just not verifiable, such as in closed source products, secret negotiations, undisclosed business interests, or whenever information is withheld or misrepresented.
You're mixing things up, either intentionally or because zealotry trumps reason in your thought processes. Not getting source code is not the same as being lied to, either by omission or by commission. You'll never have all the information about the making of a product available. You don't have the secret Coca Cola recipe, but that doesn't stop you from drinking coke. You don't know the composition of the various alloys your car is built of, but you do drive. You don't know the maintenance history of the pl
Re: (Score:2)
> You don't have the secret Coca Cola recipe, but that doesn't stop you from drinking coke
I have the ingredients, which opens up the product considerably. And actually does stop me from drinking it.
> Not getting source code is not the same as being lied to
I didn't say it was - why misrepresent my words?. It is an example of unverifiability,. not dishonesty.
> You don't know the composition of the various alloys your car is built of, but you do drive
I may not know that off the top of my head, but i
Re: (Score:3)
Re: (Score:2)
I don't think this is anything new. Corporations have been behaving like that for many decades now. What's changed is that you have fewer options and the corporations have much broader reach than they used to have. The places where you didn't have choices, the corporations were pretty transparent about ripping the customers off, and since there were no other options, there was little choice but to buy from them.
But, at least for folks living in cities, there was pretty much always a small business which one
Re: (Score:2)
I'm sorry, but I've never understood what "trust but verify" is supposed to mean. If you trust someone, then by definition, you think you don't need to verify it. The only time I verify anything is if I don't trust it!
Re: (Score:2, Funny)
I'll take your word that you've never understood what "trust but verify" means for the moment but I may look into your post history later to see if it's true.
Re: (Score:2)
I use it all the time.
I download software from people I have some trust with.
But I always run at least a cursory virus scan and always use custom install options if available.
I then try out the software looking for problems. making sure it behaves as I was told it would.
If it starts communicating with the outside world where I think it should not or installing services and driver where they need not be my "trust" get revoked.
Just because I trust you with the keys to my house does not mean that will not chan
Re: (Score:2)
Re: (Score:2)
So every time you verify your vendor, you are suspending the trust you have in it. You are alternating between trusting and verifying. You are never doing both at the same time.
"Trust but verify" rule fools some people (Score:2)
I think it's actually a bad platitude, because "verify" is always implemented as a nested trust, and that trust often turns out to be serial but the platitude glosses over that.
It goes like this: Is this person authorized to enter the building? Yes, probably, or else why would he be at the door? Well, let's verify: does he have a keycard? Yes, he has a keycard, and we trust the keycard. Why do we trust the keycard? Because only party X has the secret number hidden
Re: (Score:2)
"Require an amazing conspiracy" is closer to what trust means in terms of security than "trust but verify". But it is still too weak for a security context. And in some ways, it is the polar opposite of what "trust" means in context.
In security (of the mathematical, physical, or professional kind), a "trusted source" is a source that you are compelled to believe, because without their input, the security model would be impossible. Indeed, you want to have as few trusted sources as possible. For example,
Re: (Score:2)
about RNGs: "because it is impossible (in general) that it is not biased in some way"
Impossible to prove it's not biased.
Re: (Score:1)
Like Warren Buffett said... (Score:5, Insightful)
It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.
RSA was hacked, ultimately, because of short-term MBA thinking (I have one, so I know the type). If there's only a 10% chance of a serious security breach, then 90% of the time you can scrimp on security, and you won't merely get away with it, you'll be rewarded for "doing more with less". This same dynamic is often seen in both Wall Street and Washington.
I really wish we were required to read Nassim Nicholas Taleb's "Fooled by Randomness" and "Black Swan" in school, instead of Thomas Friedman's dreck. At least they couldn't say they weren't forewarned.
Re: (Score:2)
The bad news is that reputation nowadays is something you buy (from the mass media). That saying doesn't work anymore.
Re: (Score:2)
Yep, "legacy" PR based on controlling the message is fundamentally flawed and cause many problems too.
Re: (Score:3)
They make people read Friedman to get MBAs!? Well that goes a long way to explaining why the world's so fucked up.
Re: (Score:2)
It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.
The real problem is that the idiots that caused the hurt to RSA's reputation are not hurt themselves. They will be with Verisign next year, or somewhere else. If we don't watch the corporate level Merry-Go-Round, it will never stop.
Re: (Score:2)
Re: (Score:2)
Indeed, if only we could attach some sort of a mark which would let people know about the danger they face and how they should be audited. Obviously it should be a red A because of the threat they are to the business when auditors are on premises.
Re: (Score:2)
Yea, this comment led me to think about the issues:
http://news.ycombinator.com/item?id=2685975 [ycombinator.com]
From technician's point of view (Score:2)
I agree mostly with that - but not in full extent. I've lost faith in this company quite some time ago once I've seen their Authentication Management product (software required to authenticate against tokens). It is clearly a crap-quality product made by MBAs for MBAs. It looks like it's been severely crippled by some cheap outsourced programmers (typical corporate attitude - "cuting costs"). This particular breach mainly confirmed my earlier opinion about RSA.
Losing reputation also takes quite a long tim
Re: (Score:2)
You mean Milton Friedman?
History (Score:5, Interesting)
Of the people who I've talked to with RSA tokens, most have said they're now actively planning a migration off of RSA tokens.
It isn't that they were hacked. Shit happens, even to the best of them. It was the lack of information and lack of transparency by RSA (EMC) on the whole event. Trust has been lost.
I'm not talking about public statements or mea culpas. I'm talking about why they weren't 100% open and upfront with existing customers right away. It gives the impression that EMC's execs were hoping no one would get hacked and it would all fade away over time. That they could just ride this out and weren't going to have to fork over a boatload of cash to replace everyone's tokens, thus not taking a hit on their stock or bonuses.
They were wrong, and now the price they are going to pay is not only replacing everyone's tokens, but a loss of trust and hence future business.
Re: (Score:2)
Of the people who I've talked to with RSA tokens, most have said they're now actively planning a migration off of RSA tokens.
It isn't that they were hacked. Shit happens, even to the best of them. It was the lack of information and lack of transparency by RSA (EMC) on the whole event. Trust has been lost.
I'm not talking about public statements or mea culpas. I'm talking about why they weren't 100% open and upfront with existing customers right away. It gives the impression that EMC's execs were hoping no one would get hacked and it would all fade away over time. That they could just ride this out and weren't going to have to fork over a boatload of cash to replace everyone's tokens, thus not taking a hit on their stock or bonuses.
They were wrong, and now the price they are going to pay is not only replacing everyone's tokens, but a loss of trust and hence future business.
I just got my quote for replacement tokens. They're giving me a 3 to 6 month estimate on when I'll actually have the new tokens. I can quote the whole chain from "Nothing was stolen" to "Nothing was stolen that could replicate a token" to "Yea, our bad."
Re: (Score:2)
Why are you even replacing them?
Would you not be better off moving to a competitors service?
Re: (Score:2)
Long term, anyone on RSA is probably better off moving to a competitors service. However, before you can do that, you need to figure out which is best, figure out how to implement it, get budget approval, buy it, test the change, and implement it. Just changing all the tokens is a relatively cheap and easy short-term fix - and you can still plan on moving later.
^^^ This. We're awaiting around 100k tokens but the worldwide amount needing to be replaced far outweighs the supply.
Re: (Score:2)
Yea, likely a cover-up culture, another common problem.
Re: (Score:2)
Exactly my thoughts. They hoped the attackers would be competent enough to not get caught and the attacks not being traced back to broken SecureID. Seems the RSA hack was pretty simple, as the attackers subsequently got detected when they tried to use the data.
IMO, RSA has lost any and all credibility as a security solutions provider. Not only the completely unacceptable delay tactics, but also that this information could be hacked in the first place. Only terminally stupid or terminally greedy people leave
Re: (Score:1)
Of the people who I've talked to with RSA tokens, most have said they're now actively planning a migration off of RSA tokens.
It isn't that they were hacked. Shit happens, even to the best of them. It was the lack of information and lack of transparency by RSA (EMC) on the whole event. They were wrong, and now the price they are going to pay is not only replacing everyone's tokens, but a loss of trust and hence future business.
I don't think the worst thing is that they were hacked, I think the real incompetence is having the seeds stored on a public facing system, ready to be stolen if someone did get in.
A company of their stature should have known to air-gap this kind of information. I think this is equivalent to those web sites that have their customers passwords stored in plain text.
Serious Definitional issues... (Score:5, Insightful)
Anyway, back to the matter at hand: This article seems like a particularly bad situation for the two sharply different definitions of "trusted" to come into collision without very, very careful elucidation.
On the one hand, you have the usual social usage of "trust": more or less "the belief that a person or device will do what it says/act in good faith/do what it says on the tin/etc."
On the other, you have the paranoid security wonk definition of "trusted": "the state of being a component of the security system whose overall integrity depends on your integrity as a component."
The two could really hardly be more different while still occupying the same word. The former is socially valuable, and societies become dystopian hellholes without it; but it is a very poor ingredient upon which to build technologically secure systems. The second is an unfortunate necessity; but it is one of the marks of a good security system that it knows exactly what parts of the system are 'trusted' and what parts need not be.(a second, and important, mark of a good security system is that the set of 'trusted' systems has been culled as much as possible, and that no 'trusted' systems remain that you do not have good reason to 'trust' in the usual social sense.)
In the case of RSA, you really had a massive failure on both counts: In the social sense of "trust", RSA arguably oversold the security of their solution, was intensely cagey about the break-in until breaches at major defense contractors forced their hands, and generally fucked around as though they were trying to burn social trust. In the infosec sense, the fuckup was that(by retaining all token seed keys, RSA made themselves a 'trusted' component of every customer's security infrastructure. It is an architectural limitation of the RSA system that there must be a trusted system, with access to the seeds and an RTC, in order to perform authentication attempt validations. However, it is Not a requirement that there be other online seed stores out of the customers' control. By making themselves an extraneous, excess, trusted system, RSA weakened all their customers' security. Now that they are a 'trusted' component that no sensible people have social trust in, they are finding themselves written out of a fair few security architectures...
That is the real crux of the matter. From what I've heard(both public-ally and informally from friends working in IT at largish RSA customers) the hack was some seriously sophisticated work, rather than somebody walking in through an unlocked door. However, it barely matters how tough their security is; because they never should have set themselves up as part of their customers' systems in the first place. Had the customers done the keyfill for the tokens, it wouldn't have mattered whether they had been hacked or not.
Re: (Score:1)
The former is socially valuable, and societies become dystopian hellholes without it
why so?
Re: (Score:2)
My assumption is that there are two basic flavors of factors at work: One would be 'transaction costs' in the broad sense. Every dollar spent on extra lawyer hours to draw up ironclad contracts, loss-prevention guys watching for shoplif
Fresh Air! (Score:1)
A one page article. Ahhh relief.
Trust what? (Score:4, Interesting)
From my understanding, the RSA breach basically broke into the database that ties serial numbers to the internal "secret" that's used to generate OTP's. So go back to before the breach, and assume you're an RSA customer. To be their customer, you have to trust them. You can trust them to:
Note that options 1 and 3 are mutually exclusive. Now, it would be nice to be able to choose your level of risk tolerance yourself and decide on #1 vs #2 + #3, but there are a reasonable number of customers who actively dislike being forced to make choices. And there would be a whole lot of customers who would be really mad if, after losing their database, were told by RSA "Sorry, all of your tokens are now useless keyrings. No choice but to replace them all"
To me it's like the evolution of passwords. In the beginning, if you forgot your password, your admin could tell you what it was. Then passwords got hashed, and your admin couldn't tell you what it was, but could reset it for you, and security was enhanced. Then passwords were used as encryption keys, and now your admin couldn't tell you what it was or reset it. If you forgot it, your data was gone. Once again, a security enhancement, but now a greater danger of data loss through forgetfulness.
Re: (Score:2)
1 and 3 are contradictory, but close approximations can be made that are not.
The data could have been kept not connected to any computer networks and possibly even stored on tapes in some secure location so #3 could easily be done. Then you just need to make sure no one breaks into the location you store those tapes in. That is what they like to call a solved problem, with cost going up as you add security.
Re: (Score:2)
Taking the data offline and securing it physically is just a prudent way to secure it. To me, that still falls under #2, trusting them to keep it secure while #3, making it remain available. RSA did, I assume, a reasonable job at keeping the data available, but failed to keep it secure.
But I would have to say you're exactly right on what security should be expected. There is some data that not only can, but really should be secured by taking it completely offline. Hopefully things like this will make pe
Re: (Score:2)
They could just let you change the secret. Then if you lost the DB you could make the tokens work again without recovering the data, just like using hashed passwords lets you reset lost passwords.
Re: (Score:2)
They won't even let you change the battery, changing the secret is right out. :)
Re: (Score:2)
And there lies the problem. How do you trust a device that you can't touch?
Re: (Score:2)
They could generate and store all keys on an offline server, under heavy lock-and-key. Then only employee misuse and social engineering are the attack vectors. It's a lot easier to protect against.
Remember Mission: Impossible? The 'secure server' room? Do something like that (but probably on a lesser level). I like to hack into stuff, but I'm sure as hell not going to crawl through vents unless it will be a big enough score to pay off governments.
More history (Score:1)
most trusted technologies? (Score:3)
"(1) even the most trusted technologies fail;"
Uh, dudes.
THE INTERNET IS NOT SECURE
If you hooked your database up to the Internet, then you are the fail.
I keep saying... (Score:2)
Two Words: Yubikey (Score:4, Informative)
Yubikey [yubico.com] has secure tokens that you can "seed" yourself, for use with your own authentication servers. The scam is that RSA made some idiots think think there was no way to do this without their auth servers; Thereby fooling fools into using a less secure system with a mandatory recurring payment for RSA (to access the auth servers).
Additionally, I prefer the model that has RFID for physical access.
Relying on an outside source to have our cryptokeys is just adding another point of failure. EVERYONE relying on them is just creating THE BIGGEST point of failure possible... Every time I talked to security minded folks that used RSA tokens, I asked them, "So. How secure are RSAs severs? You do any security audits on them lately?" The blank expressions were priceless.
Blake's 7 (Score:3)
Cally: My people have a saying: "A man who trusts can never be betrayed, only mistaken."
Avon: Life expectancy must be fairly short among your people.
Avon: Cally was murdered. So were most of her people.
Also, trusting marketing people is really stupid (Score:2)
They almost all lie. One of the jobs of a legal department in a large company is to ensure the marketing scum can promise you the moon and the stars and that when you find out what you actually got, you have no legal recourse.
The only way to deal with this is to a) have enough competence yourself to get suspicious early and b) hire independent, competent outside experts than cannot easily be bought or intimidated to evaluate the product. The amount of lying going on in the security industry is staggering.
Uhhh.... (Score:2)
This is why we should re-instate hostage swaps (Score:2)
Sure, we'll buy your security solution. We'll just need a contract, an SLA, and your first born son and heir. No, you can't have mine - he's currently living with our biggest customer.
I think we'd see a bit more spending on the quality assurance department then, don't you?
Trust has to be earned (Score:2)
Trust is not something you gain by marketing or fancy words - it is defined by what you do consistently. Trust takes a long time to be built, but can be lost in an instant.
Breakthrough Authentication Technology CO for sale (Score:1)