Forgot your password?
typodupeerror
Security IT

Trust Is For Suckers: Lessons From the RSA Breach 79

Posted by Soulskill
from the and-we're-back dept.
wiredmikey writes "Andrew Jaquith has written a great analysis of lessons learned from the recent RSA Cyber Attack, from a customer's perspective. According to Jaquith, in the security industry, 'trust' is a somewhat slippery concept, defined in terms ranging from the cryptographic to the contractual. Bob Blakley, a Gartner analyst and former chief scientist of Tivoli, once infamously wrote that 'Trust is for Suckers.' What he meant is that trust is an emotional thing, a fragile bond whose value transcends prime number multiplication, tokens, drug tests or signatures — and that it is foolish to rely too much on it. Jaquith observed three things about the RSA incident: (1) even the most trusted technologies fail; (2) the incident illustrates what 'risk management' is all about; and (3) customers should always come first."
This discussion has been archived. No new comments can be posted.

Trust Is For Suckers: Lessons From the RSA Breach

Comments Filter:
  • Trust is required (Score:5, Insightful)

    by houstonbofh (602064) on Friday June 24, 2011 @03:08PM (#36558814)
    The problem is that trust is also required to have a functioning society. The higher the trust, the better a society can function. The lower the overall trust (More corruption) the less effective it is. I think "Trust but verify" is the best.
    • Re: (Score:2, Insightful)

      by h4rr4r (612664)

      Trust but verify, means don't trust otherwise you would not have to verify. Non-thinking people like the phrase because their idol said it. Why middle class folks idolize someone who sold out the middle class I do not understand.

      • Yeah, those shitty 80's. We're all much better off now!
        • by h4rr4r (612664)

          What do you think got us here?

          • Greed and idiocy, which is expected and part of the system and should be self correcting over the long term if you let it happen. What's going to really fuck us is spending, both from Republicrats and Demublicans.
            • by h4rr4r (612664)

              Spending a crazy low taxes, one or the other folks. At this point we need to cut spending and when our economy is in decent shape raise taxes to get debt down.

              Our problem was we spent and spent while cutting taxes and did not save up for the rainy days ahead.

      • It is ironic that the man who made "trust but verify" famous is much more often trusted than verified...
      • by houstonbofh (602064) on Friday June 24, 2011 @03:40PM (#36559196)
        A whole post of information and all you see is the quote at the end. You might want to read "The Last Centurion" by John Ringo for some good information on high vs low trust societies. Or not, since he might like people you hate.
        • Re: (Score:3, Insightful)

          by h4rr4r (612664)

          I don't hate anyone, I dislike people who work against folks I do like though. I don't like it when people idolize those who work against them either. "Trust, but verify" just makes no sense. "Never trust, always verify" at least makes good sense.

          • Re: (Score:2, Informative)

            by Anonymous Coward

            I don't hate anyone

            Bullshit.

      • Re:Trust is required (Score:5, Informative)

        by hedwards (940851) on Friday June 24, 2011 @03:48PM (#36559296)

        No, what it means is that you don't blindly trust anybody, but you do verify periodically that the trust hasn't been abused. It's like granting a business the right to take money out of your checking account to cover expenses, like say a CC company. You trust them not to put things on the bill which you didn't authorized. And you verify at least once a month that everything that's on the bill was authorized by you.

        Same thing here, the problem with RSA was that people trusted them, but there was no particular manner of verifying that the trust was well placed.

      • by Anonymous Coward

        Trust but verify means that some (most?) of the time you trust, but occasionally you verify to ensure that trustworthiness is still warranted.

        Don't trust means you have to verify every time. Trust without verifying means that sooner or later you're going to get taken.

        Really, it's not rocket science. (grin)

      • > Why middle class folks idolize someone who sold out
        > the middle class I do not understand.

        It's easy. We all have our own private reality.

        Here's mine. Whatever any POTUS may have done is inconsequential compared to what our Gov't did in 1913. They set in motion everything that's happened since when they handed over our economy to a semi-accountable, quasi-governmental, 100% privately owned for-profit banking cartel.

        • by HiThere (15173)

          That's one point. But I don't know why you consider it more important than all the others.

          Personally I consider two points crucial:
          1) The Civil War, when BOTH sides centralized control of the government over the populace.
          2) The Union Pacific addendum, which got corporations to be considered legal persons.

          Basically, though, when the frontier closed, increased governmental control over the citizenry started ramping up immediately. The increase was slow at first. But you could also pick the Constitution gi

          • > I don't know why you consider it more important than all the others.

            Because we were talking about the economy.

            But I agree completely with the milestones you picked for the general erosion of liberty. I hope one day we'll be able to point to events that put us back on track.

    • by flaming error (1041742) on Friday June 24, 2011 @03:27PM (#36559044) Journal

      > trust is also required to have a functioning society.
      Maybe, to a degree.

      >"Trust but verify" is the best.
      Indeed it is. Trust works when claims can be supported.

      Problems happen when information is just not verifiable, such as in closed source products, secret negotiations, undisclosed business interests, or whenever information is withheld or misrepresented.

      When "trust me" is all the verification a vendor offers, trust is for suckers.

      • closed source products

        Again coming back to a lack of trust, of your customers and clients. Trust them with the source code, and verify they aren't misusing it.

      • by ChatHuant (801522)

        Problems happen when information is just not verifiable, such as in closed source products, secret negotiations, undisclosed business interests, or whenever information is withheld or misrepresented.

        You're mixing things up, either intentionally or because zealotry trumps reason in your thought processes. Not getting source code is not the same as being lied to, either by omission or by commission. You'll never have all the information about the making of a product available. You don't have the secret Coca Cola recipe, but that doesn't stop you from drinking coke. You don't know the composition of the various alloys your car is built of, but you do drive. You don't know the maintenance history of the pl

        • > You don't have the secret Coca Cola recipe, but that doesn't stop you from drinking coke

          I have the ingredients, which opens up the product considerably. And actually does stop me from drinking it.

          > Not getting source code is not the same as being lied to
          I didn't say it was - why misrepresent my words?. It is an example of unverifiability,. not dishonesty.

          > You don't know the composition of the various alloys your car is built of, but you do drive
          I may not know that off the top of my head, but i

    • by idontgno (624372)
      The "verify" part implies a degree of transparency and insight that's rare nowadays. The fact that governments have to write laws that compel breached firms to notify their affected constituents in a reasonably timely and understandable manner is proof that if their reputation and sales are on the line, you'll only learn the absolute minimum dictated with the weight of unavoidable severe consequences. And that is a miserable basis for "verify", which makes "trust" a fool's game.
      • by hedwards (940851)

        I don't think this is anything new. Corporations have been behaving like that for many decades now. What's changed is that you have fewer options and the corporations have much broader reach than they used to have. The places where you didn't have choices, the corporations were pretty transparent about ripping the customers off, and since there were no other options, there was little choice but to buy from them.

        But, at least for folks living in cities, there was pretty much always a small business which one

    • by LordNimon (85072)

      I'm sorry, but I've never understood what "trust but verify" is supposed to mean. If you trust someone, then by definition, you think you don't need to verify it. The only time I verify anything is if I don't trust it!

      • Re: (Score:2, Funny)

        by Duradin (1261418)

        I'll take your word that you've never understood what "trust but verify" means for the moment but I may look into your post history later to see if it's true.

      • by Dishevel (1105119)

        I use it all the time.
        I download software from people I have some trust with.
        But I always run at least a cursory virus scan and always use custom install options if available.
        I then try out the software looking for problems. making sure it behaves as I was told it would.
        If it starts communicating with the outside world where I think it should not or installing services and driver where they need not be my "trust" get revoked.
        Just because I trust you with the keys to my house does not mean that will not chan

      • I have a vendor of choice. Most of the time I just order stuff and assume that he is giving me a good price. Occasionally I price check him. If I ever find he has abused my trust, I get a new vendor. The alternative is trust all the time (Stupid) and trust none of the time (a lot of work). The problem is the "verify" part. How do you do it with some companies?
        • by LordNimon (85072)

          So every time you verify your vendor, you are suspending the trust you have in it. You are alternating between trusting and verifying. You are never doing both at the same time.

    • I think "Trust but verify" is the best.

      I think it's actually a bad platitude, because "verify" is always implemented as a nested trust, and that trust often turns out to be serial but the platitude glosses over that.

      It goes like this: Is this person authorized to enter the building? Yes, probably, or else why would he be at the door? Well, let's verify: does he have a keycard? Yes, he has a keycard, and we trust the keycard. Why do we trust the keycard? Because only party X has the secret number hidden

      • "Require an amazing conspiracy" is closer to what trust means in terms of security than "trust but verify". But it is still too weak for a security context. And in some ways, it is the polar opposite of what "trust" means in context.

        In security (of the mathematical, physical, or professional kind), a "trusted source" is a source that you are compelled to believe, because without their input, the security model would be impossible. Indeed, you want to have as few trusted sources as possible. For example,

        • about RNGs: "because it is impossible (in general) that it is not biased in some way"

          Impossible to prove it's not biased.

  • by MetricT (128876) on Friday June 24, 2011 @03:09PM (#36558830) Homepage

    It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.

    RSA was hacked, ultimately, because of short-term MBA thinking (I have one, so I know the type). If there's only a 10% chance of a serious security breach, then 90% of the time you can scrimp on security, and you won't merely get away with it, you'll be rewarded for "doing more with less". This same dynamic is often seen in both Wall Street and Washington.

    I really wish we were required to read Nassim Nicholas Taleb's "Fooled by Randomness" and "Black Swan" in school, instead of Thomas Friedman's dreck. At least they couldn't say they weren't forewarned.

    • The bad news is that reputation nowadays is something you buy (from the mass media). That saying doesn't work anymore.

      • by yuhong (1378501)

        Yep, "legacy" PR based on controlling the message is fundamentally flawed and cause many problems too.

    • They make people read Friedman to get MBAs!? Well that goes a long way to explaining why the world's so fucked up.

    • It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.

      The real problem is that the idiots that caused the hurt to RSA's reputation are not hurt themselves. They will be with Verisign next year, or somewhere else. If we don't watch the corporate level Merry-Go-Round, it will never stop.

    • I agree mostly with that - but not in full extent. I've lost faith in this company quite some time ago once I've seen their Authentication Management product (software required to authenticate against tokens). It is clearly a crap-quality product made by MBAs for MBAs. It looks like it's been severely crippled by some cheap outsourced programmers (typical corporate attitude - "cuting costs"). This particular breach mainly confirmed my earlier opinion about RSA.

      Losing reputation also takes quite a long tim

    • by yuhong (1378501)

      You mean Milton Friedman?

  • History (Score:5, Interesting)

    by chill (34294) on Friday June 24, 2011 @03:12PM (#36558852) Journal

    Of the people who I've talked to with RSA tokens, most have said they're now actively planning a migration off of RSA tokens.

    It isn't that they were hacked. Shit happens, even to the best of them. It was the lack of information and lack of transparency by RSA (EMC) on the whole event. Trust has been lost.

    I'm not talking about public statements or mea culpas. I'm talking about why they weren't 100% open and upfront with existing customers right away. It gives the impression that EMC's execs were hoping no one would get hacked and it would all fade away over time. That they could just ride this out and weren't going to have to fork over a boatload of cash to replace everyone's tokens, thus not taking a hit on their stock or bonuses.

    They were wrong, and now the price they are going to pay is not only replacing everyone's tokens, but a loss of trust and hence future business.

    • Of the people who I've talked to with RSA tokens, most have said they're now actively planning a migration off of RSA tokens.

      It isn't that they were hacked. Shit happens, even to the best of them. It was the lack of information and lack of transparency by RSA (EMC) on the whole event. Trust has been lost.

      I'm not talking about public statements or mea culpas. I'm talking about why they weren't 100% open and upfront with existing customers right away. It gives the impression that EMC's execs were hoping no one would get hacked and it would all fade away over time. That they could just ride this out and weren't going to have to fork over a boatload of cash to replace everyone's tokens, thus not taking a hit on their stock or bonuses.

      They were wrong, and now the price they are going to pay is not only replacing everyone's tokens, but a loss of trust and hence future business.

      I just got my quote for replacement tokens. They're giving me a 3 to 6 month estimate on when I'll actually have the new tokens. I can quote the whole chain from "Nothing was stolen" to "Nothing was stolen that could replicate a token" to "Yea, our bad."

      • by h4rr4r (612664)

        Why are you even replacing them?
        Would you not be better off moving to a competitors service?

    • by yuhong (1378501)

      Yea, likely a cover-up culture, another common problem.

    • by gweihir (88907)

      Exactly my thoughts. They hoped the attackers would be competent enough to not get caught and the attacks not being traced back to broken SecureID. Seems the RSA hack was pretty simple, as the attackers subsequently got detected when they tried to use the data.

      IMO, RSA has lost any and all credibility as a security solutions provider. Not only the completely unacceptable delay tactics, but also that this information could be hacked in the first place. Only terminally stupid or terminally greedy people leave

    • by godel_56 (1287256)

      Of the people who I've talked to with RSA tokens, most have said they're now actively planning a migration off of RSA tokens.

      It isn't that they were hacked. Shit happens, even to the best of them. It was the lack of information and lack of transparency by RSA (EMC) on the whole event. They were wrong, and now the price they are going to pay is not only replacing everyone's tokens, but a loss of trust and hence future business.

      I don't think the worst thing is that they were hacked, I think the real incompetence is having the seeds stored on a public facing system, ready to be stolen if someone did get in.

      A company of their stature should have known to air-gap this kind of information. I think this is equivalent to those web sites that have their customers passwords stored in plain text.

  • by fuzzyfuzzyfungus (1223518) on Friday June 24, 2011 @03:27PM (#36559048) Journal
    Speaking of trust issues, quoting a Gartner analyst?

    Anyway, back to the matter at hand: This article seems like a particularly bad situation for the two sharply different definitions of "trusted" to come into collision without very, very careful elucidation.

    On the one hand, you have the usual social usage of "trust": more or less "the belief that a person or device will do what it says/act in good faith/do what it says on the tin/etc."

    On the other, you have the paranoid security wonk definition of "trusted": "the state of being a component of the security system whose overall integrity depends on your integrity as a component."

    The two could really hardly be more different while still occupying the same word. The former is socially valuable, and societies become dystopian hellholes without it; but it is a very poor ingredient upon which to build technologically secure systems. The second is an unfortunate necessity; but it is one of the marks of a good security system that it knows exactly what parts of the system are 'trusted' and what parts need not be.(a second, and important, mark of a good security system is that the set of 'trusted' systems has been culled as much as possible, and that no 'trusted' systems remain that you do not have good reason to 'trust' in the usual social sense.)

    In the case of RSA, you really had a massive failure on both counts: In the social sense of "trust", RSA arguably oversold the security of their solution, was intensely cagey about the break-in until breaches at major defense contractors forced their hands, and generally fucked around as though they were trying to burn social trust. In the infosec sense, the fuckup was that(by retaining all token seed keys, RSA made themselves a 'trusted' component of every customer's security infrastructure. It is an architectural limitation of the RSA system that there must be a trusted system, with access to the seeds and an RTC, in order to perform authentication attempt validations. However, it is Not a requirement that there be other online seed stores out of the customers' control. By making themselves an extraneous, excess, trusted system, RSA weakened all their customers' security. Now that they are a 'trusted' component that no sensible people have social trust in, they are finding themselves written out of a fair few security architectures...

    That is the real crux of the matter. From what I've heard(both public-ally and informally from friends working in IT at largish RSA customers) the hack was some seriously sophisticated work, rather than somebody walking in through an unlocked door. However, it barely matters how tough their security is; because they never should have set themselves up as part of their customers' systems in the first place. Had the customers done the keyfill for the tokens, it wouldn't have mattered whether they had been hacked or not.
    • The former is socially valuable, and societies become dystopian hellholes without it

      why so?

      • I don't know exactly, it's still an ongoing area of research; but the research suggests that societies with high levels of mutual and institutional trust score very well in prosperity and perceived wellbeing, while low levels act as a drag on both prosperity and happiness.

        My assumption is that there are two basic flavors of factors at work: One would be 'transaction costs' in the broad sense. Every dollar spent on extra lawyer hours to draw up ironclad contracts, loss-prevention guys watching for shoplif
  • by Anonymous Coward

    A one page article. Ahhh relief.

  • Trust what? (Score:4, Interesting)

    by lazlo (15906) on Friday June 24, 2011 @03:34PM (#36559128) Homepage

    From my understanding, the RSA breach basically broke into the database that ties serial numbers to the internal "secret" that's used to generate OTP's. So go back to before the breach, and assume you're an RSA customer. To be their customer, you have to trust them. You can trust them to:

    1. 1) securely wipe their copy of the database once they've delivered your tokens to you
    2. 2) keep their database secure against attackers
    3. 3) provide you with a copy of the database after you lose yours.

    Note that options 1 and 3 are mutually exclusive. Now, it would be nice to be able to choose your level of risk tolerance yourself and decide on #1 vs #2 + #3, but there are a reasonable number of customers who actively dislike being forced to make choices. And there would be a whole lot of customers who would be really mad if, after losing their database, were told by RSA "Sorry, all of your tokens are now useless keyrings. No choice but to replace them all"

    To me it's like the evolution of passwords. In the beginning, if you forgot your password, your admin could tell you what it was. Then passwords got hashed, and your admin couldn't tell you what it was, but could reset it for you, and security was enhanced. Then passwords were used as encryption keys, and now your admin couldn't tell you what it was or reset it. If you forgot it, your data was gone. Once again, a security enhancement, but now a greater danger of data loss through forgetfulness.

    • by h4rr4r (612664)

      1 and 3 are contradictory, but close approximations can be made that are not.

      The data could have been kept not connected to any computer networks and possibly even stored on tapes in some secure location so #3 could easily be done. Then you just need to make sure no one breaks into the location you store those tapes in. That is what they like to call a solved problem, with cost going up as you add security.

      • by lazlo (15906)

        Taking the data offline and securing it physically is just a prudent way to secure it. To me, that still falls under #2, trusting them to keep it secure while #3, making it remain available. RSA did, I assume, a reasonable job at keeping the data available, but failed to keep it secure.

        But I would have to say you're exactly right on what security should be expected. There is some data that not only can, but really should be secured by taking it completely offline. Hopefully things like this will make pe

    • by profplump (309017)

      They could just let you change the secret. Then if you lost the DB you could make the tokens work again without recovering the data, just like using hashed passwords lets you reset lost passwords.

    • They could generate and store all keys on an offline server, under heavy lock-and-key. Then only employee misuse and social engineering are the attack vectors. It's a lot easier to protect against.

      Remember Mission: Impossible? The 'secure server' room? Do something like that (but probably on a lesser level). I like to hack into stuff, but I'm sure as hell not going to crawl through vents unless it will be a big enough score to pay off governments.

  • It's a decade and a half since I studied a security masters, but I seem to recall Someone Who Knew saying approximately this: in the vast sweep of history, it hasn't tended to be the technology that's failed (unless it's laughably weak in the first place), but the humans handling the technology. If we assume the worst case about the RSA hack, that a big file full of token serial numbers, shared secrets and end-customer details went missing, then this is a human failing. That is, some dumbass probably left
  • by blair1q (305137) on Friday June 24, 2011 @03:56PM (#36559420) Journal

    "(1) even the most trusted technologies fail;"

    Uh, dudes.

    THE INTERNET IS NOT SECURE

    If you hooked your database up to the Internet, then you are the fail.

  • I keep saying that "I don't get paid to trust people", here at work ~ most of my job is to find bugs and squash them, whether in the code or in the model files. Some days it's the model, some days it's the software, some days it's the user. Then I talked to my neighbor and learned about his soon-to-be-ex wife problems. That simply reinforced the point that I don't get paid to trust people. Then RSA, Sony, and everyone else got hacked. That really reinforced the point. So hey, don't trust people. Trus
  • Two Words: Yubikey (Score:4, Informative)

    by VortexCortex (1117377) <VortexCortex.project-retrograde@com> on Friday June 24, 2011 @04:44PM (#36560038)

    Yubikey [yubico.com] has secure tokens that you can "seed" yourself, for use with your own authentication servers. The scam is that RSA made some idiots think think there was no way to do this without their auth servers; Thereby fooling fools into using a less secure system with a mandatory recurring payment for RSA (to access the auth servers).

    Re-configuration of YubiKeys by customers

    For high security environments, customers may select not to share the
    AES key information for their YubiKeys outside of their organization.
    Customers may also for other reasons want to be in control of all AES
    keys programmed into the Yubikey devices. Yubico therefore supports the
    use of a personalization tool to reconfigure the YubiKeys with new AES
    keys and meta data.

    Additionally, I prefer the model that has RFID for physical access.

    Relying on an outside source to have our cryptokeys is just adding another point of failure. EVERYONE relying on them is just creating THE BIGGEST point of failure possible... Every time I talked to security minded folks that used RSA tokens, I asked them, "So. How secure are RSAs severs? You do any security audits on them lately?" The blank expressions were priceless.

  • by HTH NE1 (675604) on Friday June 24, 2011 @06:37PM (#36561626)

    Cally: My people have a saying: "A man who trusts can never be betrayed, only mistaken."
    Avon: Life expectancy must be fairly short among your people.

    Avon: Cally was murdered. So were most of her people.

  • They almost all lie. One of the jobs of a legal department in a large company is to ensure the marketing scum can promise you the moon and the stars and that when you find out what you actually got, you have no legal recourse.

    The only way to deal with this is to a) have enough competence yourself to get suspicious early and b) hire independent, competent outside experts than cannot easily be bought or intimidated to evaluate the product. The amount of lying going on in the security industry is staggering.

  • If the guy's third point wasn't so blindingly obvious to him it makes me question his qualifications as a whole.
  • Sure, we'll buy your security solution. We'll just need a contract, an SLA, and your first born son and heir. No, you can't have mine - he's currently living with our biggest customer.

    I think we'd see a bit more spending on the quality assurance department then, don't you?

  • Trust is not something you gain by marketing or fancy words - it is defined by what you do consistently. Trust takes a long time to be built, but can be lost in an instant.

  • JUNE 28th, 2011 SOLVANG, California—iMagic Software, Inc, developer of Trustable Passwords, has retained investment banking firm, Nations Media Partners, to coordinate a potential sale of the company. iMagic Software has developed a patented software technology and algorithm that authenticates a user uniquely by the way they type a password. iMagic holds the only fundamental patent for typing recognition authentication. This methodology offers a high accuracy, equal to hardware biometrics, of authe

"No problem is so formidable that you can't walk away from it." -- C. Schulz

Working...