Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security IT

Trust Is For Suckers: Lessons From the RSA Breach 79

wiredmikey writes "Andrew Jaquith has written a great analysis of lessons learned from the recent RSA Cyber Attack, from a customer's perspective. According to Jaquith, in the security industry, 'trust' is a somewhat slippery concept, defined in terms ranging from the cryptographic to the contractual. Bob Blakley, a Gartner analyst and former chief scientist of Tivoli, once infamously wrote that 'Trust is for Suckers.' What he meant is that trust is an emotional thing, a fragile bond whose value transcends prime number multiplication, tokens, drug tests or signatures — and that it is foolish to rely too much on it. Jaquith observed three things about the RSA incident: (1) even the most trusted technologies fail; (2) the incident illustrates what 'risk management' is all about; and (3) customers should always come first."
This discussion has been archived. No new comments can be posted.

Trust Is For Suckers: Lessons From the RSA Breach

Comments Filter:
  • Trust is required (Score:5, Insightful)

    by houstonbofh ( 602064 ) on Friday June 24, 2011 @03:08PM (#36558814)
    The problem is that trust is also required to have a functioning society. The higher the trust, the better a society can function. The lower the overall trust (More corruption) the less effective it is. I think "Trust but verify" is the best.
  • by MetricT ( 128876 ) on Friday June 24, 2011 @03:09PM (#36558830)

    It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.

    RSA was hacked, ultimately, because of short-term MBA thinking (I have one, so I know the type). If there's only a 10% chance of a serious security breach, then 90% of the time you can scrimp on security, and you won't merely get away with it, you'll be rewarded for "doing more with less". This same dynamic is often seen in both Wall Street and Washington.

    I really wish we were required to read Nassim Nicholas Taleb's "Fooled by Randomness" and "Black Swan" in school, instead of Thomas Friedman's dreck. At least they couldn't say they weren't forewarned.

  • by h4rr4r ( 612664 ) on Friday June 24, 2011 @03:18PM (#36558920)

    Trust but verify, means don't trust otherwise you would not have to verify. Non-thinking people like the phrase because their idol said it. Why middle class folks idolize someone who sold out the middle class I do not understand.

  • by flaming error ( 1041742 ) on Friday June 24, 2011 @03:27PM (#36559044) Journal

    > trust is also required to have a functioning society.
    Maybe, to a degree.

    >"Trust but verify" is the best.
    Indeed it is. Trust works when claims can be supported.

    Problems happen when information is just not verifiable, such as in closed source products, secret negotiations, undisclosed business interests, or whenever information is withheld or misrepresented.

    When "trust me" is all the verification a vendor offers, trust is for suckers.

  • by fuzzyfuzzyfungus ( 1223518 ) on Friday June 24, 2011 @03:27PM (#36559048) Journal
    Speaking of trust issues, quoting a Gartner analyst?

    Anyway, back to the matter at hand: This article seems like a particularly bad situation for the two sharply different definitions of "trusted" to come into collision without very, very careful elucidation.

    On the one hand, you have the usual social usage of "trust": more or less "the belief that a person or device will do what it says/act in good faith/do what it says on the tin/etc."

    On the other, you have the paranoid security wonk definition of "trusted": "the state of being a component of the security system whose overall integrity depends on your integrity as a component."

    The two could really hardly be more different while still occupying the same word. The former is socially valuable, and societies become dystopian hellholes without it; but it is a very poor ingredient upon which to build technologically secure systems. The second is an unfortunate necessity; but it is one of the marks of a good security system that it knows exactly what parts of the system are 'trusted' and what parts need not be.(a second, and important, mark of a good security system is that the set of 'trusted' systems has been culled as much as possible, and that no 'trusted' systems remain that you do not have good reason to 'trust' in the usual social sense.)

    In the case of RSA, you really had a massive failure on both counts: In the social sense of "trust", RSA arguably oversold the security of their solution, was intensely cagey about the break-in until breaches at major defense contractors forced their hands, and generally fucked around as though they were trying to burn social trust. In the infosec sense, the fuckup was that(by retaining all token seed keys, RSA made themselves a 'trusted' component of every customer's security infrastructure. It is an architectural limitation of the RSA system that there must be a trusted system, with access to the seeds and an RTC, in order to perform authentication attempt validations. However, it is Not a requirement that there be other online seed stores out of the customers' control. By making themselves an extraneous, excess, trusted system, RSA weakened all their customers' security. Now that they are a 'trusted' component that no sensible people have social trust in, they are finding themselves written out of a fair few security architectures...

    That is the real crux of the matter. From what I've heard(both public-ally and informally from friends working in IT at largish RSA customers) the hack was some seriously sophisticated work, rather than somebody walking in through an unlocked door. However, it barely matters how tough their security is; because they never should have set themselves up as part of their customers' systems in the first place. Had the customers done the keyfill for the tokens, it wouldn't have mattered whether they had been hacked or not.
  • by houstonbofh ( 602064 ) on Friday June 24, 2011 @03:40PM (#36559196)
    A whole post of information and all you see is the quote at the end. You might want to read "The Last Centurion" by John Ringo for some good information on high vs low trust societies. Or not, since he might like people you hate.
  • by h4rr4r ( 612664 ) on Friday June 24, 2011 @03:42PM (#36559226)

    I don't hate anyone, I dislike people who work against folks I do like though. I don't like it when people idolize those who work against them either. "Trust, but verify" just makes no sense. "Never trust, always verify" at least makes good sense.

Make sure your code does nothing gracefully.

Working...