Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security

Inside The Twisted Mind of Bruce Schneier 208

I Don't Believe in Imaginary Property writes "Bruce Schneier has an essay on the mind of security professionals like himself, and why it's something that can't easily be taught. Many people simply don't see security threats or the potential ways in which things can be abused because they don't intend to abuse them. But security pros, even those who don't abuse what they find, have a different way of looking at things. They always try to figure out all the angles or how someone could beat the system. In one of his examples, Bruce talks about how, after buying one of Uncle Milton's Ant Farms, he was enamored with the idea that they would mail a tube of live ants to anyone you asked them to. Schneier's article was inspired by a University of Washington course in which the professor is attempting to teach the 'security mindset.' Students taking the course have been encouraged to post security reviews on a class blog."
This discussion has been archived. No new comments can be posted.

Inside The Twisted Mind of Bruce Schneier

Comments Filter:
  • by wces423 ( 686579 ) on Friday March 21, 2008 @04:20AM (#22817166)
    This article just confirms my belief that a good security professional needs to have destructive mindset. You need to feel the urge to abuse the system as soon as you have seen it. I was not good at it, quit security research to join development!
    • Why does it have to be destructive? It's not so much the urge to abuse the system, it's more the urge to see what it's capable of, even the things not intended by the creator.
    • Re: (Score:3, Funny)

      by SL Baur ( 19540 )

      Bruce talks about how, after buying one of Uncle Milton's Ant Farms, he was enamored with the idea that they would mail a tube of live ants to anyone you asked them to.
      I had the board game when I was very young. I also remember the spanking I got when I brought a container of ants into the house. Dad, they can't get out! Ouch!
    • by Registered Coward v2 ( 447531 ) on Friday March 21, 2008 @06:54AM (#22817636)
      This article just confirms my belief that a good security professional needs to have destructive mindset. You need to feel the urge to abuse the system as soon as you have seen it. I was not good at it, quit security research to join development!

      I would not say a destructive mindset but rather an inquisitive one - that asks "What possibilities does this open up and how can I use this to other ends?"

      The challenge is to turn that mindset to productive, rather than destructive ends.

      Speaking as one who has done that work; a little paranoia is a good thing as well; because some people are out to get you (and even more are just plain stupid enough to do a dumb thing).
    • by cardpuncher ( 713057 ) on Friday March 21, 2008 @07:09AM (#22817680)

      I think it's got more to do with awareness and analysis than destructivness.

      I remember some years ago now gently trying to persuade a colleague that it was inappropriate to have forwarded the infamous Craig Shergold [wikipedia.org] chain e-mail. Despite widespread publicity, the colleague absolutely refused to believe that there could be anything amiss and insisted I was being mean and cruel to deny the child (even by then cured and in his late teens) his "dying wish" and denounced my callousness to other co-workers.

      There's an advertisement for an animal welfare organisation on British TV at present with pictures of pathetic looking dogs who have been badly beaten ("it's the worst case I've ever seen" says the voice-over) or "used as an ashtray". Finally, at the end of the advertisement the confession, "these are not real cases" - followed with a demand for money anyway, now the viewers have been "softened up".

      Being a sucker for a sob-story isn't "constructive"; knowing that it can be exploited for social engineering isn't "destructive" - unless you regard human gullibility as a postive trait - though it sure can make you unpopular!

    • This article just confirms my belief that a good security professional needs to have destructive mindset
      As in 'Set a thief to catch a poacher turned gamekeeper'.
    • I think "urge to abuse" is to strong of a phrase. You don't need to feel the need to do it wrong but you do realize ways around things, I see these things all the time that are security nightmares. But I don't feel any urge to try them myself. Because I realize yea it is a security problem but it also makes my life more convient. You need to get a fair balance between the two.
    • Re: (Score:3, Insightful)

      by analog_line ( 465182 )
      I would agree. I've got the "security mindset". I used to work in security on the consulting side, trying to fix up people's stuff. Thought about getting into research, but the culture of the security community at the time (right before 9/11) drove me away before I could. A kind of self-hating trifecta of ex-military intelligence grunts looking at disdain at anyone that didn't come out of the armed services, genius technical boffins with all the interpersonal skills of Rain Man, or wild-eyed "Informatio
    • by scruffy ( 29773 )
      This doesn't seem so much different from the mindset needed for good programming or good testsets. You need to think of how different inputs or states can cause things to go wrong and then program accordingly.
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Friday March 21, 2008 @04:20AM (#22817170)
    Comment removed based on user account deletion
    • Re: (Score:2, Insightful)

      I think its all about having written a fat book at the right time and place. I can't stand it, but all my friends have it - yet none of them have read it! It has become the book you have on your shelf to look cool. Another Bruce Schneier, pronouncement about his superiority, ah yes...It would be nice if there were more articles here about current developments in cryptography. I've heard more than enough from Schneier. There are MANY other interesting security people out there to read, who aren't confused.
      • Well, I know what you mean about that. I've got Knuth on my bookshelf, and I can honestly say I don't look at it very often! Pure pose value for me ;) I assume you're talking about "Applied Cryptograhpy" - but I do read Schneier's books - I've got most of them, and I like 'em. What other security people would you recommend?
    • Re:Disappointing (Score:5, Insightful)

      by call-me-kenneth ( 1249496 ) on Friday March 21, 2008 @05:33AM (#22817374)
      Tell you what, when you've written a book that gives a tenth of the useful advice, interesting information and insightful analysis of a single issue of CryptoGram [counterpane.com], come back and tell us about it. Until then, your words serve only to make you look bad.
      • If Crypto-gram wasn't a mix of technical information and the self aggrandizement the grandparent refers to - you'd have a point. The simple fact is, Bruce has been using his fame (for doing something unrelated[1]) to push his political agenda for some years now. It probably doesn't hurt his consulting business either to get so many column inches and electrons either.

        [1] Yes, cryptography is unrelated to security - one can be an expert on one without knowing anything about the other. (In the same
    • Re:Disappointing (Score:5, Insightful)

      by mattpalmer1086 ( 707360 ) on Friday March 21, 2008 @06:58AM (#22817650)
      I would say quite the opposite. I think it's well documented that Mr Schneier used to think that cryptography would solve all our security woes, and then he realised this was only a small part of the picture. You may have preferred him when he was all gung-ho on the deeply technical and fascinating aspects of crypto - I love that stuff too - but you are not his audience anymore.

      Things that you may think are obvious are just not to most people. He's trying to reach normal people, business leaders, politicians - people who don't get it, or still think security is just boring techy stuff that doesn't work very well. He's trying to show it's also a mindset, a way of seeing the world, that anyone can understand. I think he's doing pretty good, but again, we are not his primary audience.

      • Re: (Score:2, Insightful)

        You may have preferred him when he was all gung-ho on the deeply technical and fascinating aspects of crypto ... but you are not his audience anymore.

        I have nothing to add, other than preach on, brother.

        It is always unfair to criticize a work which was never intended for you in the first place. Schneier has long since lost his faith in strong crypto as security's holy grail. He now writes, often, that security problems will not be solved with technical tools because they aren't technical problems.
    • by dpilot ( 134227 )
      Even if it's as bad as you say, he's still more interesting to read/hear than Donald Trump, the *real* king of self-aggrandizing.
  • In security (Score:4, Interesting)

    by Z00L00K ( 682162 ) on Friday March 21, 2008 @04:38AM (#22817232) Homepage Journal
    It's not necessarily to have a destructive mindset but a great deal of imagination and some paranoia.

    Such a personality may be disastrous in many other cases but works well when it comes to security work.

    And remember that most computer viruses in the beginning weren't really malicious - they just were there "because I can". Even those cases has to be taken into account by security people.

    • paranoia yes ..... (Score:3, Insightful)

      by taniwha ( 70410 )
      I do crypto for a living .... my bank really really wants me to to use their web banking service - but I have a dilemma - is it safe? if I try and break their security to test them a couple of things might happen: if it's any good they'll catch me and I might go to jail .... if it's crap there's no point in me using their service - so I can't win and can't use their service
      • The problem is even if you actively choose not to use their system, security problems in their account creation or account login can still render your account wide open to whoever out there might try to hack it. So you really might as well use the service, if only to be able create a baseline and from thereon passively monitor it yourself to check for problems. Some banks are at least trying to get better in that regard, such as Bank of America has been been offering two-factor authentication that uses a ce
      • by Lennie ( 16154 )
        I have the same problem, I do think from what I've seen of my current bank they aren't so good, I'm switching. As a user of the system you can see so much more, so I'll reevaluate after that.
      • Re: (Score:3, Interesting)

        by TheRaven64 ( 641858 )
        Have you contacted the bank and asked if they would be interested in you performing a free evaluation of their network security? Send them your credentials as a security professional and say that you are willing to give them a documented appraisal for free since you are a customer and the security of their system affects you. If they say no, then publish their refusal online somewhere, and approach another bank. If they say no, add them to the list. Start sending the list to consumer groups and mainstre
    • Re:In security (Score:5, Insightful)

      by v1 ( 525388 ) on Friday March 21, 2008 @07:16AM (#22817718) Homepage Journal
      I take the third view. I believe you need the ability to (forgive the overused phrase) "think different". 100% of what we do every day in life is based on a world of assumptions. To be a good security researcher requires distancing yourself from the assumptions, breaking out of the ruts in the road, and trying different things. The majority of security holes exist because the developers and defenders are making the same assumptions as everyone else. Buffer overflows are the classic example, and we still see them constantly even though they've been recognized for years as a major security risk.

      I did in-house beta testing for a time, and used to really piss off the developers because I had a knack for knowing what they weren't planning for. I wasn't so much looking for security holes, but rather ways to crash the app. (which probably many of which were exploitable) A classic I heard was a developer submitting a bug report for "program crashes when it says Press Any Key and you press letter A". The developer called her back to his cubicle, why did you press "A"??? She said her name was Alice, and it said press ANY KEY so she hit "A". "But you're not SUPPOSED to hit "A", you're SUPPOSED to hit the space bar!" At which point the other developer stood up from his cubicle and said "oh? I thought it meant RETURN?" This perfectly illustrates how persistent assumptions are in coding. Not only are they all making assumptions, but they aren't even making the same assumptions.

      That's the sort of testing I did. Deleting the last element in a list, Select all in empty lists, saving a form before completing it, entering a 200 character filename for save, taking advantage of assumptions that the user knew what they were doing and would not ask the program to do something that was certain to produce undesirable results.

      • by TheRaven64 ( 641858 ) on Friday March 21, 2008 @08:51AM (#22818564) Journal
        I used to know a tester who would always hit control-alt-delete when told to press any key to continue. The company changed the messages to 'press almost any key to continue' after a while. Of course, that then confused real users who wondered which keys they weren't allowed to press...
        • by jvkjvk ( 102057 ) on Friday March 21, 2008 @10:52AM (#22820066)
          This is actually just really quite obnoxious and not very helpful.

          Do you really want a user program hooking into trapping the ctrl-alt-delete sequence? I thought not.

          Being pedantic, since the tester appears to be so, "any key" does not imply "any combination of keys", either.

          I test by hitting the reset button, after all it can be considered a key too, just not a 'key' on the keyboard...

          If I was the company, instead of changing the message, i would have modified the tester's behaviour, perhaps with a hammer if necessary...
  • by evanbd ( 210358 ) on Friday March 21, 2008 @04:52AM (#22817272)

    You can get a port-a-potty delivered without ever providing positive identification. You don't even have to pay for it until it shows up, and they'll happily deliver while you're at work. They're quite used to people preparing to have renovations done by contractors.

    Of course, I would never decide someone else needed a port-a-potty on their front lawn. But, much like the ants, it's something you can't help but notice if you have the right mindset.

    • by Nullav ( 1053766 )
      So you're the one flooding hapless neighborhoods and honest businesses with toilets!
    • You can get a port-a-potty delivered without ever providing positive identification. You don't even have to pay for it until it shows up, and they'll happily deliver while you're at work.

      Maybe where you live. Not here.
  • by badzilla ( 50355 ) <ultrak3wl@gmail. c o m> on Friday March 21, 2008 @04:59AM (#22817290)
    Anyone can do what Bruce implies only "special security people" can do. It's just that most people don't because there is no incentive to. You might as well announce that your special security mindset has noticed how easy it would be to go into restaurants and put poison in the salt shakers. Hell they are wide open! What were the salt shaker designers thinking of! But of course normal people are just not interested in doing that.
    • Exactly, I read the thing about the ants and I couldn't even imagine why you would even bother to send some place live ants. Hey, you know you can *gasp* take more than one newspaper from the machine when you only payed for one! Onos security breach!

      And I read the AC reply to you, and all I have to say is this: I don't want to live in a society where everyone is treated like a criminal. Things are moving more and more in this direction in the US, and it's very sad. I think people should be treated like t
  • Good engineering (Score:4, Insightful)

    by TheLink ( 130905 ) on Friday March 21, 2008 @05:01AM (#22817298) Journal
    "This kind of thinking is not natural for most people. It's not natural for engineers. Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail"

    In my opinion, good engineering involves thinking that things _will_ eventually fail, how it can be made to fail _safely_ if possible and figuring out what the acceptable risk is given the cost. Modern engineers don't normally design stuff to last for 1000 years (some of it might last that long - distribution curves and all that).
    • Re: (Score:3, Insightful)

      by Hatta ( 162192 )
      In my opinion, good engineering involves thinking that things _will_ eventually fail, how it can be made to fail _safely_ if possible and figuring out what the acceptable risk is given the cost.

      Murphy [wikipedia.org] was an engineer after all.
  • Bruce Schneider Facts [geekz.co.uk]
    The last time someone tried to look into Bruce Schneider's twisted mind, the Big Bang happened
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Friday March 21, 2008 @05:50AM (#22817422)
    Comment removed based on user account deletion
    • by LaskoVortex ( 1153471 ) on Friday March 21, 2008 @06:02AM (#22817458)

      Id have called in the buses, and shipped everyone off property to be safe right away.

      And then the snipers would shoot them as they were packed like sardines into the busses. Me, I would pull one of 50 cards with random "evacuation plans" out of a hat and did what it said on the card. I'd include an "ignore the bomb threat" card in there as well.

      • That's basically the answer it would have to come down to as far a secure response would go. The constant issue in the grandparent's scenarios has been that the same thing will always happen. Call in a threat, watch them load the buses... bomb the buses the next time.

        Much like the pre-2001 response of "we'll sit and wait for the hijacking to end," bomb threats are dealt with as if the threat is honest. Once somebody has a case of a bomb under a bleacher to remember, we may act differently.

        Security tends to be reflexive.

    • by remahl ( 698283 ) on Friday March 21, 2008 @06:20AM (#22817528)
      No need to call in the busses. Just tell everyone that they may go home for the day. They will disperse randomly in every direction, quicker than any school administrator can administer their movements and in ways that no terrorist can predict.
      • They will disperse randomly in every direction, . . . . in ways that no terrorist can predict.

        Generally, there are only two ways in and out of a school parking. Using the DC snipers as a template:

        1) Call in bomb threat to evacuate students
        2) Administrators let students go home immediately rather than putting them on buses
        3) As first car approaches exit, sniper shoots driver, disabling first vehicle and blocking exit
        4) Repeat with second sniper at second entrance
        5) Wait for students behind stu

        • Someone who should be involved with security issues. Note that this does not preclude terrorism.
      • Great. When I was in high school, my school was (...hits Google Earth...) five miles away as the crow flies - much further by road. (You couldn't go as the crow flies.) Almost all of it alongside highways or busy roads without sidewalks. Then and now only about 10% of the student body drives to and from school. How in the hell was I supposed to get home? (Not to mention that compared with the area my school covered I lived practically in the parking lot.) Not to mention the horrendous traffic jams cau
    • by Animats ( 122034 )

      Our school gets a bomb threat, and the teachers and administrators are freaked out. They move us all, I kid you not, to the football field where we are fenced in by chain link fence, about 1/3 of which is covered by barbed wire.

      That kind of dumb response happens at higher levels. A few years back, there were three incidents where the U.S. Capitol was evacuated because a light aircraft had entered the Washington area without authorization. I was amazed at that response. The official response to an ai

    • Re: (Score:3, Funny)

      by nine-times ( 778537 )

      Me? I'd have called in the buses, and shipped everyone off property to be safe right away.

      And then what happens when the busses drop below 55mph?

  • by MyNameIsFred ( 543994 ) on Friday March 21, 2008 @06:18AM (#22817522)
    While I agree with many points of the article - specifically that a security professional must have an unusual mindset - I am troubled that the examples leave out the cost-benefit analysis. As an example, the article correctly points out the vulnerability associated with picking up "your car" from a service department. All you need is a last name, no ID. This is an obvious vulnerability. On the other hand, the service department is motivated to make the process as streamlined as possible for its customers. Demanding IDs, etc., will slow down the process. The more cumbersome the process, the more likely customers are to use a competitor. Therefore, they need to trade security with cars to the cost of loosing customers.

    I am reminded of the time that I test drove a new car. All the dealership wanted was a photocopy of my driver license, and they let me drive the car off the lot for an extended test drive. Since driver licenses are relatively easy to fake, I wondered how often cars are stolen. I asked, and was told they are stolen on occasion, but insurance covers it. My point, they did the cost-benefit analysis, and decided on an insecure method.
    • by remahl ( 698283 )
      The article recognizes that there is a cost-benefit tradeoff in the car dealership example. The point is that there will be no analysis unless someone sees that there may be a problem in the first place:

      The rest of the blog post speculates on how someone could steal a car by exploiting this security vulnerability, and whether it makes sense for the dealership to have this lax security. You can quibble with the analysis -- I'm curious about the liability that the dealership has, and whether their insurance would cover any losses -- but that's all domain expertise. The important point is to notice, and then question, the security in the first place.

    • by CBravo ( 35450 )
      It would be easy to photocopy your drivers licence and see if the person that is collecting the car matches the photo from the licence. Right? Though that doesn't prove you have not already picked up the car.

      All these 'problems' should be stated as 'engineering requirements'.
  • Turn his reasoning on his article, how can we subvert it? Was the message that he gave a real one or was he trying to make us believe something and for who's benefit ?

    Seriously: I agree with a lot of what he has to say. I am amazed at the number of programmers who do not follow Henry Spencer's 6th commandment for C programmers - check function return codes, they simply assume that it will work correctly.

    If something can go wrong - it will, and often at the most inconvenient time.

  • I do this all the time... I actually am quite surprised at the number of everyday things that have such simple flaws.

    In the hospital waiting for my wife the other day, I watched a mailwoman with a big trolley full of mail, sorted into departments, insert several people's medical records into the trolley and then walk off out of sight through locked doors (which were opened by her tapping the glass and standing to one side) leaving the mail unattended. It wouldn't take much to a) gain access to the baby war
    • The other, from working in schools, comes from the Tesco Computers For Schools voucher scheme

      The thing about this promotion is that giving away computers to schools is actually something Tesco could afford to do, for free, anytime they chose. The whole idea of making people collect worthles pieces of paper and go through the charade of giving them to schools who then redeem them is merely a marketing exercise to promote loyalty to the store and to make the donors feel good in themselves, it's certainly not

  • I have written a long long reply to his article at my blog [blogspot.com] (no ads, etc.)
    Short summary:
    In my opinion, security in real life is not about "what can go wrong". It is about "how often and how much can it go wrong and am I prepared to handle those cases". In short it is more about how to calculate risks accurately and knowing when to take them.
  • There's a fine line (Score:4, Interesting)

    by petes_PoV ( 912422 ) on Friday March 21, 2008 @07:02AM (#22817664)
    between being "security conscious" and being completely paranoid. When it boils down to it, there's risk involved in everything we do. Nothing is completely secure and there's always a chance that something will go wrong.

    Sadly the world we live in today has massively overestimated the possiblity of problems and hugely inflates the effects they will have (in the tiny percent of occasions when they happen). I think this is a side-effect of improved communications: we all get to hear about the 1 in a million disaster stories, but never about all the other times, when everything goes right. This leads us to think that problems are more common than they actually are.

    The great thing about being a security professional is that you can never be proved wrong. If you claim a security hole and it is never exploited, no-one will say you're wrong - just that it hasn't been exploited yet. If we beleived everything these guys say, no-one would ever do anything as we'd all be too scared. Personally I think we should avoid the obvious problems, get on with our lives and accept that on a few, very few, occasions we might have to spend a little time sorting out a problem.

    • by dpilot ( 134227 )
      But this becomes a good lead-in to point out the findings of the 9/11 Commission.

      The "fault" was assigned as a "failure of imagination." Yet in the very center of the whole investigation was the NSA, the folks that are *supposed* to be, as you say, completely paranoid. These are the people who are supposed to see an array of dots and connect them all into a pattern - that's their job. They're supposed to think about the possibilities of a hijacked airplane loaded with fuel, and what you do to mitigate th
    • Security professionals can be proven wrong, all it takes is someone listening to them. Suppose a security professional stated that guns were dangerous, and outlawing guns would make people safer. Then someone outlaws guns, and voila, the reverse happens. The security professional has just been proven wrong.
      • Yes, you're right in that example. However my experience (somewhat limited I admit) is that security professionals and others tend to make statements such as "X is a security risk", rather than saying "if you do X, Y will happen".

        They seem to have learned the "weasel words": might, could, may, etc. and pepper their prognostications with them. As a consequence you can't nail them down to a definitive, quantifable, statement.

        I'd like to ask the guy who wrote about being able to mail tubes of live ants (fro

    • Re: (Score:3, Informative)

      It seems when many consider risk they don't consider the probability of something happening only the possibility.

      Consider the National Safety Council's Odds of Dying [nsc.org] page. According to them, one has a 1 in 73,085 chance of dying in a motorcycle accident while there's a 1 in 19,216 of dying in a motor vehicle accident as a car occupant.

      However, motorcycles are perceived ( at least by people I know, obviously a small sample ) as more risky because "people die riding those". Obviously that happens, but not
      • Or perhaps it's a riskier activity, in part, because fewer people associate the activity of driving with risk. People take far more care when packing glassware than they do driving despite the fact that the latter is far more dangerous.
      • by pongo000 ( 97357 )
        That may well be true, but once you are in a motorcycle accident, the chances of dying from injuries in that accident are much higher than in an auto accident (sorry, no stats on this, just lots of anecdotal evidence from folks who deal with this type of stuff).

        Aviation is the same way: While chances are slight that you'll be in an aircraft accident, I can say with some degree of certaintly that uncontrolled descent into terrain is almost always 100% fatal.

        So pick your odds...you want to fly or ride a bike
      • by Violet Null ( 452694 ) on Friday March 21, 2008 @11:18AM (#22820488)
        It doesn't matter how many people die of something. What matters is the percentage of people who do it that die.

        Saying "jumping off the top of a building with piano wire wrapped around your neck" is much, much safer than being a passenger in a care because, hey, your chances of dying that way are only 1 in 492,593,129. That number just tells you how often death happened while doing that; without the vital piece of information about how many times it was attempted without dying, you don't really know anything of interest.
  • by curmudgeon99 ( 1040054 ) on Friday March 21, 2008 @07:15AM (#22817708)
    Over the Christmas holidays, when work is always slow, I have a long habit of putting on my hacker hat and seeing what our vulnerabilities are. I think every developer owes it to their sanity to do this regularly. You will find so many opportunities for SQL Injection--no matter how careful your developers are--and Cross-Site Scripting and just a bunch of other holes. You do not want to be in a conference room some day explaining to your boss's boss why your program allowed a hacker to gain access to the company's systems through your app. This is a no-brainer.
  • by dpilot ( 134227 ) on Friday March 21, 2008 @07:25AM (#22817756) Homepage Journal
    One example used was getting the car from the repair shop, with just a last name.

    Where I get my car serviced, I know both guys who might be behind the desk, and they both know me, my wife, and son. They won't hand over the car keys on just a last name. Which brings it all back to a frequent point of Bruce's writings - all of the security razzle-dazzle in the world doesn't make a bit of difference compared to a knowledgeable person in the right spot.
    • Where I get my car serviced, I know both guys who might be behind the desk, and they both know me, my wife, and son.

      But not everyone drives a ford!
  • Good engineers need to look for how things can fail, too. They need to look for small parts that children may swallow, weak latches that can allow lids to fall open, weak load-bearing structures... how the environment can make their products fail. They need to look for how things can be made to fail, as well, because the hostile human element is always part of the environment... the same factors that make someone a good engineer make them a good security expert.

    The problem isn't that good security professionals have a different mindset from good engineers, it's that both good security professionals and good engineers are rarer than people think, and that engineers are not as often held responsible for how their stuff fails when someone gains an advantage by deliberately making them fail.

    As in many other areas of life, I try to ask myself, WWFD? What Would Feynman Do?
  • Scripts (Score:3, Insightful)

    by Hatta ( 162192 ) on Friday March 21, 2008 @08:13AM (#22818098) Journal
    Speaking of security analysis, there are scripts from 9 different domains on that page, none of which are required to read the article. WTF. Thank god for noscript.
  • by dpbsmith ( 263124 ) on Friday March 21, 2008 @08:53AM (#22818576) Homepage
    Without disagreeing with anything at all the article, I'd like to raise the point that an awful lot of things have no security, or very porous security.

    What saves society is three things.

    First, mischief and curiosity aren't a powerful enough motivator to create a real problem. I don't know whether Schneier ever sent live ants to strangers... or how many Slashdot readers will try it... but most likely not very many.

    Second, for most security holes it is difficult to think of a way to make money from the exploits.

    Third, even if you can make money, it's even more difficult to find a way that will make significant amounts of money and to repeat the exploit often enough to make a living wage, without being caught.

    Case in point: newspaper vending boxes which allow you to pay for one newspaper and access a whole stack of them. If you have a "security mindset" (or even if you don't), it occurs to you that you could pay for one and take two... or ten... or the whole stack. And, indeed, you can. The problem is that it doesn't benefit you to get more than one newspaper. So, can you take two and sell the extra? Maybe. Net profit $0.50. Could you take the entire stack out of the machine and dress up as a street vendor and sell them on a street corner? Maybe. Net profit $25. Could you do it more than half-a-dozen times? Probably not.

    How about self-checkout lines in supermarkets? You can buy produce at them, and the produce isn't bar-coded. So, you can buy orange bell peppers at $3.99 a pound, put them on the scanner scale, and enter the code for green peppers at $1.69 a pound. Most supermarkets seem to rely on someone at a nearby counter keeping an eye on the self-checkout lanes while doing other things, and they don't usually come over unless a customer calls or the machine goes into an error state. Again, it's hard to see how you can make money, rather than saving a little on your grocery bill... and if you managed to do this to the extent where you were stealing hundreds of dollars, I think your chances of being detected get to be high. (I'm thinking of people who got caught recently pasting barcodes for two-dollar items over things like boom-boxes and DVD players...).

  • by BenEnglishAtHome ( 449670 ) on Friday March 21, 2008 @12:57PM (#22821850)
    This had me flashing back to elementary school arithmetic. It happened to me a hundred times. The textbook showed an equation and made a statement about it. The textbook showed another equation and made a statement about it. Then the textbook showed a third equation and asked "What can we say about this equation?"

    My answers always started the same way. "It's printed in ink on paper." I don't really think that the textbook author expected people to do anything other than to extend whatever line of reasoning had been presented in the previous examples (and I always got around to that) but the open-ended question "What can we say about this equation?" always struck me as license to comment on the clarity of the typesetting or anything else.

    My teachers thought I was weird.

    Later in life, I became involved in competitive pistol shooting. I loved the rule books. They were just collections of hidden loopholes begging to be found. And then came the problems. In some sports it was called the "engagement" rule. In others, it was the "spirit of the rules" rule. They were all the same sort of thing - a way to say you couldn't do anything unexpected. If you looked at a practical defensive scenario and found some completely whacky way to beat it by, say, running between cover in an odd sequence, you'd be found guilty by the officials of "failure to engage" the scenario. No points for you. A guy I knew had trouble seeing sights too close to his face but the rules forbid changing the sight radius (distance between the sights) making it impossible for him to move the rear sight further from his face. He responded by cantilevering both sights forward so that the sight radius stayed unchanged but both sights were now completely forward of the muzzle. It was perfectly legal under the rules as written but his pistol was declared illegal because it violated the "spirit of the rules."

    What amazes me is the hostility this mindset engenders. I'm not shy about saying that I love to parse out the rules and find advantages. I'm not shy about saying that a "spirit of the rules" rule is really just saying "You're not allowed to be smarter than the people writing the rules and running the match." The reaction I get is flaming on message boards and accusations of poor sportsmanship. There are actually people out there who want to punish innovation; at least, that's the way I look at it.

    "Thinking different" makes people feel threatened and act nervous and hostile. I don't understand that. Am I weird, or are they?

"You can have my Unix system when you pry it from my cold, dead fingers." -- Cal Keegan

Working...