Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security IT

Humans Not Evolved for IT Security 302

Stony Stevenson writes to tell us that at the recent RSA Conference security expert Bruce Schneier told delegates that human beings are not evolved for security in the modern world, especially when it comes to IT. "He told delegates at the 2007 RSA Conference that there is a gap between the reality of security and the emotional feel of security due to the way our brains have evolved. This leads to people making bad choices. 'As a species we got really good at estimating risk in an East African village 100,000 years ago. But in 2007 London? Modern times are harder.'"
This discussion has been archived. No new comments can be posted.

Humans Not Evolved for IT Security

Comments Filter:
  • So the modern equivalent is "What I can't see won't eat me" ... seems to be the same mistake. More likely, if 99.99% of your senses tell you that you are safe, then worrying about meteors or lightning strikes is a waste of energy. Plus you gotta think "selfish gene". Is I *feel* "secur-i-ness", I can proceed with making babies... while you're so worried about lions, you fail to impress the ladies.
    • Re: (Score:3, Interesting)

      by Opportunist ( 166417 )
      So that's why my common sense tells me I don't need to hide under my bed from the bad, bad terrorists, it's just that I can't see them anywhere and not that it's overblown hype.

      I'm kinda scared now.
    • Re: (Score:3, Funny)

      by CompMD ( 522020 )
      > You are alone in a dark room and cannot see. You are likely to be eaten by a grue.

      Actually, sounds like what you can't see WILL in fact eat you.
  • really (Score:5, Funny)

    by snarkh ( 118018 ) on Wednesday October 24, 2007 @02:54PM (#21103455)
    As a species we got really good at estimating risk in an East African village 100,000 years ago.

    I wonder how many days would that guy last in an East African village 100,000 years ago.
  • by User 956 ( 568564 ) on Wednesday October 24, 2007 @02:55PM (#21103459) Homepage
    He told delegates at the 2007 RSA Conference that there is a gap between the reality of security and the emotional feel of security due to the way our brains have evolved.

    Which is why, a lot of times, you end up with security theatre [elliott.org], instead of real security.
    • by Kjella ( 173770 ) on Wednesday October 24, 2007 @03:43PM (#21104093) Homepage
      And don't forget CYA security - security rules that aren't being followed and aren't being enforced either - but that exist solely so that when shit hits the fan, the bosses can say it was against policy. These are usually extremely draconian, impossible to implement or practicly impossible to follow while getting work done. But hey, it looks good on paper...
  • Ms Abacha? (Score:5, Funny)

    by Mr_Icon ( 124425 ) on Wednesday October 24, 2007 @02:56PM (#21103469) Homepage
    Looking at the number of people falling for Nigerian scammers, I'd say that our ability to "estimate risk in an East African village" is not so hot either. :)
  • by Daimanta ( 1140543 ) on Wednesday October 24, 2007 @02:56PM (#21103489) Journal
    Thank God I was intelligently designed for this kind of thing ;)
  • Bad Analogies Abound (Score:5, Interesting)

    by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Wednesday October 24, 2007 @02:58PM (#21103515) Journal

    "The brain is still in beta mode, it's got all sorts of patches and workarounds. It's not perfectly created, it's clearly evolved up."
    Wow, just ... wow. I'm not even a biologist but I know that's a terrible analogy. You can't compare the brain to software. We can control software and decide when it 'goes live,' there are no prototypes in nature or evolution. Every attempt is an iteration of the process and the process is never ending. Furthermore, the existence of an absolute of 'perfectly created' is debatable on any level in regards to any process or system.

    Exaggerate uncommon risks -- for example, air travel is safer than cars but because car accidents are common they are seen as less risky
    Maybe because everyone involved in an air plane crash usually dies. Automobile deaths are much less. There's this idea of risk = probability * impact. In the case of automobiles, probability is high but the impact is low. It's the other way around in aircraft failures.

    Personified risk -- Osama Bin Laden is scarier than a faceless threat
    How in the hell does this relate to IT security? I think IT administrators are more afraid of the people they don't know hacking their systems then the people they actually employ doing the same. In the end, I'm sure more attacks come internally or from an ex-worker than someone unknown. Maybe the face you know should be more scary than the face you don't at the office?

    Risks that could be controlled -- The DC sniper caused a few deaths but the response was way out of proportion.
    Please elaborate, I know of the John Lee Malvo incident but I have no idea how this relates to IT security. Are you telling me that shutting down a system to protect a database from a possible threat or virus is overkill? I would respond with that varying on a case by case basis but at my job, offline databases are worth maintaining the integrity of the data inside them.

    I know I'm really coming off as a jerk when I say this but I don't think this article helped me in anyway. All I saw was someone over simplifying a complex problem--thereby making them seem smarter to the people they were explaining it to.

    Don't read this article, it has nothing to offer you. If you don't know this subject, I believe this article will only add to your confusion and lack of understanding.
    • by SatanicPuppy ( 611928 ) * <Satanicpuppy.gmail@com> on Wednesday October 24, 2007 @03:12PM (#21103715) Journal
      This is actually a hot psychological topic right now; humanities tendency to poorly conceptualize risk. We're far more worried about diseases we're unlikely to catch, than ones we are. Plane crashes are scary because planes aren't familiar to most people; poor understanding of the risks magnifies fear. People always worry about the stereotypical malicious strangers, when most assaults come from people you already know.

      I think mostly he's just pointing all this out as background to the tendency to poorly appreciate risk. He's basically saying, "People apply more worry to splashy things that aren't likely to happen, and therefore we have these huge data breaches because who cares about SSNs when the terrorists could be blowing up a nuke plant?"

      The only place where I think he's totally off base is calling the brain "a patchwork". It's not, in fact. It's extremely finely tuned to do what we need it to do...It makes us ferociously competitive animals, and that is proven rather than disproven, by all the security problems that we've been having. If we weren't competitive, we wouldn't have problems. The fact that not everyone works at the same level is irrelevant.
      • by eison ( 56778 )
        Plane crashes are scary because we feel out of control. We overestimate our own competence, so if it feels like we have some control over a situation, we assume we can handle it.
        • by SatanicPuppy ( 611928 ) * <Satanicpuppy.gmail@com> on Wednesday October 24, 2007 @03:54PM (#21104249) Journal
          That's part of it, but you're still more likely to die in a bus or taxi accident, and they're not viewed with the same unreasoning fear though they also lack control.

          We are all soothed by familiar routine. This is the purpose of disaster drills, so if your building does catch fire, your mind will move into that pre-built track, and move effectively, without being paralyzed by the need to act conflicting with the fact that you have no idea of what to do. Planes are not only outside our control, they're outside most people's experience, so an event which is no more significant than a bus running through a pothole, elicits a greater level of fear due to it being an unknown, rather than a familiar, occurrence.

          • That's part of it, but you're still more likely to die in a bus or taxi accident,
            I don't know about this. Buses are safer by virtue of their size if nothing else. Unless they careen over an embankment or bridge, or get hit by a train, they are pretty safe. I have read about a lot of bus accidents over the years. Few have had fatalities.

            Taxis I can agree with you on. They are nothing but cars anyway.

      • by Lurker2288 ( 995635 ) on Wednesday October 24, 2007 @03:50PM (#21104189)
        In the sense that brains in general started off in a much simpler state with no need to handle many of the things it's currently capable of (binocular vision, manual dexterity, doing calculus) and it got to where it is one incremental improvement at a time, then yes, it most certainly is a patchwork. You can see it in the gross structure: you've got the reptilian hindbrain that keeps your body functioning in a narrow homeostatic envelope all the way at the bottom, atop which sits a cerebellum that allows for things like emotion (great for pair bonding and knowing to run away from big things with pointy teeth), and atop all of that you've got the cerebrum that enables most of your higher intellectual activity.

        The fact that this magnificent hodgepodge seems to be so perfectly attuned to our needs is almost definitional, as well as being a kind of survivor bias. That is, our brains are great at what we need them to do precisely because they evolved to do those things; brains that were evolved to do other things, or that did the same things, but not as well as ours, died off. Schneier's point is that the modern world has changed a lot faster than our brains are able to, and as a result, we're maladapted for some of the tasks facing us today, like assessing remote risks.
        • The problem with defining the brain as a patchwork is that there is no conception of what a brain would look like if it were designed. It's like calling an airplane a patchwork, simply because it's made of different parts that are all attached to each other, and all do different things.

          There are certainly a lot of ways in which our bodies are capable of adapting that would benefit us in the modern age. As for the perception of risk, I don't see it. Risk perception will never "evolve" to extend to the realm
          • Re: (Score:3, Interesting)

            by Vellmont ( 569020 )

            The evolution argument is disproven by Schneier himself; how could he be thinking about it if we hadn't already evolved to make it possible?

            Schneiere isn't humanity, he's just Schniere. One guy can have the skills and ability to do something, while the vast majority of others do not. Anyway, I think he's really trying to say that risk assessment of the modern world doesn't come naturally to people, like it did to risk assessment of being eaten by a tiger 100,000 years ago.

            I don't know if the evolutionary
          • Your airplane analogy doesn't really work because unlike the human brain, the airplane was designed all at once, and assembled in accordance with that design. In contrast, the brain has evolved in a stepwise fashion that incorporates new elements as they become necessary and discards the old as they become superfluous. If you look at the brains of reptiles, and lower mammals, and primates, and finally humans, you can plainly see the development of more complex structures piled on top of the old ones. It's t
      • It's pretty obvious that people estimate risk badly, and I agree with you.

        But don't try and actually tell anyone this. You will be labeled a bad parent (because you don't worry about stranger kidnappings as much as car accidents), un-American (because you don't worry about turr'ism as much as dying from heart disease), or a host of other things. Do not try to explain to anyone why. People tell gravely tell you "I don't need proof, know in my heart that the world is a more dangerous place today" despite

      • by Scrameustache ( 459504 ) on Wednesday October 24, 2007 @04:17PM (#21104571) Homepage Journal

        Plane crashes are scary because planes aren't familiar to most people;
        Actually, plane crashes are scary because once you're on the plane, there is nothing you can do about them.
        Car crashes are less scary because of familiarity, has you said, but also because you can grab the wheel, yell "look out!", or otherwise act upon your own destiny. And because of vertigo phobia. In a car, you're already on the ground: you aren't going to accelerate towards it inexorably, as planes will if they stall/run out of gas/break/hit another plane/etc.

        Familiarity and statistics are just part of it.
    • Exaggerate uncommon risks -- for example, air travel is safer than cars but because car accidents are common they are seen as less risky

      Maybe because everyone involved in an air plane crash usually dies. Automobile deaths are much less. There's this idea of risk = probability * impact. In the case of automobiles, probability is high but the impact is low. It's the other way around in aircraft failures.


      Not to mention the whole "I'm such a good driver I can get out of any jam" mentality. Whether true or
    • "The brain is still in beta mode, it's got all sorts of patches and workarounds. It's not perfectly created, it's clearly evolved up."

      Wow, just ... wow. I'm not even a biologist but I know that's a terrible analogy. You can't compare the brain to software. We can control software and decide when it 'goes live,' there are no prototypes in nature or evolution. Every attempt is an iteration of the process and the process is never ending.

      Not even a biologist? Are you not even a programmer either? Every attempt of a stable build is an iteration of the process and the process is never ending!

      Sexual reproduction decides when the organism goes live, and marketing decides when the product goes live.

    • Why can't we compare software to cognitive processes? It's a common analogy, and I'm surprsied you haven't run into it before. Also, "every attempt is an iteration of the process and the process is never ending," which you claim as an example of how evolution is not like software, is a perfect match to how security software (actually, a lot of software) is written these days.

      Also, while there are many non-fatal car crashes, more people do DIE in car crashes than in plane crashes, but "fear of dying in a


    • Wow, just ... wow. I'm not even a biologist but I know that's a terrible analogy. You can't compare the brain to software.

      You can't compare anything to anything else if you take it to far. The analogy was only to illustrate that the human brain isn't fully adapted to the modern world yet, just like beta software isn't quite ready yet. You're really trying to draw too much out of the analogy.

      Maybe because everyone involved in an air plane crash usually dies.

      I'd be willing to be you have a much higher chanc
  • It's the money (Score:3, Interesting)

    by ZonkerWilliam ( 953437 ) * on Wednesday October 24, 2007 @02:59PM (#21103521) Journal
    As a INFOSEC person, I see this kind of mentality on a daily bases. Still, there is a realization of the costs of outages due to attacks and that I see. Slowly but surely it's changing. Compared to evolutionary changes tho, it's a blink of an eye.
  • Stupid. (Score:5, Insightful)

    by SatanicPuppy ( 611928 ) * <Satanicpuppy.gmail@com> on Wednesday October 24, 2007 @03:01PM (#21103539) Journal
    We're not evolved for space flight either. You can't apply "evolution" as a blanket to tool use at the level we've taken it; we have evolved a capacity for abstract thought which allows us to create highly complex tools...Saying that we're not evolved to assess risk on a level as abstract as this is disingenous...When was the last time a virus jumped out of your computer and ate you? There is no evolutionary pressure involved with such intellectual pursuits.

    It's perhaps more accurate to say that only a few people are capable of truly understanding this stuff at all, and for the rest it's just black magic. Of course they don't appreciate the risk. I guess B.S was trying to find a rational reason why people just categorically don't understand security when applied to technology, but I think it's more just that they're doing well to be able to use the tech at all. We're going to have to have a lot higher skill level among users before we can expect them to truly appreciate security.
    • I haven't read the article yet, but I have a feeling your comments would echo my own. I'd add too that, it's not that your average user can't grasp the concepts, but they haven't been "conditioned" to. We fall back on what we know and Windows, as the OS with the most penetration, has worked for over a decade without requiring gramps and auntie em to jump through hoops.

      Trying to change the mindset of millions of users is not something that will happen over night.

    • We're not evolved for space flight either.
      Yet millions of people go to space everyday? Or perhaps a space flight to the ISS requires months of preparation precisely because we truly aren't evolved for space flight.
      • And that means what, exactly? We're evolved for tool use, and our tool use has grown so evolved that we're creating tools that surpass the conception of most members of our species. Does that mean that they need to evolve up to be able to create those things themselves? Not at all. It's a societal division of labor. Someone has to clean the telephones.

        A biological population will have many individuals who have differing levels of skill at different tasks. A species as diverse as ours has a great many roles,
    • When was the last time a virus jumped out of your computer and ate you? There is no evolutionary pressure involved with such intellectual pursuits.

      Wow, it sounds like you're in violent agreement with Schneier; he said evolution didn't prepare us for computer security, you agree, then you call him stupid for saying it.

      Anyways, these days mortal combat is now primarily an intellectual pursuit, because technology dominates. Usually nowadays we wage war by economic sanctions, which can kill just as many pe [thenation.com]

  • In many ways, we need to go back to square one. We need to teach ethics to the younger generation. Hackers and phishers will always remain one step ahead of the security community in developing new methods to bypass security measures. The problem is, we should have to erect so many virtual walls. The real question we should be asking ourselves is: why is this behavior acceptable -- even lauded at times?
    • I would argue that there is no "evolution" that we can make as a species that will cause this problem to go away...It's a problem of software, not hardware.

      Teaching people ethics isn't going to help though...If we could just teach everyone to be nice, we'd have done it a long time ago. Millenia of evolution have taught us about competition for scarce resources, and that expresses itself in all kinds of anti-social behaviours, and it always has. Sure, the instinct to protect the herd is in there as well, but
    • by pla ( 258480 )
      We need to teach ethics to the younger generation.

      Which will accomplish what exactly?

      You can't make everyone into a paragon of virtue, no matter how hard you try. And it only takes a few to prey on the rest (reducing the number of scammers would just increase the profitability per scammer).



      why is this behavior acceptable -- even lauded at times?

      Because the same behavior in other contexts has largely beneficial effects (even though it offends the establishment - Though that in a way makes it mor
  • so what? (Score:5, Insightful)

    by AxemRed ( 755470 ) on Wednesday October 24, 2007 @03:03PM (#21103581)
    We aren't specifically evolved do algebra either, and we (well, many of us) do a decent job at that. Humans are evolved to learn and adapt.
    • by apt142 ( 574425 )
      Well some of us can adapt. Some of us are just dumb.

      Go down you local street corner and see how many people can solve the simplest of equations. I'm guessing you wouldn't get a high percentage of people who could. And we've been teaching algebra in schools for a long time. It's a requirement in my state to pass Algebra to graduate high school.
      • Re: (Score:3, Funny)

        by apparently ( 756613 )
        Go down you local street corner and see how many people can solve the simplest of equations


        Well, for any equations where the solution is "go fuck yourself!", "I got somethin' you can solve, sugah!", or "no seriously, go fuck yourself" the subjects in my test study pass with flying colors.

    • Re:so what? (Score:5, Insightful)

      by kebes ( 861706 ) on Wednesday October 24, 2007 @03:28PM (#21103901) Journal

      We aren't specifically evolved do algebra either, and we (well, many of us) do a decent job at that. Humans are evolved to learn and adapt.
      Absolutely. But Schneier's point is not that it is impossible for humans to think rationally about IT security, but that it does not 'come naturally' to the average person. The same is true of algebra and other branches of mathematics: humans in general have very advanced knowledge in these areas, but it is still quite easy to construct a mathematical problem that will trip up a layperson, because most people are not formally trained in mathematics, and will incorrectly invoke "common sense" when solving a problem.

      The fact is that humans have an in-built "threat and probability analysis" system that was optimized to deal with "real world" situations like searching for food, avoiding predators, finding mates, etc. It is for this reason that gambling "works." People are easily tricked into believing that they can "beat the system" or "find a pattern." They believe that having rolled many sixes recently, they are "due for a 1 or a 2" even though the probability of rolling a particular number on a die is independent of previous rolls. This is because most of our in-built probability estimators assume chains of events are causally linked (which is a reasonable assumption in the "real world"--i.e. if it's been a long time since it has rained, it is indeed "due to rain soon").

      In the realm of security, Schneier identifies certain assumptions that our minds make, which are actually fallacies when it comes to modern security (e.g. that a commonly occurring risk is less important than a rare risk).

      We are not "built" to deal with modern security. As with advanced math, rather than rely on common sense (and its associated useless rhetoric) to set security policy, we need to have detailed arguments citing well-documented studies. We can indeed rise above our "programming," but far too many people don't bother trying--and continue to rely on common sense even when it is a demonstrably poor predictor.
      • But Schneier's point is not that it is impossible for humans to think rationally about IT security, but that it does not 'come naturally' to the average person.

        OTOH - Schneier has a vested interest in supporting that belief. Without generating fear, he can't get consulting gigs. Without generating controversy, his value as a pundit and speaker goes down.
        • by gclef ( 96311 )
          ...And some of us are apparently not terribly well-evolved to see the problem with ad-hominem attacks, either.
          • And some of us are apparently not terribly well-evolved to tell the difference between an ad-hominem attack and facts.
      • Re: (Score:3, Funny)

        by Chris Burke ( 6130 )
        They believe that having rolled many sixes recently, they are "due for a 1 or a 2" even though the probability of rolling a particular number on a die is independent of previous rolls.

        My goodness, this is simply untruth! While it may be so in the white halls of academia, where such things as "fair dice" and "independent events" are bandied about as though they actually exist in their perfect mathematical forms, it isn't so in the harsh reality of the craps table! Allow me to explain. You see, when you ro
  • Smith (Score:5, Funny)

    by pete-classic ( 75983 ) <hutnick@gmail.com> on Wednesday October 24, 2007 @03:05PM (#21103615) Homepage Journal
    "Only human."
    --Agent Smith on IT security
  • Phhhh ... (Score:3, Informative)

    by foobsr ( 693224 ) on Wednesday October 24, 2007 @03:06PM (#21103629) Homepage Journal
    ... if it really must be Schneier, read: "Why the Human Brain Is a Poor Judge of Risk" ( Wired [wired.com] ), but better immediately turn to Kahneman .

    CC.
  • I disagree with the use of the term 'evolution' to discuss the inadequacy of emotional responses to threats. People can be successfully trained to overcome these issues. As a security professional, I know my spidey-sense has altered considerably over the years due to training and experience, and I would think that others in fields where risk assessment is all in a day's work have largely had the same experience, and, to a certain extent, this is extensible to the population at large. (For example, I find
  • by hobo sapiens ( 893427 ) on Wednesday October 24, 2007 @03:10PM (#21103677) Journal
    People want the easy way. Security and "the easy way" are often at odds.

    Case in point...I was in a hospital ER the other day, waiting in the room (for a very long time), and I looked at the computer in the room. I noticed that someone affixed a sticker to the keyboard tray with (presumably) the windows domain login info. Had I wanted to, I could have logged in and probably gotten to all kinds of medical records. Someone from the hospital's CIS department would probably poop a brick if he saw that.

    People are lazy, and security folks constantly have to toe the line between making things hard enough to be secure but not so hard that it's just easier to find the loopholes.
    • I can one-up you on that. I recently saw a security system control panel with the four-digit PIN code written in permanent marker on the plastic housing near the LCD display, and clearly labeled as such: "Security Code: 1-1-1-1". To make it even worse, the panel directly faces the unreinforced glass doors used for the business's main entrance, and is clearly legible from outside the building.
    • Re: (Score:3, Funny)

      by blhack ( 921171 ) *
      And that is why it SUCKS to be the person in charge of security for a domain. Make the security too harsh and the users complain (with good reason) that they can't get anything done. Make things too lax, and you turn into an alcoholic schitzophrenic who does nothing but sit at home in the dark murmering about exploits and unencrypted telnet sessions that your entire company runs on, and how even the software providers out in north carolina won't implement SSL into their software because all of their progr
  • by Zombie Ryushu ( 803103 ) on Wednesday October 24, 2007 @03:12PM (#21103707)
    I don't think thats the case. I think its just that culturally we fear what we don't understand and are being taught to be stupid and proud of it. Biology and evolution have nothing to do with it. We can learn these concepts we just willingly refuse to for religious and ideological reasons.
    • I don't think thats the case. I think its just that culturally we fear what we don't understand and are being taught to be stupid and proud of it. Biology and evolution have nothing to do with it. We can learn these concepts we just willingly refuse to for religious and ideological reasons.

      Human culture has evolved right alongside human physiology. I'm not sure that there's any benefit to trying to distinguish between them at this level of discussion.

  • He's a security guy, not a biologist. His list (I must not be well today, I'm actually RTFAs) is correct; e.g., 3000 deaths this century in the US from terrorism and 40,000 every single year on the highways, but OMG ITS TEH TERRAISTS!

    However, although he's well versed on security his grasp of evolution is even slimmer than mine, and I'm no biologist, either. The only way evolution would come into play would be if computer security had the effect of killing us before we had children. Clearly, the security of
    • Linda, OTOH, had 14 kids, 13 of which are still alive. She trumps me in the evolution game 13 to 2.

      I'm almost certain that this can be shown to be a fallacy. Natural selection is an ongoing process. If you're a one trick pony, in this case, lots of children, then you have many offspring, but they all are more likely to be "specialists" not "generalists", and will be less adaptable.

      Any way I note that
      a) Linda's large family is less likely to be down to genetic factors than it is to social or cultural factor

    • Re: (Score:3, Funny)

      by Jasin Natael ( 14968 )

      There is no possible way to "evolve" computer security.

      Then, it sounds like we need a lethal, compulsory video game with a computer security theme.

  • Just an excuse (Score:5, Insightful)

    by Kohath ( 38547 ) on Wednesday October 24, 2007 @03:16PM (#21103761)
    Security solutions have to be designed around usability. If usability isn't the #1 or #2 consideration, it will increase the failure rate of the humans involved and you'll end up with an insecure system in practice regardless of the technical merits of the security methods.
  • The crude animal impulses present in the vast bulk of humanity are masked by the accumulation of accomplishments by extremely rare geniuses. Skim off the top 1% of creative freethinkers, and humanity wouldn't be all that different from any other species on this planet. Our feelings about what is or is not secure are easy to game with scary stories and special effects. Our desire to live peacefully in a democracy can quickly be overwhelmed by a relatively small threat, such as by a group of underfunded Isl
  • What a pile of carp (Score:4, Interesting)

    by Roadkills-R-Us ( 122219 ) on Wednesday October 24, 2007 @03:18PM (#21103799) Homepage
    The real problems are, in no particular order:

    1) A lot of people are either stupid or uneducated.
    2) A lot of people don't bother to think.
    3) A Lot of people are sheep and believe what they're told by marketing.
    4) A lot of people are lazy.

    I guarantee you this covers the vast majority of the problems with IT security. It's not biological evolution, though you could make a good argument for societal devolution being the problem.
    • by Frozen Void ( 831218 ) on Wednesday October 24, 2007 @03:38PM (#21104041) Homepage
      You forgot :
      5.Building an insecure system from the ground up and expecting the users to fix it.
    • by grumbel ( 592662 )
      I completly disagree. Sure, people might be lazy, not bother to think and such, but thats not the problem, thats the way humans are, maybe not all, but a very large number of them and there is *no* chance to 'fix' that on a global scale. If you just blame the user you will never reach a good state when it comes to IT security.

      Security talk is *way* to focused on rather irrelevant theoretical stuff, sure, it might be interesting when algorithm X is is now vulnerable to attack Y and Z, both of them however ve
    • And, that list describes both the IT security community and the systems users.

      Thr real problem is wirh the IT 'pros.' They need to develop security solutions that apply to users with just those attributes. The users aren't changing any time soon.

      Lazy, stupid, unthinking IT sheep need to get their act together.

      IT security is not evolved for the people it is intended to serve.
    • by mstahl ( 701501 )

      You should read Kevin Mitnick's book [amazon.com] on the human element of security. There's a lot more reasons beyond laziness why security fails in a lot of circumstances. His book covers physical as well as abstract security.

    • Re: (Score:3, Insightful)

      by turing_m ( 1030530 )
      It also stems from upper management either not being smart enough or not dedicating enough time to do a bit of basic research on security, so then they either ignore security issues entirely, or want security but completely underestimate the intelligence required to do a good job at it.

      I'm reminded of reading "Surely You're Joking, Mr Feynman!", where Feynman routinely bypassed the cargo cultish efforts at security by his ostensible military overseers. It's the same pattern - primitive people attempting to
  • by Otter ( 3800 ) on Wednesday October 24, 2007 @03:19PM (#21103809) Journal
    Is there anything on which Bruce Schneier is not an expert? Now he's an expert on evolution? I'm not sure why he thinks his knowledge of cryptography qualifies him to hold forth on every freaking subject on the planet.
  • I guess people are running around in some sort of Darwinian intellectual enlightenment these days. I've been seeing bad evolution and artificial intelligence references all over the place recently. It's only a matter of time until some jack-off writes about a darwin 2.0 semantic web

    Anyway...the issue with security isn't that people aren't "evolved" enough to use it, it's just that the solutions presented to the masses are garbage. You don't implement something in a way which makes it difficult to use, the
    • I think it's the opposite. I think most people are capable of advanced lines of thought, they just choose not to because to them it's work.

      Like, if they have to use a password that is hard to guess [er, remember] then they look at the service as "unfriendly." If they have to wrap their minds around trivial concepts like public and private keys, then the solution is too hard (honestly, if you can't figure out public/private keys you're probably operating on the mentality level of a severely retarded 8 yr o
    • Re:Stupid Crap (Score:5, Interesting)

      by Quiet_Desperation ( 858215 ) on Wednesday October 24, 2007 @03:33PM (#21103969)
      which makes it difficult to use, then say that people are just too dumb to use it.

      That always amazes me to this day.

      IT GUY: Your PC is insecure.
      AVERAGE JOE: I don't really know how to properly secure it.
      IT GUY: Dumbfuck.

      Yeah, great approach. Gosh, why don't we teach kids that way?

      TEACHER: What's 147 divided by 7?
      FIRST GRADER: You haven't taught us division yet.
      TEACHER: Dumbfuck.

  • Time to get rid of planes (not snakes, just the planes), frozen yoghurt and tv. I can't see how any of that is in our genetic makeup. If we should fly, I'm sure we'd have evolved some wings by now.
  • This sounds like what's in his 2003 book, Beyond Fear.

    I suppose we need the repetition though.

  • On the other hand, why *should* we evolve for IT security? It's not like there's a Darwin Award waiting for the dumbest user or admin. There's no evolutionary advantage for comp sec aware folks... unless we start creating some, like opening up safety related systems to the wild. Mmmm, how about wireless interfaces to the internal networks of cars, or to household appliances like gas stoves? Or the charge circuitry of Li-Ion batteries? That'll teach the noobs.
  • People are very quick to confuse inbred and conditioned behavior, because it can be hard to distinguish.

    Calling a behavior inbred is usually a cop-out: if it's inbred, then we can't do anything about it, so we can stop thinking logically about it and just attribute it to bad human wiring. It's the lazy person's way to end an argument.

    I suggest to you, that someone who has been brought up in an environment where trust is treated like the complex subject that it is, will do better than someone brought up in
  • Our brains haven't evolved a single way to solve problems; That's why we're as successful as we are as a species, is that our brains can evolve and solve new problems as they come up.

    This guy demonstrates a severe lack of understanding of the subject, which is odd given who it is.
  • " 'As a species we got really good at estimating risk in an East African village 100,000 years ago. But in 2007 London? Modern times are harder.'""

    So those kids in East Africa with their shiny new XOs should run rings around us westerners?

    Oh, wait...
  • But in 2007 London? Modern times are harder.

    Phew! I'm glad I'm in Seattle.
  • I spent years doing technical security, but that eventually turns to box shifting. Sure, there are very clever tools out there, but what good is that going to do my clients if they still leave a laptop ready to be stolen, and use passwords an 8 year old can guess?

    And that's again just the technical side. We have a setup which advises on all sorts of security, and doing the anti-kidnap coaching is a serious eye opener for someone who's been living on the command line. It puts it all in perspective (althou

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...