Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security IT

Schneier: Security Awareness Training 'a Waste of Time' 284

An anonymous reader writes "Security guru Bruce Schneier contends that money spent on user awareness training could be better spent and that the real failings lie in security design. 'The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on,' Schneier writes in a blog post on Dark Reading. He says organizations should invest in security training for developers. He goes on, '... computer security is an abstract benefit that gets in the way of enjoying the Internet. Good practices might protect me from a theoretical attack at some time in the future, but they’re a bother right now, and I have more fun things to think about. This is the same trick Facebook uses to get people to give away their privacy. No one reads through new privacy policies; it's much easier to just click "OK" and start chatting with your friends. In short: Security is never salient.'"
This discussion has been archived. No new comments can be posted.

Schneier: Security Awareness Training 'a Waste of Time'

Comments Filter:
  • by qbast ( 1265706 ) on Wednesday March 20, 2013 @05:17AM (#43221781)
    It demonstrates that car industry has failed. We should be designing systems that don't need seatbelts and don't care if user decides to slam into a tree at 100km/h. Whole concept of secure driving is just an abstract benefit that gets in the way of enjoying driving.
  • Well, duh.. (Score:2, Insightful)

    by Anonymous Coward on Wednesday March 20, 2013 @05:20AM (#43221793)

    Users can screw up because they are just as human as you. So live with it. Design around it. Make it safe regardless.

    I've only been saying that since, mwah, 1999 or so.

    Policies are OK, but rules that assume perfect compliance to work are really only there to cloak the failure of engineering in some fault tolerance in system architecture and user UI design. Glad someone finally caught on..

  • Obligatory quote (Score:5, Insightful)

    by Krneki ( 1192201 ) on Wednesday March 20, 2013 @05:27AM (#43221815)

    A common mistake that people make when trying to design something completely foolproof was to underestimate the ingenuity of complete fools.

  • Invalid comparison (Score:5, Insightful)

    by Aethedor ( 973725 ) on Wednesday March 20, 2013 @05:33AM (#43221835)

    He's comparing security with health and driving to 'prove his point'. Security is not the same as health or driving. So, any conclusion from making a comparison is a false one.

    Second, you don't have to choose between completely ignoring security awareness training and spending lots and lots of money and time in it. There is a very good choice somewhere in between. I agree with him that the information systems have to be secure and shouldn't offer dangerous actions but no matter how secure you make your information system, it will all fail if the user has no clue about what he or she is doing. And giving empolyees a basis level of security awareness doesn't have to cost a lot of money but will still help you prevent a lot of trouble.

  • by Anonymous Coward on Wednesday March 20, 2013 @05:45AM (#43221869)

    Security Awareness training is a tick the box exercise most companies do to get auditors off their back.

    Apparently, users are supposed to be "trained to recognise phishing emails and other Internet frauds". IT has enough trouble these days trying to recognise them, and somehow our ordinary users are supposed to recognise them too?

    Users have to be "trained to pick good passwords". This should be system designed to prevent users from picking bad passwords in the first place.

    Users should be advised to "pick strong passwords and change them regularly". Two contradictory statements, no-one can remember a new complex password that changes regularly unless they write it down. Oh, users should be told "not to write down passwords".

    Awareness training is pushed because there are a number of so-called "security consultants" who have no real technical skills, yet have made a living pushing this snakeoil. They unfortunately are also good self-promoters and have the ear of regulators and auditors.

    If you are relying on security awareness to protect your infrastructure, you're screwed. Most users don't care, and even those who do care cannot possibly be expected to remain aware of the myriad of threats that exist. Often, their attempts to remain secure achieve the opposite purpose ("I heard you tell me email was insecure, so I use dropbox now to transmit files to customers").

    What galls me most is I have to spend part of my IT budget this year spending money on this stupid notion because it is expected by auditors. This means I have to cut back on the security projects that make a real difference.

  • by mwvdlee ( 775178 ) on Wednesday March 20, 2013 @05:46AM (#43221871) Homepage

    To stay closer to the original analogy...

    Would you drive a car randomly left by the side of the road with big stickers on it saying "You may be eligable to win $1mln if you drive this car!!! (paid for by Soilent Green Corp.".

  • by DMUTPeregrine ( 612791 ) on Wednesday March 20, 2013 @06:03AM (#43221915) Journal
    No, he's saying that we should be adding seat belts and anti-lock breaks and eventually self-driving cars to eliminate the need for the user to focus on safety in driving. He's arguing that the safety should be built into the system, and not rely on the judgement of the user. That's the exact opposite of your example.
  • The worst thing (Score:5, Insightful)

    by drolli ( 522659 ) on Wednesday March 20, 2013 @06:12AM (#43221955) Journal

    is that many companies are too lazy to even get the most fundamental things right. Why on earth would you not distribute your own CA fro your internal web services? Do you really want to train yout employees that clicking on the "accept certificate" button is an everyday thing to do? Why dont you manage to get the security settings in a way that "content from an unknown source" is not "content from you own file server"? how the hell shoud the office assistant know that this is dangerous and theoretically unusual if in everyday work the instruciton says to accepti it several times per day? why yould you enable macros in office documents for no reason and not sign the document?

    All security training, hints like "be careful when opening attachements from unknown sources" are anihilated if you train your employees everyday to do the exact opposite thing, namely constructing worflows and selecting toolsets which are requiring exactly that.

    My 2 cents on this

    a) If there is a "do not use/do x" in your security education, then something is wrong. The right way is "use/do y"

    b) Construct your standard processes in a way that your users/employees can work secure *AND* efficient.

    c) If there are new tools and your users demand these, keep an open ear! Note to the management: reserve some bugdet for it. If users find dropbox an efficient service, the right way is not to forbid it but to ask yourself why you cant provide any decent file sharing on your own servers.

  • by Anonymous Coward on Wednesday March 20, 2013 @06:20AM (#43221993)

    Driving a car is a far more focused task, with more salient dangers. Even without safety training people understand that driving erratically, or at high speeds can be dangerous. Using a computer or the internet is more like watching TV or reading an article and determining if what you're watching is fact or fiction; it requires judgement and motivation.

    Given that many adults don't have these skills and importantly that the effects of failure extend beyond the individual involved what Schneier is proposing makes sense.
     

  • by iapetus ( 24050 ) on Wednesday March 20, 2013 @06:22AM (#43221999) Homepage

    Sorry, but your approach is inefficient. Since the system now requires users to choose passwords that aren't memorable (and probably to change them regularly as well) a large number of them will have them written down on post-it notes stuck to their monitors. That reduces the search space even more. :D

  • by jewens ( 993139 ) on Wednesday March 20, 2013 @06:34AM (#43222047)
    The training itself may be inexpensive, but the lost time for "all" employees forced to take the course/class/lecture is not. Not to mention the burden on your staff in tracking attendance compliance etc.
  • by Anonymous Coward on Wednesday March 20, 2013 @06:52AM (#43222097)

    di$agr33WithY0uWh0leH3art3dly&&.

    You think this is memorable? Take a typical company where users are forced to change a password every 30 days.

    They have to remember a new passphrase.
    They have to remember that the start of words are capitalised, except the first word.
    The have to remember to turn s into $, 3 into E etc. In case you're wondering, this is a basic reversal that password crackers find trivial to crack, so you haven't really added two extra character sets - it's security theatre.
    They have to remember to add "&&." at the end.

    As the saying goes, you've created a password that is hard for a user to remember, and easy for a computer to guess.

  • by philip.paradis ( 2580427 ) on Wednesday March 20, 2013 @07:09AM (#43222171)

    Bruce is right. In many environments, information awareness training is an attempt to solve the problem at entirely the wrong end of the failure chain, and is frequently ineffective. It may be difficult to hear for some, but the fact is that such training simply doesn't have a great track record of producing significant overall gains in organizational security, largely owing to the difficulty of mitigating widespread stupidity on the part of human operators. Most companies are not wholly staffed by information security experts, and any perceived near term security gains following training sessions quickly erode as employees revert back to an attitude of "I just want to do X, Y, and Z, and I'm too busy to keep thinking about those scary stories portrayed in last week's training."

    Even military environments suffer from these training challenges. The difference in a military unit is the very real possibility of going to prison for merely mishandling cryptographic material on accident. On the "low" end of the punishment scale, there's more than a few senior enlisted military comms folks out of a job because of such process failures. I served with one such person.

    It's worth noting in closing that you might want to spend a bit of time looking into who Bruce Schneier [wikipedia.org] is before framing him in any additional snarky quote marks. To say this is a man who typically knows what he's talking about is an understatement.

  • Re:Not news (Score:5, Insightful)

    by serviscope_minor ( 664417 ) on Wednesday March 20, 2013 @07:52AM (#43222347) Journal

    Nice to hear it from someone with a big name. I'm an IT security specialist, giving talks every now and then, and I've basically been saying the same for years now. It is one of the topic where I face the most fierce opposition, usually from (big surprise) consultants and other people who offer security awareness trainings.

    Of course, I am exaggerating a bit to make the point. I do think that training to make users familiar with specific security protocols is useful. I don't think general security awareness is. There is a plethora of reasons why it's a failure, from the context-specific nature of the human mind to the abstract level, but the main reason is that we have enough experience to show that it really is a waste of time and resources. Putting the same amount of money and effort into almost any other security program is going to give you a better ROI.

    I am honestly surprised by this. I really do not see how you can avoid security awareness training.

    Forcing the users to pick non-lousy passwords is simply not enough if the users will happily repond to an email from email.admin@scamsite.ru (Re: YO'RE ACCOUNT IS SOON EXPiRE!1) with their username, passowrd, SSN, date of birth and random security questions.

    OK, that's a bit of an exaggeration, but users do happily respond to really poor phishing attacks and will tell their password to someone they assume is an email admin because the email comes from an account with admin in the name.

    Security is as much as a social problem as a technical one, and you simply cannot ignore the social aspect. And for that people have to have some understandings of basic security protocols: e.g. the admins will never ask for your password.

    In fact, I would go as far as to say that security is very much a social problem. Technology will only get you half way. If your system is not easily hackable from the outside, you have reached the minimum standard. The trouble is that "social engineering" is really easy.

    Even if you switch to 2 factor authentication it won't help enough: if the user believes that an admin has contacted them, then they will do ANYTHING to help that admin and will even follow detaile dinstructions to bypass as much security as possible. For some reason people being scammed are way better at following instructions than when they're not being scammed.

    As someone else quoted earlier: never underestimeate the ingenuity of complete fools.

  • by Loki_666 ( 824073 ) on Wednesday March 20, 2013 @08:07AM (#43222403)

    Damn my lack of mod points today. +1

    Force users to chose complex passwords they write them down or learn what the minimum requirement is and create something stupidly simple anyway. Or they constantly forget their complex passwords and are bugging the admins to reset their passwords every 5 mins. Final variant is they use the same complex password for all systems. So, its fairly secure from brute force or random guessing, but once a hacker has one password, he has them all... one password to rule them all etc.

    I've used systems with ridiculous requirements where i've not been able to remember 1 hour later what the hell i used. Something like requiring at least one capital, one number, one punctuation mark, no more than 2 consecutive characters, and no less than 12 characters. I ended up with something like this: Aabbaabbaabb1!

  • by PopeRatzo ( 965947 ) on Wednesday March 20, 2013 @08:32AM (#43222567) Journal

    Employee awareness training is inexpensive and I bet "Security guru Bruce Schneier" will provide training to your developers that is not inexpensive.

    Training only goes so far. Even the best-trained user will make a mistake.

    "Oh, I didn't mean to click that".

  • Re:Well, duh.. (Score:5, Insightful)

    by ATMAvatar ( 648864 ) on Wednesday March 20, 2013 @08:43AM (#43222657) Journal
    They don't intentionally do so, sure. However, most software designers are not trained to develop secure software, are not paid to develop secure software, and in fact, would probably get a heated talking-to by management if they spent the extra time to make their software secure without explicit instructions to do so.
  • by Annirak ( 181684 ) on Wednesday March 20, 2013 @08:55AM (#43222751)

    Yes, I do. The problem is that passwords are fundamentally broken. They are broken in several ways.

    1) The password must be hard to guess. This, generally, makes it hard to remember.
    2) Many implementations restrict the number of characters that I can use for a password. This is downright stupid, as it prevents xkcd/936 compliance.
    3) Every service which uses a password must have a different password to prevent password reuse attacks. This exacerbates 1).
    4) I need a way to recover the password if I lose it. This exposes a secondary attack vector on my password.
    5) There needs to be a guarantee that the password will never be transmitted or stored unencrypted.

    OAuth fixes 3) and mitigates 5) and 2).
    Two-factor authentication fixes 1): guessing my password can be easy provided that attacks on my service provider are slow and that I can report my token lost/stolen in time several orders of magnitude lower than the time required to guess the whole solution space.
    Biometrics can be used to mitigate 1) and 4), but they expose additional flaws, such as lack of revocation. If someone ever gets your fingerprint, they have access to all your fingerprint secured data/possessions, unless they are additionally secured by something else.

    Using most OAuth vendors, however, exposes an additional security hole: tracking by the OAuth vendor (see Google, Facebook privacy concerns).

    Ultimately, it seems to me that the solution is probably private OAuth vendors with support for smartphone-based secure keys. The problem is getting service providers, such as banks, to implement OAuth via a username + domain (OAuth vendor) + token approach.

    This should allow users to choose their OAuth vendor, thereby allowing flexibility in the market when a particular OAuth vendor does Bad Things with users' data. This makes the required password complexity minimal. If the engine which processes the token and password were rolled into a secure smartphone application and transmitted to the OAuth vendor via a back-channel, it would also prevent password scraping.

  • Re:Not news (Score:4, Insightful)

    by Tom ( 822 ) on Wednesday March 20, 2013 @09:22AM (#43222951) Homepage Journal

    I really do not see how you can avoid security awareness training.

    To use a metaphor from my most recent talk: If you need to write "push" and "pull" on your doors, then they are designed badly. Same for security awareness. Improving the security tools is better than telling people how to safely handle broken tools.

    but users do happily respond to really poor phishing attacks

    Yes, they do.

    And all the security awareness training we've been doing for two decades has made which sustained change, exactly? That is the point. Not that we don't have a security problem, but that security awareness trainings are not a good way to solve them.

    Security is as much as a social problem as a technical one, and you simply cannot ignore the social aspect.

    I don't. On the contrary, I believe the security awareness training advocates do. They think that just telling someone solves the problem, when overwhelming evidence to the contrary proves them wrong.

    I believe the solution lies in asking a) why and b) how the users break security protocols and then tackling those issues, instead of telling them "don't do it" and thinking you've solved the problem.

    As someone else quoted earlier: never underestimeate the ingenuity of complete fools.

    I believe calling the users dumb and fools and "lusers" and such is a cop-out. It's an easy pseudo-solution to avoid the real problem, which is not so trivial. Redesigning your concepts, protocols, hardware and software to be fail-safe (or idiot-proof, if you want) is hard. Much harder than shoving everyone into a room to listen to a boring lecture, 90% of which they'll have forgotten as soon as they're out the door.

  • by Idarubicin ( 579475 ) on Wednesday March 20, 2013 @09:50AM (#43223195) Journal
    You really, really, really don't know who Bruce Schneier is, do you?

    Moreover, you really couldn't even be bothered to do a simple Google search before you shot your mouth off, could you?

    In a way, you're actually making Schneier's point. Posting a snarky Slashdot comment is easy and instantly gratifying; doing the least bit of research is a little bit harder and doesn't pay off immediately -- so you can see which happens more often.

  • by philip.paradis ( 2580427 ) on Wednesday March 20, 2013 @09:51AM (#43223205)

    This isn't merely a problem of specialization limiting perception. You're expecting average users to consistently conduct themselves in a manner they're demonstrably incapable of, at least the majority of them. Terminating the employment of those who fall victim to attacks through their own inaction or outright carelessness isn't a long term solution either, as it merely results in churn and a significantly higher bar in terms of what sort of person may be employed at a company. Money is limited, and organizations have to make decisions on the most effective ways to spend that capital with an aim to improving overall organizational security. That money is best spent on incrementally improved and frequently reevaluated security infrastructure and processes that inhibit improper access or information disclosure without overt reliance on human operators to make correct choices in terms of security posture, because those operators will often fail.

    I've spent years dealing with problems in this area, and I strongly dislike the reality of the situation. Unfortunately, my disliking it doesn't make it less true.

  • by Idarubicin ( 579475 ) on Wednesday March 20, 2013 @09:53AM (#43223223) Journal

    It demonstrates that car industry has failed.

    I would say that the car industry had failed if listening to the wrong radio station - tuning 92.3 instead of 92.5, say - allowed a malicious broadcaster to arbitrarily incinerate the contents of my trunk or assume remote control of my vehicle.

  • by delt0r ( 999393 ) on Wednesday March 20, 2013 @10:53AM (#43223903)
    And for many people this is more secure. Instead of any script kiddie with a laptop breaking into your email account from anywhere in the world. They have to break into your office first. For 99.99% of us this is not a credible threat.
  • by JDG1980 ( 2438906 ) on Wednesday March 20, 2013 @11:53AM (#43224481)

    And, compared to the cost of cleaning up an incident, it's STILL infinitesimally small.

    All it takes is one single employee to ignore or disregard the training, and you'll still be paying that cleanup cost. That's Bruce Schneider's point: it's a structural problem, not one that can be fixed by placing more burdens on end users.

  • by Opportunist ( 166417 ) on Wednesday March 20, 2013 @02:57PM (#43226345)

    Lazy bums aside, employees are most concerned with getting their work done. Security is usually one of the things that gets in the way of this. I'm often appalled by the way quite a few companies handle security (I tend to see more than my fair share being a security consultant), it often seems they have some CISO who needs to build a monument for himself, showing off how much he works by making sure EVERYONE knows about it by the sheer number of hoops that they have to jump through. That's how you get amazingly stupid setups for passwords like "at least 12 characters, no 3 characters of the alphabet in consecutive order, at least 2 numbers not at start or end and not next to each other with at least 2 special characters ....yaddayadda".

    If you see anything like this, start flipping keyboards and count the ones that contain post-it notes with the passwords du jour (because of course they need to change every other nanosecond, too).

    This has nothing to do with security, people, this is what I dubbed "Monkey Island Security". You remember Monkey Island? Where Guybrush gets jailed by those cannibals and they start putting up more and more elaborate doors every time you escape through the wall? That's what some do in IT security, we get more and more elaborate and time consuming hoops our employees get to jump through while those that want to bypass security can easily ignore that because the problem lies elsewhere.

    NO, and I mean ZERO, security breeches that I have been aware of in the last two decades can be traced to password guessing. It is amazing, though, how more and more breeches that can be blamed on personnel blunders can eventually be traced to them trying to cope (yes, cope) with security. Post-its containing passwords. Security measures unhinged or bypassed by employees because it actually kept them from doing their work. And so on, so forth.

    Security does NOT mean annoying your employees. Perfect security would actually be nearly invisible to your employees. Because that would also include them not being part of the security system, hence, not being able to fuck it up!

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...