Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security IT

Are Usability & Security Opposites in Computing? 253

krozinov writes "Instinct tells us that computer security and computer usability are inversely proportional to each other. In other words, the tougher and stricter the security is, the less usability there is, and vice versa. However, there have been plenty of cases where both computer security and computer usability went hand in hand with each other and actually improved together. In the last few years security has been the biggest buzzword in computer systems and as such has become part of our computer systems. Before that, computer systems were all about getting it done faster and easier, but now they must also do it securely. Can the two continue growing together? This paper argues that it can, as evident by the most recent Indian Assembly Election."
This discussion has been archived. No new comments can be posted.

Are Usability & Security Opposites in Computing?

Comments Filter:
  • by reynaert ( 264437 ) on Monday November 15, 2004 @12:31PM (#10820777)
    Most applications manage being both unusable and insecure just fine.
    • by dnoyeb ( 547705 ) on Monday November 15, 2004 @03:04PM (#10822379) Homepage Journal
      Useability is what happens after security is cleared. Securitys whole point is to give useability to those that are authorized to have it. If security is interfering with useability, then you will find that even people with authorization will start looking for ways to subvert it. Thus, any security that interfers with useability is bad security.

      Its kind of like welding car doors shut and calling it more secure. It is until people start entering through the windows on a daily basis.

      Just look at CD copy security measures that get cracked in minutes because they interfere with useability.
      • by henrycoderm ( 825377 ) on Monday November 15, 2004 @03:20PM (#10822564)
        Isn't people entering through the Windows the main problem?
      • The best example I can think of, is the inability for CD-writers to make UDF/DirectCD formats readable in IS-9660 format unless the user has administrator permissions. I prefer the security of users not having admin permissions all the time, but the hassle of having to switch users or logging out and back in again, has led to many users demanding admin permissions as default.
      • by eno2001 ( 527078 ) on Monday November 15, 2004 @05:30PM (#10823855) Homepage Journal
        The usability problem occurs during authorization. People don't like to remember complex passwords, so they pick something easy to remember (and figure out) or they write it down on a picec of paper. Or if you use a token authentication system like RSA tokens (with the random number for logging in) then you have added a level of complexity that most users are confounded by. We get calls where I work on a frequent basis because the users can't deal with the tokens. With SSH and static keys, you have the option of using a passphrase. But many people opt for a blank passphrase so there is nothing to type. Here is the ideal:

        1. You touch a computer, it knows who you are by some mystical means and grants you access.
        2. You don't need to remember anything. No passwords, no voice print, no finger print, no retinal scan, nothing. It just knows who you are.
        3. Once it's determined who you are, then it knows what you are allowed to do.

        What is needed is an authentication mechanism that works in the same way that we "authenticate" our friends and family to interact with us. If you see your wife, husband, girlfriend, boyfriend or child, you have a predfined "access list" that allows them access to your resources. The authentication is that you know your relationship to them. A girlfriend or boyfriend may allow sexual contact with their partner that they wouldn't allow to their child or parent. Pretty basic, but that's what most people (deep down) want from their machines. (No. Not the sex you idiot, the access to a resource) Until machines can actually recognize us (which probably won't happen until they know themselves), I think we're going to have this usability/security problem.
    • It can be related to a Johari window [wikipedia.org].

      There are applications that are

      • both unusable and insecure
      • applications that are just usable
      • applications that are just secure
      • And then, of course, there's the "unknown" pane where secure & usable applications fall into.
    • Re:Of course not. (Score:2, Interesting)

      by edxwelch ( 600979 )
      Yes, a good example is that pop-up warning message box that outlook gives you when you receive an email with an attachment - provides no real security and at the same time serves to make the application so much more annoying to use.
  • My Soapbox (Score:5, Insightful)

    by rednip ( 186217 ) on Monday November 15, 2004 @12:32PM (#10820789) Journal
    My best example of where 'increased security' actually defeats it's purpose is rapid password expiration. I've seen password policies which force a user to change their password every thirty days The problem is that most users have trouble remembering passwords. This 'forces' users to do two things,
    1. create a series of passwords, which may be as simple as adding a number to the end.
    2. or, write down passwords
    System Admins and Managers can force unique passwords, keep a long password history, and check desks, but then the burdon falls more heavly on their help desk system.

    No matter what the password policy eventually users will need to have a password reset, each time is a cost on the tech support system. Proper security whould have a security officer phyically identify each user before reset but that would be costly, so they instead ask a couple of profile questions. Which open up social engineering issues. So generally, the harder your password policies are, then the easier your reset policies need to be, (unless cost really isn't an issue).

    • Re:My Soapbox (Score:2, Interesting)

      by omghi2u ( 808195 )
      Is there a way to check for similar passwords in someone's history without 1)violating their privacy 2)compromising their password ?

      Just a thought.

      But you are totally correct in that conundrum!
      • Not unless the old password is stored in cleartext somewhere and the new password is transmitted in cleartext (well, not in hash form at leasst) over the wire...both of which are additional weaknesses....the first one being much worse than having similar passwords IMHO.
      • yep. You can check the hash value of the new password to the hash value of any old passwords - completely secure way to make sure passwords are not repeated.

        • You responded to: Is there a way to check for similar passwords in someone's history

          by saying: You can check the hash value of the new password to the hash value of any old passwords

          Now, I've only taken a few CS classes, but my understanding is that a *good* password hash should yield very different results for similar passwords. It seems that if you can see the similarities between "Password" and "pASSWORD" in your password hashing algorythm that it's not a very good hashing algorythm.
    • by stecoop ( 759508 ) on Monday November 15, 2004 @03:00PM (#10822341) Journal
      I especially like the policies where your account is locked for something like 30 minutes on N bad password attempts. I like trying to guess what the boss's password is right before a high-level critical presentation. For some reason administrator account doesn't ever get locked though; that's too bad huh?
    • Re:My Soapbox (Score:2, Insightful)

      by Anonymous Coward
      I find that sites with single-sign ons have much less of a problem with users forgetting passwords, even when they must change them frequently.

      Its when we force them to remember several user/passes that they get into trouble. Especially if the systems all have different password policies and/or naming conventions.

      If only there were a true, inexpensive, and easy to setup/maintain single sign-on solution.
    • There are gadgets to store "safely" passwords, and fits well in a keychain. If for your company is so critical the security of the passwords, well can buy for everyone there one of such things, or just magnetic cards instead of having to write passwords (of course, both gives some phisical security concerns, but are safer than some other alternatives). Is not exactly the same, but one of the uses for my palm is just storing there hard to remember passwords protected by a master password with strip [zetetic.net].

      Also,

      • Re:My Soapbox (Score:4, Interesting)

        by TykeClone ( 668449 ) * <TykeClone@gmail.com> on Monday November 15, 2004 @03:23PM (#10822598) Homepage Journal
        Sometimes the password edict is not from the company, but from the regulators. In the banking world, the IT examiners are not (necessarily, but who's kidding - they're really not) IT people. They've got a script that they follow in looking for "IT risks" and if you have weak password policies (not forcing changes every 30 days and lockouts and other stuff) you get knocked for it.

        Those gadgets are a nice idea, but I'm not sure that they would fly (yet) with the administrators.

    • Agreed there, but the other issue that I have at work is the sheer number of passwords that are required for my users. Then you have people use the same password over multiple platforms and applications.
    • Re:My Soapbox (Score:5, Interesting)

      by ajs ( 35943 ) <ajs.ajs@com> on Monday November 15, 2004 @03:31PM (#10822675) Homepage Journal
      I resolved this problem by writing a program that generates provably secure, memorable passwords for users.

      Of course, the security buffs in the audience just stood their chairs back upright, brushed off the cheetos dust from their pants and are preparing to roast me over a slow fire for public stupidity. Let me explain.

      I tried using a password generator called mkpasswd that comes with expect. I thought it generated great passwords because they looked impressively secure. Then I did the math... ulch.

      This was my introduction to a concept that I later read about in many places, including Applied Cryptography: the human's ability to judge secure from insecure is based on pattern-recognition. If you generate passwords or other tokens that don't match a pattern that the brain is used to, it looks "obscure", and that maps in most people's minds to "secure"... wrong.

      This program generated a 9-character password (sounds good) which had to contain at least one punctuation mark and 2 digits... Prolbem is there are only 10 digits, and just a handful more valid punctuation marks, so searching all 9-character passwords that contain 2 digits and a punctuation mark is orders of magnitude less work than searching all possible 9-character passwords. The result was then limited further to the requirement of 2 upper-case letters and 2 lower-case latters. Well, there goes the farm! It turns out that the result is easier to crack than a random sequence of alpha-numerics with no punctuation (and only slightly more secure than an 8-character sequence of random alpha-numerics)!

      So, I began doing some research on techniques for generating things that would look insecure (i.e. are memorable), but would actually be more secure than mkpasswd's approach. I found several approaches, and eventually came up with several of my own over the course of about 8 years. I now use a set of about 20 patterns which are permuted into slightly over 100 patterns including pseudo-word generation, permutation and combination of english words and so on. Each pattern maps to at least 1x10^13 possible passwords, and usually much more.

      I've also added various strictness settings where the top 1% or so of crackable passwords are eliminated from the result space (this is tricky, as removing too many possible results is just as bad as having a weak pattern).

      I now generate all of my passwords this way, and in reviewing what I used to have for passwords before, I have to say that my passwords are certainly more difficult to crack now (of course, part of that is that I use longer passwords now that MD5 passwords are fairly universally supported).
      • Re:My Soapbox (Score:2, Insightful)

        by Meostro ( 788797 )
        So you claim that you read Applied Cryptography, and yet you use a proprietary/secret method, not obviously subjected to peer review, to generate your "secure" passwords?

        You, sir, are probably an idiot.

        Your idea is interesting and overall it sounds sensible, but unless others poke and prod at the exact details, you'll never know if your passwords really are secure or not.
      • Re:My Soapbox (Score:2, Insightful)

        by SanGrail ( 472847 )
        Sounds interesting...
        I Am Not A Crytography expert, so I was just wondering if you could explain further why *excluding* punctuation and numbers was harder to crack?
        If you didn't know *which* characters were letters, numbers, or punctuation, wouldn't that mean instead of just trying 26^10 combinations, you'd be doing (26+10+punctuation?)^10?

        On the other hand, I do definately agree that having more memorable passwords (usually pronounceable), definately pays off, as while there's a higher probability of vow
      • So, is this program available? Under the GPL or BSDL, hopefully?
      • I resolved this problem by writing a program that generates provably secure, memorable passwords for users.

        I, myself, developed a secret technology that would generate passwords that are mathematically proven to be very difficult to "crack" yet are exceptionally easy for the user to remember.

        I based the approach on the proven techniques of phrenology [wikipedia.org], which allows me to map out certain neural pathways for each user. Based on this, I use an obsure corrolary to the Prime Number Theory to generate a secure

    • Looks like someone needs a Pronounceable Password generator [movetoiceland.com].

      Feel free to use it wherever and whenever you want. I've also ported it to Javascript [movetoiceland.com]

    • ,p> System Admins and Managers can force unique passwords, keep a long password history, and check desks, but then the burdon falls more heavly on their help desk system.

      My response to something like that would be to forget my password every few days. You see, I have about 15 passwords for various things, and I already use a spreadsheet to track them. Oh well.

  • No. (Score:4, Informative)

    by sporty ( 27564 ) on Monday November 15, 2004 @12:32PM (#10820794) Homepage
    I can make a horrible to use app that is insecure, and with a bit of effort, make a system that is secure, but easy to use.

    Take pgp and email. There are TONS of plugins for various emali clients to support signing and encrypting email. Yes, encryption can be broken someday, it's true, but if someone made a plugin that bumped it to 16k keys, it's easy and fairly secure. If people are further educated and enforcfed to not share their password and private key, it's quite possible.

    If you make a system that requires dozens of passwords to do things, duh, people will reuse their passwords or make they simple, or worse yet, put them on their monitors.
    • Re:No. (Score:2, Insightful)

      Imo, usability is part of security, since both come from the same "trunk": design. You could never design a good, not bloated, usable application without good design, which includes planning good on security aswell...
      • Re:No. (Score:4, Insightful)

        by sporty ( 27564 ) on Monday November 15, 2004 @12:52PM (#10821012) Homepage
        Of course you can. Security only means you are who you are and you can do what you can do. Simplest secure app I can think of actually, is ssh. Back it up with something that checks the difficulty of passwords, and you have something that allows access to a foreign system easily. The ease of use of the rest of the system on the other side is totally seperated from the security.
    • Re:No. (Score:2, Insightful)

      by Nosf3ratu ( 702029 )
      Firefox is more inheritly secure than IE. Firefox is easier to use than IE. Tabs are easier to manage than multiple windows. Not having a "SHOOT THE MONKEY LOL" flash ad pop up when I'm trying to read the news -- or highly sexual suggestive ad for "HOT GIRLS ON UR DESKTOP", for that matter, makes using the internet easier. Letting me know that a popup has been blocked is nice. Being able to just hit F3 to "Find next" intead of keeping a floating Find dialogue GUI covering up the text I'm actually looki
    • During my sophomore year, my prof asked us where most people kept their passwords. A couple people guessed things like "in their brain," trying to sound smart.

      I raised my hand and said "Under their keyboards."

      • So, they *tried* to sound smart, and you....? Did you feel anything hit you in the back of the head after the prof said you had the right answer?

    • Take pgp and email. There are TONS of plugins for various emali clients to support signing and encrypting email.

      A great example. Let's take PGP. It's been available for over a decade. It has been fully integrated (directly or via plugins or scripts) for years. Yet how much do you see it? And of those you do see using it, who are they? In my experience, it is a small subset of technicaly knowlegable individuals or small groups that require encryption and have been told to use PGP. PGP is not us

      • ...PGP is not used by the masses....

        The reason most people don't use encryption is because they feel it is not needed for most of their mundane communications.

        Whether in computing or in the physical world, added security is increased work which will only get done when there is a perceived need for it. I remember when I was a youth, we and most neighbors NEVER locked the door on their house where we lived in Palo Alto CA. It was not needed and nothing was ever stolen. Today that is not possible, because pe
  • Feature Creep (Score:5, Interesting)

    by cgenman ( 325138 ) on Monday November 15, 2004 @12:41PM (#10820904) Homepage
    One of the things that has killed both usability and security of modern computers is feature creep. The ability to run Visual Basic scripts as part of your file browser. Javascript interpretations of file names.

    Most people forget that computers should only have one button. It should be marked "do exactly what the user want me to do," and it should do exactly that. Unfortunately, many systems are not designed from the viewpoint of a new user, but rather the professional user who created the system. There are five or six areas where a command can be found in the windows Explorer interface, and a given command can be in one, two, or all of them. Very occasionally, a command will only be available in the help file. sKill is far more usable than Kill -3.14159265, yet is no less secure. If end-users couldn't see what they couldn't access, they would have a much less cluttered interface and less obvious routes of attack.

    • Unfortunately, many systems are not designed from the viewpoint of a new user, but rather the professional user who created the system.

      You could say the same thing the other way round. Most systems these days are designed for the new user without giving much thought to people using the system while knowing what they do. One extreme example is the whole windows configuration infrastructure with the options dialogs and the registry. It is a relatively easy way to change settings the first time but after you

    • Yet another example of the advantages of Unix. By making tiny programs that do one thing well, you're presenting the user with fewer choices in each program and giving them more flexibility while maintaining security.
  • If you're going to say "This question has been around since the beginning of computing, and can even be said to date back to biblical times.", perhaps the footnote could link to the bible text in question, or even an explanation? Konstantin, since you're reading -- what are you talking about? "Shibbolet"? The rock over Laban's well?

    As far as the main point, I'm not sure how newsworthy it is. But it's certainly news to the admins here, who are convinced that more, longer, more complex, more frequently rotat
  • "Instinct tells us that computer security and computer usability are inversely proportional to each other."

    I don't think this is particularly true. In all walks of life, if something is more usuable, then it tends to be more secure, if only because if it is easier to lock something then people are more likely to lock it.

    If it is easy to use the security features on a computer, people will. A lot of home routers tend to be left in an insecure state simple because securing them is too complicated and it is the type of task that can only be done if you already know how to do it.

    I would be willing to bet that if you did a survey of the broadband routers installed by 'normal' home users, the ones with the highest usability of the firmware, would also tend to be the ones that have been scured the most.
    • I agree with you about the broadband firmware, but you would probably also find that the most "secure" routers are also the ones behind which it is the gretest hassle to play games, use p2p apps and various other direct-connect items, therefore its usablity to the average user is less.
    • I think what we really perceive is that "security" and "ignorant/inattentive accessibility" are inversely proportional. Meaning, how secure your computer is is inversely proportional to how easily you can access it without having any idea of what you're doing. If things are secure, you need to know how to operate things, and have passwords memorized, and you generall need to pay attention to what you're doing. Plus, in general, inaccessable means more secure (all else being equal). But once you have acce
      • Oh, and other thing to throw into the mix: anonymity. The relation between freedom/usability/accessibility/security and anonymity is interesting. Like, we might be able to increase security if we all had RFID tags implanted under the skin, and that might be very usable and accessible without impinging directly on freedom, but it keeps you from being anonymous, which might indirectly impinge on certain sorts of freedom.

        I realize this wasn't meant to be about politics, but the topics are connected. When you

        • Paypal's CEO, Peter Thiel, once said "There's a trade-off between privacy, security, and convenience, you can have any two at 100 percent, but the third will be almost nonexistent." Convenience is closely related to usability.

          This is obviously a simplification, however there's a lot of truth to it. For example, at some level, any form of authentication is going to degrade privacy at some level.
    • It certainly doesn't (Score:5, Interesting)

      by Anonymous Brave Guy ( 457657 ) on Monday November 15, 2004 @03:12PM (#10822472)

      I couldn't agree more. In fact, I'd go as far as to say that usability is a necessary minimum requirement for security. After all, a very large proportion of attacks succeed because of a simple human failure, not an electronic one.

      For example, if banks would stop constantly requiring me to remember seventeen different ID numbers, "memorable" words and phrases, I might notice the e-mail they send out reminding me not to give out my PIN number to anyone else.

      On a more techie level, languages where it's easy to code properly make careless errors like allowing buffer over-runs or SQL injection less likely.

      At the heart of good usability are principles like KISS and not giving the user unnecessary chances to go wrong. These don't exclude giving the user power, but what better partner for keeping a user safe than not giving them silly chances to do dangerous things?

      • For example, if banks would stop constantly requiring me to remember seventeen different ID numbers,...
        That is one of the worst examples concerning RL-Usability I know. Why can't they keep their databases primary keys to themselves and just identify me by my name, adress and birthdate? This goes not only for banks but also for the government and most companies which instantly assign you a customer ID.
    • His "instinct" seems to have been trained by the security blackholes that Microsoft produced, joined by being them the only applications that he runs because the others are "not intuitive". I would change "instinct" with "according with my experience" so let other people think in its own "intuitive" way. My experience with OS/2, BeOS, Mac OS X, and Linux is that they are pretty secure, and have environment/applications pretty intuitive, so if i never had used any microsoft OS my instinct would say me that s

    • I don't think this is particularly true. In all walks of life, if something is more usuable, then it tends to be more secure, if only because if it is easier to lock something then people are more likely to lock it.

      Ever fumble with keys while trying to get groceries in through the front door? How about trying to get in your car during a rainstorm? Ever realize that you've forgotten or lost those keys? These are examples where increased security decreases usability.

      That doesn't mean a secured syst

      • ...ultimately a far different beast than information security...

        The inverse relationship between security and convenience applies to all areas of life, including computers. The goal is to make the security sufficient for the need with minimal increase of work, but there will always be SOME extra work for more security. There is no such thing as a free lunch in security. Someone has to do extra work and it usually is the person needing access to the computing resources. The more valuable the thing being pro
  • by xutopia ( 469129 ) on Monday November 15, 2004 @01:06PM (#10821155) Homepage
    People idea of usability is usually that programs work the way they are meant without asking for too much help to do their job. For example a usability feature of Internet Explorer was to automatically execute .doc file viewers when you downloaded them. The action of executing automatically is wonderful and for many is seen as a great usability enhancement. But what happens when the .doc file can be programmed to do all kinds of problems on your computer? What if that automatically executed script within causes havoc with other seemingly non-related things? Then what is the overall usability benefit there? Negative if you ask many people.

    The hassle of viruses, worms and other crap which appear on people's machine causes many usability problems in my book. The more maintenance you need to do on a machine the less usable it is. A windows machine needs plenty of work to keep up with updates, spyware, adwares and viruses. On the other hand the OS which doesn't execute things automatically when you visit a web site doesn't require as much maintenance.

    I always use the analogy of cars. Cars have locks on their doors, then you have to use your key to turn the motor on. Now imagine cars without locks on their doors. One less hassle in the way of doing what you want right? How about no keys to turn on the car. It automatically turns on when you put your seat belt on. Wow! What an amazing car!! Guess what though? That type of car wouldn't stay in the driveway for very long. Well a Windows computer is that type of usable car that doesn't stay in your driveway for very long. Linux might ask you to put a key in the door and turn the engine on with that same key but at least it's still in the driveway when you need it.

    • Thats why I take public transit. Press one button (put a coin into the slot) and let somebody else do all the work for me!
    • The car metaophore should be banned!
      Its like you are lying on the ground on front of my car comparing something in computing to cars and I drive over you.
    • People idea of usability is usually that programs work the way they are meant without asking for too much help to do their job. For example a usability feature of Internet Explorer was to automatically execute .doc file viewers when you downloaded them. The action of executing automatically is wonderful and for many is seen as a great usability enhancement. But what happens when the .doc file can be programmed to do all kinds of problems on your computer? What if that automatically executed script within c

    • ### The action of executing automatically is wonderful and for many is seen as a great usability enhancement. But what happens when the .doc file can be programmed to do all kinds of problems on your computer?

      Aehm, so what? When you remove the automatic loading people will click the link, press the "Open" button and then get the possibily evil .doc loaded, useability is reduced, win in security is extremly close to zero. Fixing the doc-Viewer would be the right thing todo, not making viewing docs more comp
    • Unless I have items of value in my car, I leave it unlocked with a small (not especially visible) key in the ignition. It hasn't been stolen yet, but that doesn't prevent anyone from taking advantage of the situation.

      Folks, this is security by obscurity at its finest. If cars came default with this behaviour (analogous to windows), I'd take more care in going through the hassle of securing my car. That is to say, if automotive hooligans could rely on a significant population of cars to be that easy to acce
  • by digitect ( 217483 ) <digitectNO@SPAMdancingpaper.com> on Monday November 15, 2004 @01:26PM (#10821377)

    Architecturally, it is generally accepted that the security of a building is opposed to it's accessibility. Take for example a grocery store. The ease with which customers can get in and out is directly related to how easy it is for the place to be robbed. Movie theater design is similar.

    However, usability overcomes some of these problems by making entrances obvious, door opening automatic, lighting bright, etc. I believe a comnputer interface should be the same. Just because I have to remember a password, doesn't mean that entering it need be. Perhaps many passwords presents a different problem, but one of the supposed ideals behind biometric data is that it can be greatly complex and yet still readily available. But does that mean it's less secure?

    • Perhaps many passwords presents a different problem, but one of the supposed ideals behind biometric data is that it can be greatly complex and yet still readily available. But does that mean it's less secure?


      Definitively: yes.

      Don't base your security on something you cannot change easily.

      If your password is compromised, it's a no-brainer to change it. Your biometric data may be harder to compromise, but if it is, how do you change it? Surgery?
      • If your password is compromised, it's a no-brainer to change it. Your biometric data may be harder to compromise, but if it is, how do you change it? Surgery?

        This comes back to the canonical "something you know, something you have, something you are" model. Good security should involve at least two and preferably three of the above:

        Something you know: a PIN or password;

        Something you have: a card, a key, an RFID tag, etc.; and

        Something you are: a biometric--iris scan, facial recognition, fingerprint, a

    • Architecturally, it is generally accepted that the security of a building is opposed to it's accessibility. [...] However, usability overcomes some of these problems by making entrances obvious, door opening automatic, lighting bright, etc.

      Even accessibility is not always opposed to security. If you want to rob a store, you have a few requirements that are different from those of the "shopper" user: ideally, you want to enter, take what you want, and leave quickly with as little resistance and recogniti
  • Article summary (Score:5, Informative)

    by daveschroeder ( 516195 ) * on Monday November 15, 2004 @02:55PM (#10822289)
    Q. Are Usability & Security Opposites in Computer Systems?

    A. Yes, for instances where security measures do decrease usability. No, for instances where they don't.

    A2. Yes, for instances when software makers don't care about security, nor about integrating it properly. No, for instances where they show they care about security and want to do it properly.

    Come on, seriously. Sometimes, various measures for security make things "harder" to use. But there are so many things which define "security". Authentication, authorization, encryption, access, and each at several different levels.

    The ultimate answer is, yes, security and usability are opposites when the responsibility for the security measures rests entirely upon the end user. Simple example: Make a user have a password, and they'll make it their dog's name (not secure). Force it to be too complex, and they'll forget it (not usable). Mandate that it be changed every week AND be too complex, and they'll write it down (not secure or usable).

    When the security measures are administered by a skilled external entity (such as a knowledgeable and sensible IT staff) or integrated seamlessly into applications and operating systems (by knowledgeable and sensible software makers), they can be "usable". In fact, "usable" is the wrong word: it should be "transparent".

    There are ways to make good security - whether it's for an entire organization or a single workstation - usable, and non-intrusive. It just takes someone with the skill, knowledge, and foresight to do it.
  • Hmm (Score:4, Insightful)

    by Anonymous Coward on Monday November 15, 2004 @02:56PM (#10822301)
    Usability, security and cheapness. You can have any two
  • by grub ( 11606 )

    Can the two continue growing together?

    I've used OpenBSD on my desktop for ages. Pick a nice WM and you're set.

    Security does not preclude usability.
  • No. (Score:2, Insightful)

    by zerguy ( 831069 )
    You're computer isn't very usable if it gets polluted by viruses :)

    Seriously though, there is an inconvenience, but that's all. I have to configure my router to let BitTorrent through, but the fact that I have to do this gives me an immense boost to my computer's security, by virtue of the fact that nothing is sent to my comp's ports unless I tell the router to let it through.
    • by Zarf ( 5735 )
      You're computer isn't very usable if it gets polluted by viruses :)

      The viruses are users too. Meaning that a perfectly insecure system is very easy to use. Easy for the attacker to use. We didn't specify who was supposed to use the system did we?

      The point of security is to make things hard to use. Hard to use for specific users. In this case we want a system that is easy for humans to use but hard for viruses to use. So we want to make things easy for "Good Users" and hard for "Bad Users" ...

      In you
  • by Zarf ( 5735 ) on Monday November 15, 2004 @03:06PM (#10822405) Journal
    I called it the Security to Convenience scale. Where 10 is perfectly secure and 1 is perfectly easy to use. However, in this notion security features can be seen as usability bugs.

    I've already discussed this humorously here [slashdot.org]. The point being that if you really want to you can see things like BSODs as security features. Difficulty in configuration can be seen as a usability feature because it prevents security.

    If you squint hard enough all bugs are features and all features are bugs. This view point is utterly useless in the real world, however, strangely orthogonal it may be. It still bears thought for the system designer to consider that his perfectly secure system may render the system so close to useless as to make it practically so... and thus cost him his job either directly or indirectly.
  • by Weaselmancer ( 533834 ) on Monday November 15, 2004 @03:08PM (#10822432)

    Are Usability & Security Opposites in Computing?

    I propose the following experiment. Yes, yes I know there are service packs and patches available, that's why I'm calling this an experiment.

    Take a Windows XP CD and load it onto a system you're not using for anything important at the moment. Do not connect it to a network in any way, shape, or form. Load the PC up with applications. Roughly judge load times, mouse and keyboard times...mess around with it a while and see how responsive it is. Not too bad, right? Fairly useable.

    Now, plug your netcard directly into your net. No firewall. I suggest plugging the box directly into a cablemodem. Wait 24 hours.

    Notice any difference? This is exactly why Usability and Security are NOT opposites. Any box that's running 99% cpu with malware and viruses is damn near unusable.

  • by jacksonps4 ( 746646 ) on Monday November 15, 2004 @03:13PM (#10822481)
    There is often a trade-off between security and convenience rather than usability. It is necessary to strike the right balance between the two. There is little point in adding layer upon layer of security for something which is not worth protecting. Equally, a little inconvenience can be justified for the protection of something valuable.
  • The phrase in development is 'cheaper, faster, better: pick two' can be modified for the topic at hand: 'secure, useable, cheaper, faster: pick two'
  • by OmniVector ( 569062 ) <see my homepage> on Monday November 15, 2004 @03:16PM (#10822517) Homepage
    you're confusing usability in this case with convience. there's a distinct, yet important difference. usability means something is easy to do and use -- for example it's easy to install an app in mac os x. you just drag a .app file to the applications folder. this is far more usable than a windows wizard installer (less complex, less steps, less reading, less chance for error, etc). however, lets say i set my account up to be a "Standard" user in mac os x. now when i perform this operation, i get an authentication dialog that asks me for an administrator username and password. this is an inconvience. the usabilty has not suffered, but an added dialog to keep security intact has been added. they do not conflict directly.

    increased security only has the effect of reducing convience. i could make myself an administrator and never get a password dialog. this wouldn't have any effect on the original usability of the system. likewise, i could encrypt all my ram and swap space. this would increase security, but have no real effect on usability. security is implemented with policies, and as long as those polices are reasonable (i.e. require a methodology that isn't directly in conflict with a program trying to do it's job) then the only thing it will do is require the user to enter additional passwords when designed properly. a poorly designed system (windows) doesn't implement this policy well. doing operations like copying files to admin-writable-only folders in windows is an example of a poorly implemented policy. in mac os x, i'd get an authentication dialog. in windows, i simply would get an error, with no added dialog to request a username and password.
  • Like burglar alarms in a building.

    The use of alarms has a definite impact on ease of use. In our building, the number of people coming and going at different times makes it impossible to secure the entire building. There are alarms but they don't get used.

    So here, usability and security are in conflict, and usability wins.

    However, we've created a secure zone which has our real offices (as compared to the large insecure garage space which is basically a place for parties and such), and here we have very
  • There are brilliant designs that are both simple to use and secure (and usually simple to build into the bargain). The problem is that there are not so many
    brilliant designers out there. Coming up with these designs often involves novel functional decompositions, new UI metaphors, unusually structures interfaces or something else that is hard to get to by "normal" design processes.
  • No. (Score:3, Funny)

    by Raven42rac ( 448205 ) on Monday November 15, 2004 @03:20PM (#10822561)
    We mac users have the best of both worlds.
  • Synergy (Score:4, Funny)

    by Junior J. Junior III ( 192702 ) on Monday November 15, 2004 @03:21PM (#10822567) Homepage
    Good security fosters good usability, and good usability fosters good security. When either is considered a-holistically, it results in a detrimental relationship to the other. We all need to learn how not to think like a-holes.
  • That really depends if you are talking about industrial computers, or your grandma's computer.

    In the case of your grandma, the computer should be secure enough to not be infected or hacked, and that's about it. There's no national security information on her hard drive, and no one will be particularly interested in stealing her grandson's birthday pictures. Too much security at the user level will get in her way. Security below the user level is just what she needs.

    On the other end of the spectrum you'
  • Symantec says "Yes!" (Score:3, Interesting)

    by asdfasdfasdfasdf ( 211581 ) on Monday November 15, 2004 @03:23PM (#10822601)
    I installed Norton Internet Security a few weeks back, and by default it kills all connections to shared resources... I've got a linux computer that's basically just a samba drone, and for whatever reason, Norton keeps blocking access.. Eventually, I had to turn all share blocking off to keep it from happening intermittently. There's no user-friendly way of telling it during install or configuration, "hey idiot, I'm connected to several drives/printers for sharing, open up those ports" It doesn't even bother to ask, it just shuts em down.. And did it again after a liveupdate.

    On my XP box, I'm paranoid enough about trojans and activex lunacy that I like to monitor in realtime what is asking for net access and block it accordingly, but at the price of these anoyances, I almost uninstalled it.
    • by gcaseye6677 ( 694805 ) on Monday November 15, 2004 @03:37PM (#10822731)
      Norton products are perfect examples of security made so cumbersome as to be useless. Every machine I've ever used with Norton Internet Security has some major function, such as network connectivity, disabled until Norton is shut down. After enough tinkering, you can get Norton to work and still allow yourself to use the internet, or print, or whatever. As soon as you change anything, time to reconfigure Norton. Then there's the incessant popup nagging reminders or alerts. I'll take viruses and spyware over Norton anyday. I just wonder how much longer this company will be able to continue living off their reputation, since it is the only way they can get people to buy their overpriced bloatware.
  • It replaced telnet/rsh and wrapped your X session so you no longer had to deal with xhost this and xhost that.

    It also let X pierce firewalls.
  • direct relation... (Score:2, Interesting)

    by ambienceman ( 721763 )
    I think the Macintosh OS demonstrates the direct relationship of the two pretty well, even though other companies may not.
  • I'm glad the OP thinks his papers deserve a world-wide audience. However, I would argue that it is generally considered bad form to tout one's intellectual accomplishments so nakedly. That's two in one day for this guy.
  • by Jerf ( 17166 ) on Monday November 15, 2004 @03:26PM (#10822632) Journal
    Usability and security are opposing forces, if and only if the program has optimal usability and security. To make such a program more usable, by definition it requires removing a feature, or compromising security to make it easier. To make such a program more secure, it requires either removing a feature or adversely affecting usability by adding another hoop to jump through.

    Note they aren't strictly speaking opposing forces, since "remove features" can both enhance security and usability. It's just that if your program is already optimal and you need to push it harder, something else has to give.

    You don't have to be a cynic to observe few programs are optimal, and therefore most software engineers don't have to think in this way. Thus, as a practical matter in the current environment, no, they are not opposed. But they should be.

    (As a PS, I'd define security as "Ensuring the computer does what the owner wants, no more, and no less, with the computer owner having all relevant information about and control over what the computer does." But that definition has yet another idealogical focus, no?)
  • From my experience it is a lot easier to make a good interface on top of a good secuiry model. Then to make a good security model on top of a good interface.

    If you build the interface around the security it usualy end up with a far more usable program. For example say somone doesn't have access to a field on the program Knowing this I can make the interface to hide or not even load the button and make all the other objects fit without looking like the feature is there for people w/o access. Conversly if
  • The two opposies (Score:3, Insightful)

    by erroneus ( 253617 ) on Monday November 15, 2004 @03:30PM (#10822666) Homepage
    The two opposites are "Complexity vs. Security." Those two exist as opposites only through casual analysis and not as a hard rule. The root of the problem being bad programming. (No finger-pointing needed... the culprit might be a lazy programmer or a demanding boss who cares more about the deadline than quality tested code.)

    The fact seems to be that the more complex something becomes, the easier it is to break. So in reality, we should expect to see security improvements with decreased complexity in the U.I. As for other methods of hacking software (such as non-UI doors like APIs and network related exploits) the same rules might apply where keeping the complexity to a minimum might easily lead into less opportunity for exploits and thereby improving security.

    Frankly, from where I sit (a non-developer with a basic understanding of programing concepts) I think security issues arrise from really bad programming habits and it's a damned shame that it's just not taught in school... for example, getting graded on your code by avoiding exploitable coding practices and such. As it is, security-minded coding is something that is gained through experience...usually a bad experience.
  • Simplicity (Score:4, Insightful)

    by uid100 ( 540265 ) on Monday November 15, 2004 @03:31PM (#10822683)
    I've said before and I'll say it again.

    "Simplicity is the key to security and usability"

    Problems arise in both area's when you try cramming in features at the last minute. Scope/Feature creep are what makes systems (almost anything) indecure/unreliable and ultimatly unusable.
  • by Illissius ( 694708 ) on Monday November 15, 2004 @03:36PM (#10822725)
    I think I'll quote a post I made at dot.kde.org just a few hours ago, as it seems relevant:

    In my opinion, the default level of security should be (the goal of) immunity to remote attacks. Whatever sacrifices necessary to achieve that should be made, and if additional security can be obtained at no cost, then there's not reason not to have it, but additional sacrifices shouldn't be made beyond that point. If someone gains physical access to the machine, well... if someone gains physical access to their TV, they can just walk off with it, and you don't see people chaining TVs to their walls to avoid the scenario. So really, it's a nonissue for 99%+ of the userbase -- and the rest can take further measures themselves (encrypting the entire drive, whatnot). So what I'd like to see is that users have to enter a password when logging in, and never again after that, unless they specifically choose to. Autogenerating highly secure passwords seems like a good idea -- perhaps Konqueror could try and detect registration forms, and fill in the password field(s) in advance? Or either way, the method to do so should be in plain sight and require minimal effort. Another idea, in order to get rid of the login password hassle entirely and increase security in the process: autogenerate a hugely secure password, and then let the user put it on a USB thumb/pen/whatever drive, flash card, floppy disk, heck, CD, or whatever media they have, and then use it in the same way as a car key. Press the power switch to turn on the computer, and when they insert the 'key', automatically login the user who's key it is. And when they remove the 'key', automatically log them out. That would be rather nice, don't you think? (There should also be a way to recover if the key is lost -- probably just forcing or forcefully suggesting the user to make backups, but that's getting into details.) (And again, if someone manages to steal it, well, credit cards and car keys can be stolen as well. There's no need to be paranoid to such a degree.)
  • by myowntrueself ( 607117 ) on Monday November 15, 2004 @03:40PM (#10822762)
    Something I read somewhere;

    'Some people are of the mistaken impression that being secure is synonymous with being a big pain in the ass'

    Its so true...
  • While I agree with the author's premise, I do not think he had used very good examples. His examples that he uses do not do his arguments justice.

    * In the Indian Assembly Election example he uses, the usability of the voting interface has little to do with the security of the machines. The voting interface can be very user friendly, but that says nothing about whether or not it is possible to hack into the machine through a network and change the results.

    * In his examples of firewalls and adware detectio
  • by gweihir ( 88907 ) on Monday November 15, 2004 @03:46PM (#10822814)
    With SSH I can have secure remote login without password. In addition I get nice things like port-forwarding and compressed connections.

    With Telnet I had less functionality, little security and had to either use my password each time or have even less security (rhosts).
  • I'd say not (Score:3, Informative)

    by RAMMS+EIN ( 578166 ) on Monday November 15, 2004 @03:49PM (#10822846) Homepage Journal
    I don't think they are exactly opposites. There are situations where they conflict; e.g. having to enter a password before you can use a service.

    Actually, security and usability often go hand in hand. I don't think email would be very usable if people constantly messed with your account. Another example is Windows vs. GNU or BSD: I think Windows has very low usability, due to the knowledge and action required to keep the system healthy. Part of this stems from the bad security of Windows. (puts on asbestos underwear)
  • by mark-t ( 151149 )
    The real trick to it is to finding the often delicate balance between security and usability that actually meets the demands of the people the system is working for, as well as delivering this system on time and under budget. That's what makes security such a hard job. Usability and Security are at polar opposits of the same scale, and everyone wants both. Fortunately, different environments place different demands on a system, so the balancing act is usually at least _possible_.
  • unusable software is inherently insecure.
  • Instinct tells us that computer security and computer usability are inversely proportional to each other.
    I think you have poor instincts. You are probably too used to Windows. Try a Mac.

  • Which might initially sound like it is totally unconnected, but please bear with me.

    I was actually talking with a friend about his new (to him) bmw k100 motorcycle compared to older britsh or japanese bikes... with the older stuff making them go very fast wasn't really a problem, thing is, when you were going fast you always knew it... my mate's beemer is different, he cruises along as smooth as anything and looks down at the speedo and is shocked to discover he is doing 12o mph, when it feels liek 70 mph.
  • i.e. you can have usable and secure, if you pay out the wazoo for it. unfortunately, most people don't -- they opt for usable and cheap.

    draw a triangle. at one point write "usable", at the next point write "secure" and at the next point write "cheap". now pick one side to that triangle -- thats your system.

    cant have your cake and eat it too, apparently.

    I opt for usable and secure for corporate and government environments; secure and cheap for home and small business.
  • My short white paper: Some programs are both unusable and insecure, as noted elsewhere in this story discussion. And insecurity makes programs unusable, especially when it masquerades as false security. Insecure programs that appear easy to use are really unusable, in the longer run. Useability is hard because it requires avoiding abusability .
  • It's a simple case of optimizing a few inequalities. The biggest problem is that you can't use SIMPLEX (Operational Research) because that requires linear equations, whereas here we have declining returns (non-linear equations).

    What you have are the following:

    • 0 <= Security <= maximumSecurity
    • 0 <= Usability <= maximumUsability
    • 0 <= Functionality <= maximumFunctionality
    • Security + Usability + Functionality <= maximumCombined
    • (Security * tolSecurity) >= minAcceptableSecurity
    • (Usab
  • by matman ( 71405 ) on Monday November 15, 2004 @06:13PM (#10824373)
    Security is about mitigating risks. Users can not be asked to mitigate risks that they don't understand or believe in. Users must either a) choose to mitigate the risks or b) be forced to mitigate the risks.

    If a user places them self at risk they should have the option to have that risk mitigated. If mitigating the risk causes the user no pain (no extra user action) then automatically mitigating the risk is fine; otherwise, risk mitigation should be opt-in/out-able.

    If a system exposes some other entity which has control of the system to risk, that entity may require that if the system is used, the risks to that entity be mitigated. Thus users will be forced to accept the security measures. While some users will try to work around the measures, the measures are required. The measures should be made as easy as possible to accept, though education, reduction of overhead to the user, etc.

    This applies to all kinds of security, including law. Drug laws are a good example. "Society" feels at risk from drugs, imposes security measures against drugs, and some "users of society" work around those measures to do drugs anyway. Society tries to make the laws easier to obey through education (propaganda?), by limiting access to drugs, by making drug use riskier, etc. The people that have problems with these laws are those people which do not agree with the risk assessment by society (many) and those which do not care about society but do agree with the risk assessment (few).

    Computer security is the same. People have problems with measures when the measures pain them without convincing them of the worth of the cost. You can convince the user by:
    - Reducing the cost of the measure to the user (that's UI work).
    - Increasing the "return on investment" of the measure perceived by the user (that's education).

    So:
    - DON'T force security measures on users when the measures only protect the user and when the user doesn't want them.
    - DO make the purpose of measures clear.
    - DO make the measures as unobtrusive as possible.

    Now a lot of risks involving computers do impact more than just the user. Consider worms where local host security hurts your neighbors (as your machine attacks them). This complicates things.

    As a human being, you must decide whether you want to force measures on someone that they don't want, to protect only them. I don't like other people forcing decisions on me, so I would implore developers to make such measures optional (on by default if the cost is low and benefit high). You must also decide, whether you will force measures on users that don't want them, for the good of someone other than the user. As an application developer, you must consider that any measure that you force on a user, when they don't want the measure, will be seen by that user as a pain in the ass and will help support competing applications. Also, implementation measures will be criticized for usability just as any part of your application is criticized. There's nothing special about security in terms of usability. UI components for features that users don't understand are distracting and confusing, and bad UI components for features that users do understand are just plain frustrating.
  • If you start with only usability in mind, and end up with a design that has inherent security flaws, it's easy to end up in a situation where the only way to improve security is to reduce usability. Internet Explorer is, of course, the poster boy fo rthis problem.

    If you start with security in mind, and maintain both security and usability goals, you can end up with a much more secure design that, by the end of the day, is also more usable.

    For example, if you build a rendering component that doesn't contain a mechanism for breaking out of its sandbox, and then let specific applications add capabilities that objects they directly provide to the rendering engine can use, you can implement almost every piece of functionality that Microsoft designed ActiveX for without having an ever-tightening ring of increasingly annoying restrictions wrapped about the user.

    The only difference is that rather than having Internet Explorer at the core of the system, so that everything ends up looking like part of IE, you have a variety of applications with embedded HTML panes that provide the same functionality.

    What do you lose? The ability to have remote web pages embed trusted control inside their web pages... instead you need to explicitly install plugins or, for in-house tools, run an "intranet update" that downloads and updates the apps.

    This seems less convenient, until you realise the browser is more convenient in other ways because it's not trying to second-guess everything you do... and, once enough people are using it, the convenience of a more spam- and virus- free mailbox has to count for something.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...