Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security United States

Security Responsibility Without the Authority? 206

Slashdot reader jamie submits this story about security administration. If you have the responsibility for security without the authority to make changes, your only role is to be the fall guy when something goes wrong.
This discussion has been archived. No new comments can be posted.

Security Responsibility Without the Authority?

Comments Filter:
  • On the other hand (Score:5, Insightful)

    by tverbeek ( 457094 ) on Sunday October 31, 2004 @02:22PM (#10680175) Homepage
    On the other hand, having the authority without the responsibility is a much larger disaster waiting to happen.
    • Disaster, yes. But only if you're the fall guy. If you have little responsibility, it's great. Does anyone like the new ad at the top? Go IIS!!!
    • Re:On the other hand (Score:5, Interesting)

      by aacool ( 700143 ) <moc.liamg2abmalnamaa> on Sunday October 31, 2004 @02:35PM (#10680245) Journal
      Good first post - for once

      From the article,

      Upper management often issues orders such as "Clean up the system at any cost!" Yet when these same managers get recommendations for pre-emptive security implementation, too often chief information security officers are told, "The budget for this quarter has been exceeded. Ask me again later in the year."

      Information security is a challenging and technologically rewarding profession. Unfortunately, those responsible for carrying out information security often are not given the authority and budget to get the work done.

      http://www.gao.gov/new.items/d02627t.pdf [gao.gov]THere is the definition(pdf) of the Homeland security Dept's responsibility charter, for want of a better word

      From another source, possibly not popular in these circles, is a paper on "Security Considerations for Information Security"http://www.microsoft.com/technet/security /bestprac/bpent/sec2/seconaa.mspx [microsoft.com] An excerpt:

      Security is everybody's responsibility. The creation of a secure IT environment is not just the responsibility of your organization's IT staff. Everyone in the organization has the responsibility to respect and implement the corporate security policies.
      • Re:On the other hand (Score:5, Interesting)

        by Lumpy ( 12016 ) on Sunday October 31, 2004 @07:33PM (#10681852) Homepage
        I will share the last IT security administrator's tactics....

        He saw that he was being set up for the "fall guy" position... you know it when it happens, "you are responsible for all security", ":Oh, we have no money for your department, you can not impliment that security policy, no not that either,...."

        for his last year he recorded all conversations with superiors, printed out and kept (against company policy) all communications with superiors and even kept recordings of voice mails on his company phone and personal cellphone.

        well it collapsed, we were rooted hard, and when they looked for the fall guy, hew was ready and took 7 of the companies managers and executives with him flaming to the ground.

        BTW, his tactics earned him quite a bit in a court settlement with the company. be sure to give all that information to your lawyers also... they love that kind of crap.

        basically, document everything, and under NO circumstances trust your bosses.

      • by tverbeek ( 457094 ) on Sunday October 31, 2004 @09:49PM (#10682405) Homepage
        "Security is everybody's responsibility."

        Never mind where this came from. Although it sounds good, it's the sort of platitude that can easily mean the opposite. That's because when you make everyone responsible for something, that means that no one is responsible for it. The buck doesn't stop anywhere, so when there's a lapse, the responsible party is arguably "everyone", and those who simply do not have the authority to take responsibility for security (which is most)... won't.

    • Re:On the other hand (Score:5, Interesting)

      by pbranes ( 565105 ) on Sunday October 31, 2004 @02:44PM (#10680283)
      I work in a higher-education environment as server/desktop/network support. I am faced with the problem of working with systems that were setup improperly and me not having authority over them directly, but having the responsibility of making sure the network doesn't collapse into a quivering heap.

      The way we have started facing this problem is confronting the end user and the people that setup the misconfigured equipment saying: "you must work with us in fixing this problem, or we will disconnect you from the network and you can find your own ISP". That pretty much gets their attention and allows us to set security policies, firewalls, system/application patches, and virus protection.

      Yeah, its not the optimal solution. We really need a single head person who can enforce security policies totally over every section, but that is difficult in the open environment of higher-ed.

      • by Spoing ( 152917 )
        I could work with you.

        Have you inforced network-level (router + firewall) segmentation yet? (Ex: Systems A & B and B & C can see each other, though not A & C.)

      • Re:On the other hand (Score:2, Informative)

        by zaffir ( 546764 )
        I work in the IT department of a small offshoot company of a larger corporation. For reasons that have never been explained to me, or anyone in our small company, all of our networking hardware is controlled by the IT department of our parent company. Due to some wonderful policy we aren't allowed access to any of our routers or switches. We're practically neutered when it comes to tracking down network issues.

        A while back we had a user bring in a sasser-infected machine from home and plug it into the netw
      • Re:On the other hand (Score:2, Interesting)

        by Kierthos ( 225954 )
        The university (right across the street from me) recently (last summer) and finally implemented a system where if students want to use the university's connection/bandwidth, they have to install certain software (AV stuff mostly) and adhere to the guidelines stated by the university. (Which mostly boils down to "No file sharing programs" and "No spam servers".) They also set the firewall settings on the student's computers, and tell the students not to change them.

        They've had a bunch of students complain,
      • Dictatorship (Score:2, Insightful)

        by Anonymous Coward
        You try to place the blame on misconfigured systems. But when you demonstrably create an adversarial relationship with the users you're supposed to be supporting it proves you're part of the problem. Over and over, IT throws its weight around by not allowing anything useful. Anything IT doesn't understand is disallowed behind the "security" bogeyman and there's no effort to work with the users. When IT does get authority it's a power position, not a technical position. Automatic dictatorship.
        • Re:Dictatorship (Score:5, Insightful)

          by pbranes ( 565105 ) on Sunday October 31, 2004 @08:03PM (#10681966)
          Then, what do you propose we do? Go sweet talk the user and ask that they nicely reconfigure their system pretty please with a cherry on top? We aren't just cutting them off of the network - we are giving them a choice - either configure their system properly, or don't be on our network.

          In IT, more often than not, security has to come first, and people's feelings come second - we are talking are personal information being passed around. How do you propose running a network where the emphasis is on sharing and being nice instead of enforcing strict security policies. Go to a warehouse - the physical security of that warehouse doesn't care if you are a nice person or not - they are going to make sure to enforce the security policies on you the same as everyone else. The same idea applies to data security.

        • Re:Dictatorship (Score:3, Interesting)

          by _Sprocket_ ( 42527 )
          The adversarial relationship is natural. IT tends to involve an inverse relationship between functionality and security. The easier something is to use, the less secure it is likely to be. And likewise, attempting to put in security restraints will tend to impact ease of use. This applies to people too.

          Users' primary interest is having widgets to do their work. Infosec's interest is about protecting existing widgets. The adversarial relationship tends to come in place when deploying new widgets, or m
    • well, duh! (Score:3, Interesting)

      by twitter ( 104583 )
      On the other hand, having the authority without the responsibility is a much larger disaster waiting to happen.

      That's what having a fall guy is all about. Someone has the authority to fix the problem, but no real clue or budget. Enter the fall guy. Upper management "concentrates on the company's core business" while the fall guy eats the blame.

      It's not something that can work forever. How many years can you go to the share holders with bloated IT budgets? Wall Street replaced their core infrastruct

    • by EmbeddedJanitor ( 597831 ) on Sunday October 31, 2004 @03:18PM (#10680446)
      The responsibility vs authority thing is exactly the same for IT as it is for just about any other activity involving many people.

      When I was in the army 20 years ago I had the "responsibility" to get a bunch of guys to move some furniture. Unfortunately I did not have authority over these troops since they belonged to another division.

    • Re:On the other hand (Score:4, Interesting)

      by yintercept ( 517362 ) on Sunday October 31, 2004 @03:23PM (#10680468) Homepage Journal

      Authority (who's the boss) is usually assigned for political reasons. Reponsibility has more to do with ethics and capabilities.

      When the boss is incapable of doing a task, then clearly, some underling bears the responsibility when things go wrong.

      Conversely, the people with highly developed sense of ethics and professionalism step up to the plate, work to make the project work and essentially take responsibility.

      Theoretically, it is possible to give authority to the people who take responsibility.

      On the other hand, having the authority without the responsibility is a much larger disaster waiting to happen.

      This might cause problems for a company...it usually doesn't tarnish the teflon coat of the people in charge. For that matter, when a company sees a manager with authority and no responsibility, they generally respond by expanding his authority.

  • This is by design (Score:5, Interesting)

    by Gothmolly ( 148874 ) on Sunday October 31, 2004 @02:24PM (#10680182)
    I work at a Large Bank, and more often than not, we'll implement an expensive, suboptimal product because a) Someone Else Did It or b) Gartner Said It Was Good. It's all about preconfiguring the blame, it is always someone else's fault - this way, if there's ever a problem and the Gubmint comes looking for tail, we can always point the finger. On a small scale, this reduces to individual admins being force to do stupid things, because Thats What The Project Requires.
    • This was the reason (Score:5, Interesting)

      by MacFury ( 659201 ) <me&johnkramlich,com> on Sunday October 31, 2004 @02:28PM (#10680210) Homepage
      This was the reason many of my clients opted not to go with Linux. One of the project managers told me, "it doesn't matter how long the system stays up, what matters is when it goes down, I can blame one entity."

      Doesn't matter that Redhat and everyone else offer support.

      • by pmsr ( 560617 )
        Yeah, sure. The good ol' blame Bill Gates trick. I am sure it will help them a lot. What do they think EULA's are for? To improve their reading skills? Jeez, some people really do live in a bubble, eh.

        /Pedro

        • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday October 31, 2004 @02:39PM (#10680268)
          It isn't about getting anything out of Microsoft. It isn't about the EULA.

          It's about being able to say that it isn't YOUR fault. You did what EVERYONE ELSE was doing. Then you pull out the magazines and articles about how whatever just happened to you has been happening all over to other companies.

          In many companies, it is more important to not be blamed for a problem than it is to be the one who solved a problem.
          • In many companies, it is more important to not be blamed for a problem than it is to be the one who solved a problem.

            Fuck 'em. I want a company that's interested in getting the job done right, not playing stupid blame games when they screw up.

            • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday October 31, 2004 @02:54PM (#10680328)
              Fuck 'em. I want a company that's interested in getting the job done right, not playing stupid blame games when they screw up.
              In which case, you need a boss who understands the politics and is ACTIVELY working to counter them AND has the support of HIS boss.

              Politics happen in companies. Politics happen anytime you get 3 or more people working together.

              It all comes down to different people having different agendas working together in a company with limited resources.

              The sad thing is that once your technical skills are at the "minimally competent" level, you'd be better advised to learn corporate politics to further your career.

              A technical genius without political skills can be used and abused by a mediocre technologist with good political skills.
              • In which case, you need a boss who understands the politics and is ACTIVELY working to counter them AND has the support of HIS boss.

                I am familiar with the need for a champion (connected person pushing for your project), and the current place I'm at is so very bad at this stuff. I'm mostly venting.

                The sad thing is that once your technical skills are at the "minimally competent" level, you'd be better advised to learn corporate politics to further your career.

                Got any pointers? This technical genius w

                • Know your enemy. (Score:5, Interesting)

                  by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday October 31, 2004 @03:13PM (#10680415)
                  Got any pointers? This technical genius would like to further himself out of the cannon fodder box and into something more lucrative.
                  Start with "Death March". It's a good book on why projects fail and introduces the concepts of politics with agendas.

                  I'd also recommend "The Prince" by Machiavelli. Also, take a few MBA courses. It helps to know how they think and what their phrases actually mean.

                  But no book will ever be able to replace the insights gained from person-to-person interaction. You have to learn how to be "friends" with people who annoy you and how to manipulate them into supporting your agenda. That takes practice and you shouldn't practice it at work. They probably already know it better than you do and will be able to spot your amateur attempts. Instead, look at non-work groups. Your local church is a great place to start. They are usually packed with inter-personal relationships and petty politics. A friend once gave me this bit of insight: "The politics are so vicious because the stakes are so small".

                  Politics is about manipulating people to achieve your agenda. Before you become good at politics, you have to be comfortable with that.
                  • They are usually packed with inter-personal relationships and petty politics. A friend once gave me this bit of insight: "The politics are so vicious because the stakes are so small".

                    The first time I heard that, it was in reference to university machinations.

                • Another poster suggested some books that are good, but for a quick, easy to understand list of things that you can do, try "The 48 Laws of Power". Not all of them will always apply, but it does list 48 good rules to follow, with examples from history on someone being succesfull following the rule, as well as someone failing not following it (or using it at the wrong time/place).
                  Psychology in general is a pretty good field to study. Unfortunately, filtering the wheat from the chaff is difficult, so to be
              • Politics happen anytime you get 3 or more people working together.

                I've seen it happen with just two. If you have multiple personalities, or if you take on multiple roles you could manage with just one. In that case, politics are better known as headaches.
              • A technical genius without political skills can be used and abused by a mediocre technologist with good political skills.

                Thats my career plan!:-) No seriously... I see so many bright and capable people who can't play the politics game and get ground to wheat because of it. I'm good at the technical aspects, but some of these people are so much better than I am. So I figure that in return for protecting them and getting them what they want and need, I'll get them to do great things for me.

                --Cam
            • by killjoe ( 766577 ) on Sunday October 31, 2004 @05:20PM (#10681077)
              You know I started thinking about what you said and something occured to me.

              If reading slashdot is any indication there are an awful lot of companies in the US making decisions based of really stupid and irrational criterea. I have heard many times "we didn't go with X because there was nobody to blame" and "we didn't go with Y because SCO might sue us" type of totally idiotic reasons. Why is that? Is Harvard business school or joe blow MBA mill really producing management that is unable to assess risk and intelligently apply reason to their decision making process? Think about it.

              I wonder if this is some sort of an American thing. Are people in Europe and Asia making decisions like this? If not we are about to get our assess kicked awfully hard.
          • by ScrewMaster ( 602015 ) on Sunday October 31, 2004 @02:46PM (#10680292)
            Going back even further, remember the phrase "you can't get fired for buying IBM?" That pretty much epitomizes the pack-mentality approach to IT ... do whatever everyone else is doing and you, personally, have your ass covered. Doesn't matter if you've left your company wide-open for a security breach, or simply wasted the company's resources on an inadequate solution. Nowadays, of course, it's "you can't get fired for buying Microsoft" although there are an awful lot of people, from CEOs on down, that ought to have their asses in a sling for that reason alone. From my perspective, if a corporation deliberately stores my personal information using a server OS that is known to have more security holes than the Moon has craters, when that info is stolen the people that made that decision should be up on charges of negligence or worse.
            • civil legal matters (Score:2, Informative)

              by zogger ( 617870 )
              "From my perspective, if a corporation deliberately stores my personal information using a server OS that is known to have more security holes than the Moon has craters, when that info is stolen the people that made that decision should be up on charges of negligence or worse"

              If it is a company you do business with, send them a letter-snail mail, registered, notarized whatever, in advance to that effect. Not a threat, just a reminder that they have alternatives, and it's in their best interest business-
      • by ergo98 ( 9391 )
        Interesting.

        I'd say more often the exact opposite is true. People choose Linux because of the general perception that it is the more stable, more secure choice. After a rooting the security admin can proclaim "All the press and the community said it was the greatest thing since sliced bread...I don't know what went wrong!"

        Given all the bad publicity Microsoft has (deservedly) received, it is a huge risk for architects and security admins to choose Windows -- when things go wrong everyone can immediately c
      • "it doesn't matter how long the system stays up, what matters is when it goes down, I can blame one entity."

        But what exactly does that get you? If it goes down, do you plan on suing the vendor for damages despite the gibberish in the license? If the vendor is microsoft, do you expect to be successful in suing one of the world's richest companies? I don't think any software company has ever been successfully sued for damages before.

        I just don't get how being able to blame Microsoft is any different from

        • It's not about suing the vendor. It's about being able to say to your boss/the board/the shareholders that "hey, it's not MY fault - MS made a lousy product. Whaddya gonna do?" Then your boss/the board/the shareholders shrug their shoulders and say "yeah, they DID make a lousy product. I guess it's not your fault." Then they buy the next version of Exchange ("WHIZZBANG EDITION!!!") and the cycle continues.

          Since everyone knows that whole "do things the same and expect different results" def'n of insani


          • It's not about suing the vendor. It's about being able to say to your boss/the board/the shareholders that "hey, it's not MY fault - MS made a lousy product. Whaddya gonna do?"


            Ok, but how is that different from going to the board and saying "hey, it's not MY fault - Joe Blow wrote some lousy code."?

            • Twofold - first off, there's the whole change/admitting mistake thing. They don't wanna do it. Being able to blame Jow Blow means they would have changed and thereby admitted a mistake in previously choosing Microsoft. Once a problem happens, then they will likely have to admit they made another mistake when they trusted Joe Blow.

              Second, "everyone" uses Microsoft. That means when a problem happens, everyone gets to stand up and say with one voice "Microsoft screwed us over". If you read in trade mags

        • by mrchaotica ( 681592 ) on Sunday October 31, 2004 @08:11PM (#10682008)
          But what exactly does that get you?
          You misunderstood. "You" the company gets screwed, but "you" the manager or "you" the IT guy avoids getting fired. It's called putting your own best interests before those of the company.
          • It's about playing by the company's rules. They set the rules. If they wanted to succeed they would operate as a meritocracy and give the power to the most capable people. Instead, they just want to make some money and move onto the next corporation, which takes the fall instead of them, so they set it up so that the people who will support them are in positions of power so they can do whatever they want and get away with it :P
      • Interesting how the whole thing revolves around placing blame instead of being blameless. Speaks volumes to me, anyway.
    • terminology (Score:4, Funny)

      by sanctimonius hypocrt ( 235536 ) on Sunday October 31, 2004 @02:36PM (#10680252) Homepage Journal

      It's all about preconfiguring the blame

      In the field of enviornmental compliance, the person 'in charge' is known as the 'designated inmate.'


  • by arcanumas ( 646807 ) on Sunday October 31, 2004 @02:26PM (#10680191) Homepage
    On the other hand this can be very good if you are *not* the guy with the responsibility. This means that when you fuck up there is a 'blame him' guy near by. :)
    • Especially if he doesn't speak english.

      Ahh, Tibor, how many times you've saved my butt."
    • by vwjeff ( 709903 ) on Sunday October 31, 2004 @04:16PM (#10680756)
      Sadly I am the blame guy at my job, AKA, the bitch.

      It goes like this at my job. I am "in charge" of network security and maintaining our Microsoft and Linux servers. You would think that my office would be located at the central office where all the servers are. This is not the case. Instead my boss, the IT manager, is located at the central office. Whenever he thinks something is not working right he makes changes to our production servers during business hours. My boss has no training in IT security. He's an MBA that has limited knowlege in security but thinks he knows more than he does.

      Here's how most situations go. One person calls and complains that the finance database is slow or our inventory database is not working correctly. My boss then logs into the server and makes changes without documenting anything or telling me. You can image what happens next. Yeah, I get blamed for problems that occured after he changed something. I then have to go back and try to trace what he did. I know I can't ask what changes he made since that might seem like I am blaming him for the problem he created.

      After going through this senario four times I decided to remove his login to our production servers. Big mistake.

      I got a call from my boss two days later asking why he couldn't login to our production servers. I had prepared ahead of time and had a story made. I told him that I had noticed someone was logging in to our production servers and making changes during business hours which is against our IT policy. I went on saying that the changes made during these logins were responisble for the problems. I then told him for better security I should keep his account off the production servers so that the person who was making changes could no longer do so. He then said, "In the future could you please let me know when you make changes so we can be on the same page." I told him that I always documented the changes I made in the server logbook. I told him that I would reactivate his account with a different password. Since then he has not made any changes to the system.
      • It may also be worth noting that your boss going in and making undocumented changes may very well be illegal now, under Sarbanes-Oxley (assuming you're in the U.S.).
      • That's beautiful.

        I use a multi-pronged approach to keep the other admins under control:

        • sudo logs their actions
        • tripwire tells me what files they change
        • firewall prevents them from starting new services

        Overall, it works pretty well. (I think) I know about every change that happens to my systems. At least, strange stuff doesn't happen without an audit trail to figure out who was responsible.

        Disclaimer: if you're one of my cow-orkers, please assume this was written in regard to one of my other system

      • Your manager shouldn't have access to the servers in the first place. It is not his job to logon to systems and change stuff, he is a manager not a tech.
  • I think that would be time to start looking for another job... FAST!
    Absolutely no good can come out of this situation except as a blurb on your resume. i.e. Was responsible for network security at firm with more than 500 computers for the last 6 months.
    • I know so many folks who worked at big network companies like Lucent, Nortel, Cisco. And from what they admit, there are infinite number of security holes everyday for every customer they provide service for. To get fired over a security hole, something catastrophic would have to take place!

  • False priorities (Score:5, Insightful)

    by FiReaNGeL ( 312636 ) <fireang3l.hotmail@com> on Sunday October 31, 2004 @02:26PM (#10680202) Homepage
    The phenomenon isnt specific to IT security admins; its the (sad) consequence of corporations with 'false priorities' ('one hand doesn't know what the other is doing' thing). Management ask you to do something they don't have a clue about (in this case, improving security on a network). Then you ask for resources to do the job, and the Finances guys refuse for budget (priorities) reasons.

    Basically, you're stuck in a bad position : management yell at you if anything goes wrong, Finance is annoyed by your constant demands they see no 'use' for.

    Of course, not every business works this way. But it tend to when the company gets too large...
    • The phenomenon isnt specific to IT security admins

      That was the first thought when I read the post. It's a very old idea in politics (not "politics" like government, but as in the subject of study relating to social power): never separate power and responsibility.

      Whenever making someone responsible for some duty/task, always make sure you're also giving them the power to fulfill that responsibility. Otherwise, you're just setting them up to fail. Power without responsibility, on the other hand, is guara

  • Double-edged sword (Score:5, Insightful)

    by fembots ( 753724 ) on Sunday October 31, 2004 @02:29PM (#10680212) Homepage
    But what happens when one can set rules and enforce them at the same time? That'll be too much power.

    Usually in a company, IT department takes care of the adminstration of IT-related stuff, and HR takes care of the rules/policies.

    If these two departments don't compliment each other, that's the problem to be fixed, instead of mixing two different roles together.

    That's my personal experience anyway, I find it easier to tell the users to take to HR (or vice versa) than having to deal with (punish) or explain certain policies to users.
    • Explain to me how you could be the Security admin if you did not have authority to enforce the security policy? How would you you get users to use 8 character alpha-numeric passwords if you didn't have the power to click that checkbox? How would you limit Internet traffic if you couldn't alter the firewall/proxy rules. How would you scan for viruses if you don't have the authority to scan incoming email? To actually do the job of Security Admin, you must have the authority to enforce all security related po
      • by trashcanman ( 30020 ) on Sunday October 31, 2004 @03:10PM (#10680400)
        I think perhaps you are missing the point that fembots was trying to make. Putting the authority to both make and enforce policy into one department invites corruption and uninformed policy making. I agree with fembots that the policy making group should be independent of the policy enforcement group in any large organization. That being said, I think it is imperative that the policy making group understand the implications of its policy. Thus, having some kind of IT expertise in the HR department (or at least in the IT policy making process) is required to make a policy that is informed and enforceable.

        So all of the actions you alluded to in your comment (password length, firewall rules, etc.) would be the job of IT (or IT Security) to enforce, whereas the the writing of the IT policies would be the responsibility of the HR department (with participation of IT technical resources from within or outside the HR department). This is usually the way it works for physical security in most large organizations.

        ---
        • So all of the actions you alluded to in your comment (password length, firewall rules, etc.) would be the job of IT (or IT Security) to enforce, whereas the the writing of the IT policies would be the responsibility of the HR department (with participation of IT technical resources from within or outside the HR department). This is usually the way it works for physical security in most large organizations.

          As long as the enforcement department is only responsible for enforcing the policies as written, no p

    • I'm haven't been in the corporate world for a really long time, so forgive me if this is s stupid question.

      Why would HR be setting computer security policies? Is this common? Has HR become so powerful?
      • HR has a lot security of requirements that need to be covered. They need to make sure that the organization is HIPPA complient, that the employee lists don't leak, etc. It would be suprising if they weren't invovled in the group that sets security policies.
    • That does sound like a bad idea, but in some ways, it looks like IT security is such a non-concern such that a security position isn't taken seriously. Why ask someone's opinion if you know you will reject it?

      It looked to me that it is like asking a janitor to sweep up before hours but not allowing him/her a way into the building.
  • by digitalsushi ( 137809 ) * <slashdot@digitalsushi.com> on Sunday October 31, 2004 @02:30PM (#10680220) Journal
    Anyone else want to share some of their favorite overused phrases with IT security?

    My favorite phrase is "... working hard to ensure this never happens again". We usually hear that within 4 hours of a customer calling and using the phrase "you people". "You people lost my database again!" "We can assure you we are working hard to ensure this never happens again". We've had a 0 dollar buildout and maintenence budget for 4 years. They actually get MORE surprised each time something breaks, cause we're supposed to be getting better at using the tools we have.

    Ok here's a different question -- anyone ever had to use their own property to band-aid something within the company about ready to explode?
    • by Fulcrum of Evil ( 560260 ) on Sunday October 31, 2004 @02:48PM (#10680301)

      anyone ever had to use their own property to band-aid something within the company about ready to explode?

      Don't ever do that. If you do, then they think their current budget is fine, so they won't pony up the next time, and, should you ever leave, how are you ever going to retrieve your property?

    • Ok here's a different question -- anyone ever had to use their own property to band-aid something within the company about ready to explode?

      Yes, I've used stuff from my own junk box to keep stuff running at work. I've also made the occasional run to Radio Shack or the local electronics store for a part. That's what happens when you have a severely dysfunctional purchasing process.

      These days, I'd just say "fuck it". The organization treats you like a disposable part, why do them any favors?

    • If you think the only way to fix something is to use your own kit, you have a big problem.

      That's like working for free...and probably about as legal. You need to suck it up and tell the boss "we need this piece, and if we don't get it, Bad Things(tm & C ) will happen."

      And document it to within an inch of its life.

      that way, when the witch hunt starts, you can whip out those docs from your own personal Pearl Harbor file and show that you knew what you needed, and were told to sod off.

      Holloway's laws
    • In addition to the problems mentioned in the other reply, involving yoru own hardware can make you far more liable for a situation than if you simply 'followed procedure'.

      Whenever something goes wrong in a business environment, there is a fight over who gets the blame. Whenever something goes right, there is a fight over who gets the credit. The person actually responsible is rarely the victor, in either case.
  • CSO Magazine (Score:4, Interesting)

    by Anonymous Coward on Sunday October 31, 2004 @02:31PM (#10680227)
    CSO had an article about this a few months back, and talked about how many corporations have taken the teeth out of the CSO position.

    I've seen this first hand in our midwest US city, where the requirements for most security positions are a MCSE and a CISSP with little to no interest in management and policy-level expertise. IT security has very quickly become a janitorial position. Senior management has punished IT for excessive spending by gutting it of senior level representation (to the benefit of other empire building projects, typically).

    Curiously enough, these companies are sitting ducks for your run-of-the-mill script kiddie. From putting unencrypted backup tapes on the top of file cabinets in highly trafficed hallways (at one database company that I've worked with) to believing a firewall and antivirus is perfect security (to several of the larger banks I've met with on security projects), they're complacent and believe IT security is just another IT "dot-com money wasting project." Better to spend the money in the profit centers and ignore defensive protections as the lack of a serious attack means they'll never experience one. Little do they realize, the only reason they haven't been attacked is that there aren't enough hackers to take all the easy pickings.
  • by Anonymous Coward on Sunday October 31, 2004 @02:37PM (#10680259)
    as with any job where you might be in a delicate
    position or 'the target' should things go wrong
    that are beyond your control ( whether due to
    lack of authority or lack of omniscience ),
    Document, Document, Document .. do your due
    diligence, report any possible vulnerabilities,
    suspicions of attack and recommended changes to
    your immediate boss, your IT/CIS team and their
    managers. Be public, but don't be patronizing.
    This 'paper trail' will help you immensely should
    you be terminated over some security breach should
    you be able to prove that, were your suggestions
    implemented, the breach could have been prevented.
    Security work is ridden with chance : if there is
    a flaw in the hardware or software that had not
    been documented at the root of a breach, report
    that this is a new issue with that particular
    system and that a patch is available and has ( or
    should, if you lack even the authority to patch )
    be applied immediately, or that a patch is not
    yet available. I'm not a litigious person by
    nature but I wouldn't hesitate to sue on the
    grounds of wrongful termination if i could present
    evidence that i had made those in power aware of
    the problem and had not received authorization
    to make the changes that would have prevented the
    breach.

    If you're the security guy, you Are the fall guy
    by default, but if you don't leave a document
    trail behind to show due diligence you will have
    no cushion for your fall.

    Follow the same basic guidelines that the medical
    profession uses - document anomalies, perform
    frequent monitoring, document changes. All of
    this will help greatly should you be in the
    unfortunately position of having to take legal
    action against a former employer.

    That this is necessary is sad, but it Is
    necessary.
  • This is somehow news? Companies do this all the time. For example, many go with closed source software instead of open source software so they have someone to blame/sue when something goes wrong.

    In this case the company is paying someone to take the fall when they have a security problem. If this person doesn't realize it, then they are clueless.

    Quite a few people in this position are probably content with it because they get paid to do nothing. The trade off for that is crappy job security.

    Those tha
    • Companies ... go with closed source software instead of open source software so they have someone to blame/sue when something goes wrong. ... the company is paying someone to take the fall when they have a security problem. If this person doesn't realize it, then they are clueless.

      No one takes blame when their software does not work or loses your data. The clues are:

      • Microsoft's massive cash pile.
      • Everyone else's bloated IT budget.
      • Articles about a complete lack of "cyber security" in a place that runs m
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Sunday October 31, 2004 @02:42PM (#10680278)
    Comment removed based on user account deletion
    • by Anonymous Coward
      In many US government organizations this has only resulted in a lot of paperwork rather than an increase in security. Being recently involved in the DITSCAP certification of a new system, I think the biggest problem is personel:

      1. The biggest problem is that the people doing the work don't know what they are doing. At my company, less than 10% of the people doing certification analysis have a technical background. On the project I was on, only myself and one other person (the rep from the SW developer)
    • Keep in mind that the DHS has repeatedly gotten "D" report card marks for security. The US government as a while averages at "C" I think, and IIRC, only the NSF got a solid "A". There are rules, but if they aren't enforced or enforcement is hindered, the rules are worthless.

      This guy has a political problem and that's why he resigned. Everyone wants to make a big splash when they don't get along with their cohorts. Only the classy ones keep their mouths shut. This guy isn't one of those, apparently.

      I a
    • And your system is crap. I work in a mid-to-large size goverment program that implements policies such as the ones you outline and this is what happens:

      1. The process of getting applications approved is so slow and onerous that people just install the apps on local machines w/o the knowledge of IT. If they didn't, work would never get done.

      2. Their network 'accredited'. So it's like everyone else's. Big whoop. They block outgoing ports, like ssh 22. That's just a pain in the ass. So I have to ru
  • by Anonymous Coward on Sunday October 31, 2004 @02:51PM (#10680316)
    ...your only role is to be the fall guy when something goes wrong.

    Any time security goes amuck... look to management as the culpret. If anyone points fingers at anyone else but management they really don't know too much.

    Management has the political power, the money and the fudiciary responsibilty.

    And if they don't know the assessed level of their security and security requirements, this then means they aren't doing their job.
  • From the article:
    I, for one, sincerely hope that the cyber-security position will be upgraded to assistant secretary.
    I, for one, welcome our new assistant secretary overlord.

  • by ShatteredDream ( 636520 ) on Sunday October 31, 2004 @02:59PM (#10680346) Homepage
    Keep track of all of the times that you couldn't do something important, especially things legally necessary, because the powers that be didn't want to let you take the risk or rock the boat. Then when the police come in to investigate, if the higher ups decide to make you take the fall, take them with you by dropping all of your documentation about their ordering you to not do your job, onto the cops' lap.

    There is nothing that police at all levels love more than taking down big rich guys.
  • by paranerd ( 672669 ) on Sunday October 31, 2004 @02:59PM (#10680347)
    I'm sorry. Where I work it's the other way around. Our security department has all of the authority and none of the responsibility.

    What the result is, anyone can guess: password rules so byzantine that no one can log onto production systems when sev1 issues occurr, sysops waiting three days for product tapes to be logged in and mounted, security changes being made willynilly with no change control management instituted, gateways which serve no data being loaded with full blown virus scanning software, bleeding edge maintenance being forced onto hardware and users not ready for it because it included some security fix of doubtful worth, managers not knowing the IP addys of their own *&#@ servers.

    What else is the result: passwords being taped to the bottom of keyboards, users being covertly supplied administrator rights to databases and servers, sushi programs installed by everyone, hacks programmed into apps to slip data through firewalls, and entire job streams running under one userid.

    Pity the poor security admin.
  • by Skapare ( 16644 ) on Sunday October 31, 2004 @03:00PM (#10680351) Homepage

    I used to work at a major financial services company. This was just as commercialism was just discovering the existance of the internet, so I was hired to design and deploy their high speed redundant connectivity. One thing this company did right, I think, is that all of their security was focused through the VP of Auditing, who reported to the CFO. And the guy who had this position was smart enough to know he knew very little about security and had to learn. I actually got to teach him more about it. We formed a group of people (at my suggestion), including another network engineer, two accountants, and one of the staff lawyers, as the security committee. His original mandate was network security. But in our first group meeting I gave a presentation on one of my long long ago hacking efforts (back in the mainframe days) that successfully broke into a major insurance company's three mainframes. I explained to them how I did it using entirely social engineering. Of course I had knowledge of the system, but I didn't utilize any bugs in the system to get in. With this I was able to get the group to change the focus of security from one strictly focusing on computer technology, to one that would be applied to everything the company did. Software bugs and misconfigured servers are, of course, important, but people are the weakest link in security, and this is even more so the larger a corporation is. Every operation of a company must consider security across the board.

  • It all depends..... (Score:2, Informative)

    by Fantasio ( 800086 )
    ...on how much one can ask for being a scapegoat. Make me an offer I can't refuse and I'm your man ! (paid in advance, please...)
    • by mikewas ( 119762 ) <wascher.gmail@com> on Sunday October 31, 2004 @03:43PM (#10680578) Homepage
      A place I worked had a VP that, as far as we could tell, did nothing. Every project & department had to pay a part of his salary, though.

      We finally found what his job was when government auditors showed up. He was the company scapegoat. He got 9 months off work -- with pay. Within a month of coming back they announced he was retiring -- golden parachute, full pension.

      I wanted that job!

  • by Anonymous Coward on Sunday October 31, 2004 @03:14PM (#10680419)
    If you're responsible than you make the recommendations. If they aren't followed you warn of the consequences. If the consequences result your ass is covered. This is BASIC employee CYA.

    If you do your CYA bit well your boss will follow with his CYA bit and eventually someone will sign a check or the memos will stop with someone stupid enough to take the fall. Otherwise you don't want to be working there. Works no fun if you can't do your job.

    If you don't like the CYA game, spend the time and effort you would put into implementing your recommendations into finding another job.

    Life's not that difficult!
  • Stupid IT Policies (Score:4, Insightful)

    by Stupid White Man ( 750118 ) on Sunday October 31, 2004 @03:38PM (#10680543) Homepage
    I have a client, however, who's IT security policy is so strict (14 characters, alpha, numeric, plus special) that each and every employee has taken to write down their user/password on a post it note and taping it to their monitor or under their keyboard. Just walking through the office you can pick up at least 6 user/passwords. I've tried to argue with the head dick in charge, and all I get is BS. Why put together a security policy so strict that it keeps employees from doing their jobs, or forces them to write down their passwords out of ignorance. Nothing worse than that.
  • by kafka47 ( 801886 ) on Sunday October 31, 2004 @03:39PM (#10680546) Homepage
    I've seen many definitions in the vendor and user side of security. A statement like "responsibility without authority" is highly negative and a little fatalistic, dont' you think? One of the key defining elements for me is that a good security administrator has the ability "to influence without power". That means, being Mr. SecAdmin is as much an exercise in politics as it is in technical werewithall.

    Relate this back to the industry. You're either at the top-level or you're in the trenches. A good security admin will bridge the two as best he/she can. Security fundamentally affects (and is affected by) almost every facet of an organization. I've seen through personal experience a "silo-like" mentality to security policy execution. The secadmins were in their own private bubble that attempted to be dictatory and impervious to external influence. This is wrong, wrong, wrong!

    Unfortunately, the needs of the job amount to being a little political. The decisions must be participatory, or at least giving the appearance of being participatory. That is what gives you buy-in from your users. You might say, "Why should I?" Well, if you're saying that, then you might want to find another job. Its a necessary evil if you care about keeping your org secure. If not, you might be the one complaining after the fact, "They never listened to me". Even if you're merely sitting there explaining why you are doing what you're doing - at least people are involved. You might even be giving them bad news, but at least you're telling them that you're giving them bad news before you change their lives. The real challenge here is finding the right people to involve. :-)

    Good security as much depends on the "how" of security versus the "what" of security. If your methodology is technically correct, cheap, and does the job, but you've dumped it on the organization, then guess what. It ain't gonna fly!

    The article, in its efforts to be concise, has not really justified its claims. Trying to sway the course of one of the largest governments in the world indeed sounds like a recipe for frustration, but does not necessarily map back to the industry in general. Those seem like radically different things. I remember Richard Clarke seeming positively perky during the days of his assumption of cyber-security czar role. Look at him now.

  • illusions (Score:3, Insightful)

    by killua ( 604159 ) <nimakuNO@SPAMgmail.com> on Sunday October 31, 2004 @03:56PM (#10680659)
    Circumstances like these often accomplish something very important in politics, it gives the illusion of doing something to solve the problem, when in reality they have done nothing.
  • by geoff lane ( 93738 ) on Sunday October 31, 2004 @04:14PM (#10680750)
    as it addresses the wrong problem.

    The US thinks that taking nail clippers from passengers makes air travel more secure. It doesn't but it looks as though it might.

    Most computer security looks outwards to the internet, forgetting that the biggest threat is sitting inside the firewall.

    We are all surrounded by pretend security that is in position just because it looks good. Real security is a pain in the backside. It is disruptive to the people who have to work with it and it's very expensive. It's also complex and difficult to implement.

    If the security officer in a company cannot overrule EVERY single person in the company on a matter of security, the job is a joke and exists merely as a butt-covering operation.
    • by BobaFett ( 93158 ) on Sunday October 31, 2004 @09:22PM (#10682304) Homepage

      If the security officer in a company cannot overrule EVERY single person in the company on a matter of security, the job is a joke and exists merely as a butt-covering operation.


      This would be true if security was the overriding concern, the ultimate goal. It isn't. It would be true if the cost of security breach was infinite, but that is not so as well. So it is an entirely legitimate question to ask: should we accept the risks at our current level of security, or spend more on tightening it (in the form of direct expenses or lost productivity). There are other ways to mitigate against risks (redundancy, insurance, etc). If at the end of the day you can come out ahead by accepting the risk, that that is the correct thing to do. Security officer is not qualified to make this judgement.
  • been there.. (Score:3, Interesting)

    by TheHawke ( 237817 ) <{rchapin} {at} {stx.rr.com}> on Sunday October 31, 2004 @05:47PM (#10681255)
    Done that.. Pissed off more than a few clients with the security policies, blown a couple of budgets by a little bit. But it's still secure by overbuilding, securing the systems with personal passwords, set to expire in 30 day intervals. Education, education, education... The current headache with the 'wares was simply resolved by implementing a HOSTS file into each terminal via administrative batching. This was done within a hour and the infected machines were then reimaged with clean OS's. No slouch this nut is. As I said, i've pissed off a few folks, but they learned lessons the hard way not to break the NSA's rules and you don't wind up with a blank computer, or worse, a letter in your docket for the security violation. I'm not in the business to make friends, both personally or politically, which irks some of the suits. They pay me the big bucks to keep their business secure as Fort Knox, and they get what they pay for.
  • What about a company that has 300 users 450 computers and only one fulltime IT guy who primarily does sys-admin related tasks (email, viruses, backup, troubleshooting, etc.). By association, he's the "security" guy as well. He does all of the computer inventory & tracking, ordering & provisionsing, repairing, programming... does he have to be the "Security Officer" too?
  • by kabz ( 770151 ) on Sunday October 31, 2004 @06:50PM (#10681645) Homepage Journal
    One way to decrease users tendencies to download crap might be to publish a web page harvested from the firewall logs (you do have a firewall, right ?) and allow general access to see what users have been downloading.

    This would favorably impact the following :

    o Porn searching
    o Cosmetic surgery searching
    o Perv searching
    o Joke searching
    o Browsing slashdot at -1 ;-)

    The Slashdot model of moderating/censoring web page accesses would also be driven by the curiosity to see what your dodgy co-workers have been downloading.

    One thing that one of my previous companies also emphasized was ensuring that machines have a password protected screensaver whenever a user is away from his/her desk. Another co-worker being able to hit porn from an open desktop would be a great motivation to lock up your desktop on restroom trips, coffee etc.

    Most companies have policies on non-business use of machines, though these are seldom enforced with any vigor. Enforcing them through a peer mechanism like that described above might help to keep users and company networks safe from themselves.

    --
  • by supabeast! ( 84658 ) on Sunday October 31, 2004 @10:38PM (#10682780)
    This doesn't just apply to security, it applies to IT in general. The sysadmin is always the guy who has to implement all of the stupid shit managers promise to people, and rarely has any input on how it will be done. I finally knew that my IT career was about to end the when, on a Friday morning, I was asked to work at least 12 hours on Saturday AND Sunday because the director of a federal agency I was working for (as a contractor) has promised that we would have a certain system working by a certain date which just happened to be Monday morning. This was the first time that ANYONE on the team responsible for the implementation had heard about it.

    I refused -- not that it mattered, because the coders needed time to adapt beta code from a different project to this one--, and dropped by for a few hours on Sunday just to check on the status of things. Two weeks later we had a semi-functional prototype. Three months later it was still a lame cycle of the same crap.

    Now I'm going to art school and painting full-time. The money sucks, but I never have to come in at three AM to cleanup after someone else's dumbshit idea.

Disclaimer: "These opinions are my own, though for a small fee they be yours too." -- Dave Haynie

Working...