Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Technology

What Should We Do About Security Ethics? 244

An anonymous reader writes "I am a senior security xxx in a Fortune 300 company and I am very frustrated at what I see. I see our customers turn a blind eye to blatant security issues, in the name of the application or business requirements. I see our own senior officers reduce the risk ratings of internal findings, and even strong-arm 3rd party auditors/testers to reduce their risk ratings on the threat of losing our business. It's truly sad that the fear of losing our jobs and the necessity of supporting our families comes first before the security of highly confidential information. All so executives can look good and make their bonuses? How should people start blowing the whistle on companies like this?"
This discussion has been archived. No new comments can be posted.

What Should We Do About Security Ethics?

Comments Filter:
  • by doti ( 966971 ) on Tuesday April 15, 2008 @08:05PM (#23084660) Homepage
    Ignore it?
    • If you're a slacker, yes. Masterminds violate the heck out of security ethics before blaming the slacker.
    • by EmbeddedJanitor ( 597831 ) on Tuesday April 15, 2008 @08:48PM (#23085030)
      Most are only limited by what the law allows. Although a company might speak of ethics, don't expect them to actually practice it.

      And why bother about security ethics when there are much more important ethical considerations like how they treat staff? Again, most companies screw most of their staff to the limit of the law.

      In short: If you're looking for ethics you got off on the wrong planet.

      • by TheLinuxSRC ( 683475 ) * <slashdot&pagewash,com> on Tuesday April 15, 2008 @09:12PM (#23085258) Homepage
        Most are only limited by what the law allows. Although a company might speak of ethics, don't expect them to actually practice it.

        I agree with these two statements 100%, however...

        And why bother about security ethics when there are much more important ethical considerations like how they treat staff? Again, most companies screw most of their staff to the limit of the law.

        Treatment of staff is a strawman. It has no bearing on whether security is an issue. I was employed in a medical software company that did not treat their staff terribly yet managed to deploy products that were genuinely unsafe. This was in the imaging dept.of a medical records company - imaging handled diagnostic images as well as records for archival. This needed to be 100%+ HIPPA [hhs.gov] compliant and was nowhere close. While treatment of staff was decent, security with regard to medical records/images was not at all. I believe this to be an area where security is a huge priority over how the staff is treated.
        • by Anonymous Coward on Tuesday April 15, 2008 @09:39PM (#23085458)

          Don't even get me started. I work at a company which makes document imaging software and our customers send us all kinds of crap that honestly, scares the shit out of me. Not to mention information specifically protected by law. Most of the time, I get the sense that the sender didn't even remotely think about it. All they know is "this is not viewing/printing how it should" and so off they send it, as an attachment on unencrypted email.

          So now I am put in the position of -- do I actually work on the client's problem? Or do I immediately destroy the information and tell them they are a dumbass? You know what the reality is? The highly sensitive document gets printed out, sometimes hundreds of times (as I tweak things during the debugging process), and I try to shred everything but when there's hundreds of copies, I'm sure I've missed one. If I was unscrupulous I could have made several million dollars off the information I see on a daily basis and I'm not exaggerating. Millions. Honestly it pisses me off.

          • by Anonymous Coward on Tuesday April 15, 2008 @11:28PM (#23086112)

            I remember in my days consulting, I got sent a DB to look at. This DB held all the personal information for everyone who was worth over $X. The DB contained SSN's, spouse's name, spouse's SSN, etc. As soon as I saw this DB, I asked where the NDA for it was. When I was told there was no NDA sent over, I felt sorry for everyone who's information was in there.

    • Re: (Score:3, Interesting)

      by kylehase ( 982334 )
      The subprime lending agents ignored ethics and look how that turned out.
    • by Anonymous Coward on Tuesday April 15, 2008 @11:46PM (#23086236)
      As an IT auditor doing internal control audits, this thought occurs to me:

      When my company audits you and attests to the controls being in place and operating effectively, they essentially take legal responsibility for your internal controls. If we get strong-armed or bought off and decided to cover it up (which has never happened in my experience), we are on the legal hook for the results. We can be sued. The CPA that signs off on the audit can lose his license and get in all kinds of other trouble.

      If one wanted to keep one's job, but wanted to whistleblow on this situation, one might be prudent to blow the whistle on the auditors (to the AICPA) for materially misstating the operating effectiveness of your company's controls. The auditors take the fall, and your company gets a pass by saying "Hey, we didn't know, they signed off on it!", and subsequently tightening up controls to ensure that no eyebrows are raised in the future.

      Food for thought.

    • by MadMidnightBomber ( 894759 ) on Wednesday April 16, 2008 @01:58AM (#23086896)
      1. Sell the company stock short
      2. Leak
      3. Profit !!
      (May involve forfeiture of your immortal soul, prison time and other side effects.)
  • Three Words: (Score:5, Insightful)

    by canUbeleiveIT ( 787307 ) on Tuesday April 15, 2008 @08:05PM (#23084662)
    Cover your ass.
    • Re:Three Words: (Score:5, Insightful)

      by NeverVotedBush ( 1041088 ) on Tuesday April 15, 2008 @08:23PM (#23084814)
      Actually this is probably better advice than most realize. I don't know if it was tongue in cheek or not, but it is damned good advice.

      Where I work, security is a really big issue and I have to deal with people all the time that don't realize that security is something they should consider with every decision they make during the day. Needless to say, many don't feel the same way. They are about to get raked over the coals by management.

      Unfortunately for some, they are in the crosshairs for their lax stance on security. I don't know what management is going to do with them, but management knows who they are and they stand a good chance of at least repremands and loss of pay increases, and at the worst for them, pink slips.

      Anyone in IT who thinks data security isn't their job is fooling themselves and setting themselves up for a new career. If you read the SANS Newsbites, you see breach after breach and people getting sacked or worse.

      People need to tighten up their systems, audit their systems, run configuration management, and even penetration test their systems. If you can show you are at least trying to cover your ass, you stand a better chance of being seen as proactive and trying to protect the company even if it does get breached.

      But if something happens and it comes time to pick up the pieces, and all you can say is well, we shoulda done that but we didn't, you might want to have a plan B in terms of a career because you will probably need it.
      • Re:Three Words: (Score:5, Insightful)

        by zappepcs ( 820751 ) on Tuesday April 15, 2008 @08:42PM (#23084974) Journal
        This is about as good as I know to do. Document everything. Where I work, I politely make my senior (not plural) aware of something I see as a security risk and ask for direction after giving what I think are the two-three possible methods to cure the issue. If that direction is 'do nothing' or worse, I have at least documented it. I always do this with a follow up email, or as part of my bi-weekly report.

        When I am running a tech project at work, I simply schedule resources in the project plan for security assessment and risk abatement. If these are cut from the resource budget of the project, it is documented on whose authority such was removed from the project.

        Basically stated: COVER your ass, and those below you. When those internal emails get leaked onto the internets or wikileaks it will be you shown as having 'concerns' about the security practices, and others who are guilty of the massive security problems being allowed to propagate. That makes finding the next job much easier.

        Additionally, all managers can find a few hours here and there within their department resources to do some security auditing and testing. Showing these results on your status reports documents proactive use of company resources. Additionally, if you can show that customer xyz just survived an attack because of something you did, you may end up being given more slack to accomplish your true and altruistic goals ( - that is sad state of affairs ) of providing secure products and services. Each time the company suffers a loss through security problems and documents the cost of recovery, you can show next time what security auditing would have saved them if they had taken actions earlier, such as the nice plan you hand them to peruse which would stop future such attacks.
        • by jhol13 ( 1087781 )

          it will be you shown as having 'concerns' about the security practices
          But doing nothing? (your text can be read that way, sorry if you did not imply it)

          others who are guilty of the massive security problems being allowed to propagate.
          Or "unaware" of the fact (according to the the laws of the court).

          I would certainly recommend to document everything but still first and foremost stay legal. "Criminally negligent" is not fun, I'd imagine.
        • Re: (Score:3, Insightful)

          by mwlewis ( 794711 )

          ...to accomplish your true and altruistic goals ( - that is sad state of affairs ) of providing secure products and services.

          Why is this altruistic? It makes your product better, and should make you more competitive in the marketplace. Granted, this is more of a long term effect than the short term effect of cutting corners to cut costs. But businesses make investments every day. I think it's just as important to stress the benefits as well as the costs or the risk reduction.

      • Pink slip eh....that doesn't sound so bad.

        What can i redeem it for?

        oh PLEASE say action figures and concert tickets!
        • Re: (Score:3, Insightful)

          The job market isn't all that good out in the real world right now -- especially if you have been fired for cause.

          Why add another hurdle to finding a job?

          And that kind of attitude is what I see in some of my coworkers. Smartass people who think they know it all and just don't care about consequences. And coincidentally, those are the same ones in management's crosshairs. Pretty much without exception.
      • Re: (Score:3, Interesting)

        by Heembo ( 916647 )

        If you read the SANS Newsbites, you see breach after breach and people getting sacked or worse.
        Ouch, you are implying SANS has integrity. Newsbites is a advertising vehicle for one of the most low integrity organizations in the security industry. For real information, Bugtraq is where it's at.
    • by Nick Driver ( 238034 ) on Tuesday April 15, 2008 @09:04PM (#23085184)
      ...he who dares tell the Emperor that he's wearing no clothes gets his head chopped off.
    • by Heembo ( 916647 )

      Cover your ass.
      This is the only way to roll. Email the the Security Officer about your disagreement over the issue at hand, and include factual evidence. CC the CEO. Print out a copy for your personal records and use registered mail to mail it back to yourself. When the PCI/SOX/HIPPA/etc shit hits the fan, bust out the sealed envelope.
    • by dbIII ( 701233 )

      Cover your ass.

      Especially if a company has a senior anything XXX. The security one probably means whips and chains.

    • I agree. And if you both want to do this, and maintain some ethical standard, I would suggest getting a lawyer. Granted this is going to take some money, but find yourself a lawyer, and see what your options you have, realistically. This is the safest way to go about it. There are whistle blower laws carved out all over the place, but they're often narrow, and complex. It's not the kind of thing you want to take lightly or without some extreme care.
  • Gee, I dunno (Score:4, Insightful)

    by Gewalt ( 1200451 ) on Tuesday April 15, 2008 @08:06PM (#23084674)
    how about you gather some evidence and publish it?

    Of course, you'll lose your job over it. So decide now. Do you want to sleep at night? Or do you want to feed your family?
    • Re: (Score:3, Funny)

      by Anonymous Coward

      how about you gather some evidence and publish it?

      Of course, you'll lose your job over it. So decide now. Do you want to sleep at night? Or do you want to feed your family?
      That is one end of the spectrum. Another is to gather some evidence in order to ensure job security and hefty pay raises!
      • Indeed, when they fire all the higher ups, you can move in and take their former positions. That is if the company is still standing, and none of this ever got out.
    • Lose your job over this? Probably not necessary. But I would recommend documenting everything you've noticed and told your boss in a detailed set of memos so that you're safe if an ethics committee ever investigates. If that's what you're worried about, of course.

      Going the "get fired" route is probably a really bad idea under normal circumstances as you're likely to be passed up for jobs in the future for "lack of loyalty" or whatever the hell they're calling it now. Publishing anonymously, like on Wikilea

      • But I would recommend documenting everything you've noticed and told your boss in a detailed set of memos so that you're safe if an ethics committee ever investigates. If that's what you're worried about, of course.
        You probably want to document it in a format that does not alert your boss to the real danger, but later, when people are looking for a scapegoat, will show that your boss was negligent in not following up your report..
    • I duno, you could also just make it publicly known how incompetent your security practices are, without being "that guy".
    • You can sleep at night and feed your family, you just might have to go through hell with lawyers to get there. If you can document the choices being made really are coverups of violations of the law (and not just weak interpretations of the law) then go ahead and gather the evidence, and then make it clear to your boss or his boss or whoever needs to hear it: this is a problem and if it's not fixed here, you have no choice but to go public with it.

      Suddenly, the only way to "cover up" is to fix the problem.
      • "Suddenly, the only way to "cover up" is to fix the problem. If they fire you, you go public anyway, and not only is their coverup work worthwhile, but they're liable under whistleblower protection laws."

        Or...you get discovered in your car with your brains decorating the interior and an unsigned and typed suicide note bemoaning your guilt for all of the problems.

        Ahhhhh! Tinfoil hat restricting blood flow!
      • Re: (Score:3, Informative)

        by rah1420 ( 234198 )
        Whistleblower laws are a freaking joke.

        I have an acquaintance who was a financial underling at a publicly traded company. The CFO discovered some irregularities with the books and blew the whistle on the shenanigans. Within 6 months he was history, along with anyone else who TPTB determined was in the 'penumbra of blame.' Came damn close to my acquaintance but didn't affect them.

        Look at it this way; are you gonna want to keep around the guy who spoiled the ride for the rest of the clowns? If you are on
    • Re:Gee, I dunno (Score:5, Insightful)

      by plover ( 150551 ) * on Tuesday April 15, 2008 @10:59PM (#23085972) Homepage Journal
      Yes, gather evidence, but DO NOT publish it. Be very careful who you tell. If you do publish it they will hunt for whoever leaked it; if they find you at the end of the trail, you will be fired and likely blackballed in your city. (That's the thing about pissing off security people; they know exactly how the system works and will skirt the labor laws to put someone in a world of hurt.) It won't matter if it made their security better, or if someone gets an award for fixing it, or if your stock doubles because of your shiny new security model, if you hurt their image they'll put you down like a dog.

      Check around, maybe your company already has a CISSP on staff you could talk to. If not, as a large company you likely have an Info Security officer or manager, or perhaps a Loss Prevention or Asset Management department. Or perhaps you have someone in the networking area responsible for security (firewall installers, Active Directory admins, etc.) Corner the person in charge, and start asking him pointed questions, like "Did you see the news about company Y, who got hacked by exploiting this same vulnerability we've got?" "Have you done a risk analysis?" "What would you do if X happened?" "Do we have an incident response plan?"

      Or maybe you take credit cards, and have a PCI auditor running around. It's their job to care about security holes. Get your findings to them.

      Just saying "OMG, we're using WEP!" or "look, someone keeps pulling these XSS attacks on us, I told you so!" isn't likely to be earth shatteringly bad news; trust me, it's pretty much just irritating to those who politely listen to you whine. But offering constructive organizational advice might let these people know that you're not stupid, and that you really could help them improve their security.

      If you're considering a career change into the security field, a positive attitude towards fixing the systemic problems (big picture, not just the one set of things you're looking at) might get you somewhere.

  • Wikileaks (Score:5, Informative)

    by Mondo1287 ( 622491 ) on Tuesday April 15, 2008 @08:07PM (#23084680)
    • Re:Wikileaks (Score:5, Insightful)

      by couchslug ( 175151 ) on Tuesday April 15, 2008 @08:17PM (#23084758)
      If you leak it, not only do it on the sly in a manner that can't be traced to you (or you'll probably never be hired in a position of trust again!) but have an authentication method that can PROVE it's you in case the Feds come looking and you need to roll over.
      • Re:Wikileaks (Score:5, Interesting)

        by Anonymous Coward on Tuesday April 15, 2008 @08:55PM (#23085088)
        I work for a very large US government department. Our agency oversees all of the child agencies. If we leak information about how we fast-talk the 20-some year old college graduate security auditors that know jack about computers, we would surely lose our contract. Our contract pays big, on the order of a few million per year. We have a total staff a little over 20, do the math. If the federal it director says to do it one way, we do it that one way to ensure nice paychecks to our employees.

        Now, I am one of these employees and I'm not going to watch my job burn because the government is hiding blatent security problems. The next person that comes in will comply the same way and I'm left searching for a new job. No. What I do is purposely delay audit results. Miss a deadline here and there. Specifically mention other areas of concern while satisfying the customer by fast talking through another area. Results? It turned your governments security finding report from a B to a D. This past year sucked, work wise, but we're far more secure now than we were a year ago.

        Just to scare you some more, we were sending backup tapes offsite without using encryption. We also didn't encrypt our laptops until the day before the government stipulated deadline. The best one? One of our budget management systems runs a public X server as root. Guess what else? We hold tons of medical, legal, and personal information for a very large number of you americans. Yea.

        You're damn right we need to change how we address security concerns. I have no ideas on how to change this, so I will continue to be very cautious in my personal life. I will also continue to take contracts like this to ensure I can feed my family for the next couple of decades.
      • If you leak it, not only do it on the sly in a manner that can't be traced to you (or you'll probably never be hired in a position of trust again!) but have an authentication method that can PROVE it's you in case the Feds come looking and you need to roll over.

        What if it's not the feds but some other less recognizable but similarly irresistible force? If you leave an authentication tag of some sort then they will take the slipper and shove it on everyone's feet until it fits. Better not to leave it there in the first place.

    • Re: (Score:2, Interesting)

      by NoobixCube ( 1133473 )
      This is pretty much what Wikileaks is for. Though if you're in Australia, that avenue will soon be sealed off from you if that new law gets approved. All in the name of our safety, of course. Can't have terrorists bringing down the economy by trying to improve it.
    • full disclosure
      full-disclosure@lists.grok.org.uk
  • by awyeah ( 70462 ) * on Tuesday April 15, 2008 @08:11PM (#23084704)
    It's more common than you think. Some of it is due to laziness, some due to a lack of knowledge, and some due to time constraints. Fortunately, for the really sensitive information, management at my company finally put into place very strict policies on how we handle the data: How we store it, erase it, encrypt it, and display it. Granted, most of these policies are actually put in place by vendors that require it, but we've taken those standards and extended them across all sensitive information.

    If you're failing SOX/SAS-70/404 audits (or whatever types of audits apply to you)... that's bad, although you've already identified that.

    We formed a data security team - it's just one dedicated person right now, but since he's really only involved with the policy stuff, that's enough for us - however, he does hold frequent and regular meetings with management across all departments. The DS team recently published our "best practices" which every developer now has posted at his/her desk.

    Because management took this very seriously, we became one of the first companies in our industry to have all of the current versions of our software fully compliant with industry security standards.

    If there are no standards set forth for you, I suggest you make your own. It takes time and they must be well thought out, and no comprimises can be made (that's a bad pun, sorry). Use your audit results (the actual audit results, not the strong-armed ones) as a baseline for improvement. Dedicate a resource to data security. Whatever you have to do. Since you're a senior level person, you should be able to convince people to allow you to do it.

    If you have security issues and a breach occurs, well... I think you know what could happen.
    • ...but one thing that would improve matters is if sensitive information automatically kicked in compulsary external audits by some independent watchdog. That would require some creative legislation, not only to make acceptable to courts, corporations, etc, but also to keep sufficiently current that poor practices or malpractice aren't actually required. That, I fear, is beyond any Government currently out there, and given the track record of Governments on IT issues, I suspect skepticism and wholesale rejec
    • Re: (Score:3, Insightful)

      Standards are often slow to form, and then just as slow to be bought into. Everyone knows that they're needed, but they're too often set aside "just for this one thing."

      I think one of the problems is the idea that has become prevalent that "business drives IT." This is taken by many to mean that business decides what IT does, and that IT's rules have to bend to the desires of business whenever they clash. Personally, I think this is asinine, especially because it leads to a completely unnecessary adversa
    • Re: (Score:3, Informative)

      by pclminion ( 145572 )

      If you're failing SOX/SAS-70/404 audits (or whatever types of audits apply to you)... that's bad, although you've already identified that.

      Now how the FUCK can you fail a SAS-70 audit? You get to set your own damn criteria for passing!

    • "we formed a data security team - it's just one dedicated person right now, but since he's really only involved with the policy stuff, that's enough for us"

      Not only do you have a paper-tiger security team, but you under-staff it, at that! Epic fail.
  • by Anonymous Coward on Tuesday April 15, 2008 @08:11PM (#23084708)
    I work for many clients, most are lobbyists and lawyers. Ethics are different for everyone.

    We have laws to restrict what people do and police to enforce those laws.

    I know of one client, in an attempt to get a Federal contract, created a multi-million dollar program just to meet the "green" requirements that the Federal government is placing on new contacts.

    Turns out - nothing much is being done except the bare minimum.

    What is ethical is very different from that which is legal.

    Because of my personal beliefs which stem from an often insulted and bashed faith, constantly mocked here on Slashdot, I do not sell the information I am privy to.

    • Re: (Score:3, Funny)

      by eln ( 21727 )

      Because of my personal beliefs which stem from an often insulted and bashed faith, constantly mocked here on Slashdot, I do not sell the information I am privy to.
      I have a very strong sense of ethics too, and don't sell the information I'm privy to either. Since you say these beliefs stem from your faith, then we must be of the same faith. Always nice to meet a fellow atheist.
  • Ethics in Business (Score:3, Insightful)

    by TheRecklessWanderer ( 929556 ) on Tuesday April 15, 2008 @08:13PM (#23084720) Journal
    It's interesting that you talk about ethics in one branch of business, when clearly, there is a lack of ethics in most branches of business.

    Unfair labor practices, shady reporting practices, Enron, The entire legal profession, The entire political category (is it truly a profession).

    The point is, why single out one area of unethical behavior? Does it surprise you that the executives in our (Techie's Rule) should be any different?

    Most executives make their way to the top by lying, cheating and stealing better than the next guy.

    What can you expect?

    • Re: (Score:3, Insightful)

      by compro01 ( 777531 )

      The point is, why single out one area of unethical behavior?
      because it's the area most of us are in and the area most of us are most likely to be able to have an effect on.
    • by overshoot ( 39700 ) on Tuesday April 15, 2008 @08:27PM (#23084854)
      It's interesting that you talk about ethics in one branch of business, when clearly, there is a lack of ethics in most branches of business.

      No, not really. After all, there are children dying of AIDS in Africa, of hunger all over the world. Old people are being neglected, education is a mess, etc. Apparently your strategy is to give up on doing anything because we can't do everything. The advantage of this approach is to make the problem so far beyond our powers to solve that we can justify not even trying.

      In response, I call your attention to the words of a sage from when things were a hell of a lot worse: "It is not for you to finish the task - nor are you free to desist from it."

      It may be trite, but doing something to improve one corner of the world beats whining on /. about how bad it all is.

    • Most executives make their way to the top by lying, cheating and stealing better than the next guy.

      Wow! Do you have numbers to back up the above assertion?

  • by jay2003 ( 668095 ) on Tuesday April 15, 2008 @08:14PM (#23084726)
    Ask yourself whether your "internal findings" are really representative or just attempt to CYA in case there is a problem. Coming at this problem from the side of someone whose job it is to get things done rather create objections, I frequently see security people asking for extremely expensive security "enhancements" that provide marginal if any value.

    All business decisions should be made on the basis of cost-benefit analysis. Most staff positions including security usually do a poor job of assessing either side and instead focus on potential risks without quantifying them. Just because security would be better by doing X, does not mean X is good idea. If X is really expensive and your competitors do not it, your firm is now at a cost disadvantage
    which depending on the industry can be catastrophic.

    I really have no way of knowing whether actions you are talking about really negative expected value actions or not in the sense that over a long period the risks involved will be realized and the damage will be far greater than the cost of taking preventative action. However, changing ratings is troublesome. A much better process is a well defined override or exception procedure. The business should understand what they are doing. A rigid system that says we can not do anything rated 'Y' even if there is 100M at stake will only result in the rating be changed.
    • by Fnord ( 1756 ) <joe@sadusk.com> on Tuesday April 15, 2008 @09:44PM (#23085490) Homepage
      This is the problem with modern business methodology. Engineers do cost-benefit analysis also, but not with monetary cost. Every design decision in a piece of software is a balance of how much cpu time does this save me vs. how much memory does this eat up vs. how much complexity does it add to my system, etc.

      But before cost-benefit analysis even begins, problems to be solved are classified by their risk. There is a class of problems that absolutely must be solved regardless of the cost. If you're writing a filesystem, anything that has the remotest chance of data loss is unacceptable, regardless of how slow it is. If one of these crucial elements costs too much for the system to handle, take out something else.

      A large number of businesses don't seem to see anything as unacceptable risk. Medical companies, car manufacturers, baby toy manufacturers, etc. consider anything that could possibly cause loss of human life an unacceptable risk. Banks and retailers should treat anything with the remotest possibility of leakage of customer data a must fix problem, and this means IT security should get done, regardless of cost.
      • by jay2003 ( 668095 )

        Banks and retailers should treat anything with the remotest possibility of leakage of customer data a must fix problem, and this means IT security should get done, regardless of cost.

        That statement shows a fundamental misunderstanding of probability. There is no piece of information where probability of it leaking is zero, even the US military's most closely guarded secrets. Security measures can push the probability close to zero but it's not possible to get here. There's always some compromise one

  • by ThinkComp ( 514335 ) on Tuesday April 15, 2008 @08:14PM (#23084730)
    I wrote an essay about this very issue a while back.

    http://www.aarongreenspan.com/essays/index.html?id=9 [aarongreenspan.com]

    The sad fact is that I don't report flaws anymore because I've been threatened too many times.
  • Is it ethical to place the interests of your employer above the needs of yourself or your family?
  • Not much (Score:5, Interesting)

    by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Tuesday April 15, 2008 @08:16PM (#23084750) Homepage

    I don't see how there is much you can do. There was an article here a few months ago about a group that started sending out bad XML because too many people were using the DTD they were hosting, to the tune of 10,000s of hits a day that were completely unnecessary.

    The company I work (not Fortune 500, smaller) sees some stuff that continues to floor me. Our dealings are mostly transactions of information (containing important things like bank accounts) between our computes and those of other companies. We have had to, quite a few times, flat out turn people down because they refuse to run securely. Not without massive DB encryption. Not hashing everything. Just not using SSL, an easy to implement addition on top of HTTP (which carries our conversations with people).

    Every two months or so, we are put in the position of telling people that the SSL certificate on their production system expired last night. This usually entails a discussion as to why we can't just let them slide, or give them a day, etc. We've had people switch off good SSL certificates from very valid authorities to self-signed certificates.

    In fact the expiration problem happened enough that someone seriously suggested we consider making a little program to check people's certificates and warn us when they were going to expire so we could warn them. Things got better and it didn't happen. Many people just don't care.

    I'm not sure how this happens either. We recently let a certificate lapse on a domain we stopped using and gave up on. For the 6 months before it expired I got emails from the certifying company up to one every 2 weeks or so at the end. Then they called our office to make sure we knew it was about to expire and to find out if we really wanted that to happen. Then today, a few weeks after it expired, I got an email reminding me that it expired and they'd be glad to renew it. I don't know how many companies are this proactive about renewing SSL certs, but I'd have had to have my head buried pretty far in the sand to not have noticed all that.

    We've seen plenty of poor security designs. I don't expect other operations to be perfectly secure. But the number of these companies who seem either ignorant or dismissive of SSL continues to surprise me from time to time.

    Best advice? If you can at all, shut them down. Very few of the companies we have worked with have been very nice about turning on SSL. Some have said "just add S to the URL" (it was secure, they just didn't give us that URL). Some have said "sorry, we'll get that right up". More than a few have not been that easy. Turning people off is the best power we have. If your contracts are big enough (as a Fortune 300 company, they might be) you could try to put security provisions in them with penalties for shenanigans. But we've found that when discussions aren't working, just disconnecting people usually gets their attention.

    • Re: (Score:3, Funny)

      by Qzukk ( 229616 )

      I'm not sure how this happens either. We recently let a certificate lapse on a domain we stopped using and gave up on. For the 6 months before it expired I got emails from the certifying company up to one every 2 weeks or so at the end.

      Actually, it's pretty easy. See, Jim punched in his email address back when we first got the certificate, so we'd been getting the notices at jim@example.com. Things were fine for a while, but then Jim moved on to another company. Fortunately, we had another Jim, so we just gave the email account to him when the first Jim left, and things were fine.

      Last month Jim turned in his two weeks' notice.

      By the way, we've got an entry level opening some of you might be interested in, just need a PhD, 10 years exp

  • I've been in enough places at this point to know that security does not matter.

    As much as it pains me to say it, there just isn't a good enough reason to do it. I think thats why its the OpenBSD guys that end up providing OpenSSL and SSH and the like... Cooperate pressure just kills any desire to get security right.

    Of course, the languages and libraries do not help the issue. Its just too easy to make stupid mistakes that result in code with security problems. People always argue that security will alw

    • Security happens when you think things through.

      Thinking things through all the time is hard

      Security makes things harder

      More developer time can at best, optimise how much we have to think before we act. But as long as users can't act without thought, they will think it's "hard" and will try not to do it.

      Battle between developers and human nature, human nature wins.

      That's to use, not to write though, more secure code should be easier to understand and debug, and actually be easier to write(provided you take
  • by Anonymous Coward on Tuesday April 15, 2008 @08:16PM (#23084754)
    Security won't be taken seriously until the powers-that-be worry that they will be directly impacted. A giant security breach that compromises tens of thousands of other people doesn't worry them. Once someone brings a successful (maybe class action) lawsuit and wins a lot of money, the powers-that-be will start paying attention.

    It is strange. We can't let a piece of equipment that isn't UL approved within a mile of our building. We have a guy whose whole job is to audit all the equipment and make sure it conforms. Security, on the other hand, isn't audited. The bosses sure don't fear us the way they fear the outside people who do all the other audits.

    Clearly it would be a good thing if someone were setting standards for security the way UL does for electrical equipment. It would be good to have outside auditors. Only then will the in-house security people get any respect.
    • Check out NIST: http://csrc.nist.gov/ [nist.gov]

      They not only have standards to follow but also scripts that can check security configurations to tell you if you meet standards or not.

      I know DHS gets mocked a lot but they are working with NIST to help harden computer systems. It's worth checking out.
  • by overshoot ( 39700 ) on Tuesday April 15, 2008 @08:18PM (#23084762)
    Step one: gut check.

    Step two: Find another job. If you take a cut, see step one.

    Step three: Pull no punches when you resign. Leave a resignation letter stating that you cannot in good conscience continue to sweep serious liabilities under the rug, and that under the circumstances you have no choice but to leave. Copy the BOD. If you want to really play hardball, copy the company's liability underwriters.

    Make no mistake, this is a major bridge-burning exercise. It may turn out to be the best thing that ever happened to your career, but don't count in it. See step one.

  • "How should people start blowing the whistle on companies like this?"

    Unh, perhaps by having the guts to name the company and maybe even the data at risk, rather than just saying n a Fortune 300 company. Oh, I guess you don't want to risk your bonus either, or maybe your job is more important than the safety and security of the citizens of your country. So why the hypocrisy to act like it's only your bosses who are vile evil bloodsuckers hiding the truth for their own enrichment?

    • It's easy to criticize when you aren't the one in the hotseat. Sometimes, working from the inside to make things better, in spite of what management wants, can be the better approach. If the poster is being confronted with big security issues, and management that thinks they can skate (or are betting they can skate), and really confidential data is at risk that would harm people if it were compromised, working from the inside to change attitudes is sometimes the best way.

      Maybe signing up for SANS Newsbit

  • How should people start blowing the whistle on companies like this?"

    If it's as bad as you're indicating, everyone learns eventually, even if it's the hard way. What you need to consider is, is it worth it?

    The questions I'd ask are:

    Are peoples lives at risk from these vulnerabilities?
    Are peoples lives going to be ruined because of these vulnerabilities?
    Is the company at serious threat of going under because of these vulnerabilities?

    If you can answer yes to one or more of these questions, you might consider
  • Kay Sara Sara (Score:3, Informative)

    by WwWonka ( 545303 ) on Tuesday April 15, 2008 @08:23PM (#23084816)
    Just let them be.

    I too worked for a company that catered to the people that made money for it. $40 billion+ in assets at the time. No matter how hard I tried security ALWAYS took a back seat to profit, ease of use, and not rocking the boat. I was the head of network security, there was not even a CSO. The hierarchy wasn't even in place. One day I even saw a live network hack in progress as one of our network engineers was using a VNC server not protected by our corporate firewall! Someone on the outside had found it and started using his desktop! I couldn't believe my eyes! In the end it came down to me just accepting that this company, and a vast majority of corporations, will always and forever be run this way...until, of course, the proverbial $#It hits the fan, at which point I didn't want to be there.

    So I left and never looked back. I suggest that this also be your course of action before the one left holding the bag is you.
  • There's lots of other jobs out there where you won't be confronted with this quandary. Your never going to get any credit for pointing out the security problems of your current employer. You run considerable legal risks (and might, in practice, render yourself unemployable in the future) if you try to blow the whistle.

    Find another job. Your family will be fed. You will also sleep somewhat better, except when you realize your ex-employer is still out there.
  • How should people start blowing the whistle on companies like this?

    Um...anonymously! DUH! Post some internal e-mails or outgoing to vendor e-mails proving this bullshit to wikileaks using a proxy or something. Or anonymously e-mail the business owners or other high level people about what's going on. Unless they're the ones doing it, then sneak an e-mail to their bosses: THE CUSTOMERS! Lol send out a fake newsletter e-mail to everyone in the database saying you'd just like to let them know about the ne

  • You are looking at the problem from the wrong direction:

    "the fear of losing our jobs and the necessity of supporting our families comes first before the security of highly confidential information"

    And so it should.

    However, you should put up a case to your higher ups about the business reasons why they need the security measures and that they need to be followed. The higher ups recognise this (in theory) and the practise of lowering security threats is classed as a "punishable offence". If a person's
  • Your best bet is to find someone higher up who understands the problem or to whom you can explain the problem.

    You eventually need to get to a C-level officer, something like CTO or COO who can actually mandate change. Somehow, in the places that I've worked I've been lucky enough to have CTOs that understand the concept of (and need for) security. They made a lot of changes that made sense to me (passwords must be changed more than once every 3 years, user data must not be stored on local machines, princi
  • by Nefarious Wheel ( 628136 ) on Tuesday April 15, 2008 @08:33PM (#23084898) Journal
    In the spirit of "The Unwritten Laws of Business" (W.J.King, Profile Books) you need to choose your boss carefully. If the company you're with is not transparent enough for that, check their culture against the culture you'd like to associate yourself with. To do that, I'd suggest large amounts of common sense or read "Good to Great" (Jim Collins, check Amazon).

    Don't be a whistleblower, be an activist for change. See if you have a risk compliance manager and talk to them, ask for their advice. At worst, you'll get your name known in the higher echelons, at best you'll get your own way. Most people will shy away from a confrontation, but love giving advice in a tricky situation.

    Your mileage may vary, and I may be full of compost. Think and do.

  • Make an appointment with the CEO/MD with a draft of your findings. If he doesn't care, you shouldn't care.
  • by Anonymous Coward on Tuesday April 15, 2008 @08:50PM (#23085046)
    I have had to make a similar choice twice now and both times, I had to leave the company to feel good about the situation. In one case, I also insisted that my name be removed from all company communications and government vendor documents. I do not regret my decision, although it has cost me.

    You say you are an uber security drone with a Fortune 300 company and that you *know* of fraudulent business practices to help the company earn better ratings on its security policies. I'm guessing that some of these impact SOX/404, SAS-70, and probably ALL would be of concern to the company's shareholders and business trading partners. Like it or not, you are now either complicit or you are obligated to inform oversight authorities. Your first duty
    should be to your own profession's standard of behavior, your second to the company shareholders, your third to the public's interest, and last to your management chain.

    You seem to be entertaining the idea of moving management's priorities to the head of the list and that would be to make yourself complicit. The fact that it would be difficult to prosecute you does not make that considered behavior any less criminal. You will have to live with that knowledge for a long time. I have friends who worked at Enron who to this day have valid concerns about the resume stain they have earned from their time there. Are you willing to bear that also?

    How you go about protecting yourself from reprisals is up to you and the reporting authority, but surely anonymous 'tip' reporting is possible. Given senior management is the problem, that is a strong candidate for your response. I would also recommend you document your allegations as best you may and make them to the SEC and your local branch of the FBI. Either agency might request you remain with the company while they investigate your allegations. Otherwise, it may be time to vote with your feet and find employment elsewhere.

    You more than anyone should know what will be the eventual outcome of improperly securing vital systems. Do you want it to happen on your watch or to have to answer difficult questions later
    about why you did not strongly resist or report events which will lead to that security breach? Do you want the stigma to attach itself to your resume? Do you want to sleep on the knowledge that you passively participated in criminal conspiracy by voluntarily remaining silent?

    You cannot fault the ethics of your superiors if you fail to execute upon your own. What are you made of? Decide,and then live with the decision. It only appears to be a difficult decision if you have an off-switch upon your professional ethics.
    • by duffbeer703 ( 177751 ) * on Tuesday April 15, 2008 @11:14PM (#23086034)
      I don't think that things are as cut and dry as the people posting here, and security people in general often make it out to be. A case in point was an audit that I was involved in about two years ago. One of the risks that the auditors threw a fit about, (and that management successfully lowered the risk rating of) was a six-character password limit on a legacy system which contains sensitive data. The security people threatened, cajoled and generally made an ass of themselves about this issue without looking at the circumstances.

      In that case, management was correct to lower the risk of this flaw, because they mitigated it. Access controls to that particular system were moved to a web-based terminal emulator, which is secured by complex passwords and a two-factor authentication system. Those six character passwords were randomized daily and linked to a specific user in the emulation system.

      All I am saying is that there is a difference between fraud, negligence and compromise. Just because management is twisting the arm of a zealous auditor, or the infosec crew is pissed off because their latest policy or acquisition got shot down doesn't mean your organization is run by Gorden Gecko or Ken Lay. Money and resources are not in unlimited supply, and sometimes standards need to be compromised or worked-around so that business can continue.

      If you're ethical standards can't handle that, you'd better move to academia or write security books, because there isn't an non-trivial environment anywhere that achieves perfect adherence to security standards.

  • Security ethics is a two-way street. I've seen reasonable risks downplayed when they shouldn't be but I've also had to argue with an auditor about "failed" checklist items whose security implications were clearly understood and very obviously addressed elsewhere in the system's overall architecture.
  • perspective (Score:5, Insightful)

    by J.J. ( 27067 ) on Tuesday April 15, 2008 @09:06PM (#23085204)
    Take a few steps back and consider your perspective. Try reading about engineers vs. managers: http://www.fourmilab.ch/hackdiet/e4/eatwatch.html [fourmilab.ch] (scroll halfway down)

    Many computer guys tend to be alarmist and see the world in black and white. Many security firms rate problems only based on potential damage without consideration for existing mitigations elsewhere in the system or the reality of targeting from attackers. Consider your company's situation carefully.

    If, after much deliberation, you are certain legitimate problems exist that must be fixed (versus managed) then talk to the managers in their language: build a business case. You work for a company, the company's job is to make money. Security costs money. You must clearly articulate how the security improvements will make money or stop the company from losing money. It's all engineering, in the end. It's just engineering with words and numbers.

    Cheers.
    - jj
    • Thanks for the link, that was interesting to read.

      Another thing to realize is that security companies and consultants have an inherent desire to inflate the size and probability of any security risk. The more fear they can build in the client, the bigger the more expensive the service they can sell.
  • I have observed for over a decade now that index finger pointing is passee at Fortune 50+... pecker-order old-boy corporate welfare companies and the USA government congress, DoD ... use of the middle finger for FU is the management rage for CYA. I suspect the Whitehouse, congress members, some mayors/governors, and many CEOs, CIOs, CFOs ... have a staff of blame-stormers. Blame-stormers are used when the best-framed-truth is (determined by the lawyers on staff) not believable to a jury, idiots and/or dogma
  • by rindeee ( 530084 ) on Tuesday April 15, 2008 @09:19PM (#23085330)
    Sorry my friend, but the biggest reason people 'fear losing their job' and not being able to support their family is due to personal irresponsibility. I promised myself a looooong time ago that I would do my best not to get into a situation where my job could bend my ethics due to need for the check every two weeks. Show me a person with little to no debt, a stout (not huge mind you) savings that knows how to live within or below their means and I'll show you someone who won't hesitate to 'blow the whistle', call a spade a spade, insert cliche here. Sadly, employers know as well as retailers and lenders that debt equals power over the indebted. This is not 100% of the problem, but in my opinion it is a very big part of it.
  • "I am a senior security xxx in a Fortune 300 company and I am very frustrated at what I see. I see our customers turn a blind eye to blatant security issues, in the name of the application or business requirements. I see our own senior officers reduce the risk ratings of internal findings, and even strong-arm 3rd party auditors/testers to reduce their risk ratings on the threat of losing our business. It's truly sad that the fear of losing our jobs and the necessity of supporting our families comes first b

  • Sarbanes-Oxley (Score:3, Interesting)

    by PPH ( 736903 ) on Tuesday April 15, 2008 @09:27PM (#23085378)

    This law makes the company CEO responsible for making any material mis-statements. If the security in question involves financial information, or if it would affect the financial standing of the company in the eyes of investors, it cannot be covered up.

    There may also be other regulatory agencies involved, such as the FDA, FAA, etc.

    If this is the case, tell the people pushing for the cover-up that you will gladly comply. But, after the sh*t hits the fan, you will visit the CEO in prison and tell him/her exactly who was responsible for generating the mis-statements.

    IANAL, so you should check with one first.

  • fulfill your requirements and document your protests. When your manager comes to collect, point out your protests and mention that they've been documented from the start. Do your due diligence my friend.
  • by Animats ( 122034 ) on Tuesday April 15, 2008 @09:31PM (#23085416) Homepage

    Public embarrassment can be useful. We publish a list of major domains being exploited by active phishing scams [sitetruth.com]. These are major domains where an attacker has found a security hole allowing them to exploit the site for phishing purposes. There are 65 sites on the list. There used to be about 140, but by nagging and publicity, we've been able to get most big-name sites to tighten up. Now and then some big site makes the list, but it often disappears within hours as the hole is plugged.

    So it actually is possible to get big companies to tighten up security, if you do it right.

  • You may be missing the bigger picture. I would assume that your systems have multiple layers of security, so things are never simple.
  • The issue here is not ethics, it's integrity.

    How long are YOU (yes, you personally) prepared to continue in your current security role knowing that when it really matters, the powers that be are ignoring you in ways that seriously put your customers at risk?

    If your answer is along the lines of "but I'd lose a good job and probably take a paycut" how is that attitude any worse/different than "the problems at upper management" which you're currently whining about?
  • by Torodung ( 31985 ) on Tuesday April 15, 2008 @10:59PM (#23085970) Journal
    The simple answer is we need laws, and public servants who understand the laws and the issues, for our new situation of having an "imaginary economy," where the only proof is often the voltage level of a circuit.

    Today: We are in the phase of judges trying to claim that putting a program into RAM might be an illegal copy process, and demanding a core dump as evidence.

    The Future: We need mandatory hard records of specified sensitive transactions (e.g.: e-voting, health, finance), we need whistleblower laws that protect what would otherwise be considered improper employee investigation and documentation of ephemeral computer records (it looks a lot like espionage), and we need legislators that understand the technology economy, and know where new laws are needed, and where the old ones will suffice.

    Then we need to fund enforcement, which has taken a dive in recent years.

    The newly qualified legislators are scheduled to arrive in Congress in about 20-40 years, if the older tech-savvy generation can teach the new aspirants to value their own privacy, and get them to understand that the fifth amendment doesn't apply if you put it all up on MySpace. I have confidence that these qualified people will eventually come to Congress.

    Until then, enjoy the wait. In the short term, enforcement money, and will, has been gutted. In the long term, the Congress is not yet savvy to these issues, so the law is inadequate, and new law is written by lobbyists who want less accountability, not more.

    Unfortunately, you don't have a leg to stand on while we amend the unintended consequences of our move to the "paperless society." I'm sorry. :^/

    --
    Toro
  • Rule Number 1 (Score:4, Insightful)

    by codepunk ( 167897 ) on Tuesday April 15, 2008 @11:30PM (#23086128)
    Rule Number 1

    The bottom line is this, it does not matter one lick how many security measures you put in
    place. Short of completely disconnecting the network from every point of entry and encrypting
    the entire network. Your security measures are not going to survive a determined attack from
    someone with at most average hacking skills. The best you can do is to point out the risks
    and figure out how to respond when your network gets owned because someday it is going to.

    Security it always a trade off and a continuous game of cat and mouse. It is all about being open
    enough to get the job done while doing your best to inform and mitigate the risk.

  • I work for a small IT company doing work mostly for law offices in our city. I fully, and completely agree that security is of prime importance and that we spend far to little time on it. The problem is guys, how do I get my CLIENTS to buy it? Most of them are fairly small and the attitude of "It can't happen to me" is all pervasive.
  • by Alexander ( 8916 ) on Wednesday April 16, 2008 @06:01AM (#23087896) Homepage
    When I've seen Fortune XXX companies deal with this similar issue, it's rarely been that Company XXX "doesn't care about security" - almost always it's been that Information Security Department doesn't understand the fundamental question "are we secure enough" within the context of the risk tolerance of the organization. When security is ignored, it's usually because we don't use "risk" in a means that is useful to the rest of the business.

    So I'd first get a proper definition of risk. I'd start with:

    (probable frequency x probable magnitude of loss)

    Risk must be a probability issue, and it needs to be expressed as a derived value (how frequently something bad will happen, and how much it will most likely hurt). I recommend using FAIR (see the Open Group website) as a means to derive risk. FAIR was developed by a Fortune 100 CISO who had a similar problem.

    It is a Bayesian Network for risk expression, which results in the best probability outcome that your prior information will allow, but more importantly it will help you work with auditors and the data owners to identify any dispute about the amount of risk the organization has by working through the composite factors involved. FAIR also provides KPIs for discreet risk issues.

    Next, you need to expend whatever political capital involved and get some flavor of Risk Tolerance/Appetite from the C-Suite. A 15 minute with the CFO with the right questions prepared ahead of time should suffice. Join ISACA and find someone who is all hyped up on COSO. The COSO evangelist will likely help you develop the right questions for the price of a nice lunch. There are good things and things that suck about COSO, but you can use the "Internal Environment" and "Objective Setting" functions of COSO to develop a risk tolerance.

    Finally, you need to stop thinking about security in terms of IP addresses, and think in terms of the business processes they support. Businesses, outside of Information Security Departments, usually couldn't give a rats@ss about what a scanner says about an IP address. They want to know the risk (FAIR, above) around the business process that makes them money.

    Let me also suggest that if you're already feeling commoditized there, the business isn't going to care about "compliance" either. Hitting them over the head constantly with a large GLBA/HIPAA/PCI/SOX/Whatever hammer might get you some budget, but it's not going to get you credibility.

    I'd also work with your CISO to get the company to change the name of your group to Information Risk Management to better reflect your value to the company. You may also want to join the SecurityCatalyst.com website (smart people there) and subscribe to the RSS feed of the Security Bloggers Network on Feedburner.

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...