What Should We Do About Security Ethics? 244
An anonymous reader writes "I am a senior security xxx in a Fortune 300 company and I am very frustrated at what I see. I see our customers turn a blind eye to blatant security issues, in the name of the application or business requirements. I see our own senior officers reduce the risk ratings of internal findings, and even strong-arm 3rd party auditors/testers to reduce their risk ratings on the threat of losing our business. It's truly sad that the fear of losing our jobs and the necessity of supporting our families comes first before the security of highly confidential information. All so executives can look good and make their bonuses? How should people start blowing the whistle on companies like this?"
Ethics? Where? On Slashdot? (Score:3, Interesting)
We have laws to restrict what people do and police to enforce those laws.
I know of one client, in an attempt to get a Federal contract, created a multi-million dollar program just to meet the "green" requirements that the Federal government is placing on new contacts.
Turns out - nothing much is being done except the bare minimum.
What is ethical is very different from that which is legal.
Because of my personal beliefs which stem from an often insulted and bashed faith, constantly mocked here on Slashdot, I do not sell the information I am privy to.
Essay: Catch 222-22-2222 (Score:4, Interesting)
http://www.aarongreenspan.com/essays/index.html?id=9 [aarongreenspan.com]
The sad fact is that I don't report flaws anymore because I've been threatened too many times.
Not much (Score:5, Interesting)
I don't see how there is much you can do. There was an article here a few months ago about a group that started sending out bad XML because too many people were using the DTD they were hosting, to the tune of 10,000s of hits a day that were completely unnecessary.
The company I work (not Fortune 500, smaller) sees some stuff that continues to floor me. Our dealings are mostly transactions of information (containing important things like bank accounts) between our computes and those of other companies. We have had to, quite a few times, flat out turn people down because they refuse to run securely. Not without massive DB encryption. Not hashing everything. Just not using SSL, an easy to implement addition on top of HTTP (which carries our conversations with people).
Every two months or so, we are put in the position of telling people that the SSL certificate on their production system expired last night. This usually entails a discussion as to why we can't just let them slide, or give them a day, etc. We've had people switch off good SSL certificates from very valid authorities to self-signed certificates.
In fact the expiration problem happened enough that someone seriously suggested we consider making a little program to check people's certificates and warn us when they were going to expire so we could warn them. Things got better and it didn't happen. Many people just don't care.
I'm not sure how this happens either. We recently let a certificate lapse on a domain we stopped using and gave up on. For the 6 months before it expired I got emails from the certifying company up to one every 2 weeks or so at the end. Then they called our office to make sure we knew it was about to expire and to find out if we really wanted that to happen. Then today, a few weeks after it expired, I got an email reminding me that it expired and they'd be glad to renew it. I don't know how many companies are this proactive about renewing SSL certs, but I'd have had to have my head buried pretty far in the sand to not have noticed all that.
We've seen plenty of poor security designs. I don't expect other operations to be perfectly secure. But the number of these companies who seem either ignorant or dismissive of SSL continues to surprise me from time to time.
Best advice? If you can at all, shut them down. Very few of the companies we have worked with have been very nice about turning on SSL. Some have said "just add S to the URL" (it was secure, they just didn't give us that URL). Some have said "sorry, we'll get that right up". More than a few have not been that easy. Turning people off is the best power we have. If your contracts are big enough (as a Fortune 300 company, they might be) you could try to put security provisions in them with penalties for shenanigans. But we've found that when discussions aren't working, just disconnecting people usually gets their attention.
Bosses don't fear security breaches (Score:4, Interesting)
It is strange. We can't let a piece of equipment that isn't UL approved within a mile of our building. We have a guy whose whole job is to audit all the equipment and make sure it conforms. Security, on the other hand, isn't audited. The bosses sure don't fear us the way they fear the outside people who do all the other audits.
Clearly it would be a good thing if someone were setting standards for security the way UL does for electrical equipment. It would be good to have outside auditors. Only then will the in-house security people get any respect.
Re:Wikileaks (Score:2, Interesting)
Re:Wikileaks (Score:5, Interesting)
Now, I am one of these employees and I'm not going to watch my job burn because the government is hiding blatent security problems. The next person that comes in will comply the same way and I'm left searching for a new job. No. What I do is purposely delay audit results. Miss a deadline here and there. Specifically mention other areas of concern while satisfying the customer by fast talking through another area. Results? It turned your governments security finding report from a B to a D. This past year sucked, work wise, but we're far more secure now than we were a year ago.
Just to scare you some more, we were sending backup tapes offsite without using encryption. We also didn't encrypt our laptops until the day before the government stipulated deadline. The best one? One of our budget management systems runs a public X server as root. Guess what else? We hold tons of medical, legal, and personal information for a very large number of you americans. Yea.
You're damn right we need to change how we address security concerns. I have no ideas on how to change this, so I will continue to be very cautious in my personal life. I will also continue to take contracts like this to ensure I can feed my family for the next couple of decades.
I'm at a fortune 150 company (Score:1, Interesting)
Number one, be a successful example of your policies.
Number two, understand, you are expendable, security is not a tangible deliverable to many. Strong arming people is the worst move that can be made, it will alienate your team. Security is extremely important, but getting a product across the finish line is even more important. If you stand in the way of delivery the barrier will be removed. If your security offerings help deliver a product faster and cheaper, then you'll be a hero.
Here is why I say these two things.
In my environment the security group is the worst example of security as a process, so nobody takes them seriously. People across the board are actually writing code to work around their systems as we need to deliver a product. It is ironic that in the latest audit, they failed worst than the groups, more or less because they didn't follow the enterprise security standards.
I can assure you that the barriers at some point will be removed one way or another.
Pick your battles, and be strategic.
Re:There are very few ethical companies. (Score:4, Interesting)
I agree with these two statements 100%, however...
And why bother about security ethics when there are much more important ethical considerations like how they treat staff? Again, most companies screw most of their staff to the limit of the law.
Treatment of staff is a strawman. It has no bearing on whether security is an issue. I was employed in a medical software company that did not treat their staff terribly yet managed to deploy products that were genuinely unsafe. This was in the imaging dept.of a medical records company - imaging handled diagnostic images as well as records for archival. This needed to be 100%+ HIPPA [hhs.gov] compliant and was nowhere close. While treatment of staff was decent, security with regard to medical records/images was not at all. I believe this to be an area where security is a huge priority over how the staff is treated.
Re:Three Words: (Score:3, Interesting)
Change starts with you. (Score:3, Interesting)
Sarbanes-Oxley (Score:3, Interesting)
This law makes the company CEO responsible for making any material mis-statements. If the security in question involves financial information, or if it would affect the financial standing of the company in the eyes of investors, it cannot be covered up.
There may also be other regulatory agencies involved, such as the FDA, FAA, etc.
If this is the case, tell the people pushing for the cover-up that you will gladly comply. But, after the sh*t hits the fan, you will visit the CEO in prison and tell him/her exactly who was responsible for generating the mis-statements.
IANAL, so you should check with one first.
Re:There are very few ethical companies. (Score:5, Interesting)
Don't even get me started. I work at a company which makes document imaging software and our customers send us all kinds of crap that honestly, scares the shit out of me. Not to mention information specifically protected by law. Most of the time, I get the sense that the sender didn't even remotely think about it. All they know is "this is not viewing/printing how it should" and so off they send it, as an attachment on unencrypted email.
So now I am put in the position of -- do I actually work on the client's problem? Or do I immediately destroy the information and tell them they are a dumbass? You know what the reality is? The highly sensitive document gets printed out, sometimes hundreds of times (as I tweak things during the debugging process), and I try to shred everything but when there's hundreds of copies, I'm sure I've missed one. If I was unscrupulous I could have made several million dollars off the information I see on a daily basis and I'm not exaggerating. Millions. Honestly it pisses me off.
Re:What Should We Do About Security Ethics? (Score:3, Interesting)
Laws written by legislators that understand tech (Score:4, Interesting)
Today: We are in the phase of judges trying to claim that putting a program into RAM might be an illegal copy process, and demanding a core dump as evidence.
The Future: We need mandatory hard records of specified sensitive transactions (e.g.: e-voting, health, finance), we need whistleblower laws that protect what would otherwise be considered improper employee investigation and documentation of ephemeral computer records (it looks a lot like espionage), and we need legislators that understand the technology economy, and know where new laws are needed, and where the old ones will suffice.
Then we need to fund enforcement, which has taken a dive in recent years.
The newly qualified legislators are scheduled to arrive in Congress in about 20-40 years, if the older tech-savvy generation can teach the new aspirants to value their own privacy, and get them to understand that the fifth amendment doesn't apply if you put it all up on MySpace. I have confidence that these qualified people will eventually come to Congress.
Until then, enjoy the wait. In the short term, enforcement money, and will, has been gutted. In the long term, the Congress is not yet savvy to these issues, so the law is inadequate, and new law is written by lobbyists who want less accountability, not more.
Unfortunately, you don't have a leg to stand on while we amend the unintended consequences of our move to the "paperless society." I'm sorry.
--
Toro
Re:There are very few ethical companies. (Score:5, Interesting)
I remember in my days consulting, I got sent a DB to look at. This DB held all the personal information for everyone who was worth over $X. The DB contained SSN's, spouse's name, spouse's SSN, etc. As soon as I saw this DB, I asked where the NDA for it was. When I was told there was no NDA sent over, I felt sorry for everyone who's information was in there.
Re:Essay: Catch 222-22-2222 (Score:1, Interesting)
http://www.aarongreenspan.com/essays/index.html?id=9 [aarongreenspan.com]
The sad fact is that I don't report flaws anymore because I've been threatened too many times.
While companies do get defensive, once I mention having spoken to the feds already, I never hear from lawyers. I think in a way, I've covered my ass in advance. And problems do seem to get fixed. And it never hits the papers, making everyone more or less happy.
Sure, nobody knows who I am, and my name isn't on anyone's "clever security guy" list. While I sure could have used the publicity as a consultant, that isn't really the point. The only thing is that I never seem to be able to get a gig out of any of these findings, even though it would have made perfect sense to hire me to help. THAT one I haven't figured out yet. Oh well.
Here's an interesting thought: (Score:5, Interesting)
When my company audits you and attests to the controls being in place and operating effectively, they essentially take legal responsibility for your internal controls. If we get strong-armed or bought off and decided to cover it up (which has never happened in my experience), we are on the legal hook for the results. We can be sued. The CPA that signs off on the audit can lose his license and get in all kinds of other trouble.
If one wanted to keep one's job, but wanted to whistleblow on this situation, one might be prudent to blow the whistle on the auditors (to the AICPA) for materially misstating the operating effectiveness of your company's controls. The auditors take the fall, and your company gets a pass by saying "Hey, we didn't know, they signed off on it!", and subsequently tightening up controls to ensure that no eyebrows are raised in the future.
Food for thought.
Stop Being A Security Professional and use Risk (Score:4, Interesting)
So I'd first get a proper definition of risk. I'd start with:
(probable frequency x probable magnitude of loss)
Risk must be a probability issue, and it needs to be expressed as a derived value (how frequently something bad will happen, and how much it will most likely hurt). I recommend using FAIR (see the Open Group website) as a means to derive risk. FAIR was developed by a Fortune 100 CISO who had a similar problem.
It is a Bayesian Network for risk expression, which results in the best probability outcome that your prior information will allow, but more importantly it will help you work with auditors and the data owners to identify any dispute about the amount of risk the organization has by working through the composite factors involved. FAIR also provides KPIs for discreet risk issues.
Next, you need to expend whatever political capital involved and get some flavor of Risk Tolerance/Appetite from the C-Suite. A 15 minute with the CFO with the right questions prepared ahead of time should suffice. Join ISACA and find someone who is all hyped up on COSO. The COSO evangelist will likely help you develop the right questions for the price of a nice lunch. There are good things and things that suck about COSO, but you can use the "Internal Environment" and "Objective Setting" functions of COSO to develop a risk tolerance.
Finally, you need to stop thinking about security in terms of IP addresses, and think in terms of the business processes they support. Businesses, outside of Information Security Departments, usually couldn't give a rats@ss about what a scanner says about an IP address. They want to know the risk (FAIR, above) around the business process that makes them money.
Let me also suggest that if you're already feeling commoditized there, the business isn't going to care about "compliance" either. Hitting them over the head constantly with a large GLBA/HIPAA/PCI/SOX/Whatever hammer might get you some budget, but it's not going to get you credibility.
I'd also work with your CISO to get the company to change the name of your group to Information Risk Management to better reflect your value to the company. You may also want to join the SecurityCatalyst.com website (smart people there) and subscribe to the RSS feed of the Security Bloggers Network on Feedburner.
Re:What Should We Do About Security Ethics? (Score:4, Interesting)
Rule Number 2 (Score:1, Interesting)
Rule #2 - The business makes the decisions, not security or audit or legal. Everyone serves the business, not security, audit, or legal (or your dept).
As a CISSP and security director, my job is NOT to lock down the business and fix every security hole. (Hang on!) My job is to discover the risk, document it, determine how to mitigate it, determine whether the benefit of mitigation outweighs the $ cost and the user "cost" in productivity, and relay all that information to management (keep some proof you did) so THEY can decide whether they want to assume, mitigate, transfer, or ignore the risk.
If they make a poor decision in your opinion, you challenge them gently with facts that SPEAK TO THE BUSINESS. If you can't do that, shut up, as you're wasting everyone's time and the shareholder's money.
If you challenge MGMT appropriately and it ignores you, you've done your job. Keep harping about it and you'll get fired. Since you raised the issue, you can maintain your ethics and sleep somewhat at night.
If you warn them, mgmt ignores it, and then it happens, you look good. Of course, you may get fired anyway, but at least you can stand tall and tell the next interviewer that you raised the issue proactively, it was rejected, and you were fired as a scapegoat. Not a bad reference in the security world.
Just stop trying to do management's job. Focus on your doing your job well and that should keep you busy enough. If you don't like how mgmt handles things, go somewhere else. But beware: there are few "somewhere elses" that are any different.
It's now common practice to copy company files (Score:1, Interesting)
They leverage these files to get a job in another company and then use the files as templates or models for documents/processes/forms that they have to produce in their new job. Some people even share these kinds of documents with friends at the local bar, in a scratch my back, I'll scratch yours, kind of mutual aid society.
There is little to no corporate security to protect against this kind of wholesale data copying. If you glue up the USB ports, then people will set up a web server on their laptop so that they can copy onto their home PC. Or they boot one of the CD versions of Linux to copy files off the Windows hard drive.
Fact is, that most corporate info is not proprietary enough or secret enough to warrant high security. It is only the work of a few high-ranking executives and people doing M&A work, that needs security. Most companies benefit from this copying as well, because new employees bring with them already-designed processes, documents and forms from their previous company. Instead of rebuilding from scratch, they just edit a new draft.