Thinking of Security Vulnerabilities As Defects 158
SecureThroughObscure writes "ZDNet Zero-Day blogger Nate McFeters has asked the question, 'Should vulnerabilities be treated as defects?' McFeters claims that if vulnerabilities were treated as product defects, companies would have an effective way of forcing developers and business units to focus on security issue. McFeters suggests providing bonuses for good developers, and taking away from bonuses for those that can't keep up. It's an interesting approach that if used, might force companies to take a stronger stance on security related issues."
Of course vulnerabilities are defects (Score:5, Insightful)
Thread over. (Score:5, Funny)
Re:Of course vulnerabilities are defects (Score:5, Informative)
but then what do you call design features like windows networking telling you if you got the first letter of your password right, even without the rest of the password, and then letting you do that for the next letter, and so on and so on.
it was a feature of early windows networking, to do just that! like people might 'forget' their password, so they would 'need' a feature that would tell them letter by letter, if they were getting warmer on remembering the password! hackers had a FIELD day with various 'features' of Microsoft products.
Re: (Score:3, Insightful)
Re: (Score:2)
... or legal telling you it's a good idea to include flaws in the design, so people can't sue.
No jokes about legal department being a mental defect, ok?
Re: (Score:3, Insightful)
but then what do you call design features like windows networking telling you if you got the first letter of your password right, even without the rest of the password, and then letting you do that for the next letter, and so on and so on.
Seem to me that is still a defect. Not in the software itself, but in the software design.
Re: (Score:3, Informative)
keep in mind the original RFC for SMB file sharing had nothing about encrypting the password, networks were new, the internet non existent, we're talking 1979 or earlier here... the original windows SMB filesharing was over netbios, not even over tcp/ip (because windows had no tcp/ip stack) so having SMB tell you each letter of the password was individually correct was more along the lines of 'routers' coming with the default password of 'admin' before they're configured or if they're manually reset...
it se
Re: (Score:2)
Yeah, it was quite astonishing the first time I saw an SMB password cracking tool in action. It looked just like how cracking tends to be portrayed in hollywood movies, a flurry of letters scrolling by, freezing one place at a time as the correct letters built the whole password, the whole operation finished in mere seconds. I had always sneered at that portrayal because it should be impossible to crack something that way, since who would be retarded enough to design a system where it would be possible? I g
Re: (Score:3, Insightful)
This example is certainly not ideal as it does not involve software design, but it is analogous and one that I have personally seen happen. Consid
Re: (Score:3, Insightful)
"Was the lack of security and the potential vulnerabilities a defect or a design flaw for the small company?"
How can somebody twist a simple concept into such a contorted one?
Defect is nothing more and nothing less than something not working as expected. If something is there by a concious decision is a feature; if something is misdoing, it's a defect. It's as simple as that. Really.
Now, on defects: if something works as designed, but the designers didn't thought in advance of a given (misdoing) situatio
Re: (Score:3, Interesting)
If something is working as designed and the designer doesn't feel some behaviour to be misdoing, then it's a systemic defect (either an unethical seller or an idiot/uniformed buyer).
Or the designer is an idiot. Been there, had to fight long and hard because first I struggled with coders that said the code is according to spec. Second I had to struggle with the designer that clearly didn't understand what the system was supposed to do, because logically and businesswise it made no sense but somehow he thought the design was right. It was close but he had mixed up two columns of data with similar content and used the wrong one. I finally got through and got it categorized as a defect and
Re: (Score:2)
"The point I was trying to make, and perhaps I did not make it as clearly as I could, is that you can have security vulnerabilities that are not defects."
I saw your point, but I think "the point" now is about the definition of vulnerability. You can go as theoretical as you want but, at the end of the day, more or less the same operational definition I gave for "defect" is just as valid for "vulnerability": if it works as expected, then it's not a vulnerability as it is not considered a vulnerability that
Re: (Score:3, Insightful)
As the company grows, this will become unacceptable. But once security is laid on, now you have to make sure that everyone has the right permissions to read the documents they need. It adds layers of overhead and usability. It increases security, but that security comes at the price of tremendous man hours for a select few domain admins and often forcing users to wait for a designated admin when they need basic things like software installed.
Two things... The first is that therein lies the rub. Permissions are a flaked out aspect. Why? See the second point. The unfortunate reality is that when there are permissions there are also methods to enforce them. Second? You're describing, to the letter, DRM and we can't have a discussion about DRM as a viable tool. The reality is that CHMOD is a basic form of DRM.
So security, as you state, is a matter of usability vs. security. No, not any, online computer is secure. To deny that is just blind. If it
Re: (Score:3, Informative)
The reality is that CHMOD is a basic form of DRM.
I must say, my keyboard would be absolutely soaked if I was drinking something right now. My maildirs and private keys? Not yours and I'm not setting public read. httpd.conf? Sure, I'll set public read and owner write so you can look at it and even copy it for non-disruptive editing for whatever reasons. It's also worth noting that a lot of 'users' don't even have people behind them on a typical home system.
It would be DRM if RO access meant you couldn't edit the copy. Moreover, all current incarnations of
Re: (Score:3, Insightful)
The current restrictions (and methods) of the widely used copy protection are barbaric and disturb me greatly. The way that the term DRM has been used has pretty much tarnished its reputation forever and some people have come to the conclusion th
NO! Otherway Round! Defects _are_ Vulnerabilities! (Score:3, Insightful)
The vast majority of security vulnerabilities are merely exploits of defects!
How do you hack a system? Find a bug, that's usually pretty easy....
Then you have the system operating already, "not as the designer intended" and you're more than halfway there...just add a bit of creativity and malice aforethought.
Re: (Score:3, Insightful)
Of course vulnerabilities are defects
If they were defects in the eyes of USA law, they'd be considered a material defect or design defect under existing contract or product liability law respectively.
There are a few possible outcomes from such a scenario
A) Nobody writes software anymore because they'd be sued into oblivion
B) Prices go up because coders & companies have to buy software defect insurance
C) Prices go up because companies spend more in labor to produce defect free code
D) The EULA lists every possible failure scenario (plausible
Re: (Score:3, Insightful)
If problems on cars were defects in the eyes of USA law, they'd be considered a material defect or design defect under existing contract or product liability law respectively.
There are a few possible outcomes from such a scenario
A) Nobody builds cars anymore because they'd be sued into oblivion
B) Car prices go up because builders & sellers have to buy car defect insurance
C) Prices go up because companies spend more in labor to produce defect free cars
D) The EULA lists every possible failure scenario (pl
Re: (Score:3, Insightful)
Except, when I bought a new car, there was a small defect in the paint job -- a nearly unnoticeable paint bubble. I'm sure that every car that comes off the lot has a blemish somewhere. Doesn't cause the car to crash, and life goes on. Same thing ain't true with software -- that same "blemish" could easily be turned around and allow someone to break into the software.
The only way we could get comparable results with software v.s. physical objects is if computer systems develop the ability to withstand a c
Re: (Score:2)
D) The EULA lists every possible failure scenario (plausible or not) in the interests of full disclosure and business continues as usual
Actually, IBM discovered all this decades ago. One of the ongoing jokes in IBM systems, dating at least from the 1960s, is about a customer documenting and reporting a bug, only to be pointed at page 485 in volume 17 of the documentation, where exactly that behavior is documented. And since it's documented in the manual, "It's not a bug; it's a feature" and won't be fixed
Re: (Score:2)
If you design a bridge for free and there's a defect so it collapses and kills somebody, would you be held liable?
I don't see how things will become cheaper. Things might become more reliable, but cheaper?
People might stop designing crappy bridges for free (bridges that "sorta work" most of the time). Take PHP and MySQL as examples, they're pretty crap, but lots of people seem to think they work well enough.
We don't sue suse/redhat if there's a problem, b
Re: (Score:2)
No.
Absolutely, but that has nothing to do with the UCC [wikipedia.org] which covers commerce.
See, when you buy something, you have certain rights; it has to work as advertised, or the seller has to pay damages.
If someone gives you a car that doesn't work, versus someone selling you a car that doesn't work, would be closer to the analogy.
Re: (Score:2)
Sadly, most manages only recognize a defect if it i something likely to be noticed by their customers. Unless your customers are hackers, they'll never discover the problem so there's no need to test for them.
The biggest problem, in my opinion, is that management as a whole is generally "success based" which means failures a swept under the carpet or otherwise ignored. This means the company as a whole never leans from its mistakes or avoids them in the future. Everyone runs around talking about how great e
Vulnerabilities are not always defects (Score:2)
If they weren't, they would be in the program design.
"It's not a bug; it's a feature."
That old joke isn't always a joke. Some vulnerabilities are built in, because they were put there intentionally by the designers and/or developers.
This is one of the primary arguments behind Open Source: If you can't get at the code and study it, you don't have any idea what "special features" might be hidden in there. The people who built it could have provided all sorts of back doors for exploitation by themselves or b
No (Score:3, Funny)
Re: (Score:2)
Since we already call "bugs" defects ... umm ,,, on second thought ...
Seriously, security faults, "bugs", "features" that aren't, etc., are defects. They're mistakes. Errors. They didn't just crawl in there accidently, no matter how much we strive to give them independent life by calling them "bugs" or "random behaviour". Not on deterministic systems like computers.
No - they go beyond application level defects (Score:4, Interesting)
The elephant in the room is that primitive, unsafe tools endlessly perpetuate these problems. Buffer over/under flows are not difficult problems to solve at language design level, but the common tools We currently use to create applications make diagnosing them and fixing them rocket science. C and C++ (and other lesser used languages) are notorious for being hostile to catching these problems at compile time or debugging them when they happen later. In most cases, the problem goes "unnoticed" affecting unrelated functions in the application downstream and incorrect behavior or crashes happen at a later time when they can no longer be traced back to the original cause.
For kicks check http://en.wikipedia.org/wiki/Buffer_overflow#Choice_of_programming_language [wikipedia.org]
Google search on http://www.google.com/search?hl=en&q=%2Bbuffer+%2B%22overflow%7Coverrun%7Cunderrun%22&btnG=Search [google.com]
Re: (Score:2)
What a load of crap. FORTH is as primitive and unsafe as they come and they don't have to deal with over/under flows the way incompetent C++ programmers have to.
If users knew that there was a corrolation between competence and bug-free and problem-free code, they'd stop accepting crap. Instead, there are a lot of programmers- some good, and some second rate, defending bugs and security problems as mere accidents at worst- the kind everyone makes.
Instead, we have this culture that has convinced the user to a
Re: (Score:2)
One reason is because people hardly ever bother to exploit FORTH programs or if they do they don't usually make so much noise about it. I crashed a forth webserver on my first try. But guess what, who cares. If you could crash Apache or IIS now at your first try, it's worth $$$.
But yeah, C++ is overused. Crappy programmers like me should stay away
No. They'd get sued (Score:2)
They'd get sued out of existence for shipping defective products. I can't see any company agreeing to label its products in such a way.
Re:No. They'd get sued (Score:5, Informative)
The article (at least in my reading) isn't saying that they should be held legally accountable as selling a defective product. Instead it's about how companies should approach a bug report of a vulnerability. He's saying, when someone reports a vulnerability, consider it something that you're obligated to fix, not as a feature request.
But then, I think most people do. It seems like he hit a bad support person.
I ran into a similar problem once with Citrix, actually. Their software was relying on some library that it assumed was installed, even though recent Linux releases (at the time) had stopped using that library. The result was that the software didn't work until you tracked down that library, dropped it in the right place, and then it worked fine.
So I went to their website to give feedback, just to let them know. I mean, I'm sure they would have figured it out, but I thought, "may as well give them a heads up" because it was happening on major linux distros almost a year after their release. Citrix had released several updates to their software, and never fixed this problem. I couldn't find anyplace on their website to provide feedback, except for a form to give feedback about the website itself.
So I wrote up a little feedback, trying to explain the situation briefly (i.e. "I wanted to drop some feedback to your development team letting them know there's a problem, how to fix it, but I can't find any contact information on your website. Is there any way to submit this sort of feedback). The response came back quickly, "If you want support, you'll have to pay for a support contract."
I wrote back again, trying to explain, "No, see, I'm not looking for help, I'm trying to be helpful. I'm letting you know that there's a problem I already know how to fix. I was just wondering if there was a place to submit this sort of feedback."
Again, the response came in, "I'm sorry sir, but if you want us to help you with this problem, you'll need to buy our support contract."
At that point, I gave up.
Re: (Score:2)
"when someone reports a vulnerability, consider it something that you're obligated to fix, not as a feature request."
Why any company in the world would do something like that!!!???
Oh, yes, only if they are legally or financially forced, that's how.
Or do you think any company in the world would rise their production costs for no benefit?
Re: (Score:2)
Re: (Score:2)
"What about customer satisfaction"
bollocks
"and the financial detriment of losing your customers"
Which part of "legally or financially forced" didn't you understand?
Re: (Score:2)
Re: (Score:2)
Wow. You'd never expect a significant software company to do anything like that.
Wait... That process is *exactly* the Microsoft process.
Re: (Score:2)
Very true. Also, the bonus system won't work for two reasons:
1: Security bugs tend to be discovered years down the road.
2: A lot of companies would have to start paying bonuses to programmers, and not just to S&M. That'd never float.
Re: (Score:2)
It's not so much a legal question as a business problem. Vendors are obligated to fix their bug and security holes because they are problems for customers.
I would just leave the courts out of it and let the free market decide. In this particular case, a good response would be: (A) I have reported this to management and we will seek other vendors to suit our needs. (B) I am obligated to report this security hole to the usual vulnerability lists, because since I found it obviously somebody else could too.
I
wait...what? (Score:2)
There are companies that DON'T treat security vulnerabilities as defects??
Re: (Score:2)
Re:wait...what? (Score:4, Funny)
Re: (Score:2)
Why is this even a question? (Score:5, Interesting)
To the best of our knowledge we've never had a remote exploit vulnerability, but even so we've gone so far as to scrap thousands of freshly pressed CDs a day before releasing them because I spotted a way to get root access through a tricky bit of business with shared libraries. (And that was for something spotted internally - no customer ever reported it.)
The real question isn't whether to treat security vulnerabilities as a defect - of course you do - but - somewhat paradoxically - whether or not to treat them as security vulnerabilities. We were acquired some time ago and have now adopted (and adapted to) various more complex procedures typical of a large company. There's this little box you're supposed to check in our current bug reporting system that says "this is a security vulnerability". The problem is that checking that box fires up a whole lot of extra process that rarely helps and can actually hinder prompt resolution of the problem and getting the fix into customer's hands.
Re: (Score:2)
Because it leads to the line of thought that one might hold a company liable for these defects, the repercussions of which would imply large changes both in the approach of software development, in the support of existing software, and in regard of the current state of widely used software.
The real question isn't whether to treat security vulnerabilities as a defect - of course you do - but - somewhat paradoxically - whether or not to treat them as security vulnerabilities.
Re:Why is this even a question? (Score:5, Interesting)
We've treated potential vulnerabilities in our products, even extremely minor ones, as defects for over two decades now. And we have always given them very high priority.
Go ask your corporate legal counsel what would happen if the law treated software vulnerabilities as design defects.
Re: (Score:2)
Go ask your corporate legal counsel what would happen if the law treated software vulnerabilities as design defects.
Umm... nothing? Last I checked pretty much every EULA disclaimed any liability for defective design along with everything else.
Vote with your money (Score:2)
Here's a summary of the article:
Vendors should make their developers work more on security (via money)
Meh. How often are the developers free to choose which parts and aspects of their companies' software they want to work on? As long as the companies tell their developers what to work on, here's the easy way: tell the devs to work on security testing and fixing. Letting the developers manage themselves is not going to sit well with management types, so you almost always have developers who are told what to work on.
Also, if anything external to the way you work (i.e. the pr
Generic unrelated subject (Score:3, Insightful)
Everybody, please laugh at the subject of my post which has no relation to its contents ;)
What I meant to write when I wrote the subject is that, from the point of view external to the organization developing insecure software, you are, according to the wisdom of the /. masses, supposed to vote with your wallet.
Yet, how's that expected to take place? To apply some of Schneier's observations, you have multiple parties, each with their own security agenda; the sysadmin might want the most secure option becau
Re: (Score:3, Insightful)
Also, if anything external to the way you work (i.e. the promise of more money) can make you work better, you're slacking off in your daily work: why don't you deliver peak performance without the extra money?
There's two ways to look at performance vs. compensation. Employees, ideally (at least from the employer's viewpoint) will look at it the way you do: you're being paid to do your best, so you should need no extra incentive to do so. Project management, on the other hand, should be pragmatic about it. Sure, employees SHOULD do their best no matter what, but maybe cash incentives can add motivation. If that is found to be the case, a good manager will choose results over principles.
Re: (Score:2)
If that is found to be the case
I'm waiting for you to google up someone to cite on a verdict ;)
Seriously, that would be interesting to know: how well does cash make people work better, and how does it depend on the "price structure"...
Re: (Score:2)
That's certainly a nice anecdote, which points out my unstated premise (which I thought was obvious): that I'm talking about differing levels of quality of work done in the same amount of time.
Yes (Score:2)
Yes, they are defects.
That said, there's one caveat to it: WHO'S defect is it? If the defect comes from a licensed library, OS issue, or other hidden cause, then that defect belongs to the source author/vendor.
Sometimes you end up having to work around someone else's crap.
Re: (Score:2)
But that's no different to any other product. Buy defective capacitors from a manufacturer for your product, and your product blows up because of them: It's your responsibility to recall the product and replace the capacitos.
Recall, fix, and sue... (Score:2)
...the source of the crapacitors for fraud.
Re: (Score:2)
That's an afterthought. First you are responsible for making your product save, even though you didn't manufacture the capacitors.
Intentional misuse (Score:3, Insightful)
If a user was intentionally mis-using software I had written, I wouldn't consider it a bug. Although a vulnerability is generally mis-use by someone other than the owner of that piece of software, I'd still have to conclude it's not a bug. If I'd built a car, I would be more than a little annoyed if I got the blame that someone had broken into it and run someone else over with it.
I think it needs to be left to the market to decide what is acceptably secure software. Many Ford cars from the early 90s had locks that were far too easy to break - just stick a screwdriver in and it opens - even did it myself when I locked the keys in the car once. They got a bad reputation, and Ford improved the security to a level the market was happier with.
The market in software doesn't work quite as well as for cars unfortunately, but that's another issue.
Re: (Score:3, Insightful)
Should they? (Score:2)
These legal concepts apply to cars and other tangible goods but not to software. They should.
Would that not destroy hobby software, and much of OSS and Free Software along with it?
Re: (Score:2)
Re: (Score:2)
They do apply to software. EULAs in most places at least are pretty much unenforceable nonsense. If software doesn't do it's job, you can return it for a refund.
Most software vulnerabilities are caused by inputting in random or intentionally malicious data. Continuing the car analogy, I can go a pour sugar (or worse) into a car's fuel tank and it will break the car (or explode and kill someone depending what I used). That isn't a bug. If I do this to my own car I'm completely to blame for my actions, a
Re: (Score:2)
> They do apply to software. EULAs in most places at least are pretty much unenforceable
> nonsense. If software doesn't do it's job, you can return it for a refund.
I'm not talking about EULAs. I agree that they are largely unenforceable. As far as I know in the US what is warranted is the tangible goods: the physical copy. If the CD is scratched you can insist that tney make you whole by replacing it with a good one or refunding your money but the software can be buggy as hell as long as the copy a
Re: (Score:2)
> Although a vulnerability is generally mis-use by someone other than the owner of that piece of software, I'd still have to conclude it's not a bug
So if you would write e.g. an FTP-server and user would setup that anonymous users have no write permission (read only). And then someone would as anonymous user send invalid command, which would cause a buffer overflow and corrupt the whole file system. You would not consider that a bug, because someone had just misused it by using a non-standard command?
Re: (Score:2)
I suggest leaving it to the market to decide precisely because of this sort of issue - an FTP server designed to be openly accessible shouldn't have this kind of vulnerability. Users wouldn't accept this sort of problem, so yes it should be treated as a bug.
Even so, sending a few Kb of malicious data down to the server is hacking and the legal 'blame' for this should be with the hacker not the developer.
But not all software is subject to this requirement. Probably the majority of applications developed ru
Re: (Score:2)
Justify that.
This issue is presently a public-relations problem with public-relations solution. So far, the strategy is to convince the user that they did something wrong and need more software. This works because the user tends to be under-educated, and because both competent and incompetent programmers are lock-step on message: programming is hard, you can't do it, and everyone makes mistakes.
I reject this premise, so I re
Re: (Score:2)
I think this is a good point. I can't really think of any other goods of the same level of complexity as software, that are not regulated and produced according to strict standards. However the cost of failure in software is relatively low - at least in the sense that it can be measured in dollars rather than lives.
Bonuses for good developers? (Score:1)
- difficult or simple code
- amount of code written
- where in the application does the developer code? Some code is more likely to result in security issues.
- how well that person knows the code
Re: (Score:2)
Absolutely (Score:4, Interesting)
As a software developer I spend about a quarter of my time rewriting code that one of our other developers writes. His code is like a rhesus monkey came in and started flinging shit all around. He 'keeps up' with the other developers because he does the absolute minimum, never ever rewrites code to fix problems, cuts and pastes, etc. One time he cut and paste a second copy of a 200 line function so he could change one loop constant.
There's lots of developers like him, and they and/or their company should get sued over that code. At least when it is from negligence. Or there should be a licensing requirement.... something so that the people who are irresponsible or incompetent are held responsible for it.
Pretty much the only thing that makes programming not worth while is that people can hack out a 80% working code, get credit for it, then move on and leave all the crap for competent developers to fix. I would gladly pay a malpractice insurance fee if it means less having to deal with bullshit code.
Re: (Score:2)
Tell me you are not so desperate for a job that you would spend 25% of your time fixing a coworker's mistakes? Bring it up with your manager, have his faults explained to him, document it over a month or so, and fire him if he doesn't get better. If the company doesn't agree to hold your coworkers accountable for their work, leave. There are plenty of other companies that do.
In fact, if you haven't already tried the above suggestions, the problem is almost as much your fault as anyone else's.
Re: (Score:2)
Tell me you are not so desperate for a job that you would spend 25% of your time fixing a coworker's mistakes? Bring it up with your manager, have his faults explained to him, document it over a month or so, and fire him if he doesn't get better.
Because the other 75% of the time I get to code awesome stuff. He's friends with the CEO and went to college with a good portion of the company, so licensing or liability would help -- but your suggestions would not, Mr. Know-It-All.
Customers (Score:2)
No matter what tricks you try to use to get your developers and others to focus on security issues, it is going to cost money. Denying bonuses won't help because your developers can always leave and work for a competitor who doesn't play that game. And you'll still have to fix those vulnerabilities.
The solution is to ask your customers. Given the choice between a more secure, more expensive product and a less secure, less expensive product, which will your customers choose? Once you have the answer to that,
Black robes and Black Hats (Score:1)
To make software bullet proof requires the developers to have the skills of the hackers and malware writers, and the resources, the secret handshakes, the underground culture.
Yer talking one of two scenarios here:
Re:Black robes and Black Hats (Score:4, Insightful)
Actually you really need just one person in the company with "haxor" skills to test the security of the products that others make. A single person can very quickly find a lot of common holes. That person doesn't need to a developer. He/She can be there just for testing or even just for supervising others that make the testing, to make sure that they test for security vulnerabilities also.
Re: (Score:2)
There are also tools, both OSS and commercial to analyze source code. All of these things help, and everyone should be using at least some of the measures available.
TFA Author is Inexperienced (Score:5, Insightful)
"The problem of course is I'm saying how the companies should handle them, and I have no authority at any of these places, save people actually valuing my ideas. Personally, I've done some development in the past, and there was the concept of defects. Your bonus would depend on how many defects were in your application at delivery time. These were feature-based defects, but shouldn't vulnerabilities be considered defects as well?"
So, the author freely admits he is neither a developer or a manager. If he was a developer he'd know that these are defects and everyone treats them as such.
If he was a manager, he'd know that one of the surest ways to wreck a good shop is to start doing comp based on defects. Here is what invariably (in my experience) happens when a shop includes defect counts in there comp plans.
1. Relationships between Dev, QA, Product Management and Operations get worse because the terms 'defect' and 'bug' become toxic. In reality these things always exist in software. The last thing you want to do is create barriers to dealing with them. Making the acknowledgment of a defect cost someone money means you will have arguments over every one of them unless they cause an out right crash.
2. Culture becomes overly risk-averse - No one wants to take on difficult problems or blaze new territory. The smartest people will naturally pick the easiest work to minimize the risk of defects.
3. Over-dependence on consultants - More CYA behavior. If it's too complex people will outsource to keep the defects away. This is a very bad thing if the nasty problems are because of business and not technical challenges. Now the people who know enough about the problem domain to understand the risk are hiring proxies who know nothing to avoid responsibility for 'defects'.
Re: (Score:2)
So far I've been fortunate enough not to have worked in such an environment. If I did, though, I could imagine that there would be a lot of motivation to fix the minor, easy-to-address defects such as "text is a slightly incorrect shade of gray", and ignoring the larger, more complicated defects that take longer to solve and might be orders of magnitude more serious for the end product. I guess this i
Treating security issues as defects depends on... (Score:4, Insightful)
...the nature of the security issue.
A defect, by definition, is an unintended behavior of a program. Something was designed to work, but for whatever reason, doesn't. Compare this to a lack of a feature, which means that something doesn't work because there was never the intention for it to work in the first place.
A buffer overflow or SQL injection related issue is almost definitely a defect, since there is a dedicated, designed parsing mechanism to process input, and if some types of inputs are not processed as intended, it is a defect of the software.
On the other hand, for example, a security issue arising from plaintext transmission of sensitive data over the net, is not necessarily a defect. If the site in question was never designed to use SSL or another encryption mechanism, then it's a lack of a feature. If the site in question is an online banking site, then it is a blatantly poor and inexcusable design shortcoming, but nontheless, not a defect. (Of course, if the site DID intend SSL to work properly, but for whatever reason there is a hole allowing to crack or circumvent the encryption, then it IS a defect).
Besides, assigning a "defect" status to a security issue is not necessarily useful for it's own sake. The understanding is that a responsible company should treat a security issue with much higher priority than a non-security related one, defect or not (compare "we released an emergency hotfix to download" to a "we'll ship the patch in the next release cycle"). Saying a security issue is a defect, is like saying that a cardiac arrest is "organ numbness" - true, but not very useful.
Re: (Score:1)
Wouldn't matter what you call them. (Score:2)
The EULA's exempt holding the software maker for defects in their products anyway. Be them security holes or total meltdowns.
Re: (Score:2)
EULAs are unenforcable in California, which is why VMware, among other companies, has in section 8 of their EULA "This EULA will be governed by California law."
So us Californians are safe. No EULA can go against a law.
Re: (Score:2)
In that case, id would forbid sales of my software in that state without written pre-sale contract.
Just to protect myself.
Re: (Score:2)
Too bad most major software companies have HQ here in California. They'd be SOL.
Edward Deming would disagree (Score:5, Informative)
When I think of defects and total quality management, I think of Edward Demings [wikipedia.org].
Edward Demings saw the problem of defects as a systems issue, not an individual performance issue. And his theory was that paying someone based on performance would have the unintended consequence of increasing the number of defects, not decrease them (Here is the list of Deming's 14 principles with my emphasis added in bold).
Re: (Score:2)
As would I, not that it matters.. You take, point by point, Edward Demings list and compare it to any software company practices! What do you get - a total mismatch, because it would break the management structure (and lower the next quarter profits - maybe?)
Part of blame goes to the consumer! There was a time when I was able to get an answer and often a fix in 12 hours from vendor. And they better - otherwise our ($1 million in 70's) monthly license payments somehow created the same problems, got lost, wha
tracking vulnerabilities (Score:2)
Management is usually to blame (Score:1)
companies would have an effective way of forcing developers and business units to focus on security issue
I don't think developers need to be 'forced' because generally they understand the importance of making their software secure and if they don't do it then it's usually due to external pressures such as unreasonable deadlines and management not wanting them to spend time on something that does not have tangible results.
In short, the problem with a lot of companies is that management doesn't value secu
The market is to blame (Score:2)
Management are trying to maximise profit, and typically don't care anywhere near as much as developers whether the job is done 'right'.
The problem is most buyers of software are way more interested in shiny bits and pieces than the security. If (more) people weren't willing to put up with insecure software, managers would be asking the developers to work more on the security aspects of the application.
Already are (Score:1)
They already are considered defects. The only thing that might change is the prioritization of defects. This topic is ridiculous.
Bonuses? Are you suggesting bonuses for finding defects? In 20 years I've worked at only once place that awarded bonuses for resolving defect
Great Idea On Paper...BUT... (Score:4, Insightful)
In the RW, i'd suggset that we should consider the following;
You are Programmer Sian (notice the trendily androgynous name), you work for a gigantic software company, or conglomerate or industrial that does all its own major development inside, you are potentially confronted with;
1. Antiquated Developer Tools -- in general, the larger the development environment, unless you're Disgesting Your Own Pet's Nutrition, you are very likely to be using multi-year and/or multi-generation old development platforms and tools.
The question here is then, how can you effectively hold poor Sean accountable for vulnerablities, that are intrinsic to many older tools?
Who's more accounatable here? Sian or the managers who make the procurement decisions?
2. "Science Fiction" Application Programming Interfaces - depending on whether you are programming on a well-established product or not, if you are, Poor Sian is probably stuck with API's that were developed many years before and have been the victim of Design Creep, and its, Lunatic Cousin, Design Implosion.
In many instances the APIs, while they may once have had a large degree of Paradigmatic and Philosophic Design Integrity, as their initial Designers and Implementers have moved on to other; products, companies or, Worst Case, Inpatient Mental Health Facilities. Many New Designers have come in to add "Their Own Programming Uniqueness" to the APIs, frequently rendering the API's a jumble of radically different approaches to similar algorithms.
Should Sian be subjected to having their pay docked because 9/10 Functions implement a Library Call one way, and some "Johnny-Come-Lately" API function implements a similar looking, but substantially different in output function?
Shouldn't the API Designers/Architects be held more responsible for this one?
3. PHB Stupidity - As QC forwards endless application/OS defect notices to the Development/Maintenance Team, these defects are reviewed by the Team Managers and Supervisors. It's understandable, given the 11 hours per day of Absolutely Vital Meetings that most PHBs love to, i mean are forced to attend, that Defect Prioritization will suffer.
Sian can't choose what defects to repair, and in what order to repair them.
This is a management function, and one, in my experience, that Mgt usually jealously and zealously guards.
SOOOO, it's been the case in every Development project that i've worked on and know about, that PHB's have a well-understood tendency to prioritize Defect repair, according to external pressures, especially from Sales and Marketing.
Sales and Marketing organizations are usually likely to priortize according to their immediate impact on quarterly projections.
Vulnerablities are only likely to affect quarterly results when they are CATASTROPHIC defects, i.e. App or OS Killers. Otherwise, the majority of vulnerablities, which are usually well submerged in the Defect numbers, tend to get shoved aside for the higher priority defects that S&M believe impact immediate sales.
There are numerous other considerations here; including Contract Programmers, Legacy Compatability (ask the Vista Team about that one), Vendor Driver Teams that don't even know what to do with a new code base, etc, etc..
But it seems to me, that, while financial incentives CAN BE, useful as a Mgt tool for improving product quality, they should, to be even-handed, applied across the entire product team, with specific ***POSITIVE*** incentives used to take care limited, high priority problems across the product line.
There's already a tendency to "blame the programmer", and my Best Guess is, that any attempt to lay the responsiblity for vulnerabillites, THAT AREN'T CLEARLY THE RESULT OF SLOPPY/POOR/INCOMPETENT CODE PRODUCTION, at the feet of the programmer, will merely increase the employee turnover in the Development Team. Something that already is a problem most places.
from my experience: "The Fault, Horatio, Usually Lies Not In Our Code, But In Our Process"
Semantics (Score:2)
This reminded me of a funny/ typical/ stupid/ aggravating thing at work a few weeks ago. I pointed out a security vulnerability in one of our intranet apps during a meeting to discuss the next release. Despite exasperating efforts to educate --and a heated argument over the correct term-- a project manager insisted on spreading the word to upper management that we had a "security breach." But in the war with management (and those who THINK they're above us on the org. chart), I guess it's all about the p
Security vulnerabilitiesare not functionality bugs (Score:3, Insightful)
However, security bugs are not easy to test or discover. In fact, it's very expensive to do testing to uncover even some easy classes of security vulnerabilities. Normal users do not stumble on security problems like they do with functionality issues.
Also, none of your developers were ever taught anything about application security in college. They professors are clueless. Even Michael Howard from MS who is hiring out of the best universities in the world cannot find a new grad who has any clue how to build secure software.
Functionality bugs and Security bugs are apples and oranges and deserve very different consideration. (Like measurement of Risk, etc)
Last, you can make a piece of software work. But you an never make a piece of software secure, only reduce risk to an acceptable level.
Re: (Score:2)
If you constantly find flaws by hiring pentest firms, you are in the wrong stage. You need to get Secure SDLC built into your development and actually try to catch these flaws in the design phase.
That is great in theory, and might be true in the future, but you are missing the reality of the software development industry as it stands today.
1) Universities are not teaching software engineers about application security.
2) Most development organizations do not have leadership that understand the complexities and processes needed for secure software engineering.
3) Network Security training organizations like SANS teach courses around application security that barely teaches developers the ski
McFeters is nuts! (Score:3, Interesting)
I work with the vulnerability management team and product security team at a large software company, and trust me vulnerabilities are treated as product defects. The cost of addressing vulnerabilities in the field is huge, and not addressing them is simply not feasible - customers would never tolerate it.
Should software piracy be treated as theft? (Score:2)
Redefining something rarely changes anyone's behavior.
Vunerabilities aren't defects unless the behavior of the application is inconsistent with it's requirements.
In any case, the solution isn't likely to be found in rewarding or punishing developers, but rather making security part of the requirements and providing enough development and testing time to insure that the software is secure.
Generally the market drives the process rather than software quality.
Too broad of a question (Score:2)
The same thing applies in software - no piece
Well Yah... (Score:2)
Because that's idiotic (Score:2)
Security vulnerabilities are a backwards way of describing the issue. There are "security issues" and there are "vulnerabilities". If you throw user passwords into the cloud, it's a security issue. If there's a way for someone to hack into your database of passwords, it's a vulnerability.
Vulnerabilities don't get solved -- virtually ever. Vulterabilities get "handled". By that I mean that every "solution" costs something -- resources, performance, interactivity, flexibility, time, something.
For example
Ponder the implications (Score:2)
There is no bug free code. You are dependent on SO many variables, most of them not under your control (your program runs on an OS that you didn't write, was compiled with a compiler you didn't write, runs on hardware you didn't design which uses a BIOS that's not under your control...), that you can never credibly claim your program runs flawlessly. Even any given Hello World has the potential to be buggy, not even due to you. Worse yet, any (but the compiler, I give you that) can change on the target mach
Criminal negligence (Score:1)
Unless you make security defects criminally punishable you will get no traction at all.
I'd imagine that this is already the case for banks, payment processors, medical facilities, and the like.
Re: (Score:2)
Apparently not. My dialysis center runs all Microsoft OSes, and I've even heard the techs complain about how they were sure they entered info, but it wasn't in the system. Sounds like a familiar problem. [cdi.org]
Their system also seems to miscalculate the amount of fluid to be taken off. At first I thought maybe it was just trying to challenge the presumed dry weight which was in the system, but if this were the case, then why have the techs started calcuating it by hand? I have observed this in two dialysis cente