Will Security Task Force Affect OSS Acceptance? 224
An anonymous reader writes "An interesting article published by SD Times: "Application Security Goes National" discusses some of the talking points generated by a federal task force that will make recommendations to the Department of Homeland Security. One of these talking points is to license software developers and make them accountable for security breaches. Licensed developers would get paid more as well. The article also mentions that "Executives" might not wish to work with smaller undiciplined partners and a little further down that "Hobbyists create Web services [and] professionals create them" and that "companies relying on critical infrastructure Web services need confidence". Would OSS have to be writen entirely by licensed developers to be considered secure? . Yahoo Finance has another article on the subject." The SD Times article is current, despite the incorrect date on it.
OSS Acceptance (Score:2, Interesting)
The anti-FOSS lobbying is merely an example of the artificial barriers that prop of the prices and keep all those people employed. (Though I doubt that there are actually that many people earning their living by programming operating systems, Web browsers, and word processors these days. In the future the way to make money as a programmer will be to implement special-purpose applications that only scratch the itch of some company's shareholders.)
Licensing again huh? (Score:5, Interesting)
One problem (of many) is of course that if you make programmers legally responsible for security failures you also need to give them the authority to say "No! You can't do it that way! I don't care WHAT Marketeering says!"
Texas has had licensing for a few years. Anyone know how it's worked out?
Good concept, illegal in practice (Score:2, Interesting)
That said, the idea itself is good -- but let ACM *and* IEEE *and* Sun *and* whatever other institution do certifications... That avoids the government regulation, and allows potential employers to select "qualified" individuals.
Trends are fun (Score:5, Interesting)
The US military brass decided at one point that it would be great if all of their software was written in one language. They forned a comittee to design what they wanted. Ada was created and various military agencies started insisting on its use.
The problem was that what they designed wasn't flexible enough and over time Ada became less and less important.
Licensing will go a similiar route. The government will spend millions on a comittee to come up with requirements for a standard software engineer license. Then they'll find out that their licensed folks STILL screw up and eventually it'll become less of a big deal.
That being said, if software engineering licenses come into existance at the federal level you can bet I'm going to get one.
Will this help with our outsourcing problem? (Score:2, Interesting)
Why the license idea doesn't fit. (Score:5, Interesting)
As the past owner of two different businesses and the present manager of a mid size company, I can confidently say that the answer is no.
This is very simple. Over the years, I have hired a wide range of different people to work as programmers. I had everything from masters degree programmers with 20 years experience to kids out of school who do it as a hobby. In all cases, what determined the success or failure of the project was not the qualifications of the programmer. I had masters degree programmers write such gibberish that multi-hundred-thousand dollar projects were cancelled. I had masters degree programmers who did a marvelous job. I had some kids code up another product that worked so beautifully that it only made the company money. I also had kids who did a crappy job and the project failed. In other words, success or failure is determined by results, and nothing else.
Returning to the above question, software is considered secure if it is tested for vulnerabilities and is found to be strong against attempts to break in. If the programmer has a Ph.D., that's all nice and pretty, but it means exactly Jack Schitt. The results are the only thing that matter.
Therefore, I think this committee should not waste its time with issues like licensing, because that will only create more bureaucracy, more fees, and entire administrative efforts... and it provides no guarantees of success. They should figure out a way to measure the reliability of a piece of software (reliability is the parent category of security, because an insecurity reduces reliability). They should make up some guidelines for how mission critical systems should be judged and tested. Perhaps they should recommend that the government should hire its own crackers to constantly look for and help fix vulnerabilities. Because security isn't a one-time thing. "Let's license programmers and the problems will go away." It doesn't work like that. Like everything else related to management, in security, the only constant is change.
This is nothing but extending Palladium to people (Score:2, Interesting)
I am assuming the compiler will digitally brand your code with your signature, in order to find out who wrote the "unsafe" code that was breached.
States' rights? (Score:3, Interesting)
Re:Do they not get it? (Score:3, Interesting)
Would you consider banking software to be fairly important? I have seen banking software, to be used by the national banks of brittle developing economies being worked on by high school students with no engineering techniques being used at all. This software was being sold by a very large computer company with over 175,000 employees in over 100 countries, not a "fly by night" basement operation.
As to organizations being sued because their critical software failed, that is rarer than disbarments. Even then, the company suffers very little. A programmer or two might be fired. A fine might be paid. At the end of the day yet another profitable quarter is recorded which is all that really matters.
I have worked in software development for over 20 years now and, while most people advocate the careful processes you describe, nowhere I have worked actually does it (including three major companies whose names are three letter acronyms). None of my many friends and acquaintances in the business have worked at such a company either. One of the companies I worked for was ISO9000 certified to boot.
Immature discipline (Score:5, Interesting)
Consider the many centuries we were building buildings before we had anything beyond a few guestimated best practices to assure that they wouldn't fall down. Eventually, the field matured and we figured out how to calculate the strength of a building in advance. Even then, it is only reletivly recently that we could do dynamic simulations. In spite of that, we still have mishaps.
Furthermore, we STILL are not at the point where we can guarantee that a building will hold up under attack. In fact, we are certain that ANY building can be destroyed using explosives. In fact, any device we invent can be destroyed and in turn cause destruction when deliberatly used contrary to it's design.
At the same time, there are levels of vulnerability that are clearly substandard. Buildings must not simply fall down in a light breeze and cars must not explode when you start them.
On the basis of that, licensing and liability will need to be restricted to a very small subset of applications, and they will be very expensive. For the same reason that most of us don't have bomb proof cars, most software will not be built to that standard.
The other case would be grievously stupid design decisions such as having email from anonymous strangers be executable or using gets for a publically acessable interface.
I'll take it, on one condition (Score:5, Interesting)
That condition comes from the licensing of civil engineers, too. You have to be licensed to be a civil engineer, pass some fairly effective exams and all that. You can be held personally and professionally liable for screw-ups in your designs. But there's another aspect: you have control. If you're the civil engineer on a project and you specify that it needs X grade of concrete, that's it. If management tries to say "That's too expensive, build it using a cheaper concrete.", you get to say "No can do." and they can't argue. If they do, you make a phone call and the next day some gentlemen with badges show up to discuss the fines and penalties management is going to pay. If management fires you and uses the cheaper concrete anyway, the discussion will be about criminal charges on top of their liability, not yours, for any damages done because of their illegal substitution.
If licensing of software engineers includes everything that licensing of civil engineers does, including the "those who don't have the license do not get to overrule you on how the job gets done" provisions, it's IMHO a good deal. We ought to press for exactly that in licensing, because while companies would be highly allergic to it it'll play very well with the public. Think about public reaction when a structural failure turns out to have been caused by someone substituting shoddy materials for what was originally specified or otherwise not doing things the way the engineers said to do them.
audits,certifications can't stop security breaches (Score:2, Interesting)
It seems to really prevent all possible security breaches, you need to prove that the program is correct [uottawa.ca] first - I don't know of many entities that even try to prove their programs. I have heard of a few telecom infrastructure programs, but remember the big SS7 outage caused by one tech some years ago? The SS7 code is probably better "audited" than most code but would that outage have been construed as a "security breach"? - Yes, after the lawyers were done with it.
What about how quickly the world changes after a program is released? You use the best encryption technology of the day, you prove your programs correct, not just audit the code or use "good" software engineering/management methodologies. But you used DES (back in the day) or MD5 more recently, then MD5crack comes along or quantum computing and suddenly you are responsible for a "security breach" because of some exploit that didn't exist when you created the program.
That is nuts, who would want to sign up for that?
Besides DJB [cr.yp.to] does anyone even have the balls to reward people for finding security problems? Or even advertise security as a feature? OpenBSD (yeah, I know its dead, blah, blah, blah), pureftpd [pureftpd.org], NSA Linux [nsa.gov]
I expect not many others, because people expect code to have security issues.
Since security is such a big concern now (and in the past), I would think that people who wanted to show off their programming prowess would be bragging about how secure their code is. But no one does, that I know of - why? because its just damn hard to be sure that the code is perfect - which is what is required to prevent all possible security problems. So where are all these people with the big security cahones [geocities.com] going to come from?
Can a program be proven correct for all inputs?
If it isn't stateless then can each permutation of state and input be proven?
Are all the protocols used by the program verified?
The impossibility of preventing security breaches seem to make this kind of government action more likely. Burn the witches!! They hexed our computers, and were seen in the woods cavorting with unaudited code fragments!
NT was declared secure (Score:2, Interesting)