Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Software Linux

Will Security Task Force Affect OSS Acceptance? 224

An anonymous reader writes "An interesting article published by SD Times: "Application Security Goes National" discusses some of the talking points generated by a federal task force that will make recommendations to the Department of Homeland Security. One of these talking points is to license software developers and make them accountable for security breaches. Licensed developers would get paid more as well. The article also mentions that "Executives" might not wish to work with smaller undiciplined partners and a little further down that "Hobbyists create Web services [and] professionals create them" and that "companies relying on critical infrastructure Web services need confidence". Would OSS have to be writen entirely by licensed developers to be considered secure? . Yahoo Finance has another article on the subject." The SD Times article is current, despite the incorrect date on it.
This discussion has been archived. No new comments can be posted.

Will Security Task Force Affect OSS Acceptance?

Comments Filter:
  • OSS Acceptance (Score:2, Interesting)

    by Anonymous Coward on Wednesday December 31, 2003 @08:49PM (#7850044)
    For commonly used software this provision of jobs increasingly depends on artificial barriers to the acceptance of free alternatives. Now that millions of people are programmers with supercomputers on their desks and an itch to scratch, and now that the cost of software distribution is approximately zero, the unconstrained market value of a line of code for a commonly used application is rapidly converging to zero.

    The anti-FOSS lobbying is merely an example of the artificial barriers that prop of the prices and keep all those people employed. (Though I doubt that there are actually that many people earning their living by programming operating systems, Web browsers, and word processors these days. In the future the way to make money as a programmer will be to implement special-purpose applications that only scratch the itch of some company's shareholders.)
  • Licensing again huh? (Score:5, Interesting)

    by DroidBiker ( 715045 ) on Wednesday December 31, 2003 @08:49PM (#7850048)
    I suspect we'll have some sort of meaningful licensing scheme someday. It'll probably take a while tho. There will be a lot of pain and probably more than a few witch hunts before it happens.

    One problem (of many) is of course that if you make programmers legally responsible for security failures you also need to give them the authority to say "No! You can't do it that way! I don't care WHAT Marketeering says!"

    Texas has had licensing for a few years. Anyone know how it's worked out?

  • by Aviancer ( 645528 ) on Wednesday December 31, 2003 @08:53PM (#7850078) Homepage Journal
    This is the grant of government license to do a specific type of work. That's akin to the government granting the title of Lord, and is technically illegal.

    That said, the idea itself is good -- but let ACM *and* IEEE *and* Sun *and* whatever other institution do certifications... That avoids the government regulation, and allows potential employers to select "qualified" individuals.
  • Trends are fun (Score:5, Interesting)

    by DroidBiker ( 715045 ) on Wednesday December 31, 2003 @09:03PM (#7850144)
    In the near term if they adopt a licensing scheme the first iteration at least will be something like the programming language Ada.

    The US military brass decided at one point that it would be great if all of their software was written in one language. They forned a comittee to design what they wanted. Ada was created and various military agencies started insisting on its use.

    The problem was that what they designed wasn't flexible enough and over time Ada became less and less important.

    Licensing will go a similiar route. The government will spend millions on a comittee to come up with requirements for a standard software engineer license. Then they'll find out that their licensed folks STILL screw up and eventually it'll become less of a big deal.

    That being said, if software engineering licenses come into existance at the federal level you can bet I'm going to get one.

  • by samdaone ( 736750 ) <samdaone@hotmail.com> on Wednesday December 31, 2003 @09:11PM (#7850206) Journal
    If this if for homeland security does that mean the only people who can be licensed are US citizens native to this country? If so, that may help with our outsourcing epidemic.
  • by rice_burners_suck ( 243660 ) on Wednesday December 31, 2003 @09:12PM (#7850212)
    Would OSS have to be [written] entirely by licensed developers to be considered secure?

    As the past owner of two different businesses and the present manager of a mid size company, I can confidently say that the answer is no.

    This is very simple. Over the years, I have hired a wide range of different people to work as programmers. I had everything from masters degree programmers with 20 years experience to kids out of school who do it as a hobby. In all cases, what determined the success or failure of the project was not the qualifications of the programmer. I had masters degree programmers write such gibberish that multi-hundred-thousand dollar projects were cancelled. I had masters degree programmers who did a marvelous job. I had some kids code up another product that worked so beautifully that it only made the company money. I also had kids who did a crappy job and the project failed. In other words, success or failure is determined by results, and nothing else.

    Returning to the above question, software is considered secure if it is tested for vulnerabilities and is found to be strong against attempts to break in. If the programmer has a Ph.D., that's all nice and pretty, but it means exactly Jack Schitt. The results are the only thing that matter.

    Therefore, I think this committee should not waste its time with issues like licensing, because that will only create more bureaucracy, more fees, and entire administrative efforts... and it provides no guarantees of success. They should figure out a way to measure the reliability of a piece of software (reliability is the parent category of security, because an insecurity reduces reliability). They should make up some guidelines for how mission critical systems should be judged and tested. Perhaps they should recommend that the government should hire its own crackers to constantly look for and help fix vulnerabilities. Because security isn't a one-time thing. "Let's license programmers and the problems will go away." It doesn't work like that. Like everything else related to management, in security, the only constant is change.

  • by Pepebuho ( 167300 ) on Wednesday December 31, 2003 @09:58PM (#7850425)
    Licensing Software professionals and holding them accountable for software security is the Palladium concept applied to people. Once you have to license "software engineers" in general, you will have them digitally signing their code and then only software duly signed will run on your Palladium Computer. Otherwise, your computer might run (gasp!) pirate code!
    I am assuming the compiler will digitally brand your code with your signature, in order to find out who wrote the "unsafe" code that was breached.
  • States' rights? (Score:3, Interesting)

    by Burnon ( 19653 ) on Wednesday December 31, 2003 @10:37PM (#7850575)
    So, leaving aside issues of whether or not this is a good idea, are states' rights being encroached upon with this idea? States currently license engineers as they feel its necessary - why would software require federal licensing? Engineering is engineering, whether your twiddling bytes or blocks.

  • by peter hoffman ( 2017 ) on Wednesday December 31, 2003 @11:41PM (#7850859) Homepage

    Would you consider banking software to be fairly important? I have seen banking software, to be used by the national banks of brittle developing economies being worked on by high school students with no engineering techniques being used at all. This software was being sold by a very large computer company with over 175,000 employees in over 100 countries, not a "fly by night" basement operation.

    As to organizations being sued because their critical software failed, that is rarer than disbarments. Even then, the company suffers very little. A programmer or two might be fired. A fine might be paid. At the end of the day yet another profitable quarter is recorded which is all that really matters.

    I have worked in software development for over 20 years now and, while most people advocate the careful processes you describe, nowhere I have worked actually does it (including three major companies whose names are three letter acronyms). None of my many friends and acquaintances in the business have worked at such a company either. One of the companies I worked for was ISO9000 certified to boot.

  • Immature discipline (Score:5, Interesting)

    by sjames ( 1099 ) on Wednesday December 31, 2003 @11:53PM (#7850909) Homepage Journal

    Consider the many centuries we were building buildings before we had anything beyond a few guestimated best practices to assure that they wouldn't fall down. Eventually, the field matured and we figured out how to calculate the strength of a building in advance. Even then, it is only reletivly recently that we could do dynamic simulations. In spite of that, we still have mishaps.

    Furthermore, we STILL are not at the point where we can guarantee that a building will hold up under attack. In fact, we are certain that ANY building can be destroyed using explosives. In fact, any device we invent can be destroyed and in turn cause destruction when deliberatly used contrary to it's design.

    At the same time, there are levels of vulnerability that are clearly substandard. Buildings must not simply fall down in a light breeze and cars must not explode when you start them.

    On the basis of that, licensing and liability will need to be restricted to a very small subset of applications, and they will be very expensive. For the same reason that most of us don't have bomb proof cars, most software will not be built to that standard.

    The other case would be grievously stupid design decisions such as having email from anonymous strangers be executable or using gets for a publically acessable interface.

  • by Todd Knarr ( 15451 ) on Thursday January 01, 2004 @02:22AM (#7851352) Homepage

    That condition comes from the licensing of civil engineers, too. You have to be licensed to be a civil engineer, pass some fairly effective exams and all that. You can be held personally and professionally liable for screw-ups in your designs. But there's another aspect: you have control. If you're the civil engineer on a project and you specify that it needs X grade of concrete, that's it. If management tries to say "That's too expensive, build it using a cheaper concrete.", you get to say "No can do." and they can't argue. If they do, you make a phone call and the next day some gentlemen with badges show up to discuss the fines and penalties management is going to pay. If management fires you and uses the cheaper concrete anyway, the discussion will be about criminal charges on top of their liability, not yours, for any damages done because of their illegal substitution.

    If licensing of software engineers includes everything that licensing of civil engineers does, including the "those who don't have the license do not get to overrule you on how the job gets done" provisions, it's IMHO a good deal. We ought to press for exactly that in licensing, because while companies would be highly allergic to it it'll play very well with the public. Think about public reaction when a structural failure turns out to have been caused by someone substituting shoddy materials for what was originally specified or otherwise not doing things the way the engineers said to do them.

  • by Iamnoone ( 661656 ) * on Thursday January 01, 2004 @06:11AM (#7851853)
    One of these talking points is to license software developers and make them accountable for security breaches.

    It seems to really prevent all possible security breaches, you need to prove that the program is correct [uottawa.ca] first - I don't know of many entities that even try to prove their programs. I have heard of a few telecom infrastructure programs, but remember the big SS7 outage caused by one tech some years ago? The SS7 code is probably better "audited" than most code but would that outage have been construed as a "security breach"? - Yes, after the lawyers were done with it.

    What about how quickly the world changes after a program is released? You use the best encryption technology of the day, you prove your programs correct, not just audit the code or use "good" software engineering/management methodologies. But you used DES (back in the day) or MD5 more recently, then MD5crack comes along or quantum computing and suddenly you are responsible for a "security breach" because of some exploit that didn't exist when you created the program.

    That is nuts, who would want to sign up for that?

    Besides DJB [cr.yp.to] does anyone even have the balls to reward people for finding security problems? Or even advertise security as a feature? OpenBSD (yeah, I know its dead, blah, blah, blah), pureftpd [pureftpd.org], NSA Linux [nsa.gov]
    I expect not many others, because people expect code to have security issues.

    Since security is such a big concern now (and in the past), I would think that people who wanted to show off their programming prowess would be bragging about how secure their code is. But no one does, that I know of - why? because its just damn hard to be sure that the code is perfect - which is what is required to prevent all possible security problems. So where are all these people with the big security cahones [geocities.com] going to come from?

    Can a program be proven correct for all inputs?
    If it isn't stateless then can each permutation of state and input be proven?
    Are all the protocols used by the program verified?

    The impossibility of preventing security breaches seem to make this kind of government action more likely. Burn the witches!! They hexed our computers, and were seen in the woods cavorting with unaudited code fragments!
  • by wap911 ( 637820 ) on Thursday January 01, 2004 @01:12PM (#7853103)
    NT was given something like a "G5" security rating, BUT is was not attached to any network what so ever. Lot of good that certification did as Win2k and XPee were then built from this so called "secure base". The "system" is the problem not the programmers. Testing and certifing software for it designed and useful purpose, which EULAs negate, would be a starting point. Also I am sitting here on a RoadRunner connection using a Motorola SurfBoard modem. Why did I have to spend $45 for a SMC 7004VBR to secure it when 1-2 chips in the modem would have taken care of a lot of issues [log is showing 4-6 denied attacks every 5-10 minutes].

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...