Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security Businesses Programming Software The Courts

What Happens When Software Companies Are Liable For Security Vulnerabilities? (techbeacon.com) 221

mikeatTB shares an article from TechRepublic: Software engineers have largely failed at security. Even with the move toward more agile development and DevOps, vulnerabilities continue to take off... Things have been this way for decades, but the status quo might soon be rocked as software takes an increasingly starring role in an expanding range of products whose failure could result in bodily harm and even death. Anything less than such a threat might not be able to budge software engineers into taking greater security precautions. While agile and DevOps are belatedly taking on the problems of creating secure software, the original Agile Manifesto did not acknowledge the threat of vulnerabilities as a problem, but focused on "working software [as] the primary measure of progress..."

"People are doing exactly what they are being incentivized to do," says Joshua Corman, director of the Cyber Statecraft Initiative for the Atlantic Council and a founder of the Rugged Manifesto, a riff on the original Agile Manifesto with a skew toward security. "There is no software liability and there is no standard of care or 'building code' for software, so as a result, there are security holes in your [products] that are allowing attackers to compromise you over and over." Instead, almost every software program comes with a disclaimer to dodge liability for issues caused by the software. End-User License Agreements (EULAs) have been the primary way that software makers have escaped liability for vulnerabilities for the past three decades. Experts see that changing, however.

The article suggests incentives for security should be built into the development process -- with one security professional warning that in the future, "legal precedent will likely result in companies absorbing the risk of open source code."
This discussion has been archived. No new comments can be posted.

What Happens When Software Companies Are Liable For Security Vulnerabilities?

Comments Filter:
  • Nada software, the only one without bugs or security holes. http://www.bernardbelanger.com... [bernardbelanger.com] Just a kindly reminder: you get what you pay for.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Because of the move toward more agile development and DevOps, vulnerabilities continue to take off...

      You cannot build a secure application without planning the whole thing out first. This ADHD / MBA / lazy fuck / quick profits / fuck the customer approach to development ("agile") is a cult kool aid, and all the young ones drank it.

      It will take computer science decades to recover from this, if it ever does. I think we may have already peaked.

  • by known_coward_69 ( 4151743 ) on Saturday June 17, 2017 @07:24PM (#54640365)

    Just look at medical devices. They don't cost that much to make but have to go through a long certification process that needs to be paid back.

    Same with software. Something like SOX, PCI or HIPAA will pop up to certify "secure software" and software that is patched on a regular basis and people will end up paying for it. And on top of that every piece of software will be "certified" on some platform, similar to a game console. If you run it outside of the certified hardware you lose the ability to sue.

    You're all idiots if you think you will be able to run software on some of the ridiculous configurations I've seen in my time and expect vendors to pay for it when it breaks because of your stupidity.

    • by Chris Mattern ( 191822 ) on Saturday June 17, 2017 @07:52PM (#54640441)

      Just look at medical devices. They don't cost that much to make but have to go through a long certification process that needs to be paid back.

      And yet, ironically, that certification process does not cover security. The software on medical devices is well known for being almost ludicrously insecure.

      • This price hike won't either.

        And just like the price hike with medical devices, it will be mostly to cover the additional cost for legal battles and settlements. Life has a price, ya know...

      • Doesn't matter. After the lawsuits of the 80's and 90's there are now "best practices" and "standards of care" and standards for almost everything because you can't just sue. You have to prove someone did something wrong.

        Same here. Industry will make up some best practices, it will be a certification or some other process that costs a lot of money, it will mean hiring people to push the paper and make sure the paperwork is right and everyone will pay.

    • The whole medical ecosystem is seriously screwed up, starting with the reimbursement models. They scream high costs, but one particular med device company I worked for spent $600 per device full up for production and FDA overhead. They sold for $15K and the company was barely breaking even. Where did the other $14K+ go? Mostly sales and marketing, also lobbying for increased reimbursements.

  • If not already, there will be a clause added to the Terms and Conditions saying, "(a) We're not liable and (b) all disputes will be settled via arbitration."

    • Now only the laws of your country must allow that.

      • by tepples ( 727027 )

        "(c) Residents of jurisdictions that do not recognize disclaimers of implied warranty or mandatory arbitration are not eligible to license this software or purchase this device."

  • by El Cubano ( 631386 ) on Saturday June 17, 2017 @07:26PM (#54640377)

    ... software takes an increasingly starring role in an expanding range of products whose failure could result in bodily harm and even death. Anything less than such a threat might not be able to budge software engineers into taking greater security precautions.

    What you are seeing is the maturing of software engineering as a profession. A few hundred years ago if you needed surgery you would go to your barber [wikipedia.org]. The reason for this was that they were usually in possession of the right tools. The medical profession eventually matured to what we have today, where a surgeon is a specialized physician. But that didn't happen overnight and lots of people died in the process. In fact, we didn't even have a theory of infectious disease until the 1830s.

    The point is that right now hardware, including its firmware components, is oftentimes made without the involvement of a software engineer. It wasn't that long ago that software engineers didn't even exist and in time as the profession matures we will get to the point where developing a piece of hardware without the participation of a software engineer will be unthinkable. But we are not there yet.

    An important side note is that there is a difference between a coder, a developer, a programmer, a software engineer, and several other specialized disciplines in the software arena. I think that a precondition to solving the problem identified by the article has less to do with things like development methodology (that is not central the problem at hand) and more to do with establishing minimum standards for some who claims to be a software engineer. For instance, a surgeon in 2017 has to meet vastly different minimum qualifications than a surgeon did in 1917. We didn't even have software engineers a hundred years ago, so who knows what it will actually looks like by the time the field really starts to mature.

  • by AncalagonTotof ( 1025748 ) on Saturday June 17, 2017 @07:35PM (#54640389)

    Sorry, I stopped there, at "Even with the move toward more agile development and DevOps". What's the link, supposed positive here, between the two ?
    Both "old" and "new" method won't never mean better software than the people using them.

    Bad engineers using old method (V cycle ? Tons of documents ?) or new methods (you said agile, as in "get as many things done as possible, as quickly as possible, using shiny web app like Trello or Kanban-something" ?) won't make secure software.
    May be with good engineers, you can achieve good results, whatever the method is.

    More or less related : ISO9001 doesn't mean that the certified company makes good products, it means that it produces always the same quality, good or bad.

    This may sound a bit like a troll, but I'd like to add that, since young engineers favor more agile methods, and considering the lack of experience, combined to the messy sensations I sense in agile methods, I tend to think that agile methods would produce less secure software ...

    • by vtcodger ( 957785 ) on Sunday June 18, 2017 @02:48AM (#54641501)

      I tend to think that agile methods would produce less secure software ...

      Couldn't agree more. The notion that the road to computing security runs through agile and Devops seems to me to be as unlikely as the notion that the way to get you and your bicycle from New York to Bermuda is to head off on said bike for the Bering Strait (in Winter) so you can get to Singapore then think out your next step.

      FWIW, I think the road to computing security probably is ill paved, difficult, unpleasant and involves shrinking attack surfaces by eliminating unneeded capabilities (e.g. #@$%^ Javascript) . It probably also requires shrinking the toolset to a bare minimum of proven libraries and protocols. That's not much fun, so it probably won't happen until we've exhausted the long list of entertaining but ineffective alternatives.

  • by chispito ( 1870390 ) on Saturday June 17, 2017 @07:42PM (#54640411)

    Software engineers have largely failed at security. Even with the move toward more agile development and DevOps, vulnerabilities continue to take off. More than 10,000 issues will be reported to the Common Vulnerabilities and Exposures project this year.

    How about you get what you pay for? Many management teams have decided that adding security costs money and it's more cost effective not to spend many cycles on it, but rather to just deal with problems as they pop up.

    I don't think you can spin that as software engineers "failing." If the management wants security, they can pay for training, consultants, audits, bug bounties, etc. There are lots of ways to address this issue. Besides, perhaps the number of bugs is skyrocketing as a natural consequence of all of the new software projects and products.

    • And since companies are not really known for sitting on costs, expect software (and hardware) prices to skyrocket.

    • by SvnLyrBrto ( 62138 ) on Saturday June 17, 2017 @08:22PM (#54640551)

      So very much this. Given the notion that out of fast, cheap, and $x, you can have any two; it's practically a truism that PHB/MBA types will always choose fast and cheap, no matter the value of $x. The only exceptions are when you're contractually or legally obligated to have $x, such as in PCI or HIPAA environments. And even then, fast or cheap is only given up for $x very begrudgingly, and sometimes only on paper but not in reality.

    • How about you get what you pay for? Many management teams have decided that adding security costs money and it's more cost effective not to spend many cycles on it, but rather to just deal with problems as they pop up.

      Software hasn't had it's "Pinto" moment yet, where a jury decides that a company needs to be punished for that type of calculus.

  • Automated test should be including known attack vectors, if the build is vulnerable, the build fails.

    • If it's not secure, it's not working.

      so your saying no one has ever written working products yet? if that is your definition of working then there will never be any working products, security can never be absolute.

    • The problem with that is that most security issues are a result of attack vectors that were not known yet when the software was under development.

      Software patches address that - so software when released may be tested against known attack vectors, security patches are also tested against known attack vectors, just a few more of them as more become known over time.

  • Simple (Score:4, Funny)

    by Opportunist ( 166417 ) on Saturday June 17, 2017 @07:53PM (#54640447)

    We'll essentially get shrink-wrap contracts that basically say "This software can't do shit, if you use it regardless, sucks to be you."

    In other words, what we already have.

    • Isn't that in most EULAs already?

      You can always find something in the lines of no guarantee, explicit or implied, that this software is fit for any purpose.

  • by StevenMaurer ( 115071 ) on Saturday June 17, 2017 @08:27PM (#54640565) Homepage

    Reading the article, it's all people with an interest in peddling solutions to the problem, naturally. This is a marketing paper.

    Claiming that Software Engineers have "failed" at security is akin to claiming that police have "failed" at crime stopping crime. And the courts aren't going to suddenly start blaming companies for the actions of threat actors unless there is some representation that the products they're creating are unhackable.

    • by gweihir ( 88907 )

      Same here. Because in actual reality it is mostly managers that have failed to hire competent people and then give them the time to create secure code.

  • If it were literally only companies which were on the hook, you'd see a bloom (not a renaissance, because there has not been a dark age yet) of OSS. If companies are liable, but J. Random Coder on the street is not, then you're going to see FoSS take center stage simply due to lack of liability.

    • by bws111 ( 1216812 )

      Uh, no. Given a choice between "use product from A and if there is a problem they are liable" and "use FOSS product and if there is a problem I am liable", who do you think is going to go with the second option?

      • Uh, no. Given a choice between "use product from A and if there is a problem they are liable" and "use FOSS product and if there is a problem I am liable", who do you think is going to go with the second option?

        You haven't read a EULA lately, have you? That's how it is now.

        • by bws111 ( 1216812 )

          So you're saying that there are EULAS today where the developers ACCEPT liability? No, today EULAS deny liability, just like FOSS. As far as liability is concerned, today proprietary and FOSS are equivalent.

          But you are talking about a situation where they are unequal - proprietary software can be held legally responsible while FOSS cannot. In that case, one would have to be nuts to choose FOSS.

  • ...software takes an increasingly starring role in an expanding range of products whose failure could result in bodily harm and even death.

    Not a security vulnerability - as in, no malicious actor exploiting a security flaw - but killing people? BT,DT. https://en.wikipedia.org/wiki/... [wikipedia.org]

    • The tiniest bit of actual research reveals that the issue with Therac-25 was actually the lack of physical safety interlocks which the software, written for an earlier model in the Therac line, assumed were present. Software developers were left out of the development of Therac-25 as the hardware and product guys assumed they could just use the existing software as-is and didn't bother to ask anyone who knew better.

      The problem, then, was poor product (hardware) engineering and a series of lapses in judgme
      • Nope, that's not an excuse.

        The earlier models had two, redundant, safety mechanisms in place to prevent killing patients, one in software and one in hardware. Yes, it was an unforgivable management decision to deliberately compromise that redundancy by removing the hardware safety mechanism in Therac-25, but that does not excuse the bug in the software safety mechanism.

        The software was responsible (not solely responsible, before Therac-25, but still responsible) for preventing fatal radiation doses, and i

        • Consider that it wasn't a bug as that particular failure mode was not something the software was responsible for handling in the models for which it was written. When writing software for very specific and well-defined hardware, how long do you spend on use cases specific to undefined hardware? How do you even develop for undefined hardware?
  • If (and that's a big "if") companies become liable for software failures then it will be most likely that there will be a guideline of standard programming practices. Likely it would restrict companies to using programming languages that have already been heavily analyzed and their security weaknesses identified. CMU has composed guidelines for multiple languages and platforms [cert.org] which could easily be identified programmatically. Such regulation would be a deathblow to companies using script kiddies to scra

    • by gweihir ( 88907 )

      That is BS. The CMU guides are pretty reasonable, but they cover maybe 10% of the problem. And people that have what it takes to write secure code do not actually need them.

      This is not a problem that will be fixed by "best practices" anytime soon.

  • You built a $19.99 wireless camera with no security that sells in 2 AM infomercials? Sue your ass out of existence. Sue your Cxx's into the food stamp range. Fuck you assholes.

    Oh shit, now wireless cameras cost $49.99. But they're secure? Works for me.
  • Safety and security are independent requirements. An expensive insurance bill or loss of operational trust is not a measure of safety.

    In the real world of buildings and machinery security is only occasionally a factor and often clashes with safety. Safety is always number one at the expense of security.

    • So who're you going to sue? The retailer? The distributor? The importer? Or will you try going after the producer - overseas, different jurisdiction, and possibly out of business already (or simply shut this company and moved on to the next).

  • by PJ6 ( 1151747 ) on Saturday June 17, 2017 @10:04PM (#54640827)

    Even with the move toward more agile development and DevOps, vulnerabilities continue to take off

    Last time I checked, both of those fads tend to harm security, not help it.

  • Whether liability is accepted by the software company or not, the crown goes to the one that can take liability for the same price as the others.
  • Look what happened to general aviation when cessna, piper, et al got the pants sued off them. A small 4 place plane used to cost about as much as a mid range Cadillac, after the lawyers got through with them they cost $200k.

  • by Anonymous Coward on Saturday June 17, 2017 @10:31PM (#54640905)

    "Even with the move toward more agile development and DevOps, vulnerabilities continue to take off..."

    Should read as follows:

    "Because of the move toward more agile, less detail-oriented, lower quality development and DevOps, vulnerabilities continue to take off..."

  • ... in response to litigation.

    I predict the EULA inclusion of waiver of liability is going to be removed.

    It's similar to signs in the parking lots of Walmart saying, "Not responsible for damage from shopping carts."

    While that sign may discourage some shoppers from filing damage claims, it certainly does not protect Walmart from liability in all cart-related matters.

    As TFS suggests, computing devices are becoming more critical and damages more damaging.

  • Anyone care to explain how agile is supposed to improve security?

  • It will be the hardware companies who buy the cheapest IoT hardware they can find and slap the manufacturer's sample code on top of it, display their logo wherever they can, then ship.

    To which I say, good fucking riddance.
  • ... to define a "state of the art" regarding security. It should contain things like not mixing user input with SQL-queries, unless it goes through a whitelist of characters or is escaped by a proven to work function.

    Essentially that "state of the art" should always be a bit above what idiots do in order to weed out idiots. Ideally it's defined in a way that that compilers can prove it working. (in the above case, user input strings and SQL-queries could have different types)

    Slowly, but surely you'd raise t

  • Large corporations have armies of attorneys to cover their asses. Liability for software faults would benefit them because they have the resources to kill most any lawsuit against them. The opensource world, however, would whither and die because no weekend coder is going to risk everything because of a mistake. Expect large corporations to fully endorse software liability laws since it will remove the one kind of competition that they can't compete against on cost or functionality.

  • Comment removed based on user account deletion
  • If the brakes go out on your car and you crash into a tree, is person who made the brakes liable?
    If the software 'goes out' on your car and you crash into a tree, is the person who made the software liable?

    If you build a car with an easily hackable lock, and someone breaks in, are you liable for theft? (tennis ball trick)
    if you build a car with an easily hackable electronic lock, and someone breaks in, are you liable for theft?

    Do you see the parallels here? Just because someone can do something bad to yo

  • Agile and Devops won't do anything on their own to improve your security. I'd have a really hard time taking seriously anyone who thought they did. Also, the current state of the industry is not likely to change as long as there are intelligence agencies that feel that it's beneficial for software to not be secure. If you OS were truly secure, you could be that there'd be a constant push by those guys to introduce backdoors they could exploit.
  • Even with the move toward more agile development and DevOps, vulnerabilities continue to take off...

    *needs citation Seriously, I'm a software developer and often have to be involved in a variety of security-related aspects of development and I've been doing it for twenty years. My anecdotal evidence is that security exploits are way *way* down in terms of risk and severity compared to when I entered the industry... I could be wrong (the plural of anecdote is not data) but it feels the opposite for me.

For God's sake, stop researching for a while and begin to think!

Working...