Who Is Liable For Software With Security Holes? 441
securitas writes "Interesting article over at eWEEK that asks who is and should be legally responsible for insecure software. Some say it's the manufacturer. Currently software is exempt from product liability as we've come to know it in the physical world. Others say the software licenses should make users responsible if they don't install patches and updates. Infosecurity czar Richard Clarke said in his speech at RSA that Nimda cost US companies an estimated $2 billion. Imagine if Microsoft was legally liable and a $2 billion suit was filed. Now extend that to the other jurisdictions outside the US. What does this mean to open source software, which is being used to a greater extent in corporate environments? Food for thought."
Just like a car.. (Score:2)
I suppose that's only a dream for us OSS kids
Just my US$0.02
Hargun
Re:Just like a car.. (Score:5, Insightful)
Re:Just like a car.. (Score:2)
That's what situation is NOW. Wait for a couple a years and you'll see net used for lots of 'critical' missions (like remote surgery, diagnostics, controlling). THEN a simple DoS (nimbda even) will kill people.
I think this thing should be sorted out before it will become a problem.
And of course having a legislature doesn't mean it's enforced.
Re:Just like a car.. (Score:2)
Consider: Company designed X program to run X piece of medical equipment. Program fails. Patient dies. Who is responisble (the company was sued out of existence it turns out)
Consider: Company designed mainframe system X. System fails b/c of date-bug (like 2K bug but it failed on 1987 for some reason). Hospital computers crash. Nobody dies but it was a distinct possibility.
It isn't that hard to extrapolate situations where computer programs can/do cause actual physical harm to people (would YOU want win95 running the air traffic control system? Didn't think so).
Holding software makers unaccountable for their errors is ridiculous. No industry in America is allowed to do this. You can say software is impossible to be completely fail-safe. Ok, so are cars, VCRS, DVD players, airplanes, etc but manufactures are still held liable. Simple fact is the software industry has been able to produce bug ridden, crappy software under the title of 'good enough' for far too long. Accoutability is desperately needed.
Re:Just like a car.. (Score:2)
"At first I wondered why I needed to register my toaster with Windows XP, but the computer wouldn't let me on the Internet until I brought it the toaster! Things were fine for a while, until someone hacked into my computer and took control of my toaster [geocities.com]! I tried to sue Microsoft, but the courts ruled that Windows didn't kill my boy, the TOASTER did!"
Re:Just like a car.. (Score:2, Interesting)
Software will always have bugs. But no producer is punished for making insecure programs. Only bad PR. I think it's suboptimal that bad PR is the ONLY incentive to write secure apps.
Company A wants to sell products for e-tailers? Then they better issue some kind of warranty (not that it's 100% bug free, but at least a level indicating how hard it is to break it, or how much time will it pass before they issue a patch).
Define "faulty" (Score:2)
That gets into a gray area where you really have to define faulty. For instance, when it comes to system faults vendors should be required to offer a guaranteed uptime (they can set the value at whatever they want, so you could sell your software with a guarantee of no more than 20 critical faults a minute, but that might hurt your sales somewhat... As it is, organizations make very few commitments to their systems, allowing Microsoft, as an example, to simply push each new OS as "way more stable that that last piece of software which we sold you under the pretense that it was super duper stable..."). Is that bicycle fault if the rider drives irresponsibly and gets hit in traffic? Is that bicycle faulty if it gets stolen or is otherwise maliciously used?
Security robustness is a marketing function (it's a feature, if you will, just like a Volvo withstands impacts better than most other cars), and insofar as vendors don't outright lie about the security of their systems, they should not be held responsible: The responsible parties are the hackers/DOS attackers/etc, and no one should ever fool themselves into anything otherwise. For all of the talk comparing software to the "real" world, the reality is that the window maker isn't responsible if someone throws a brick through it, and the lock company isn't legally responsible if someone drives a tow truck through the door: As long as it withstood at least the marketed capabilities there is no vendor fault.
Re:Just like a car.. (Score:3, Insightful)
I didn't charge anybody anything... I didn't even give permission for it to happen. So if this is a crime surely if I knowingly give somebody a car that is faulty (even if I don't charge him) shouldn't I also be guilty.
Just because I don't profit off of a transaction doesn't give me a right to put somebody at risk, financially or physically, unless perhaps I am completely forth right and even then often not; and simply saying "Well, at your own risk," is not completely forth right, not even close.
The problem with your argument is you offer two different arguments and claim that one applies to paid software and the other to free. Yet your arguments have no dependency on this variable so it is unclear why the arguments vary so. What it appears you are saying is if you are giving away software then you are a nice person. And nice people shouldn't be held to the same laws as mean people. Well a system bases on niceness is in a different ball park than a justice system.
The other way your argument makes sense is if the seller is only liable up to the price he charged and is not liable for damages. Otherwise you're buying the right not to be put in a dangerous situation with out your knowledge... which u can't buy.
Re:Just like a car.. (Score:2)
Isn't it interesting that cars and bikes underwent continual improvement throughout the last century, which is still ongoing?
These improvements have made cars and bikes much safer than even what our parents had. Today every major operating system, even Linux, is riddled with bugs of all sorts. Software is still a young field. When you use software, you take a calculated risk.
Re:Just like a car.. (Score:3, Insightful)
My car's design has a flaw and the manufacturer issues a public recall for a free repair, I have this mentioned when I next go for a service, but choose not to have the work done because it's too inconvenient. The part fails and I am involved in an incident that causes harm to a third party - I think I should have my ass sued clean off, don't you?
My software has a bug, the vendor issues a freely downloadable patch, and even emails me about it, which I choose to ignore and don't install it. My server is compromised and used to DoS a third party - I think I should have my ass sued clean off, don't you?
In the incidence of software this is clearly related to the debate about disclosure of vulnerabilities. You have to acknowledge that software is going to have flaws, that it takes a period of time from discovery of a flaw to produce, test and release the fix, and that during this time liability is the grey area this topic is discussing, but once the fix is out and announced, responsibilty *has* to be transferred onto the people using the software rather than those that produced it.
I don't think you can blame a vendor for having a bug in their code, because it's not a perfect world and it happens (albeit more with some vendors than with others) and doing so sets a precedent that would effect other industries as well. You can however apportion a great deal of blame after the flaw becomes public knowledge, and reapportion that blame once the fix is available or if the fix is sufficiently tardy in arrival to cause problems. Which explains a great deal about some people's attitudes towards the issue of full disclosure, doesn't it?
nobody is legally at fault (Score:4, Insightful)
Still, they aren't legally responsible for the bugs. If you read most licenses, they say "this software is provided as is." Everybody makes mistakes and even though software creators should make more effort to stamp out bugs, no code of a certain level's complexity is perfect.
The important thing here that needs to happen is that businesses and consumers say "features are nice, but fix the bugs first." At the moment though, they say "features first! bugs aren't displayed on the box." They speak with their wallets by buying buggy software. I don't mean to be one of those typical anti-MS people (even though I dislike their software), but the fact is, they produced extremely buggy software and most people still bought it. That says something.
Re:nobody is legally at fault (Score:2)
it's not so simple (Score:2)
And this begs the question of whether or not it's possible to make bug free software in the first place. Given the complexity of software, 100% bug free software might not be a realistic goal and this seems to make it unfair to punish software companies for every bug. Making software companies liable could severely hinder software development due to the high risk involved.
It's very hard to assess liability when software fails. I haven't the solution and I imagine it'll be a while before anything concrete is determined.
Re:nobody is legally at fault (Score:2)
The software industry heavly-lobbied for legislation (and got it, of course) that basically makes its products legally without warranty.
In my opinion, buggy software is a result of "time-to-market" hype that results from managerial/marketing pressure and insufficient, undermanned, undertrained people coding away and reinventing the wheel every chance, while making YAWOD (Yet Another Wrapper Or Driver) because they don't understand something (as typical w/ micro$oft coding). What is ActiveX called now? Wasn't it DCOM... wait... COM... wait
Features last, working first. I'd prefer features in an patch and working OOTB.
"Interface is everything."
huh? (Score:2)
The software industry heavly-lobbied for legislation (and got it, of course) that basically makes its products legally without warranty.
Which legislation are you talking about? The only law I know of that would accomplish this for them is UCITA, and that's only been adopted by 2 states.
Re:huh? (Score:2)
What about ridiculous software patents? Those are being "legally" enforced left and right; whole companies are based on IP-squatting.
Wake up and smell the fucking coffee. [eff.org]
Re:Changes in the education system (Score:3, Interesting)
Unless they have actual knowlage of the laws in question.
This is similar to those signs that say not responsible for blah blah blah. Bullshit. If they are responsible, then they are responsible. Period.
The more subtle one you tend to find in software licencing is "we disclaim anything the law will allow us to disclaim". Using the, usually correct assumption, that most people won't actually know what can and can't be disclaimed in this way...
definated in the EULA/License (Score:2)
The makers of the software? (Score:2)
If someone is mowing the lawn and a stick flies up and takes out an eye the lawn mower company isn't liable if there is a warning somewhere saying "must wear eye protection while operating". Maybe a "must back up all data" in the software agreement would cover the software companies somewhat.. but then again, who reads the agreements in the first place?
Re:The makers of the software? (Score:2, Insightful)
-you must not put a cat in the microwave
-if you vandalize the vending machine it might tip over and kill you.
-playing on the nintendo 8hrs a day 6 dats a week might not be wise if you have seizures.
i don't think its reasonable that the manufacturer is responsible for all the really stupid things the customer can do with its product. there is a thing such as common sense. people should not sue 'because it did not say on the package that i should be careful when using a chainsaw'.
btw in all EULAs there is a phrase that says
which is also common sense. if a sofware creates software that contains 40 million lines of code it cannot be bugfree. no matter is your name is msft, redhat, oracle or apple.
demanding that it should be is unrealistic.
though i agree that better design would solve a lot of problems.
btw.
there is no spoon.......
when you realize that, you will see that it is not the software that contains bugs, but that your mind interprets undocumented features that way
maybe we (SE's) should become more liable (Score:2)
Re:maybe we (SE's) should become more liable (Score:2, Insightful)
Most product failures are management decisions (tradeoffs). And managers are basically never held liable; even the companies usually have enough lawyers to avoid real consequences.
Make the manufacturer responsible? (Score:2, Funny)
buh bye sendmail!
-Bill
Depends who you talk to. (Score:3, Insightful)
Now I also remember when the commercial version of SSH released v3.0, there was a HUGE security hole (passwords of length 2 or less would always work...), and SSH developers took the heat; rightfully so. They 'fessed up, and they fixed it. As far as I know, there were no incidents because of it, because the problem was fixed before it was used widespread. But if it did create an issue (like Nimda, Code Red 1/2, etc.) before a fix was made (proactive vs. reactive), they should be held liable, not the users. If a fix exists, and a user says "oh, I don't have *that* problem," well, I think we all know who should get the blame. Just my $0.02 worth though...
Re:Depends who you talk to. (Score:5, Interesting)
Does it seem to anyone else that the whole software industry is starting to look like a house of cards?
All these products are being marketed as easy to use, easy to take care of, easy to everything. It's not. It's hard, very hard sometimes. I run into the strangest interdependencies, completely unexpected behavior, just plain wierd shit all the time.
It's dumb stuff mostly. How many of you knew that Photoshop 6.0 will randomly cut off network access on a Windows box? (6.0.1 fixes it) When presented with this problem, Photoshop was not my first thought, I'm looking at the swich, changing cables etc. Took me an hour to realize that this only happened when Photoshop was running. Would the user have been able to figure this out herself? Not very quickly.
People are starting to clue into this, I've had two people ask me if they should buy Windows XP. Both of them asked if it would mess up any of their programs first, before the asked if XP had any new features they would find useful. It seems to me that the marketing messages are failing, the upgrade treadmill is starting to look more and more like a sham. Seriously, what is the compelling value that will make me upgrade my company from Office 2k to XP? Somebody tell me cause I have no idea at all. I don't want to woosh around the desert on my desk, I want to not restore Outlook .pst files 3 times a week.
I think soon the software industry is going to have to really consider making a more stable product, the flashy wizz bang product doesn't have the draw it used to. Security is really only a part of this but given the Summer of the Worms (tm) we just went through it is the most visible part right now. People are terrified of thier email, those little home firewalls are flying off the shelves, we're almost to the point of widespread clue. I just hope we make it.
Gupta reads Slashdot (Score:4, Funny)
Re:Gupta reads Slashdot (Score:2, Funny)
Who is liable for defective software? This is a question that has plagued many in its time. I intend on answering it. What we must do is write perfect software. Then there won't be defective software. But then, what if there is buggy software? Huh? Whatcha gonna do about it? Then you gotta sue. But it shouldn't involve legal action. It should be solved out of court but they should be legally liable. This question has plagued many people in it's time but I have solved it.
Nobody is responsible. (Score:2)
How can users know about holes, where a company charges for tech support calls? Then if there is a hole, the user must pay for the upgrade.
Prosecute people for being in the wrong place? (Score:5, Insightful)
The supposed $2B in "damages" are a liability on those who wrote and launched the worms, directly.
By connecting to the net, just like stepping outside your door, you are assuming risk.
That said, Microsoft should be liable if they represent their product as "safe" and it isn't. I believe their representation of XP as the "Most Secure Windows Ever" does open the company to prosecution for misleading advertizing, but who has the resources to prosecute it?
There is a great deal of difficulty with trying to assign liability to those who are in the wrong place at the wrong time. Someone who gets wet because they weren't wearing a long coat when a truck splashed them doesn't expect to sue the truck driver, do they?
The systems owners who were "damaged" by the worms are indeed guilty of not securing their systems. Who will prosecute them? And for what?
Liability is based on two things: Intent and negligence. False advertizing and misrepresentation are the former, the success of virii is the latter.
Personally, I think a few false-advertizing claims against Microsoft would be great, and from a theoretical standpoint they certainly are misrepresenting their products when they call them "secure" or "safe". Who's got a million or two for the legal fees when we lose?
Bob-
It depends (Score:2)
Bad PR due to security holes again and again are enough of an effect (liability) for companies to wise up, one should hope (how many times have you heard from respected experts and, at times, Microsoft itself, to have IIS disabled on Win2k?).
If you contract a company to design specific software to suit your specific needs, and that software does not perform adequately (security holes, or what have you) then I believe that it is acceptable to blame the software manuf.
Face it, security holes exist. No one likes them, everyone wants to blame someone else for them, but you just have to accept that they do exist.
Weigh your options and choose the option that has proven itself. Be it number of security problems, speed in which they were fixed, or severity (proven and potential)of these vulnerabilities.
Oftentimes this points in the direction away from Microsoft, but that's in the eye of the beholder.
-kwishot
Re:It depends (Score:2)
Unless ofcourse, it's Micro$oft... NSA key anyone? AutoUpdate? There are more...
Defective software (Score:5, Informative)
Many Commonwealth jurisdictions have similar regulatory regimes.
It is arguable that software which doesn't work very well fails all of the above requirements. A former law school acquaintenance of mine has even sued a car distributor, for a fleet of Lada Samaras, claiming that they didn't fit the description of a "motor vehicle" (ie a moving machine !) because they spent all their time in the shop !
What needs to be remembered is that all software producers can be liable under such a regime, Linux or Winduhs.
Re:Defective software (Score:2)
Does the Australian law (either in the statute or appropriate court ruling) define "software" as "goods". The usually issue here is that abstract licences arn't either goods or services....
Software makers shouldn't be reliable... (Score:2)
Think about how many companies form as little one or two man shops that have great ideas.
Sure they have bugs and security holes and hopefully they're fixed before any damage is done, but to sue a small shop a million dollars because you didn't test something you installed on production servers is a joke.
Instead, you could pay another company to test your security all the way around including all software installed on a server.
Also, if there were something that says the software maker is liable, open source should be exempt as everyone has the oppourtunity to review exactly what the code does or doesn't do.
License Agreements..... (Score:2, Interesting)
here's my view (Score:4, Insightful)
Unfortunately there's no way to produce software which is bug free, just not possible today. Well perhaps with the exception of hello world
I don't think software producers should be responsible unless it's shown they are grossly neglegent and even then they are not neccessarily responsible. Otherwise amer^H^H^H^H people are probably just going to start suing people stupid leading to massive rises in software prices. OTOH when I use windows it pisses me off when it crashes, it I upgraded from 95 to Xp a few months ago. MS says XP is rock stable, hardly ever crashes, bullshit. The lies in advertising piss me off more than the crashes themselves - false advertising that is something I'd like to see them punished for.
Re:here's my view (Score:2, Insightful)
The problem is that there's no regulation at all. When something wrong happens we all blame it to "sCriPt KiDz or CiberTerrorists".
Like you'd open a bank in the a bazar...or like you'd open a ice-cream shop in a highly secured building. Software is the same, there should be different warranties regarding security so that each kind of company could pick the one.
Re:here's my view (Score:2)
Re:here's my view (Score:3, Insightful)
Actually, you are wrong in your examples, and may be correct in your assesement.
Let's take the shace shuttle example, shall we? Bug-freedom is achieved by:
A> Highly rigid quality assurance. Un-feasable for any non-life-critical situation, due to extremely high cost.
B> Two independent, different, systems, that checks each others constantly. Those system have both different software and hardware (and possibly a design phylosophy), so a bug in the same place is highly unlikely.
Face it, bug-free software is possible, but once you get beyond notepad level, you are going to have to face the problem of getting the money to fix all the problems is greater (often *much* greater) than you will get, not to mention the *time* it takes to get such checks made.
The Linux liability case (Score:2)
This is a standard legal theory. Manufacturers get third-party liability claims all the time, and carry insurance to deal with them. Except in the Y2K area, though, this doesn't seem to have been litigated yet.
The choices are obvious... (Score:5, Funny)
A. The Author/Publisher
B. The User
C. CowboyNeil
Novel Idea... (Score:2)
A note about software licenses... (Score:5, Interesting)
Re:A note about software licenses... (Score:2)
Re:A note about software licenses... (Score:3, Interesting)
Perhaps... (Score:2)
Free software is a separate case, IMO. If, for example, I download a Linux ISO, then there has been no sale. Accordingly, no contract has been entered into either by myself or the creator of the software. I may have obtained the product legally, but since no contract of sale is present, I am SOL if anything bad happens.
It's not the bugs; it's negligence. (Score:2, Interesting)
In the case of Microsoft, you can demonstrate a pattern of negligence in the way they test and release their product. The company also publically denies that there are problems until it is too late for users to do much of anything to protect themselves and their networks. The last thing MS wants is administrators migrating their operations off MS products in favor of more controllable risk(like Open Source or a different and better tested proprietary one). I say controllable risk, because no software is bug-free and it is the job of the administrator to manage the technical arena and minimize risks to their networks.
With the Redmond mis and disinformation machine, you can never be sure of what the truth is in terms of real support from the vendor. Afterall, this latest round with UPnP pretty much proved that the company puts profits over security. I mean, only Microsoft would try to tell the FBI that a security disaster waiting to happen wasn't one. It IS how they maintain their 'edge'.
Death by a 1000 cuts.
free vs. commercial (Score:5, Insightful)
This totally changes the nature of the beast. As a specific, non-tech example, I can give a friend a ride. I can even graciously accept gas money, or a free lunch for my troubles. I could even be a good Samaritan and offer a lift to total strangers.
But the instant I actively charge people for this, even if it's a token amount, I become a "for hire" limosine service and am required to obey a large number of laws. Some are "on point," others seem to exist solely to eliminate competition.
There are other, more subtle differences. I can refuse to give a friend a lift without explanation. Once I become "for hire" I can't (legally) refuse to accept a passenger without a good reason. E.g., someone showing a weapon can be refused, but someone who stinks because they haven't bathed in weeks can't be refused.
An even more extreme example is the difference between my friend asking me if I've ever experienced certain medical symptoms and a stranger paying me for advice. The former is a casual conversation between friends (or not so casual, if it involves a possible STD
In the software realm, I would expect to see a similiar difference in the treatment of amateur efforts (where people develop software for the love of the craft) and commercial efforts. If someone is grossly negligent, it won't matter whether they're compensated or not. But for routine oversights, I would expect to see far more severe penalties for commercial vendors than OSS providers.
The second difference is that when you get software from Microsoft, you can't change it. Any errors *have* to be due to Microsoft's (in)action. In contrast, free software is released in source form and patches are routinely assigned. It's not morally acceptable to hold people accountable for the (mis)actions of others, so it's much harder to justify penalties against parties that provide source code.
Re:free vs. commercial (Score:2)
Re:free vs. commercial (Score:2)
It also makes sense to consider the difference between closed source and open source. In the latter case even if you don't pay for it you effectivly get something which is "take it or leave it". With open source (even if you pay for it) you get something which you can modify yourself...
Re:free vs. commercial (Score:3, Insightful)
Now, one has to consider - does mere notification to the developer constitute due diligence? What happens if the developer doesn't acknowledge that there is a problem (Microsoft)? What happens if a product has such a complex management that fixes are routinely overlooked (Linux)? What happens if a project is abandoned (half of Sourceforge)? What happens if the sole developer dies (no example given)?
What may be necessary is a form of limited tort liability, similar to what law enforcement in my home state has. There is a limit on the damages that can be collected from any lawsuit against law enforcement, regardless of actual damage caused.
Which of course leads to the situation where someone sustains a billion dollars of economic hardship, but is limited to only a million in lawsuit damages. It isn't justice, and the money won't come near recovery for the damages, so
This is one ugly situation.
No one is at fault, or liable. Sorry, MS bashers. (Score:2)
As for open source, "As is" is very much implied before you even start using it. It's impossible for anyone to be at fault in either case, from a legal standpoint. Therefore, this story is completely bogus.
Me. (Score:2, Funny)
Thanks,
Al Gore
The software industry is a great business (Score:4, Insightful)
Why does anyone even try to sell anything else?
Re:The software industry is a great business (Score:3, Informative)
Difficult programmers? (That's a problem?) Please. I am a programmer, so I take offense at both your generalizations.
You haven't refuted my point that selling software is better than selling airplanes. If an airplane comes apart in flight, and the flaw was even theoretically foreseeable, you expose yourself to incredible liability. I wouldn't want to be in the airplane business, or any "real" industry. It looks like a good way to get an ulcer. People in the software world like to fancy themselves as being in a real manufacturing business as opposed to a service-based one, until the topic of legal liability comes up. Then we suddenly view our position much more clearly.
Now... should software companies be liable for damages from bugs? I think it depends on the intended use of the product and the seriousness of the bug. Medical, military, and government software should at least be well-tested and well-written. But a bug that wipes out a user's save files for Bobo the Monkey III should not even be legally actionable.
Well that's reasonable, but those are two extremes. Nuclear, aerospace, medical, and military software is generally integrated into and viewed as a part of a larger physical system. If a microcontroller in an airplane has a software problem and feeds wrong information to an actuator on the plane causing a crash, you expose yourself to liability as a seller of a faulty airplane, not a faulty software program. Software that isn't sold as part of a larger machine with real physical parts doesn't have this problem. The shrinkwrap around a software box (and the EULA wrapper around the disk) is like an armor against lawsuits.
Microsoft products have various back doors like the buffer overflow that Code Red exploited, but they also have front doors and that's just incredible and inexcusable! Outlook has an intentional feature where it automatically executes VBA code contained in an attachment when you open it. This allows worms to flood the Internet on a regular basis, without even having to do hackish back-door stuff like overflowing a buffer. But it's not really a bug, it's a feature that wasn't well thought out. Someone wasn't using their head. All of Office suffers from feature creep and they don't think things through as they shovel thousands of questionable features into their software. (Maybe I lead a sheltered life, but I have yet to hear of anyone sending a legitimate VBA script via an Outlook attachment. Have you?) Incredibly, for all the monetary damage those worms have caused, Microsoft has suffered only a little humiliation. It has exposed itself to no product liability at all. If Microsoft sold airplanes, or medical equipment, or solid rocket boosters, they'd be out of business by now. Their workmanship is just too mediocre for anything except software.
Rod Serling Would Say... (Score:5, Funny)
Source and liability (Score:2, Interesting)
If you don't publish the source, you're liable. Hiding the inside of a program is perfectly OK - assuming that you take full responsibility for the manner it works.
If you publish the source, you can be extempted. Exposing the inner workings, anyone can verify the suitability of the software for a given purpose.
MS plays safe by not being responsible (sueable) for their bugs. If they where requested to either FIX them holes before release or publish the source, they'd concentrate on security before feature count, which would be double good.
Only problem is, this way of cutting things would hardly feed the lawyers :)
No software liability please (Score:2, Insightful)
As others have pointed out, if someone breaks into your car, then you cannot sue the car manufacturer (at least it is difficult to do so successfully!) for the theft of your vehicle. Similarly if someone steals your hi-fi from your house, you do not sue the manufacturer of your locks and windows, or even the hi-fi maker.
I do believe that software should be reliable and perhaps there is a case for liability if the operation of the software causes a major disaster without malicious outside interference. The problem with that, however, is we're all to aware of what will be the result; software prices will skyrocket to cover the immense legal costs that will result defending and settling these claims.
The only people who would benefit from this will not be the software developers, regardless as to whether it's Micorsoft of open source developers; it would be the legal profession aiming to take 10-50% of your damages award when you did settle.
Locksmiths? (Score:2)
I would really like to know what some lawyers have to say.
A bad way of thinking. (Score:2)
If I buy a car, I'm paying for transportation. It would seem silly to sue the manufacturer because somebody stole my car and I found out the locks on it were easy to pick.
I use Outlook as my mail program at work. I paid for it, and I expect it to be able to send and receive mail. If somebody illegally exploits that program to do malicious things, I don't blame Microsoft, I blame the person who wrote the virus.
On the other hand, I also own a virus scan program. This is a security measure I pay for. If my computer is attacked by a virus, I expect my virus scan program to detect it and remove it. After all, thats what I'm paying for.
Yet the mentality is, if somebody illegally affects my mail program, Microsoft is at fault. While the virus scanner, which I also pay for and keep updated, which failed to do it's task, remains blameless.
It's nuts.
come on, it's not that hard (Score:3, Insightful)
In addition, there should perhaps be restrictions on what can be sold: for the sale to be legal, consumer software should perhaps have to conform to some basic safety standards, analogous to UL standards for electrical devices. (Since this is a restriction on sales, it would obviously not apply to free software.)
Large commercial customers are presumed to be competent, and they should be responsible for this themselves; they don't need regulations or legislation to protect them. For example, if a company exposes 10000 people to identity theft through an unsecure computer system, the company should be legally liable for that. The company will then insure against that risk (possible directly through the software vendor). The insurer will assess the risk and compute the cost of the insurance. The company then can take the cost of the insurance into account when selecting software. I.e., it comes down to the question of: is Apache plus insurance more or less expensive than IIS plus insurance?
lack of legal accountability is just the beginning (Score:2, Interesting)
* Human Nature
People in general don't like to admit that they are wrong. Companies small and large are not much different. Even when they distribute the patch, there is rarely accurate or complete information about the problem or the severity of the problem being addressed. We think apologizing is a sign of weakness.
* Corporate Image
By admitting fault, company loses credibility. Company is always willing to live with few unhappy customers to protect it's overall image. It's one of the reason why software defects, security or otherwise, get hushed up and buried. You all know that the euphemism for this policy when it is applied to security is called "security through obscurity". You also know how well that works. Admitting fault is the last thing company will do. Even when they do admit it publicaly, they will always play down the severity of the problem.
* Monopoly
When a company is a monopoly, there is almost no incentive to admit to a problem and fix it. If you know that you can't get fired and you will get paid the same if you work one hour a day or eight hours a day, which would you choose? Lack of incentive is the very reason why communism is bad for progess. Only reason why Microsoft is pretending to care about security recently is because they are having trouble penetrating (from behind) the enterprise market with their tarnished image.
* Money
When I say money, I don't mean cost to create or distribute bug fixes. Putting a patch on a website for user to download isn't such an expensive proposition. It's lot different than car manufacturer doing a recall. When I mean money, I mean greed. Companies are using bugs fixes as a ploy to get users to upgrade. Marketing departments have figured out that consumers are willing to pay for bug fixes. Example of this is Windows 98 and ME. Essentially they are selling you a big pile of bug fixes as a full product and charging you for it. Sneaky isn't it? MS is not the only guilty party of this devious practice. Many companies such as Vignette, bea systems have done this sort of thing. It's becoming very common in many places and we all have been brainwashed to accept it as a norm.
Since Free Software/Open Source has only one of the four problems to deal contend with, I think it has a somewhat better chance of producing superior software than from commercial environment.
Inedible food for thought (Score:2)
If you write it, you do your best to make it secure and keep it that way. If you write insecurities into it, that's your problem.
If you install it, it's up to you to make sure it stays uptodate with patches.
I've got no sympathy for people with cracked boxes when there's a patch that should've been applied (ie in 99.9% of linux and 99.99% windoze cases).
I don't see what casting it in law is going to achieve; far rather use common sense that people are responsible for their own doings, with a few precedent cases to back it up. (That'd be a first
Freedom of the Market (Score:2)
Who's responsible for network security? (Score:2, Interesting)
In a normal hetrogenous environment (as 99% of n/ws are), you're going to be dealing with software and hardware from many different vendors.
It is possible (if not probably) that the interaction of these components will create security holes for an attacker to exploit. Which vendor do you blame? They may all be working as designed. Do you blame your low-paid network guys? Do you spend hundreds of thousands to hire external consultants? Can you blame (and sue) them if your network is breached?
What about default configurations of software? What if the default configuration is insecure, but the documentation describes how to secure it?
I have my own thoughts on these issues, I'd like to see what the general consensus is here.
Btw, if you're looking for a secure OS, try XTS 300 STOP [ncsc.mil].
The EPL makes interesting reading. [ncsc.mil]
Damage estimates (Score:2)
Did I read that right? (Score:2)
Oh yes, I forgot: features!
These are not security products. (Score:5, Insightful)
Being able to stand up against novel forms of human attack is not basic product quality. Worms, trojans, and viruses are not mere environmental hazards, they are the results of intensive effort to find and exploit any system weaknesses.
Disappointed customers and annoyed partners are punishment enough. Market forces will correct the problem; people will eventually learn not to buy stuff that doesn't work. They will also learn to do their part, since security doesn't come in a shrink-wrapped box.
In a way, these petty vandals are doing us all a favor by forcing us to harden our systems. If nobody exploited the security holes, you couldn't convince people to spend extra money or effort on security. Then, when somebody made a truly serious attack, as an act of war, we would be utterly defenseless. I believe humans evolved an instinct for mischief for just this reason, and so we shouldn't be too hard on the script kiddies.
In regards to proprietary software (Score:2)
First, it makes the software company more dilligent about getting all bugs out of software, and worry more about security concerns (which are, shockingly, rarely "bugs" in the software)
Second, it makes the software company work harder at producing a patch that fixes the problem.
Third (and most importantly in my book) it forces system admins to work faster at patching software.
Double your Money Back (Score:2)
"If product fails to perform in a secure manner, buyer of product will be entitled to a refund in the amount of two times the purchase price."
Free software covered! :-)
Re: (Score:3, Insightful)
Careful what you wish for... (Score:3, Interesting)
The prevailing of commercial software is set by the market, and reflects the balance of features, updates, price and quality that users want. That's why your word processor crashes sometimes and your defibrillator doesn't. Attempting to set a new and better balance by turning hordes of plaintiffs' lawyers loose on the software industry is going to improve the situation of users about as well as turning lawyers loose on the tobacco industry has helped smokers.
Oh, and if you think that open source software is going to be unaffected by this, either because it has no bugs or because it's so cuddly it will be exempted from liability -- good luck. Bye-bye, Red Hat!
Closed Source = Liability (Score:2)
Open source software provides a method with which users can confirm functionality (checking the source to see it really does what it's meant to), report faults to the vendor and even make fixes themselves, if required. These factors should result in a vastly reduced liability, since this kind of software gives users the tools to take responsibility of their systems. Even if the user doesn't have the skills or inclination to use the source, they can hire someone who can.
While this may sound like pandering to the open source crowd and Microsoft-bashing, it just seems to make good sense... keeping the source to yourself means that you have to take responsibility.
Re: (Score:2)
Software liability vs book liability... (Score:2)
Can I sue Apache? (Score:2)
Depends on the situation (Score:3, Insightful)
If the vendor is informed and fixes the bug in a reasonable amount of time then they shouldn't be liable. (Reasonable being a flexible span of time. If a bug is particularly vexing but they keep their users informed of the progress, then they should get extra time. But if they just say "yeah, yeah, we'll work on it" and then nothing happens for a month, they don't get extra time.) Of course, if the vendor is informed about the bug and does nothing about it, they should be made liable.
Finally, if they release a patch but the user doesn't install it and has their security compromised (e.g. what happened with CodeRed), the user is the one at fault. In this case, it would be like an automobile manufacturer issuing a recall, a consumer ignoring the recall, and then getting into an accident because of the very defect that prompted the recall. Software companies shouldn't be liable for the stupidity/ignorance of their users.
Except... (Score:2)
Truth in Advertising (Score:3, Informative)
Automatically applying patches is NOT a solution! There are countless stories where the applying of patches caused formerly working software to crash.(*)
One major advantage of OSS vs Commercial software is the availability of the source code. Another major benefit, but less well recognized, is the visibility of REPORTED DEFECTS. Prior to obtaining an OSS application, say on sourceforge, I can peruse the bug list and get a complete list of reported bugs. What's the chances I can see the complete list of reported defects in, say, Microsoft Office?
Okay, why not just have a law passed that requires commercial software developers to make all reported bugs publically visible? Ain't gonna happen; political contributions and lobbying efforts would squash that in a heartbeat.
BUT, there's another approach. Don't use LEGAL requirements -- make it a MARKET requirement.
In other words, consider these two scenarios when making a recommendation for two different software packages:
In short, software will always have bugs -- just as OSS makes the code available, we can use market forces to trumpet the same visibility of the known (and future) bugs.
(*) Footnote: Feature vs Bug... many years ago I worked for 2+ years in testing a COBOL compiler that was being upgraded to support the latest standard. The version that was already out in the field was rife with bugs. Several customers were worried that we were going to fix some bugs they depended on! Though non-standard code, they had developed workarounds and used them extensively -- fixing the bugs in the compiler would break their programs!
No question about it... (Score:3, Insightful)
This is especially true of their enterprise products, like, say, Outlook/Exchange. It should not be a full-time job patching and reconfiguring the damn stuff to keep the misfit script kiddies with Outlook Worm Kits from bringing down an entire organization's e-mail system. Microsoft should damn well have been able to be held liable for something like ILOVEYOU, that knocked some very large companies' mailservers off the Net for days.
Imagine if, after all the car commercials boasting airbags, crumple zones, etc, those safety features turned out not work-- and then, while paging through it from your hospital bed, you found a EULA in the back of the Owner's Manual disclaiming Ford/GM/whoever from liability, if they didn't?
The biggest bullshit, though, is the notion that people will eventually get pissed off about software not living up to the hype and take their business elsewhere. If that theory held water, Microsoft would already be a memory amongst sysadmins these days. Companies are practically locked into using Microsoft products. And what people use at work, they will buy and use at home because by and large, they are sheep who fear change. Which is exactly the kind of environment in which companies like Microsoft can shovel sub-par shit out the door, not be liable for its flaws, and still thrive.
~Philly
Viruses are a bad example (Score:2)
The fault for Nimda, however lies squarely on the shoulders of the virus author. Claiming that an operating system, no matter how insecure, is at fault, is like claiming that non-bulletproof t-shirts are responsible for murder by gunshot. Murderers are responsible for murder. Virus authors are responsible for viruses. Software writers are responsible for software problems-- but not for criminal acts by other people.
One difference (Score:2, Interesting)
There is a related story that happened a couple of years ago (don't remember exactly). Tim Hortons is haveing a Roll Up the Rim to Win promotion every year. When you buy a coffee - you can roll up the rim of the cup to see if you won a prize (all I ever got was donuts and more coffee - go figure!). Well.. It came out that some of the people who worked at the company that was manufacturing those cups were cheating by unwrapping those rims and stealing prizes. I know that that company lost the contract - I do not remember if they were sued for damages as well. I think they did - they failed to provide a resonable service they were contracted out for.
OSS is a bit different. It's public domain. Everyone owns it - therefore if you choose to use it, and if it breaks you yourself are responsible for damages.
That's what I think - I don't know how accurate this is, but I do realize that it's not such a great thing. If a company has to choose between OSS and proprietary solution then they will choose the proprietary one. Simply because IF something goes wrong - they have a chance of getting some recompensation.
It's a simple choice - do you buy a reliable car, or one less reliable with insurance?
Wouldn't it be nice if life came with guarantees! (Score:2)
It's a crule fact of life
No security analyst worth his weight in sand will ever tell you that a system is 100% secure. That is why people like Bruce Schneier ("Sectrets and Lies: Digital Security in a Networked World") [amazon.com] talk about security landscapes and weighing the cost versus benifit of implementing a given security feature.
It is up to the System Administrator(s) to determine which solutions make sense. Most know that they are buying a system that is hackable by definition when they use M$, but find the risks acceptable.
What M$ should be held liable for is their blatant lies when they say that security is a priority for them, because it isn't
If Msft was liable for $2 billion (Score:2)
Perfection and moderation (Score:4, Insightful)
But at this point in time, it would be disasterous to start allowing liability. Why? Because liability is determined by the court system, and with no offense intended, the court system is incompetent at this time to make those sort of decisions.
I have no faith in the ability of the court system to distinguish between an obscure flaw that allows a man-in-the-middle attack on a so-called "secure" connection, and a glaringly obvious security problem like "By default, everyone in the world has full access to your desktop." (reference: Symantec's PCAnywhere for a *very* long time.) In fact, I don't trust me to make those decisions.
At this point in time, and at our current technology level, as we've all heard and said many times, one wrong character in the wrong location, out of billions, can cause a difficult-to-detect error that, when exploited, can give an attacker root access. It's difficult to come up with any sort of definition of proportional responsibility.
If a bridge collapses because all of the tons upon tons of concrete used was an inferior grade, that's one thing. But if the bridge collapses because one screw was made of aluminum instead of steel, is that worth suing over? My real point can be seen in how this metaphor is not applicable; A bridge would never collapse over something so trivial unless it had other fundamental problems! Software is fundamentally more fragile. (So far, all attempts to negate this have essentially failed, and I'm not willing to count on some miraculuous development in the future. Though I suppose if such a thing occurred, and it was legally mandated to use formal methods, that would make people like me who could understand them suddenly no longer competing with hacks who think they're leet 'cause they can sorta use Perl... >:-) )
Even a professional like me might be hard pressed, after the fact, to determine which sort of problem is before the court, to determine liability. Do you want to leave it in the hands of lawyers?
Re:Two sides to every coin (Score:2)
Re:Two sides to every coin (Score:2)
-- Rich
Re:Two sides to every coin (Score:5, Insightful)
With power comes responsability (usually) (Score:2)
Yes, now imagine if Linux Torvalds or the FreeBSD Foundation were liable for that same $2 Billion. They would be SOL. Microsoft would just be annoyed.
People often tend to forget a very important factor when talking about Microsoft. Microsoft is a *monopoly*, it's official now. [slashdot.org]
With that monopoly power they have killed off a lot of the competition by creating proprietary standards.
And here is the important fact: People/companies no longer have any *choice* but to use Microsoft's products if they want to share information with someone else. And what companies don't share information today ? None !
So please, don't compare the Microsoft user-license/responsabilities/whatever, that you have no choice but to accept or get out of business, to the open-source ones that people can very easily walk away from if they dislike it.
Re:Two sides to every coin (Score:2, Insightful)
Did we have a similar incident that caused such a damage on Linux or FreeBSD platforms? I know that also Open Source software is listed on the security announcements, but I don't remember that any of this issues caused so much trouble.
Yes, now you can argue that Microsoft products are more widespread than Open Source but then you should also consider that usually Open Source comes more or less secure out of the box while Microsoft products are insecure if you take them out of the box. And of course Microsoft is trying to put the responsibility for security issues on the shoulders of the user, but if a system is insecure by default then its not the fault of the user.
Compare it with cars. If I buy a car without brakes and the salespeople told me "thats the most safe car in the world" and I have an accident... who is responsible? If I use a known as safe car and I don't fasten my seat belts, go with more speed than allowed and I get hurt in an accident then its my own fault.
The only problem is that usually a car has to pass a lot of security tests before firms are allowed to sell it and you are allowed to use it in the daily traffic. With software nobody checks if you are able to use it and if its fulfilling minimum security requirements. So we all meet on the "information highway" and some of us suffer because others have insecure "cars".
Re:Wrong issue -- +1 Insightful (Score:2)
Re:Wrong issue (Score:2)
Re:Wrong issue (Score:2)
MS is extremely stable for some people, extremely unstable for others. And a large part of that variance is due to 3rd party software, DLLS and drivers.
MS cannot be held accountable for every possible configuration or installation base out there.
To do so would be the equivilent of holding a brink manufacturer liable when a building constructed using their bricks falls down. You have to show that it is a defect in the brick, and not someone making an unbalanced building.
Re:Wrong issue (Score:3, Interesting)
This is one good reason for open source software. If there is a bug, people will fix it. There isn't a financial incentive to ignore the bug until it causes real problems.
Blame the victim? (Score:3, Interesting)
> The question should not be who is responsible for insecure code but rather what can be done to discourage people from vandalism and how to track down and punish those who choose to break the law.
I agree, in principle. A similar concept applies to copy protection; we should concentrate on punishing theft rather than on limiting the fair-use capabilities of our electronics.
But in this case, I've been wondering whether society's best interest lies in a different strategy, more pragmatic if less idealistic.
I'm normally adamantly against blaming the victim for crimes, but consider this. What if we legalized hacking? Within a few weeks, incompetent sysadmins/secadmins would be out on the street. Within a few months, software that was not patched promptly would be replaced by software that was. Within a few years, software that was not essentially secure would be off the market.
Publishing the criminal is certainly just, but it doesn't do a heck of a lot of good to spank someone after the damage has been done. Society is going to be more dependent on computers in the future, and more at risk to insecure softare. We need to take radical action to fix the problem before it grows from inconvenient to devastating.
Admittedly this would cause a great deal of short-term disruption, but at least the problem would get fixed.
It's possible to build secure software; developers and vendors just have to care enough.
Re:Wrong issue (Score:2)
If you leave your door unlocked and a thief steals your Tivo, is that any less of a crime than if your door was locked? Don't blame the locksmith, blane the thief!
The real question is what standards will we use to prosecute people who break the law, and will they be at all equal?
Consider these two stories, from The Reg [theregister.co.uk] and The Rochester, NY Newspaper [democratandchronicle.com]. In both cases, web sites were broken into by guys in their twenties who said that the security on those sites is woefully inadequate, and claimed that they were practically "invited" in? The Library even mentioned that they were in the middle of revamping their security, so they knew they had problems.
Anyway, The guy who had access to Rush Limbaugh's social security number and made himself a NY Times employee in their database gets off scot-free, while the guy who did not access any sensitive information at a county library and "merely" changed their web page is facing up to seven years in prison.
Granted, the guy who broke into the Times was Adrian Lamo, who is apparently considered a "white-hat" hacker and has a track record of playing nice with the corporations he hacks into. (He may even read /., for all I know). But why is he going to get off the hook for his vandalism, while the other guy is facing a long sentence? Didn't they essentially do the same thing? Maybe I need more coffee this morning, but something doesn't sound right...
Re:huh? (Score:2)
Re:IMO (Score:2)
I think a lot of us are talking specifically about MS here. MS has supported all of their products (or at least a good number of them) well past 5 years. In fact, it was only November of 2001 that they decided that they were not going to "officially" support Windows 95 anymore. Secondly, like you said, it should be the responsibility of the users/administrators to keep up with patches. This had been the reason why Nimda and Code Red were able to propagate so much. Microsoft had published a bug fix 3 month prior to Code Red. But only a small fraction of the administrator actually implemented it. So who's fault is that? Why should MS be liable for the laziness/inability of the admin to fix a problem that they were notified about. These admins were as much responsible for the spread as the creators of the worm.
Re:Well surely.. (Score:2)
Look at it this way. A company is not willing to put product that has no guarantees of operation on a mission critical application (mission critical can mean Office Suite...if a CEO can't read his email, that is mission critical). You must admit that EVERY piece of software produce will have a problem, whether it is an inherent problem, or just a dumb enduser that thinks the Garbage Can is just another folder. Now when a company comes across a problem, they are not going to spend time reading pages and pages of document for a solution. They would want to be able to talk to someone and make sure that the problem gets resolved.
Now look at it this way. If there is a major security flaw in an application that the publisher knows about and does not resolve, he is liable for any damages that have incurred. The company using this product has at least one way of trying to recoup its losses. Now, (according to you) if they were to use Open Source product, they now cannot sue the company for damages incurred. Knowing this, when the CIO, CTO, C-etc are doing budgeting for software purchases, are they willing to gamble on something that does not come with a warranty? Or will they pony up the extra cash to get something that may or may not have a problem, but will be liable and available to resolve the problem. This will lead the people up top to choose the MS / or big company
Want another way to look at it? You're going out to buy a computer. One store is offering a 30 day unconditional return policy/3 service. Another store is offering 7 day return and 6 month service. Which one would you buy? Would you be willing to pay a little extra for the comfort that you will not be liable for a problem with your computer?
One thing that we all must remember is this. Open Source is not the panacea for software problems. I have seen a lot of good Open Source programs and I have seen a lot of bad open source programs. What is important is how comfortable your customer is with the solution that you are providing. Can you guarantee to him that you will be able to support it? Are you accountable for the problems that might occur?
Re: (Score:2)