Is It Illegal To Disclose a Web Vulnerability? 198
Scott writes "I'm submitting my own story on an important topic: Is it illegal to discover a vulnerability on a Web site? No one knows yet, but Eric McCarty's pleading guilty to hacking USC's web site was 'terrible and detrimental,' according to tech lawyer Jennifer Granick. She believes the law needs at least to be clarified, and preferably changed to protect those who find flaws in production Web sites — as opposed to those who 'exploit' such flaws. Of course, the owners of sites often don't see the distinction between the two. Regardless of whether or not it's illegal to disclose Web vulnerabilities, it's certainly problematic, and perhaps a fool's errand. After all, have you seen how easy it is to find XSS flaws in Web sites? In fact, the Web is challenging the very definition of 'vulnerability,' and some researchers are scared. As one researcher in the story says: 'I'm intimidated by the possible consequences to my career, bank account, and sanity. I agree with [noted security researcher] H.D. Moore, as far as production websites are concerned: "There is no way to report a vulnerability safely."'"
So is it illegal too... (Score:2, Insightful)
paste up a poster in the town square, announcing that the lock is broken on the back of the hardware store?
How is this different?
Re:It ought to be (Score:4, Insightful)
Expecting privacy on a publicly advertised service is different to people using zoom lenses to peer through the fence of your gated community.
Anonymizers? (Score:5, Insightful)
Anonymizer tools have improved since then, especially for combating censorship. Would you be able to use TOR or something similar to report vulnerabilities without exposing your identity?
Comment removed (Score:3, Insightful)
Re:Test my house for security vulnerabilities (Score:2, Insightful)
Re:Test my house for security vulnerabilities (Score:5, Insightful)
It's more like checking the locks on the backside of a Walmart. Suspicious, but not illegal, and not nearly as unethical.
Heck, you may actually have a legitimate reason to be back there - such as offloading goods from a truck.
The same can be said for security vulnerabilities in websites. You can easily stumble across them when you're not even looking in places that you're supposed to be.
Re:Discover, or try to discover? (Score:2, Insightful)
It is a little different (Score:3, Insightful)
Re:Test my house for security vulnerabilities (Score:5, Insightful)
Re:It ought to be (Score:4, Insightful)
She has drapes for this.
Re:What's the problem? (Score:5, Insightful)
There's two types of people in the world.... (Score:2, Insightful)
and those that proudly proclaim "I am doing this and no-one can stop me. If you think you can arrest me for this, YOU ARE WRONG."
The first kind of people contribute nothing to our freedoms. They are crippled by uncertainty and their annoying whining makes people think that, hey, maybe there is something to fear. The second kind of people challenge the norms and make that which was uncertain clearly not illegal. Hey, if they can get away with it, maybe I can too!
So my advice: stop whining and grow a backbone.
Re:Discover, or try to discover? (Score:4, Insightful)
A real world example would be, if you get caught outside of a door, trying to pick the lock, and then claim you were trying to ensure their locks were safe, you might get charged bith attempted B&E. You don't get to do a security audit on people's front doors.
I don't buy that analogy. Breaking and entering is a crime. Theft is a crime. Exploiting computer vulnerabilities is a crime. I'm not sure finding computer vulnerabilities is or should be a crime. I could just as easily use the analogy, "looking at the windows of houses to see if they are open or unlocked is not a crime, but climbing through a window is."
I think laws that rely upon somehow knowing the intent of the person performing an act are pretty poor laws. If I go tell you your locks are really old and can be opened with a plastic fork because I noticed it while walking by, and you happen to run a store I do business with and hence have my CC# on file, that sure shouldn't be a crime. If I write a letter to the editor of the newspaper saying the same, it should not be a crime. If I notice on your Web site the same level of e-security, I don't see how it is qualitatively different.
Re:So don't. (Score:2, Insightful)
Making mistakes != being stupid. If someone found a vulnerability in your site wouldn't you want them to let you know about it? On the other hand, if you had already been warned about this vulnerability and done nothing about it then yes, that would be very stupid.
Re:So is it illegal too... (Score:5, Insightful)
I doubt that you'd get in trouble -- and how could you? -- if you submitted the vulnerability, or even publicized it, anonymously. There are lots of ways to do this; Mixmaster comes to mind, and is practically invulnerable to tracing, particularly when your potential adversary isn't expecting an anonymous communication to come in.
If you found a problem, realize that no good is ever going to come to you because of it, and don't expect to ever be rewarded or thanked. Once you've acknowledged those things, there's no reason to attach your name to it, when you let them know.
It's when you try to have your cake and eat it too -- point out someone else's problem while getting rewarded for it -- that the problems really begin.
Look who will argue, write and advocate the law. (Score:3, Insightful)
Each time an exploit comes out, the pattern is the same. the company doesn't announce it, anti-virus makers are either paid off (as in 'approved' spyware and/or rootkits) or not kept informed, and once the story breaks, the public relations machine starts. The researcher is vilified as a hacker, the problem is denied or minimized, and the prospect of a patch is left moot because this would require accepting that a huge problem exists. Most of us scream that this is ridiculous, companies should tell everyone when an exploit shows up, and patch it as soon as possible. More to the point, they should expose their source code to scrutiny in order to better provide services to their customers.
Are you sitting down? good. They won't and they don't care. The first rule in the PR handbook is to deny and put off realization. If the big front is that there isn't a problem, or that a crack of a voting machine can only be done in a lab, and months down the road, the company quietly sues the researcher or releases a patch, they win. People have a limited attention span and fatigue quickly in the face of fear and hysteria. As long as your company's admission of guilt comes well after the original problem, or not at all, people are happy.
With this in mind, let's look at the law. thankfully, whistleblowers have some protection, and some internal voices about code might not be silenced, especially if the review takes place within the judicial system, and not through a new law. Of course, corporate secrecy, as in the case of Apple and HP, is pretty extreme, and most employees wouldn't risk the civil consequences of voicing a problem that doesn't rise to the level of a public safety hazard.
Outside researchers are in more and more trouble, and this really only leads to problems for the customer base as a whole. We rely on sites like MOAB [info-pull.com] to shame companies into action. We also rely on OSS competition in order to make products like IE better--Firefox gives an economic incentive to Microsoft to improve their product, otherwise, security development would have languished.
Very few analogues exist in the places where this is critically important: commercial and banking software. CITIbank [boingboing.net] suffers a classbreak and doesn't bother informing their customers. Security conscious customers can voice their discontent and move to another bank, but we have to trust that the new bank is as averse to security breaches as we are. For the rest of the millions of customers, security will not improve. Since identity theft costs are largely borne by the customers, the banks don't care. because the banks don't care, it is much easier, and better in their eyes, to make publishing voulnerabilities like this one [eweek.com] illegal and trust that their customers will never be the wiser.
check out this article:
[PDF] Why information security is hard [google.com]
It may not be illegal... (Score:3, Insightful)
But then, it's not your business, either.
Should you discover a security vulnerability, the correct response is to forget it. Here's why:
Naturally, we might feel a sense of duty to help someone out - if they have an exposed security flaw, we naturally want to help them. But first consider how it will be received. Most companies would rather produce software with publicly unknown flaws than to produce perfect software, websites, etc... at a much higher cost.
And, if you feel that the website owner would appreciate knowing, you might at least disclose it from an anonymous email address.
Re:Discover, or try to discover? (Score:3, Insightful)
I'm gonna divide that into two halves
If you truly 'walked by' and noticed the windows, and told me about it, that's like notifying the site owner -- it's a nice thing to do, the site/business owner may not immediately act upon it, but they know; and they presumably rely on the fact that it's not widespread information. If you were going house to house trying to open windows, I bet you'd be in a different legal position. If you then went to a known burglar with the information, well, you're no longer just doing something nice and innocent now, are you??
For the second half
Hmmmm
That's very naive -- "I can tell everyone how to break into your house, and I have no consequences" -- just doesn't sit well with me. I would say if you are going around telling people exactly what they need to do to break into my house, you have the happy fun of being an accessory, or a party to a conspiracy to commit a crime. You haven't done some public service.
I realize people figure that white hats should scream really loud so everyone knows the vulerability, because the black hats wouldn't. But, telling the black hats how to do it, you no longer get to say you're better than they are. In fact, you're probably worse, because you were the one casing the joint, as it were.
Telling about exploits, especially in open forums where people with less honourable intentions might be, isn't necessarily a noble thing. You don't have an obligation to ensure that everyone in the world knows how to open every unsecured lock.
Cheers
Re:It ought to be (Score:5, Insightful)
Pay the price (Score:1, Insightful)
I did vulnerability research on server at my university when I was starting out. I went out and got authorization to do so. In most instances they have a test/dev server they permitted me to test on. I published these vulnerabilities in the form of an advisory publicly after contacting the vendors. You do not have the right to decide to do whatever else you want on someone else's website.
Should you be allowed to try and steal stuff from a store just to see if they're vulnerable to being robbed? Can you break into that same store to see if your sledge hammer breaks their glass? What if you were doing all this just to show them it could be done and not to rob/harm them? So what.. your ass is getting arrested. I think this is the same point posts above had made and it is 100% valid.
Re:Discover, or try to discover? (Score:4, Insightful)
If you then went to a known burglar with the information, well, you're no longer just doing something nice and innocent now, are you??
Yes, but no one is claiming you should be able to find vulnerabilities and give or sell them to blackhats, merely make them public or inform the site operator without worrying about being sued.
or the second half ... WTF does having, or not having, your credit card # on file apply to this?? It seems a bit spurious to the conversation at hand, and I'll treat it as such.
No it isn't. If they have your credit card on file (as many e-businesses might) then you have a business relationship with them and a vested interest in their security. It is perfectly legal and sometimes industry practice to hire private investigators to look into the security of current or proposed business partners.
I don't think you've idly done nothing.
You've done something, but nothing illegal.
You've made available to people the means to commit and illegal act. The fact that it was just there for anyone to see (or you spent three hours trying to find it) doesn't mean you wouldn't have anything to do with them getting robbed.
So what if the local bank, where the whole town keeps their money, tends to leave the back door propped open and the safe unlocked? Should it be illegal for me to tell the paper or the paper to write an article letting everyone know they should take their money out? Should you have to be concerned about being sued if you write the bank manager and let him know what is going on?
I realize people figure that white hats should scream really loud so everyone knows the vulerability, because the black hats wouldn't. But, telling the black hats how to do it, you no longer get to say you're better than they are. In fact, you're probably worse, because you were the one casing the joint, as it were.
Not at all. Whitehats do not profit from illegal actions and are aiming to improve overall security. Full disclosure is not always the best way to go about improving security, but sometimes it is. Why you think only in terms of full disclosure, however, is a mystery to me. Even the summary specifically mentions people being sued for just telling the Web service provider that the service has vulnerabilities in it.
You don't have an obligation to ensure that everyone in the world knows how to open every unsecured lock.
No, but sometimes telling the public how to open a particular lock is the best way to improve security. If Diebold starts selling a new combination bike lock, and I discover 1.2.3.4 always opens it, and I know at least one gang of thieves is already looking for these locks and stealing bikes via this method... I should 100% have no fear that I will suffer legal repercussions if I tell the support guys at Diebold. If Diebold refuses to acknowledge the problem I should likewise have no fear that my exercising my freedom of expression and telling the local newspaper will result in my being prosecuted for some crime. The same goes for software and services on computers.
Re:So is it illegal too... (Score:3, Insightful)
Re:So tonight... (Score:2, Insightful)
smashing the window means you've actually made the system more vulnerable than it was, which is not the case in this argument.
Re:It may not be illegal... (Score:3, Insightful)
As a website owner, and admin of several sites, yes I do want to know and while no one likes bad news, I would rather hear it from a "good samaritan" than find out after my site was hacked.
Because I would truly appreciate it if others would do the same kind service.
Not everyone can afford an audit firm. Also, there are things that security auditors miss as well. Any security "expert" who tries to tell you they will find every possible edge-case scenario is a liar and not to be trusted any more than the programmer that claims his or her software is 100% bug-free.
Yes, getting hacked is a valuable learning tool, but also an incredibly expensive one.
Do you really think that anonymous tips could ever shut down the digital security industry? This is a straw-man argument and not worth any more time.
Okay, so doing nothing means that you won't get into trouble. And yes, if a site has vulnerabilities that are not remedied you are probably right to take your business elsewhere. But I see this as akin to driving past a burning building and not calling the fire department. "Let it burn, it's not my problem." Did you stop to think about all the users of the site who don't know about the security issues? Perhaps your dear aunt Ethel whose entire stock portfolio is about to be stolen by the hackers who come after you and discover the same flaw.
Re:So is it illegal too... (Score:5, Insightful)
If the hardware store gets broken into it mainly effects the owner(s) of the store, the people who work there, and not many other people. If a site like yahoo (the mail aspect of it), a banking site, or paypal is broken into and exploited then it effects every single person who uses the site in a very negative way.
I don't think publically announcing a vulnerability in a specific public service or facility is very responsible. At the same time, many businesses don't do anything to fix the problem if only one person tells them about it. The public releases we commonly see are sometimes necessary because without the pressure of the public eye the business won't correct the problems in it's service.
I've done things similar to this on a few occasions. I found a vulnerability in Surgemail, an all-in-one mail server software for Linux, which allowed any remote user to read any mail to the root account, and to send mail as root. I emailed them about it several times and received no reply for over six months. I finally released the info on it, and they fixed it two weeks later. I did something similar with an online service schools in my area offer which allows anyone to see the grades and personal info (SS#, home address, etc) of students in the school through a SQL injection. I contacted several schools about the issue as well as the company they had contracted to write the software for them. It's been 2 years and they still haven't fixed it.
Re:Discover, or try to discover? (Score:3, Insightful)
Clicking the link took me to a page that had links to pdf reports, etc. Clicking on one of those took me to a standard apache index page with a list of the contents of the directory.
After clicking around in there, the source files for a multi-thousand (close to $10,000) cold fusion enterprise CMS system were discovered. Clicking on one of the
Opening one of those files revealed usernames and passwords (in plain text mind you) for many thousands of nasa employees, scientists, politicians, etc... that had accounts on the CMS.
Another file contained the software license and key to run said CMS software in it's most expensive form, the Enterprise Ultra edition with unlimited domains and users.
I sent an email to the server administrator that was listed as being the registered user of the CMS stating that their code, license, and database were out in the open and only *one click* away from a google search. The query I used was basic, simply something like "weather Data" although I can't remember the exact term now. No "Google hacking" involved, and google only returned 4 results. Theirs being #1.
I never received a reply from NASA, and after about 6 months, the page was not fixed, but the CMS and database backups were finally removed.
Sometimes, even disclosing a problem to a very public website doesn't generate a response.
Law - 1, Greater Good - 0 (Score:2, Insightful)
I think we should establish stricter minimum guidelines for information security and hold those we choose to share our personal information with to them. Anyone in IT in the medical industry knows about HIPAA... usually with a groan. HIPAA can levy fines, shut down operations, etc... if you're not taking "reasonable and appropriate measures" in safeguarding sensitive data. Why should it be any different with other, equally personal data?
I understand the argument that "I wouldn't want someone picking my lock and then telling me that my lock was succeptable to being picked.", though I think the metaphor is stretched a little thin. The reality is that flawed code will be exploited eventually. Especially on higher profile sites. I think the goal should be to foster is an environment where there are responsible disclosure procedures available and allow there to be increased legal pressure for those who do not demonstrate adherence to established guidelines for information storage (see above).
Entities which store your data (companies, schools, etc...) will not be more responsible. There's no incentive for them to. It's more financially sound for them to respond under the current laws (mostly they're only required to do notifications, rarely will you be compensated to any amount near to what you will lose) than to fix the underlying security problems.
Another problem is the McCarty was prosecuted under new provisions in the Patriot Act which change how computer crimes can be convicted. It used to be that the government had to prove both unauthorized access and malicious intent. The malicious intent clause was dropped from the requirements. As such if you go forward and provide information about how the breach occurred and work with the site owners to resolve the issue before serious data loss can happen, you are criminally liable. This would be the perfect law if we could ensure it would be applied equally and fairly. Unfortunately many crimes cannot be prosecuted in this manner either because of geographic differences or lack of evidence (real hackers alter logs). As such it really only stands to prosecute those who aren't legitimate threats and gives the government some big news stories to try and lend credibility to the Patriot Act and the erosion of civil rights.
leaving keys in the car is illegal too (Score:2, Insightful)
I think some of these legal actions are driven by the fact the the webmaster is an idiot and is embarrassed,not to mention that all that crap he fed his boss about the website being bulletproof is just a bunch of BS.