Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Bug The Internet

Is It Illegal To Disclose a Web Vulnerability? 198

Scott writes "I'm submitting my own story on an important topic: Is it illegal to discover a vulnerability on a Web site? No one knows yet, but Eric McCarty's pleading guilty to hacking USC's web site was 'terrible and detrimental,' according to tech lawyer Jennifer Granick. She believes the law needs at least to be clarified, and preferably changed to protect those who find flaws in production Web sites — as opposed to those who 'exploit' such flaws. Of course, the owners of sites often don't see the distinction between the two. Regardless of whether or not it's illegal to disclose Web vulnerabilities, it's certainly problematic, and perhaps a fool's errand. After all, have you seen how easy it is to find XSS flaws in Web sites? In fact, the Web is challenging the very definition of 'vulnerability,' and some researchers are scared. As one researcher in the story says: 'I'm intimidated by the possible consequences to my career, bank account, and sanity. I agree with [noted security researcher] H.D. Moore, as far as production websites are concerned: "There is no way to report a vulnerability safely."'"
This discussion has been archived. No new comments can be posted.

Is It Illegal To Disclose a Web Vulnerability?

Comments Filter:
  • by DanQuixote ( 945427 ) on Tuesday January 16, 2007 @05:03PM (#17635400)

    paste up a poster in the town square, announcing that the lock is broken on the back of the hardware store?

    How is this different?

  • Re:It ought to be (Score:4, Insightful)

    by LiquidCoooled ( 634315 ) on Tuesday January 16, 2007 @05:05PM (#17635438) Homepage Journal
    It depends if your daughters bedroom is on a shopfront on Rodeo drive (or wherever).

    Expecting privacy on a publicly advertised service is different to people using zoom lenses to peer through the fence of your gated community.
  • Anonymizers? (Score:5, Insightful)

    by tfinniga ( 555989 ) on Tuesday January 16, 2007 @05:07PM (#17635480)
    So, this might not be relevant, but once I reported a cross-site scripting to a website by using a web anonymizer to create a hotmail account, sending exactly one message, and then never using the email account again.

    Anonymizer tools have improved since then, especially for combating censorship. Would you be able to use TOR or something similar to report vulnerabilities without exposing your identity?
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Tuesday January 16, 2007 @05:11PM (#17635566)
    Comment removed based on user account deletion
  • by Anonymous Coward on Tuesday January 16, 2007 @05:13PM (#17635584)
    Colorful analogy, but most vulerabilities are not specific to one person's machine. Would you go "kick someone's ass" for finding a flaw in their own house's security that just happened to affect you too?
  • Not really a good comparison since your house is private and websites are essentially open to all comers.

    It's more like checking the locks on the backside of a Walmart. Suspicious, but not illegal, and not nearly as unethical.

    Heck, you may actually have a legitimate reason to be back there - such as offloading goods from a truck.

    The same can be said for security vulnerabilities in websites. You can easily stumble across them when you're not even looking in places that you're supposed to be.
  • by haddieman ( 1033476 ) on Tuesday January 16, 2007 @05:18PM (#17635702)
    I would have to agree with you on this. The problem is that, with the internet, it is a lot easier for people to do this and not "feel" like they are doing anything wrong. Sure, most people aren't going to risk being caught trying to pick someone's lock when it's on their back door, but when you are sitting in your room at your computer it is much easier to feel that you either won't get caught or that people will appreciate your "helpfulness" even though, in real life people will still feel like their privacy has been violated, regardless of whether your intentions were good or not.
  • by gelfling ( 6534 ) on Tuesday January 16, 2007 @05:19PM (#17635726) Homepage Journal
    It's more like advertising that given brand and implementation of a lock is faulty. It may or may not impinge on you but in either case it's general enough to be of benefit to people besides you. Would you like to know that every model of the car you own happens to accidently use the same key? I would.
  • by russ1337 ( 938915 ) on Tuesday January 16, 2007 @05:20PM (#17635752)
    Would you say anything if you were in an airport and noticed a door unlocked and ajar leading from the public area to the tarmac around the aircraft?

  • Re:It ought to be (Score:4, Insightful)

    by jimlintott ( 317783 ) on Tuesday January 16, 2007 @05:27PM (#17635892) Homepage
    It would be perfectly legal to stand on the street and stare at my naked daughter through her bedroom window.

    She has drapes for this.
  • by fractalus ( 322043 ) on Tuesday January 16, 2007 @05:35PM (#17636044) Homepage
    Simple: sometimes such information gets lost, or doesn't get acted on, and the bug persists. That bug could be exposing thousands (or hundreds of thousands) of users of that site to risks they're not aware of. If one person found it, another surely can, so it's a reasonable assumption that someone else other than the site owner could know about the bug and be exploiting it for personal gain. At that point, being aware of the bug but not informing the users is allowing them to be exposed to unnecessary risk. Businesses are often reluctant or slow to fix problems because they assume nobody knows about them or they're costly to fix (just like auto companies hate to have to recall cars to fix problems). Sometimes, the only way to get the problem fixed is to announce it publicly and give the company a bit of a black eye.
  • by QuantumG ( 50515 ) * <qg@biodome.org> on Tuesday January 16, 2007 @05:38PM (#17636094) Homepage Journal
    those that ask *best whiny voice* "Is it ok if I do this? Will I get arrested? Is it illegal to do this?"
    and those that proudly proclaim "I am doing this and no-one can stop me. If you think you can arrest me for this, YOU ARE WRONG."

    The first kind of people contribute nothing to our freedoms. They are crippled by uncertainty and their annoying whining makes people think that, hey, maybe there is something to fear. The second kind of people challenge the norms and make that which was uncertain clearly not illegal. Hey, if they can get away with it, maybe I can too!

    So my advice: stop whining and grow a backbone.
  • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday January 16, 2007 @05:39PM (#17636112)

    A real world example would be, if you get caught outside of a door, trying to pick the lock, and then claim you were trying to ensure their locks were safe, you might get charged bith attempted B&E. You don't get to do a security audit on people's front doors.

    I don't buy that analogy. Breaking and entering is a crime. Theft is a crime. Exploiting computer vulnerabilities is a crime. I'm not sure finding computer vulnerabilities is or should be a crime. I could just as easily use the analogy, "looking at the windows of houses to see if they are open or unlocked is not a crime, but climbing through a window is."

    I think laws that rely upon somehow knowing the intent of the person performing an act are pretty poor laws. If I go tell you your locks are really old and can be opened with a plastic fork because I noticed it while walking by, and you happen to run a store I do business with and hence have my CC# on file, that sure shouldn't be a crime. If I write a letter to the editor of the newspaper saying the same, it should not be a crime. If I notice on your Web site the same level of e-security, I don't see how it is qualitatively different.

  • Re:So don't. (Score:2, Insightful)

    by haddieman ( 1033476 ) on Tuesday January 16, 2007 @05:41PM (#17636134)

    Why are we supposed to help the stupid? Let them continue doing stupid things until they get pwnt and it costs them their business.

    Making mistakes != being stupid. If someone found a vulnerability in your site wouldn't you want them to let you know about it? On the other hand, if you had already been warned about this vulnerability and done nothing about it then yes, that would be very stupid.

  • It's not, except that what gets people in trouble, is when they try to take credit for a vulnerability they've found in a production website.

    I doubt that you'd get in trouble -- and how could you? -- if you submitted the vulnerability, or even publicized it, anonymously. There are lots of ways to do this; Mixmaster comes to mind, and is practically invulnerable to tracing, particularly when your potential adversary isn't expecting an anonymous communication to come in.

    If you found a problem, realize that no good is ever going to come to you because of it, and don't expect to ever be rewarded or thanked. Once you've acknowledged those things, there's no reason to attach your name to it, when you let them know.

    It's when you try to have your cake and eat it too -- point out someone else's problem while getting rewarded for it -- that the problems really begin.
  • by Protonk ( 599901 ) on Tuesday January 16, 2007 @05:43PM (#17636176) Homepage
    this is an issue that simply must not be decided by the people whom it has been entrusted to. In this case, the vested interests that will lobby congress, pay for legal teams, and write friend of the court briefs are not the whisleblowers and the security researchers. There are HUGE industries where the economic incentive is to ignore problems, rely on obscurity for security, and prosecute those who would expose vulnerabilities.

    Each time an exploit comes out, the pattern is the same. the company doesn't announce it, anti-virus makers are either paid off (as in 'approved' spyware and/or rootkits) or not kept informed, and once the story breaks, the public relations machine starts. The researcher is vilified as a hacker, the problem is denied or minimized, and the prospect of a patch is left moot because this would require accepting that a huge problem exists. Most of us scream that this is ridiculous, companies should tell everyone when an exploit shows up, and patch it as soon as possible. More to the point, they should expose their source code to scrutiny in order to better provide services to their customers.

    Are you sitting down? good. They won't and they don't care. The first rule in the PR handbook is to deny and put off realization. If the big front is that there isn't a problem, or that a crack of a voting machine can only be done in a lab, and months down the road, the company quietly sues the researcher or releases a patch, they win. People have a limited attention span and fatigue quickly in the face of fear and hysteria. As long as your company's admission of guilt comes well after the original problem, or not at all, people are happy.

    With this in mind, let's look at the law. thankfully, whistleblowers have some protection, and some internal voices about code might not be silenced, especially if the review takes place within the judicial system, and not through a new law. Of course, corporate secrecy, as in the case of Apple and HP, is pretty extreme, and most employees wouldn't risk the civil consequences of voicing a problem that doesn't rise to the level of a public safety hazard.

    Outside researchers are in more and more trouble, and this really only leads to problems for the customer base as a whole. We rely on sites like MOAB [info-pull.com] to shame companies into action. We also rely on OSS competition in order to make products like IE better--Firefox gives an economic incentive to Microsoft to improve their product, otherwise, security development would have languished.

    Very few analogues exist in the places where this is critically important: commercial and banking software. CITIbank [boingboing.net] suffers a classbreak and doesn't bother informing their customers. Security conscious customers can voice their discontent and move to another bank, but we have to trust that the new bank is as averse to security breaches as we are. For the rest of the millions of customers, security will not improve. Since identity theft costs are largely borne by the customers, the banks don't care. because the banks don't care, it is much easier, and better in their eyes, to make publishing voulnerabilities like this one [eweek.com] illegal and trust that their customers will never be the wiser.

    check out this article:
    [PDF] Why information security is hard [google.com]

  • by gillbates ( 106458 ) on Tuesday January 16, 2007 @05:54PM (#17636400) Homepage Journal

    But then, it's not your business, either.

    Should you discover a security vulnerability, the correct response is to forget it. Here's why:

    • No one likes the bearer of bad news - not the website owner, not the vendor who sold the software, not the consultant who coded the website. They have lawyers; their interest is in making money, not necessarily in creating secure software. Keep this in mind. If they can find a cause for libel, they will. If they can deflect blame (stupid hackers are at it again!), they will.
    • Why would you expose yourself to potential legal problems, especially considering that you aren't getting paid for your efforts
    • If they were truly concerned about security, they would have hired an audit firm.
    • Getting hacked is perhaps the best teaching experience regarding security. Let another hacker expose their vulnerability in a way they can't deny. Then they will take security seriously.
    • Do the security industry a favor: why would anyone hire a security specialist when good samaritans on the internet (aka whitehats) will audit their website for free? Don't undermine your fellow workers.
    • No one has ever been brought to trial or sued for failure to disclose a security vulnerability. You stand nothing to lose by quietly taking your business elsewhere; let the company figure out that the public wants secure web sites.

    Naturally, we might feel a sense of duty to help someone out - if they have an exposed security flaw, we naturally want to help them. But first consider how it will be received. Most companies would rather produce software with publicly unknown flaws than to produce perfect software, websites, etc... at a much higher cost.

    And, if you feel that the website owner would appreciate knowing, you might at least disclose it from an anonymous email address.

  • by gstoddart ( 321705 ) on Tuesday January 16, 2007 @05:55PM (#17636410) Homepage
    I think laws that rely upon somehow knowing the intent of the person performing an act are pretty poor laws. If I go tell you your locks are really old and can be opened with a plastic fork because I noticed it while walking by, and you happen to run a store I do business with and hence have my CC# on file, that sure shouldn't be a crime.

    I'm gonna divide that into two halves ... the one that makes sense, and the other.

    If you truly 'walked by' and noticed the windows, and told me about it, that's like notifying the site owner -- it's a nice thing to do, the site/business owner may not immediately act upon it, but they know; and they presumably rely on the fact that it's not widespread information. If you were going house to house trying to open windows, I bet you'd be in a different legal position. If you then went to a known burglar with the information, well, you're no longer just doing something nice and innocent now, are you??

    For the second half ... WTF does having, or not having, your credit card # on file apply to this?? It seems a bit spurious to the conversation at hand, and I'll treat it as such. :-P

    If I write a letter to the editor of the newspaper saying the same, it should not be a crime. If I notice on your Web site the same level of e-security, I don't see how it is qualitatively different.

    Hmmmm .... you 'discover' (either by playing or quickly deducing) a vulerability. You write a letter to the editor saying that someone windows are faulty, or they hide their spare key under the plant on the porch, or the combination to their security system is 1234 .... I don't think you've idly done nothing. You've made available to people the means to commit and illegal act. The fact that it was just there for anyone to see (or you spent three hours trying to find it) doesn't mean you wouldn't have anything to do with them getting robbed.

    That's very naive -- "I can tell everyone how to break into your house, and I have no consequences" -- just doesn't sit well with me. I would say if you are going around telling people exactly what they need to do to break into my house, you have the happy fun of being an accessory, or a party to a conspiracy to commit a crime. You haven't done some public service.

    I realize people figure that white hats should scream really loud so everyone knows the vulerability, because the black hats wouldn't. But, telling the black hats how to do it, you no longer get to say you're better than they are. In fact, you're probably worse, because you were the one casing the joint, as it were.

    Telling about exploits, especially in open forums where people with less honourable intentions might be, isn't necessarily a noble thing. You don't have an obligation to ensure that everyone in the world knows how to open every unsecured lock.

    Cheers
  • Re:It ought to be (Score:5, Insightful)

    by rootofevil ( 188401 ) on Tuesday January 16, 2007 @06:05PM (#17636602) Homepage Journal
    in most states it would be illegal for her to stand in view of someone in the street naked. what does that say about website vulnerabilities?
  • Pay the price (Score:1, Insightful)

    by madsheep ( 984404 ) on Tuesday January 16, 2007 @06:40PM (#17637248) Homepage
    As someone who researches vulnerabilities and does IT Security for a living I do not find this too hard of an issue to deal with. If you are poking around someone else's website to look for a vulnerability, flaw, or bug, then you should be prepared to deal with the consequences. It is your choice whether or not to start testing for various things that could lead to a SQL injection, XSS issue, directory traversal, authentication bypass, file inclusion, or whatever the vulnerability or issue might be. If the site happens to be running some free or commercially available software, guess what you can do? Get a copy of it yourself and test it. Alternatively, guess what else you can do? GET PERMISSION. If you aren't authorized to start snooping then you deserve to be punished, embarassed, prosecuted, and smacked down.

    I did vulnerability research on server at my university when I was starting out. I went out and got authorization to do so. In most instances they have a test/dev server they permitted me to test on. I published these vulnerabilities in the form of an advisory publicly after contacting the vendors. You do not have the right to decide to do whatever else you want on someone else's website.

    Should you be allowed to try and steal stuff from a store just to see if they're vulnerable to being robbed? Can you break into that same store to see if your sledge hammer breaks their glass? What if you were doing all this just to show them it could be done and not to rob/harm them? So what.. your ass is getting arrested. I think this is the same point posts above had made and it is 100% valid.
  • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday January 16, 2007 @06:43PM (#17637302)

    If you then went to a known burglar with the information, well, you're no longer just doing something nice and innocent now, are you??

    Yes, but no one is claiming you should be able to find vulnerabilities and give or sell them to blackhats, merely make them public or inform the site operator without worrying about being sued.

    or the second half ... WTF does having, or not having, your credit card # on file apply to this?? It seems a bit spurious to the conversation at hand, and I'll treat it as such.

    No it isn't. If they have your credit card on file (as many e-businesses might) then you have a business relationship with them and a vested interest in their security. It is perfectly legal and sometimes industry practice to hire private investigators to look into the security of current or proposed business partners.

    I don't think you've idly done nothing.

    You've done something, but nothing illegal.

    You've made available to people the means to commit and illegal act. The fact that it was just there for anyone to see (or you spent three hours trying to find it) doesn't mean you wouldn't have anything to do with them getting robbed.

    So what if the local bank, where the whole town keeps their money, tends to leave the back door propped open and the safe unlocked? Should it be illegal for me to tell the paper or the paper to write an article letting everyone know they should take their money out? Should you have to be concerned about being sued if you write the bank manager and let him know what is going on?

    I realize people figure that white hats should scream really loud so everyone knows the vulerability, because the black hats wouldn't. But, telling the black hats how to do it, you no longer get to say you're better than they are. In fact, you're probably worse, because you were the one casing the joint, as it were.

    Not at all. Whitehats do not profit from illegal actions and are aiming to improve overall security. Full disclosure is not always the best way to go about improving security, but sometimes it is. Why you think only in terms of full disclosure, however, is a mystery to me. Even the summary specifically mentions people being sued for just telling the Web service provider that the service has vulnerabilities in it.

    You don't have an obligation to ensure that everyone in the world knows how to open every unsecured lock.

    No, but sometimes telling the public how to open a particular lock is the best way to improve security. If Diebold starts selling a new combination bike lock, and I discover 1.2.3.4 always opens it, and I know at least one gang of thieves is already looking for these locks and stealing bikes via this method... I should 100% have no fear that I will suffer legal repercussions if I tell the support guys at Diebold. If Diebold refuses to acknowledge the problem I should likewise have no fear that my exercising my freedom of expression and telling the local newspaper will result in my being prosecuted for some crime. The same goes for software and services on computers.

  • by kalirion ( 728907 ) on Tuesday January 16, 2007 @06:49PM (#17637414)
    What if you want to let the store owners know that the lock is broken? When they ask "how do you know?" you reply "Well, I touched the lock, and it fell apart." So they turn you in for vandalism and breaking and entering.
  • Re:So tonight... (Score:2, Insightful)

    by sameeer ( 946332 ) on Tuesday January 16, 2007 @07:06PM (#17637700) Homepage
    there is a difference in smashing the window, and being smart enough to observe that he's left his window open. then leaving a post-it (not visible to the public) that the window is open, and to close it.

    smashing the window means you've actually made the system more vulnerable than it was, which is not the case in this argument.
  • by Evardsson ( 959228 ) on Tuesday January 16, 2007 @07:07PM (#17637720) Homepage
    Hmmm, to answer point by point:
    • No one likes the bearer of bad news - not the website owner, not the vendor who sold the software, not the consultant who coded the website. They have lawyers; their interest is in making money, not necessarily in creating secure software. Keep this in mind. If they can find a cause for libel, they will. If they can deflect blame (stupid hackers are at it again!), they will.
      As a website owner, and admin of several sites, yes I do want to know and while no one likes bad news, I would rather hear it from a "good samaritan" than find out after my site was hacked.
    • Why would you expose yourself to potential legal problems, especially considering that you aren't getting paid for your efforts
      Because I would truly appreciate it if others would do the same kind service.
    • If they were truly concerned about security, they would have hired an audit firm.
      Not everyone can afford an audit firm. Also, there are things that security auditors miss as well. Any security "expert" who tries to tell you they will find every possible edge-case scenario is a liar and not to be trusted any more than the programmer that claims his or her software is 100% bug-free.
    • Getting hacked is perhaps the best teaching experience regarding security. Let another hacker expose their vulnerability in a way they can't deny. Then they will take security seriously.
      Yes, getting hacked is a valuable learning tool, but also an incredibly expensive one.
    • Do the security industry a favor: why would anyone hire a security specialist when good samaritans on the internet (aka whitehats) will audit their website for free? Don't undermine your fellow workers.
      Do you really think that anonymous tips could ever shut down the digital security industry? This is a straw-man argument and not worth any more time.
    • No one has ever been brought to trial or sued for failure to disclose a security vulnerability. You stand nothing to lose by quietly taking your business elsewhere; let the company figure out that the public wants secure web sites.
      Okay, so doing nothing means that you won't get into trouble. And yes, if a site has vulnerabilities that are not remedied you are probably right to take your business elsewhere. But I see this as akin to driving past a burning building and not calling the fire department. "Let it burn, it's not my problem." Did you stop to think about all the users of the site who don't know about the security issues? Perhaps your dear aunt Ethel whose entire stock portfolio is about to be stolen by the hackers who come after you and discover the same flaw.
    In the end it comes down to "What is the right thing to do?" If you really don't care then it's a non-issue, but if you do care about trying to make the net a better place an anonymous tip is at least the decent thing to do, at least until someone figures out how to produce perfect software and websites.
  • by Lesrahpem ( 687242 ) <jason DOT thistl ... AT gmail DOT com> on Tuesday January 16, 2007 @07:52PM (#17638418)
    I see a big difference.

    If the hardware store gets broken into it mainly effects the owner(s) of the store, the people who work there, and not many other people. If a site like yahoo (the mail aspect of it), a banking site, or paypal is broken into and exploited then it effects every single person who uses the site in a very negative way.

    I don't think publically announcing a vulnerability in a specific public service or facility is very responsible. At the same time, many businesses don't do anything to fix the problem if only one person tells them about it. The public releases we commonly see are sometimes necessary because without the pressure of the public eye the business won't correct the problems in it's service.

    I've done things similar to this on a few occasions. I found a vulnerability in Surgemail, an all-in-one mail server software for Linux, which allowed any remote user to read any mail to the root account, and to send mail as root. I emailed them about it several times and received no reply for over six months. I finally released the info on it, and they fixed it two weeks later. I did something similar with an online service schools in my area offer which allows anyone to see the grades and personal info (SS#, home address, etc) of students in the school through a SQL injection. I contacted several schools about the issue as well as the company they had contracted to write the software for them. It's been 2 years and they still haven't fixed it.
  • by Pikoro ( 844299 ) <init.init@sh> on Tuesday January 16, 2007 @08:32PM (#17638924) Homepage Journal
    I ran across something like this once. I was doing a Google search for some data and caught a link to a NASA website.

    Clicking the link took me to a page that had links to pdf reports, etc. Clicking on one of those took me to a standard apache index page with a list of the contents of the directory.

    After clicking around in there, the source files for a multi-thousand (close to $10,000) cold fusion enterprise CMS system were discovered. Clicking on one of the .cfm files revealed the source, the code was not running. Very obviously, the web server was not configured correctly. After looking around some more, there was a database backup directory with db dumps for the CMS system dating back a couple of years.

    Opening one of those files revealed usernames and passwords (in plain text mind you) for many thousands of nasa employees, scientists, politicians, etc... that had accounts on the CMS.

    Another file contained the software license and key to run said CMS software in it's most expensive form, the Enterprise Ultra edition with unlimited domains and users.

    I sent an email to the server administrator that was listed as being the registered user of the CMS stating that their code, license, and database were out in the open and only *one click* away from a google search. The query I used was basic, simply something like "weather Data" although I can't remember the exact term now. No "Google hacking" involved, and google only returned 4 results. Theirs being #1.

    I never received a reply from NASA, and after about 6 months, the page was not fixed, but the CMS and database backups were finally removed.

    Sometimes, even disclosing a problem to a very public website doesn't generate a response.
  • by KhaymanUCSD ( 801306 ) on Tuesday January 16, 2007 @09:42PM (#17639834)
    Knowing Eric McCarty personally I have some level of insight into this case other than what's put out in the news media. For what it's worth here is my $.02.

    I think we should establish stricter minimum guidelines for information security and hold those we choose to share our personal information with to them. Anyone in IT in the medical industry knows about HIPAA... usually with a groan. HIPAA can levy fines, shut down operations, etc... if you're not taking "reasonable and appropriate measures" in safeguarding sensitive data. Why should it be any different with other, equally personal data?

    I understand the argument that "I wouldn't want someone picking my lock and then telling me that my lock was succeptable to being picked.", though I think the metaphor is stretched a little thin. The reality is that flawed code will be exploited eventually. Especially on higher profile sites. I think the goal should be to foster is an environment where there are responsible disclosure procedures available and allow there to be increased legal pressure for those who do not demonstrate adherence to established guidelines for information storage (see above).

    Entities which store your data (companies, schools, etc...) will not be more responsible. There's no incentive for them to. It's more financially sound for them to respond under the current laws (mostly they're only required to do notifications, rarely will you be compensated to any amount near to what you will lose) than to fix the underlying security problems.

    Another problem is the McCarty was prosecuted under new provisions in the Patriot Act which change how computer crimes can be convicted. It used to be that the government had to prove both unauthorized access and malicious intent. The malicious intent clause was dropped from the requirements. As such if you go forward and provide information about how the breach occurred and work with the site owners to resolve the issue before serious data loss can happen, you are criminally liable. This would be the perfect law if we could ensure it would be applied equally and fairly. Unfortunately many crimes cannot be prosecuted in this manner either because of geographic differences or lack of evidence (real hackers alter logs). As such it really only stands to prosecute those who aren't legitimate threats and gives the government some big news stories to try and lend credibility to the Patriot Act and the erosion of civil rights.
  • by fred133 ( 449698 ) on Wednesday January 17, 2007 @02:32AM (#17642614) Homepage Journal
    Prosecution of people reporting vulnerabilities on sites should be predicated on the fact that the webmaster knows what he/she is doing.
    I think some of these legal actions are driven by the fact the the webmaster is an idiot and is embarrassed,not to mention that all that crap he fed his boss about the website being bulletproof is just a bunch of BS.

interlard - vt., to intersperse; diversify -- Webster's New World Dictionary Of The American Language

Working...