Ethical Lines of the Gray Hat 261
Facter writes "There is a great article on CNET about the ethical debate between white/gray/black-hat hackers - interesting to note is that it reports the "fading away" of the "gray" definition between white and black, due to the DMCA hindering anything in between.."
Do we really need a hat? (Score:3, Insightful)
Re:Do we really need a hat? (Score:3, Insightful)
Re:Do we really need a hat? (Score:2, Funny)
I prefer my red hat... I don't have to shave, all the white hats respect and revere me, and I can get it on with Smurfette.
Re:Do we really need a hat? (Score:5, Insightful)
The 'bull' is that there is no longer a 'gray hat' hacker. The elimination of the 'gray areas' is a legality, and a stupid one, at that. It is not a reality. Hackers will still walk the line, and things they do will still be thought of as "good", "bad", or "fuzzy line down the middle". The only difference is that the DMCA has moved the line of acceptable actions so far over, that people can be White Hat hackers and still end up being persecuted under the DMCA for doing something that even the majority of the population would consider "GOOD" as opposed to bad.
This doesn't mean that the hackers are "black hat", and it's stupid to imply so.
-Sara
Re:Do we really need a hat? (Score:2, Insightful)
The DMCA is criminalizing the White-ish hat, meaning that if you are not 100% pure cotton white hat then you must be, by law, a rotten, credit-card thieving, hard-drive reformatting, website-defacing, hardcore-porn-trading, no-good, evil, and overall bad person.
Of course, it's equivalent to saying that people that drive over the speed limit are killers.
Just because you bend a little stupid and useless law does not make you a hard-core, purse-snatching, nigerian-money-laundering uberhaxor whose handle rhymes with Phuckiaul.
I say: Hacking is good: It's called creativity, perseverence, and curiosity. Take these things away from society and people become sullen, unimaginative, short-attention-spanned. Which, come to think of it, is exactly what the entertainment industry wants people to be like.
{voice of irate teacher in pink floyd's The Wall]
"You will sit on the couch and watch our programming! Any demonstration of self-awareness will be punished! How can you become a couch potato if you don't eat your meat?" Da-dum-dum da-dummmm.
Re:Do we really need a hat? (Score:2, Insightful)
A more appropriate analogy would be "It is illegal to research into, and document the progress of a disease", or "It is illegal to test the security of the locks that the locksmith installs on your door."
Even 100% cotton white hats check the security of things, and attempt to make sure that they work on their systems--under the DMCA this could be considered attempts at hacking, and thus illegal.
If the DMCA just made it possible to crack down on "law benders", or "law breakers", I'd be unhappy about the law-bending category, but hey- they're laws. However, the DMCA outlaws things that it should not touch. Things that are beneficial for society, things that keep technology moving forward, and that keep the country's data safe. Gray hat hackers are *NECESSARY*, if only because black hat hackers exist, and at least gray hats are less malignant.
In a lot of ways, the DMCA is equivilent to the US Gov't outlawing a cure for aids because it caused people to have a cold for a week.
It's over-reaching, and goes beyond being restrictive--straight into the field of being suffocating and damaging.
-Sara
Re:Blacklist the DMCA (Score:2)
And then there's the ultimate "black-hat" attack - the DDOS. Requires little or no skill, just the ability to use some scripts off the web. Doesn't teach you anything. Just fucks up everything for the ppl attacked and for anyone trying to use their systems, without any gains for anyone except the immature little wanker sat giggling in his bedroom.
I'm 100% anti-DMCA for its restrictions on reverse-engineering. But I'm 100% *for* fucking over the script kiddies.
The "gray hat" thing is harder. RFPolicy is a good start towards this - get a standard code of conduct and everyone knows where they are. If you're genuinely not interested in hurting the affected people, give them a chance to respond and fix it, and then take the kudos. Hell, anyone who's ever worked in software knows that you never find all the bugs - even NASA can't manage that, for all its budget and procedures! - so this in-depth testing helps everyone. And this also provides a stick with the carrot - the software company *does* have to respond in a timely manner to alerts, bcos otherwise their product will get cracked.
"Then screw them" is one argument, but it assumes you're not affected. Suppose you happen to have one of those servers? Remember, it's not really the software companies affected, it's anyone who uses that company's products. So if someone finds a vulnerability in the Apache software and then cracks your server wide open, wiping all your data in the process, it's *you* that's suffered, not the Apache team who were slow in responding to the alert.
Grab.
Re:Do we really need a hat? (Score:5, Funny)
Yes, apparently you really need a hat.
Re:Do we really need a hat? (Score:3, Insightful)
Your post indicates that you think to earn the title "hacker" you have to break into other people's computer systems. Well, that's one definition I suppose (one I hate, and I'm not the only one [tuxedo.org]), but it is by no means the only definition [tuxedo.org].
Anyway, in order to answer to the overall theme of this thread - "why the coloured hats" - it is helpful to understand both the history of the term "hacker", and appreciate the prevalence of moral relativism. So, if you're sitting comfortably, then I'll begin...
The origins of the term "hacker" being used in relation to computers are described in the very detailed and entertaining book Hackers: Heroes of the Computer Revolution [amazon.com] by Stephen Levy. From the Amazon editorial review:
So how did the meaning of the word change?
Well, this is where moral relativism comes in. It's human nature to justify yourself, and that's what people did. When mischievous computer users began entering computer systems without authorisation they justified (in their own minds) themselves by claiming that they weren't doing any damage - just satisfying their curiousity.
"I'm not a criminal, I'm a hacker", they'd say.
Hence you have an entire culture of people that rate each other according to technical ability and/or morals, spawning such terminology as "lamer", "elite", "black hat", "grey hat", "white hat", and "script kiddie"; but funnily enough, it all seems to come down to the fact that people don't want to admit that they are doing something wrong - there is always someone worse than them.
Re:Do we really need a hat? (Score:2)
There are folks who are hired to test phsyical security systems (airport security audits have got a lot of press lately) who make use of the same approach as criminals would. These could well be considered "white hat" professionals.
Or were you explicitly taking exception to the entire Merlin vs the Evil Sorcerer aspects of the colored hats?
Re:Do we really need a hat? (Score:4, Insightful)
Not to sound like I'm getting up on my soapbox (I'm not), but it's one of the reasons I like Linux software. I know that if someone finds a problem with bind/apache/ftp that a fix is going to be published somewhere I'll read it (fyi, I don't go surfing the Microsoft website for patches) and I can fix the hole. It's comforting, and that's the defense I give people when they ask why they should use OSS for secure systems.
--trb
Re:Do we really need a hat? (Score:5, Insightful)
Ok. So you realize that merely reporting a security hole in a protocol to a company, with working source code, is a violation of the DMCA?
So, as a "security professional" you have now broken the law and should go to jail.
If we want to be sane about the situation then people trying to uphold themselves as being better than black hats need to get off their high horse. Realize that if you've found a security hole in a product then you're probably not the only one. And yes, you should dutifully report it to the company with enough data/code for them to verify your claim, and give them time to address it (which is a key issue - how long is long enough?).
But what happens when they don't fix it? Do you just decide that you've done your duty and ignore the fact that someone else out there either has or will discover the hole and exploit it? Or do you report it to a public independant organization like BugTraq? To whom do you owe loyalty? The company producing the product, or to the customers who are being left hanging in the breeze by the company?
I'll admit that I'm no hacker or security professional, but as a programmer I'd damn well want you to do the latter. It's called whistleblowing, and it's accepted as a viable method to right wrongs when other attempts to solve a problem have failed. This isn't a new concept, nor is it limited to the computer world. The only real difference is the speed at which companies are expected (and needed) to act.
Re:Do we really need a hat? No- just truth. (Score:3, Informative)
Security means nothing with the term hacker unless you are an un-educated manager. What you are referring to is a cracker and a completely different individual....
Please, get a clue as to what term is what. I dont care what the illeterate media calls them or how they use the term... a HACKER is not a criminal but a software and hardware genius...
A CRACKER tries to break into systems or bypass security. Why is this so hard for people to understand? The drivel that spews forth from the anchorwoman/man's mouth does NOT make it truth.
Hacker != Lawbreaker (Score:5, Insightful)
I fully support the use of the alternate term "cracker" to refer to people who use hacker-like skills (or often, no skill just downloaded cracker kits) to vandalise whatever system they can manage to crack. Yes, some hackers get sucked into these activities at some point in their development, but that doesn't mean it is condoned by the hacker ethic.
How about some analogies. When you check the door of the business down the street and find it unlocked, is it legal so wander around inside and see what you find? No, but if you didn't do any damage, it shouldn't be more than a legal slap on the wrist. If when you tried the door, you triggered the alarm, or some damage was done just by trying it, you can expect someone to be pissed off, and maybe prosecute you when you try it again on another business.
If a responsible third party closely inspects and tests the security perimiter around your nuclear, chemical or biological plant, and finds vulnerabilities, what should be done? Right, first they tell you and the relevent government authorities, and if there is no real response for a reasonable period of time, tell someone else (press, other trusted third party, etc.).
What is going on now is a typical corporate response, and it is exactly the same as using SLAPP lawsuits to silence critics. It is evil and anyone getting hit by such tactics should get help from advocacy groups. Of course, staying away from controversy is one approach, but it doesn't give you good hacker-karma.
Re:Do we really need a hat? (Score:5, Insightful)
The term "hacker" has a lot of confusion tied to it. Where I come from it's a term of respect for someone's raw technical abilities. A hacker is someone who is so good at taking things apart and understanding them that they can make gadgets and software do things the original designers never dreamed of. If you think everyone fitting that description without "proper approval" belongs in jail you've got another think coming.
Maybe when you say hacker you mean someone who breaks into systems belonging to someone else without permission. Yes, that is a minor criminal act, much like trespassing. And there is no excuse for responsible adults doing such things without very good reason, but kids will be kids (Sometimes a system is so insecure this can happen by accident [zdnet.com.au]. )
The term hacker in general usage today usually covers both the system hacker who gains access to systems not belonging to them as well as the software hacker who takes apart software they have rightfully purchased on their own system. Classically system hacking has been seen as wrong or illegal, but software hacking has always been accepted, and only disclosure has ever been at issue. The DMCA attempts to deal with both in one fell swoop and does so very badly. I take your comment to mean we should just enforce the law to it's fullest even while it is changing in subtle and terrible ways.
White hats hide information. It seems they *never* disclose exploit code. Black hats hide information. They only use vulnerabilities for themselves. It would seem to be only Grey hats who hold the advancement of security important by sharing their code and knowledge fully. In fact, I'd say it is highly unethical for a White hat to get a vulnerability fixed without ever disclosing it. Perhaps we need criminal penalties for that as well? It also seems a tragedy that white hats will never be inclined to disclose their exploit code even after a fix has been made. They just don't seem to realize that information sharing really is a power positive good. (wasn't that the hacker eithic? [tuxedo.org])
Actually there are a whole host of other things White hats can and do that are wrong. Like implanting spyware in a product or being negligent in protecting customer information. I don't see criminal penalties for those...
Re:Do we really need a hat? (Score:2)
Funny that you mention that. Most actual mis-uses of sensitive information and computer networks come from current or past employees of the company compromised.
Funny thing is, most companies don't have anything to offer in the way of financial 'secrets' or documents of any worth that are on a network.
This isn't nearly as true as it used to be, even for the government.
One of the things Companies will never understand is advice for Free
IMHO, companies like that deserve to go out of business.
There's a reason 50% [sba.gov] of all employees work for a small business in the US. Some large companies do an ok job of learning from their mistakes and not punishing those wishing to help them (cough*IBM*cough). But if a company thinks they can stay in business just by leveraging their position at the top (cough*Microsoft*cough), they've got another think coming.
If you can't know your own products well enough to know when something important comes up, and if you aren't willing to learn from your mistakes, I don't think you've got much business in software. (Yes, there is some hope for Microsoft left, but I don't see them properly chasing it just now [theregister.co.uk])
Re:Do we really need a hat? (Score:2)
Well... while obscurity helps a bit, disorganization is a poor substitute for security. I should know, it was the predominent method my company used to use!
Also, keep in mind that computers are getting more reliable, and most mid to large sized companies I've seen lately (okay a lot more mid-sized ie: 100-1000 employees) do have very reliable networks. Once a network works for a year or two without hicups, people start to trust it... Whether it is secure or not.
Re:Do we really need a hat? (Score:2)
Maybe too good if you enjoy your freedom...
Re:Do we really need a hat? (Score:2)
Oh whats that? The bios got cracked. Oh no, you benifit from the fruits of a hacker, shame on you! You should go to jail.
Some people...
Cracker (Score:3, Insightful)
Re:Cracker (Score:2)
These writers really need a geek consultant to get their terminology correct.
Re:Cracker (Score:3, Funny)
Well, there's The Jargon File [tuxedo.org], a.k.a. The New Hacker's Dictionary, which the writers could presumably consult whenever they write anything with a geek factor greater than 0. However, even The File's contents can be tough to grok by non-geeks, so I've decided to condense it into a form more easily digested by non-geeks:
Hopefully they'll get that one.
(I make no warranty of accuracy of my statements.)
Re:Cracker (Score:2)
There's also the problem of the use of cracker as a racial slur in the south.
Re:Cracker (Score:2)
Crackers are hackers (Score:2, Interesting)
The tools, tricks, and procedures used by one are used by the other. The original hackers were the original crackers. It was fun to break into things (be it your radio, your telephone, your telephone network, or someones computer system). Well whats the fun in just being there if no one knows you were there. This is where data stealing, or defacing came in. All the way back when the hack/crack was as simple as making a score board say MIT, when they didn't have a sports team, let alone being involved in the specific contest.
To you and me, it is obvious where a prank ends, and malicious intent begins. To the person that has to clean up the prank, it is all malicious. So to you an me, there is a distinction between hacker and cracker, but to the laymen, they are the same. Not because they don't know any better, but because to them the outcome is the same. And now with the DMCA and the like, the line is clearer.
And before someone says kernel hacker, the prankster hacker is where the term originated. So if anyone is using the term incorrectly, they are probably the ones that should get the chastising. Kernel hacking is such a small and specific subset of the word, it isn't what the term was created for, nor does it truly represent the standard.
Re:Cracker (Score:2)
The "real term" is hacker, not cracker. Why? Because that's what the majority of the english speaking population says it is. Get used to it, because unless you can convince Joe Sixpack and his favorite news anchor otherwise, that's the way it's going to stay.
You'd need to find another term anyway; cracker already has a commonly accepted meaning when it's applied to a person, and it has nothing to do with computers.
Re:Cracker (Score:2)
I understand what you're trying to say, but you use a very poor example. 'Pedophile' and 'homosexual' are both made up of Greek/Latin roots whose definitions have been set for centuries, even millenia, and are widely known. Additionally, the distinction between them is already set in the public consciousness.
That puts them in a whole different catagory than 'hacker', which has only really been used in the context we are discussing for perhaps 20 or 30 years. In that time the definition has been set in the public consciousness as "someone who knows a lot about computers", generally with a negative connotation. Just because you disagree with that definition, that doesn't make the definition wrong, it makes you wrong.
It comes down to this: if you don't want to be associated in the public eye with people who break into computer systems with malicious intent, don't call yourself a hacker. Trying to get everyone else to call computer vandals something other than 'hacker' is pointless because that definition is already set, and 'cracker', as I've already pointed out, is already used elsewhere. There are plenty of other words you can use to describe yourself and what you do that don't carry that negative conotation, such as 'coder', 'techie', etc. Use one of them. 'Hacker' is a lost cause.
Re:Cracker (Score:2, Informative)
This may be true, but I refuse to use the term incorrectly when I know better. Please read the following. I did not write it, it is from someone on a mailing list, when someone misused the term "hacker", then argued that it was the accepted use of the word. The author puts it better than I ever could. (you can view the original post to the list here [anti-dmca.org]
-------
If you haven't already, read Orwell's "1984".
The use of words is absolutey critical, and using language for social engineering by governments, churches, and corporations is not the stuff of science fiction ... just ask anyone who works in marketing. It happens every
day, and the deleterious effects on our society and our world which result
are trivial to see. (ponder the definition of the word 'terrorist' and how
fluid it has become, and the real, physical consiquences which are apparent
and resulting in no small part from the misuse and mutation of that word)
Now think to yourself: Who owns the rights to every dictionary in circulation (Merriem-Webster, Oxford, what have you)?
That's right, the publishers. Organizations that have been members of the copyright cartel since the sixteenth century, a cartel which in its history had at least one person drawn and quartered for possessing a printing press and not being a member of the cartel.
With respect to the word 'hacker' it is highly debatable whether the misuse of the term was deliberately and knowingly inserted into the dictionary as a form of semantic engineering, or whether the publishers simply picked up on the misuse of the term being promoted and propogated by another copyright cartel: the entertainment cartel.
The same applies to the word 'piracy,' though poking through some very early dictionaries certainly suggests its definition was changed as part of a conscious effort at semantic engineering (the incorrect, propoganda definition of the word equating copyright violators with rapists, pillagers, and murderers on the high seas was in at least one dictionary long before misuse of the word had become widespread).
What is known for certain is that, for other words of political significance, dictionaries have been known to publich definitions adhering to one political agenda or another PRIOR to their widespread use in language. The "authority" of the dictionary has been used, more than once, to deliberately modify and change the use of language to promote a political agenda.
If you're really interested in such things, look up the history of the usage of the word 'he' and 'his' as a gender-neutral or gender-indeterminate pronoun. In the United States, the use of 'they' and 'their' (singular) was in widespread use around the turn of the 20th century. Grammaticians displaced that, deliberately, with 'he' and 'his'. One of the comments made by one of these early 'semantic engineers' was something to the effect of "as in nature, when there is a choice, the male pronoune shall dominate." It is only in recent years that the use of 'they' and 'their' (singular) as a gender-neutral pronoun has come back into use, despite the linguistic orthodoxy to the contrary.
There are other examples, indeed a plethora of them from the cold war and even the war on drugs.
In other words, blind faith in the dictionary is as misplaced as blind faith in anything else (e.g. religion, government, or McDonald's). The publishers have as many ulterior motives, and as unreliable ethics in persuing those motives, as every other industry has come to have.
You misused the word 'hacker' on a mailing list of people who know better. You were corrected, you have been educated, and your response is to call everyone a hypocrit.
A community of hackers, in the old and august meaning of the word, is not at all hypocrictical for being annoyed with you for misusing the term and equating them to a bunch of petty criminals, any more than a person of a particular ethnicity, who stands for freedom, is a hypocrit for being angry when another group deliberately denigrates them. Or, put another way, fighting speech with speech is not the same as advocating censorship, and you should recognize the difference.
Frankly, you should drop the attitude, admit you made a mistake, and move on. Everyone makes mistakes ... that is part of life. Clinging to them out of
stubbornness, however, is just silly.
--------
Re:Cracker (Score:2)
And the use of the word hacker was in nation-wide usage in the 1980's? I really really doubt that. If it was in use, it was most certainly underground and not in mainstream media in any great capacity.
People who break into computer systems often identify themselves as "hackers". The politically correct thing to do is to allow them to self-identify.
Really? Hmm, I don't know anyone who breaks into computer systems who writes for the NY Times, or the Chicago Tribune, or any other major media outlet. Or are you telling me that all of the stories about "hackers" over the past decade have sources in the illegal hacker community? Interesting. Or maybe just BS. Do you honestly think that people reporting on people who break into computers ask them what they consider themselves to be?
Gee, so you consider yourself to be more educated about this issue than me, whooptie-do. You seem to be quite pleased to point it out, as an Anonymous Coward. I would think that such an educated person wouldn't stoop to the level of petty insults, no matter how cleverly and intellectually they try to phrase them. There is always going to be someone who is more educated in this or that. But it seems that no matter how smart some people claim to be, they can't resist the urge to act like an asshole.
Hope you feel educated after reading this.
Oh yes, great master, your vague references to "factual" events has greatly educated me. May I go now?
Forget the DMCA... (Score:2, Insightful)
"See? Look what he did! He 'hacked' into someone's computer, and now he's someone's bitch for life."
"But he didn't do anything damaging."
"He was HACKING. That's BAD. He's gone for LIFE. Let that be a lesson."
The lesson is that curiousity is now punishable by life in prison. Great. Don't get me wrong, traipsing into someone's computer isn't exactly ethically RIGHT (I don't care HOW wide open they leave it), but it's certainly not criminally WRONG.
Re:Forget the DMCA... (Score:4, Insightful)
I was under the impression that right and wrong were mutually exclusive. If it's not right then it has to be wrong. If you "traipse" into my computer you will go to jail. Pretty simple. Should I be able to pop the hood on your car if it's in the parking lot of Wal-Mart because I'm curious as to how your car is different from mine. What about your house? I'm interested in the architectural differences between our houses, so I break into your house because of my "curiosity." Please try to refrain from ridiculousness in the future.
Re:Forget the DMCA... (Score:4, Insightful)
Re:Forget the DMCA... (Score:2, Insightful)
An example more salient to this discussion: if your hood was open, and your windows were down, and your doors were open, etc., would you seriously expect your car to be untouched after you got out of Wal-Mart?
Re:Forget the DMCA... (Score:2, Insightful)
Re:Forget the DMCA... (Score:2)
What is I am unknowingly in your computer because someone else is routing through a hole in your system? or is storing images on your system that are linked to a different site? Is requesting something from your computer wrong?
It's not wrong for met to go to your door and request to borrow a cup of sugar from you, nor it it wrong for me to equeste a ride from you.
This is why there is such confusion with computeres, so many different analogies can be made to prove any side of any argument. Computer really need more concrete examples that belong to them.
Todays, computer are designed to share informatiuon, the internet is designed to share information.
Really, we need to accept that, and focus on good security mothodologies and technology implimentation in all products.
Gone are the days when computers where isolated machines. It seems obvious, but people can't seem to get that through there heads.
Re:Forget the DMCA... (Score:2)
Yes, and if it's not in light then it must be in darkness, right?
I won't even go into the myriad of ideas or situations that exist in the grey area between right and wrong.
If you "traipse" into my computer you will go to jail. Pretty simple.
Ok, so what if I find a backdoor onto my own computer? Should I report it to the company? If I do and they do nothing to fix it what then?
This shouldn't be hard for you to answer. After all, by your own statement there's one right answer and everything else is wrong.
Re:Forget the DMCA... (Score:2)
It shouldn't be considered ok to invade someone else's computer as long as there's "no harm done."
Most of the computer profession had been starting to agree with that statement. Unfortuneatly gaining closure requires some compromise from both sides. You see it also isn't RIGHT to create a shoddy piece of software and bill it as "secure and easy to use". Just as it isn't RIGHT to manage a "critical" server so poorly an average 12 year old can break in. Further it's outright WRONG to misuse consumer information or to create and sell spyware to unsuspecting folks.
Perhaps "grey hats" are merely "white hats" willing to get a bit dirty in order to ensure that others don't stray into even worse colors. I personally applaude the work of bugtraq, @stake and others like them.
Re:Forget the DMCA... (Score:5, Insightful)
Should you be able to pop the hood on my car in the Wal-Mart parking lot to see how my car is different than yours? No.
Should you be able to pop the hood on my car to extinguish a fire in the engine compartment and keep it from destroying the vehicle, anything in it, and probably the vehicles on either side? Yes, please do!
But... you still "broke into" my car. Do you want to go to prison and enjoy the tender thrusts of Bubba for your good deed?
If you have an ftp server running on your machine, and I happen to notice it, I feel perfectly justified in connecting to that server. If it allows anonymous logins, I feel fine looking around. If not, I won't sit there and try to guess passwords, as that *would* be wrong.
Yet, if after logging in as an anonymous user, I manage to get access to your filesystem, I would feel obliged to leave you a note, telling you that maybe / isn't the best anonymous ftp root. Would you send me to prison for that? If so, I'd suggest you seek counseling, since you obviously have some personal insecurities and ego problems beyond your server.
The DMCA is an abomination. It creates a situation where one can be punished without actually doing anything beyond research. How many people who just happen to own Sharpies bought them with the criminal intent of listening to protected music CD's? Most of my sharpies pre-date the DMCA, yet I am technically a criminal because they COULD be used to circumvent copy-protection??? All of you out there who have screwdrivers -- you can use those to unscrew poorly secured locks. There, now I'm in trouble for disseminating information about circumvention, and you're all screwed for having the tools. Go Law!
Re:Forget the DMCA... (Score:2)
netphilter is right that open doors don't make B&E legal. If you leave your door hanging open, and a robber comes in in the middle of the night, "the door was open" does not work as a defense strategy.
That being said, the important problem with the new federal hacking bill(s) is the harshess of the punishment. You can spend more time in jail for cracking someone's podunk little website than for rape.
Re:Forget the DMCA... (Score:2, Insightful)
No. Should you get life in the big house if you do that?
If you did that, but did not take or break anything, do you think you would get life in prison?
This Axis-of-Evil crap, which you are parroting here, is one of the worst abuses that two useless Bush administrations has come up with. Before, it was the War on Drugs, now it's the War on Terrorism. Hey, future presidents! Got some societal ill that's obviously far too complex and pervasive for you to begin to address? Declare war on it!
The rhetoric has not changed: You are either for us or against us! God bless the USA! (insert patriotic theme a la Animal House.)
The methodology has not changed: Caught with a couple grams of an herb considered harmful by some? Lose your house, lose your car, do prison time comparable to assault or manslaughter. Caught using or (God forbid) writing a sequence of computer code that an American media corporation finds inconvenient? Lose your house, lose your equipment, and off to the cooler where you can only hope that someone like EFF or the ACLU takes up your case.
Indeed! As in, simplistic, oversimplified, and simple-minded. Who did more damage to life, liberty and the American Way--Kevin Mitnick or Kenneth Lay?
Re:Forget the DMCA... (Score:2)
So is posting to slashdot on company time "RIGHT"?
Re: (Score:2)
gray/grey hats (Score:4, Insightful)
It seems to me that giving companies time to fix their holes is always a Good Thing (tm) but that a lack of public disclosure by a 3rd-party will only help obscure legitimate problems. People with the attitudes similar to that of Peter Lindstrom* demonstrate, to me at least, a lack of care towards users and their potentialy open/vulnerable systems. One of the easiest ways to get a slow company to fix something seems to be to talk about it in the press.
* quote: ("If you are gray, you are black," Lindstrom said. "It's not that I don't understand what they are trying to do, but it comes down to what you are actually doing.)
Re:gray/grey hats (Score:2)
Actually, they consider them white hats (as do I). In the side bar for white hats read:
Information handling: Works with software companies to resolve vulnerabilities; won't announce vulnerabilities until company is ready or found to be unresponsive.
Typos are mine. The source is a gif.
Never understimate a suit's fear. (Score:3, Insightful)
Unfortunately, this fear overwhelms the suit's intelligence, which would tell the suit that in the long term, a climate where disclosing holes is discouraged merely limits access to the information to the so-called "black hats".
Obviously, an environment where most of the flaws and holes are only known by the less scrupulous because you'd lawsuit-threatened the scrupulous out of finding the holes and telling you about them just makes it that much easier for your programs to be hacked and your customer's data to be stolen - and then they definitely won't trust your product.
For me to poop on. (Score:3, Interesting)
The argument that you need to publish to the whole world instantly is absurd. Sure, a couple vendors may not be responsive, but most are. Even in the cases where the vendor's response is not entirely adequate, the "harm" posed by waiting is negligable because it's rather unlikely that some unknown hacker will discover the same bug and start exploiting it before then. Few would argue that the developers of Linux and a couple other leading open source packages are slow to respond, yet we see this same instant disclosure of code, often without a patch (even in the cases where a patch is provided, it's not necessarily one that is suitable).
The reason for this publication in the majority of cases is pretty simple. The publisher wants some recognition for his discovery. While this is understandable, there are other ways to gain recognition. For instance, he could disclose the fundamental details of the exploit to the public and/or a trusted 3rd party on discovery and maybe attach a checksum or PGP signature of his official advisory that he sent to the vendor (in case someone else tries to take credit for the particulars, the corresponding document could be revealed and proven to be known by the discoverer at least when the first advisory was sent out). It may not bring him quite the same fame, but it would be something.
Even if the so-called "white" or "grey" hats cease to disclose these vulnerabilities to anyone, it would be virtually impossible for a large number of black hats to keep the exploit to themselves without it getting back to the security community. It's human nature to brag and to leak. What's more, I would argue that very few blackhats have the sophistication to come up with original exploits themselves. They pretty much depend upon the more knowledgable people that disclose the vulnerabilities to the public. In other words, the community of people having exploits over vulnerable machines would be far smaller.
Exploits 'held by the dark side' for _years_. (Score:2)
There are some very intelligent people coding for black hats. Many of the brightest people on the legitimate side of network security honed their skills as a black hat, then had a change of heart in the past few years as the threat of criminal charges grew larger, or after suddenly realizing that having a house, a wife, and kids changes your priorities. However, the pool of exploitable machines would be much much larger.
Restricting public exposure of holes has been tried, and found wanting. Limited distribution of the details of holes was the unwritten law in the 1980's and early 1990s (anybody remember the 'core' list?). This is why the creation of Bugtraq in 1993 was such a big deal. Prior to that, vulnerability information was carefully controlled, distributed to a limited pool of "trusted" admins... including the "daytime personas" of a number of black hats.
This approach did little to keep the black hats from learning about new vulnerabilities and writing exploits, and put little pressure on vendors to patch their software or pro-actively work to limit security holes.
Full-disclosure may not be ideal, but it is better than the alternatives.
Re:Exploits 'held by the dark side' for _years_. (Score:2)
Sure, and it only takes one person to leak. You can hardly have a group of 30 people or more and not have a leak after a week or two. So the question is something like this: Would you rather have 30 hackers attacking the same number of vulnerable targets for a slightly longer period of time or 20000 script kiddies (plus assorted people that have more skills) of them for slightly less? You do the math. I'd certainly take the 30 and that's assuming that the vendors are significantly less responsive (a premise that I disagree)...by the mere fact that you give them, say, a 2 day lead time.
Well this can quickly unravel into a semantic argument, but I disagree. Very few people that are not disclosing to the public or to the vendors have the ability to write their own exploits. What ever hat you wish to put on them is an entirely different argument that I'm not interested in. I won't debate that many sophisticated people had their start in hacking, but the more sophisticated people quickly outgrow hacking into other people's servers for the sake of it as their skills develop. What fun is it to hack a bunch of servers with already known exploits (even if you created them) when you can you do something that is actually intellectually challenging (e.g., discovering your own) and do it mostly above board while you're at it, not to mention profit from your legitimate fame. (Sure, someone on the fringes may engage in the occassional hack, but not en masse) Yes, there are some undeniable blackhat codes, but they're generally lacking in originality.
In much the same way (as you and others argue this point) "democracy" was tried by, and subsequently failed for, the Greeks (and others), so therefore it could have been (and was) argued that it was the wrong path and should have been avoided in favor of monarchy, dictatorship, or the other extremes. Of course, we all know the United States and other democracies have since succeed magnificently. The reason? Subtle and important differences in the governence and a different situation (class, geography, economics, etc). You can't neglect these important differences:
Firstly, what I'm asking for is not the same as the policy with CERT and other bodies. These people pretty much gave CERT the information and then walked away from it. Instead, I'm giving the vendor a reasonable period of time to respond. If they fail to respond in that alloted time, then the hacker always has the option of making the same disclosure that they do today, only a day or two later. The vendor has every incentive to respond before the hacker does this. Secondly, you can hardly compare the situation today with the growth of the internet (and lists devoted to distributing this sort of information to the public) and the increased interests in security with that of 10+ years ago. It's an apples and oranges comparison. Thirdly, I've yet to see any objective evidence that full disclosure has been any more effective in practice (and yes, I was around and quite aware then). Maybe you can argue that the sysadmins and/or users are a little better armed today with knowledge, but the script kiddies are also armed in that same stroke... The difference is that the script kiddies are armed first, with real weapons (well code at least) when the users only have knowledge that's of questionable value (even with this full disclosure and if the vendor tries as hard as they can, it may take more than a day to come out with a patch or an acceptable workaround).
Re:I agree.... (Score:2)
This article starts with a poor example (Score:5, Insightful)
Facter writes "There is a great article at CNet..." but I wasn't so impressed. This example of Kevin Finisterre isn't really that amazing. Finisterre's employee publically disclosed the vulnerability. You gotta expect to piss off HP when you do something like that. Look, I'm a fan of open-source software and I understand that publically disclosing software bugs is one way of motivating a lazy company to plug those holes but I'm not sure you can really defend this ethically. If you find a bug in Company A's software, then let A know about it. If A decides not to do anything about it (or if they are taking longer to plug the hole than you thought) I don't see how you are morally justified in leaking that info to the world.
Finisterre, who was not hired by HP, now says he'll think twice before voluntarily informing another company of any security holes he finds.
This is just silly. If he had just informed HP, there wouldn't have been a problem. However, his employee decided to inform the entire world and that's what triggered HP's retalliation. If Finisterre and his employees restrict themselves to informing the company, they should be okay.
The rest of the CNET article is okay. But starting off with such a stupid example really weakens the story. They could have started off this story with the Sklyarov example. That would make a stronger case for the idiocy of the DMCA.
GMD
Re:This article starts with a poor example (Score:4, Insightful)
If I find a hole, I shoule be able to tell anybody I want about it, because it is speech.
If I found a hole in a major software product that could be damaging, would I tell the company first? Yes, because I believe that would be the moral thing to do, but freedom of speech is not about morals, its about being able to say/write what I want to, even if it is not what society, or an individule, or a corporation, think is moral or right.
Re:This article starts with a poor example (Score:2)
So companies have the right to prevent my freedom of speech?
No, they have the right to fire your ass if you exercise your free speech in an a way they don't like, or even take legal action against you (such as disclosure of trade secrets).
Freedom of speech != freedom from consequences.
Re:This article starts with a poor example (Score:2)
GMD
morally justified (Score:2)
Just because you found a hole, it doesn't mean that you are the ONLY one to find the hole. It's possible that any hole you find is an actively exploited hole.
While I'm not familiar with Kevin's case, I've been in a similar situation before. Bank A would not patch their holes in their banking websites. I notified them again and again. After months waiting, I went public. Problem was solved the NEXT DAY! It was simply a matter of getting the right people to make it a priority. I feel that this is completely morally justified and I don't think that the bug was exploited, and I don't think that USERS were harmed just because it was public. It may however have hurt Company A's reputation.
You take the credit -- would you take the blame? (Score:3, Insightful)
While I'm not familiar with Kevin's case, I've been in a similar situation before. Bank A would not patch their holes in their banking websites. I notified them again and again. After months waiting, I went public. Problem was solved the NEXT DAY! It was simply a matter of getting the right people to make it a priority. I feel that this is completely morally justified and I don't think that the bug was exploited, and I don't think that USERS were harmed just because it was public.
Congrats on getting the bank to do something. And your sentence makes it clear that you feel that you deserve the credit for getting the bank to fix this.
Now I am wondering: what if the bank did not fix this problem the next day? And what if some cracker/con-artist used your publically-disclosed exploit to cause significant damage to the accounts of one or more bank's customers? Would you be willing to take the blame for this? Yes, the bank should have fixed the problem and you gave them ample opportunity to solve the problem themselves. But I would argue that, yes, you do bear some responsibility in this case. But that's just my opinion. I am curious what yours is.
You are very eager to take the credit for a case when a public exploit resulted in something beneficial. Would you also be willing to take the blame if your actions had had disasterous consequences? If so, then I salute you as a fair man/woman/slashkitty. If not, I wish I could smack you upside the head.
GMD
Re:You take the credit -- would you take the blame (Score:2)
Now I am wondering: what if the bank did not fix this problem the next day? And what if some cracker/con-artist used your publically-disclosed exploit to cause significant damage to the accounts of one or more bank's customers? Would you be willing to take the blame for this?
The fact that an attack is performed shortly after the weakness is disclosed does not mean that (a) the attack would not have been performed had the weakness not been disclosed or (b) that the disclosure had any relationship whatsoever with the attack.
What's very clear, however, is that the correction of the defect has a direct, causal relationship with the public disclosure.
Certainly, public disclosure increases the odds of an attack, but it does not increase them from from zero, and disclosure which results in the correction of the defect reduces them from the previously-unknown value to zero.
In most cases, the bank's customers are better served by public disclosure. For one thing, it lets them know that their bank behaves irresponsibly with their money, and gives them a good hint that they should take their business elsewhere.
I would agree that it's irresponsible to publish software that automates an exploit, and that doing so might place the author at fault, to some degree. Publishing the vulnerability on a secret crackers-only forum would be thoroughly reprehensible. And it's both polite and good for the bank's customers to give the bank a chance to fix the problem themselves before going public. But if the bank isn't willing to protect its customers unless its nose is publically rubbed in the problem, then the responsible thing to do is to go public.
You are very eager to take the credit for a case when a public exploit resulted in something beneficial. Would you also be willing to take the blame if your actions had had disasterous consequences?
You have it backwards. The poster would be at fault if he had continued to keep it quiet until the customers' accounts had been emptied. The only difference is that there would be no one trying to apportion blame to him, so that is an /easier/ approach. But a much less moral one.
Re:You take the credit -- would you take the blame (Score:2)
It's unfortunate that the legal system tends to look more at actions instead of inactions. Did you ever see the final episode of "Seinfeld"?
I feel that there is less RISK to users if they know which company / product / website is more risky to use, and know which companies keep up to date on fixing things.
In the end, in my case, the type of bug in the bank's site had been listed in CERT for 2 years, along with how to fix it. I think that it's clearly the company's fault for not building a safe website.
The exploiter is to blame, not the revealer (Score:2, Insightful)
If I went to my bank and noticed the door to the vault was open, I would tell the manager about it.
If I came back the next day and it was still open, I would close my account. I would also feel ethically obliged to tell all the other customers at that bank that their money isn't secure.
A: Do you agree with that, in the terms of the analogy? (physical bank; physical door)
B: Does the analogy become any different when a computer is involved?
One person, and one person only, is responsible for a malicious exploit: the person who performed the exploit.
Networking protocols were designed for sharing information. There are (relatively) easy ways to ensure that only authorized recipients get information through these protocols. If a security system allows me access to parts of an internetwork, I have no reason to think I'm an unauthorized recipient of the information on that network.
Re:This article starts with a poor example (Score:2)
Wrong!! Read the above staement again. Still wrong.
Bugs and exploits make us (as users of the software) vulnerable - and because the software is question (HPunix) is closed source, we are dependant on the software maker to fix these exploits. If they choose to not do so, or take their time, the we are obligated to ourselves and other users of the software to push the issue.
Any eula or law that prevents this is flawed and needs too die (die! die!).
Now, the Finisterre story may still not have been the best argument - the article does say that HP was creating a patch - but no mention of how long it too them.
just my opinionated 2 cents...
Re:This article starts with a poor example (Score:2, Insightful)
What if HP made the car you drive your family around in?
Of course we should TRUST the corporations to fix all the problems with their products. Why wouldnt they? And of course dont let the public know that new car SuperFastExpensive SUV can explode if hit at the right spot, why should they know about that???
DMCA was a bluff... (Score:2)
I once sat down with a Talmudic scholar (I'm Jewish by choice, and find their ethical constructs best for tackling ethical questions) to discuss the ethics of hacking. The farther we got into it, the more I realized that "hacking," as used to define the uninvited attack of another person's system or systems, is fundamentally unethical. You want to make the world a better place? Stand up your own system and practice on it to find problems to fix.
I also think that public release should be delayed until the vendor addresses the issue. But if the vendor is unresponsive...I think releasing to the public is critical. I've seen situations where I've found a vulnerability and was prohibited from disclosing by an NDA with the client. Every time, the vendor failed to address it within 12 months. The times when it could be released, they were all over it like white on rice.
We need the grays (Score:4, Interesting)
independent reports, on the software and
systems we run.
In house security research is typically poor,
a lot of times (cough microsoft cough) companies
refuse to make any information about flaws in there software public. Which means that without the greys, the blackhats will be exploiting flaws, and us poor sys-admins will have no idea of how there doing it, or how to keep them out.
Without the greys, where would CERT advisorys come
from?
Secondly, in the case of open source software, the
public and the developers/owners of the code are
the same group of people (theoretical, if not in practice), so its impossible to make
any distinction between grey and whitehats in
this case.
Re:We need the grays (Score:2)
I suppose I shouldn't have killed so many of them in Deus Ex then...
Differences: (Score:4, Funny)
Whitehat: Finds a hole on your box, breaks in, writes a nice note to the admin about patching it.
Blackhat: Finds a hole on your box, defaces your homepage.
Script Kiddie: Hears about a hole on your box via AOL Instant MEssenger becomes utter perplexed why his IIS rootkit won't work.
WH: Sees your
BH: See's the same, breaks in, rm -rf
SK: See's the same, pings your box, brags about it on IRC.
WH: Sees a probe coming from your machine, finds out its hacked, drops you a note.
BH: Sees the same, roots your box, roots the original attackers box, kills him, kills his family.
SK: Gets rooted.
DISCLAIMER: This is humor, thank you
Re:Differences: (Score:2)
How To Hack The Hell Out Of A WWIV BBS [phonelosers.org]
it makes more sense.. (Score:2, Interesting)
security VS fame (Score:3, Insightful)
If you have somebody who's informed a company of their problem, waited for them to do something, and then finally anonymously or semi-anonymously posted the problem, then we have the "security" types that are looking out for all of us. Somebody who posts it as "hey look at me, I hacked XXX/YYY and somebody should fix it" is just looking for fame or possibly profit.
I think that if you can hack a system and then offer a viable fix/solution without the indicated repercussion of telling everyone in the world what the problem is, then you shouldn't be blacklisted as a "black hacker".
However, if you go off and tell everyone that so-and-so's software/network is insecure because they didn't pay you, then you're no better than an extortionist or a crook.
If you've bypassed security on a product that was hindering legitimate users, we have another really hard area to define. Anything that gets done to a company's product generally should be done with the grace of the producing company.
Perhaps one of the biggest problems is those who just jump out and post something on the internet without thinking of the ramifications to the owner/users of the product. If you post a security vulnerability and fix, you may be allowing a certain amount of people to fix the problem, but you're also letting all the hackers out there know where there's easy prey in those that don't see the fix soon enough.
In the same hand, if companies legally lambaste anyone who hacks and then offers a solution to their woes, it only makes things worse.
Corporations with insecure products/networks need to recognise that running for the lawyers isn't always the best solution, while those doing the hacking need to recognise that extortionist/fame mongering/otherwise damaging tactics aren't helping either.
If more companies can work with legitimate hackers in a productive way (as stated in the article, many have internal hackers), without inviting dozens of script-kiddies to poke at their servers, then perhaps one day the important people (we, the end-users) will find a day when we can legitimately use the products we pay for, in a meaninful manner, and without security woes.
It's not what you can do, it's how you do it that counts - phorm
If servers were Fords (Score:5, Insightful)
Now let's say you notice that my HP server is likely to be compromised. But there's a law in place that says HP can sue you if you tell me, because that violates their cracker security, which consists in not letting people who might be malicious know that the rear door of an HP could be a tempting target.
Exactly why should HP deserve a legal protection that no sane person would give to Ford, when in both cases the customers are far better off with the knowledge?
Re:If servers were Fords (Score:2)
The problem with
Re:The greater harm. (Score:3, Interesting)
The "spotless white" hat notifies Ford, but the company ignores the warning and goes on making the Pinto without any changes. The CIA, Mafia, and Mossad learn of the weakness (through leaks or by discovering the issue independently) and build selective exploits, using them against their enemies for several years before the weakness becomes widely known. (This scenario has played out in both physical security and remote software exploits more than once.)
The "light gray" hat tells Ford and his circle of 'leet buddies, and when Ford does not respond, some or all of his research notes are published to a "Full-Disclosure" list. Ford rushes out a fix in record time.
The "pitch black" hat builds selective exploit tools and sells them to the highest bidder.
Yes, it can be "the lesser harm" to publish.I've learned the hard way on more than one occasion that if you don't publish, most vendors will almost certainly not respond in a timely manner. They may create a fix and quietly distribute it in their next scheduled release, or they may just ignore the warning.
Meanwhile, other researchers (including some truly morally bankrupt black hats) are almost certainly looking at the same areas you are, and will eventually discover the same vulnerability independently, and begin to exploit it.
In case after case it has been demonstrated that for most vendors, nothing short of full disclosure is sufficient for them to take the problem seriously.I thought there were only black-hats left? (Score:2)
Kjella
No traditional whitehats anymore (Score:2)
Of course, this hasn't got to do much with security anymore, it's all about making profit and a feeling of security. After all, when you learn about a new, critical defect in Windows or some component of the GNU/Linux system, there's already a patch (at least in most cases, and the other ones are so obscure that you don't understand what's going on, so you really can't be bothered by them). So it's not that bad if you run software which is poorly designed and sluggishly implemented, isn't it? The whitehats will keep everything in control, and thanks to the new DMCA law, we can safely tell them from the blackhats!
sigh
(And BTW, the "responsible disclosure" document is referenced quite a lot for a withdrawn Internet Draft.)
DMCA isn't the big gun against hackers. (Score:4, Informative)
However popular it is to join the bandwagon railing against the DMCA anti-circumvention provisions (people seem to forget that the DMCA is itself an omnibus of technical and non-technical issues, good, bad and indifferent, and ranging from boat-hull designs to ISP immunities), the article's focus on DMCA is misplaced -- almost irresponsibly so.
The big guns against cracking conduct have been in place for years, and well before DMCA: The Computer Fraud and Abuse Act, the ECPA and countless state computer crime and regular theft statutes. All of these tend to be much broader in scope and reach, and far easier to prove and enforce. After the enhancements (from a prosecutor's point of view) made in the USA-PATRIOT Act, CFAA has become an even more powerful tool. The FBI didn't need a DMCA to get Kevin.
At the end of the day, the HP nonsense was just that: nonsense. The reason the HP DMCA threat was never pressed was simple -- it was a no-play claim, and everybody knew it. However, there are and have for years been a kazillion laws to beat up on anybody who engages in unauthorized access or exceeding authorized access of any kind, and regardless whether the conduct amounts to any circumvention of an effective copyright protection scheme.
I'm not arguing cracker ethics, or defending DMCA. I'm simply saying that the focus of the article is wildly misplaced. DMCA is just barely an interesting curiousity in the enforcement quiver -- so far as real cracking goes, it isn't even a fourth-string defense except in the oddest cases.
Re:DMCA isn't the big gun against hackers. (Score:3)
True, the DMCA is narrower than some of the other laws you cite because it is specific to security systems designed to protect copyright, and not security systems in general.
The article unfortunately confuses two gray hat actions: breaking into a system to report to the owner about its vulnerabilities without permission (which should be illegal in my opinion), and releasing exploit scripts to the public when vulnerabilities are found in commonly used operating systems or servers. I think the latter should definitely NOT be illegal for First Ammendment reasons if no other.
The DMCA stands apart from the other laws you cite, in that it criminalizes the latter activity (if the security system is primarily used to protect copyright.) The other laws only criminalize the former activity.
Re:DMCA isn't the big gun against hackers. (Score:2)
You will search in vain to find "hacking tools" among the proscribed devices set forth in DMCA. Only particularized devices are involved there, and very few of them have ANYTHING to do with cracking.
I don't disagree that the DMCA is pernicious, only that the conflation of it with these practices is bad karma for those who would like to criticize DMCA -- its technically weak as an argument, and generally associates violators of DMCA with an image not favorably taken in the public at large. If you want to beat down the DMCA, don't blame everything on it, like some technological "el nino."
There is simply no reason to think that releasing an exploit script directed to a technical vulnerability would be a DMCA violation -- and the HP backtracking that immediately followed their ludicrous overreaching is more evidence that DMCA is not implicated than that it is.
Re:DMCA isn't the big gun against hackers. (Score:2)
Explain that to Dmitri Sklyarov, who spent more than a month in jail for releasing a hacking tool, which unlocks Adobe e-books.
Re:DMCA isn't the big gun against hackers. (Score:2)
That's just silly. This is some new use of the word "hacking tools." Certainly, Elcomsoft [elcomsoft.com] doesn't think so -- the words "hacking tools" do not appear on their web site.
Sure, you can try to define yourself out of this argument by treating the word "hacking" to mean whatever you like. But that's the same logical error -- you are still conflating the same concepts. If you define "hacking" to include the activity of trafficking in software for "unlocking Adobe e-books," congratulations! You won the argument. But so what? My point is that DMCA is not directed toward the conduct traditionally known as hacking by most of us (clever machination of technical systems) nor the conduct currently known as hacking (cracking). The DCMCA anti-circumvention proscriptions may overlap with some cracking conduct, just as any number of other laws -- that doesn't make it anti-cracking legislation, for the reasons stated earlier.
Re:DMCA isn't the big gun against hackers. (Score:2)
The DMCA criminalizes free speech and thereby nullifies the First Amendment.
And there were already laws aplenty against nefarious hacking; for what did we need another one??
Ethics (Score:2, Insightful)
Re:Ethics (Score:2)
Ever since you subscribed to a utilitarian view of ethics and there was a better option, I should think.
"If I find a problem with the tires that causes the car to flip I anm going to tell people about it."
Before or after it's flipped? And who exactly are you going to tell?
Nearly everyone's a grey-hat (Score:5, Interesting)
Not too long ago, I sent a note to several of my friends about a conflict [theregister.co.uk] I saw between the DMCA-esque proposed Microsoft security certification -- requiring software bug hiding and notification of the software vendor before notification of the affected client -- and the codes of ethics binding those with CISA and CISSP certifications -- both of which require protection or notification of the potential target/victim. (My personal favorite part of the ISC2/CISSP code is "Tell the truth" which is anathma to the DMCA/bug-hiding camp.)
Of course, since DMCA enforcement tends towards the corporate view of things (property, ownership, patents, royalties) rather than the societal view (ethics, trust, truth, community), if I follow the vendor-independent (societal) path, I get labelled as a grey-hat or a black-hat right out of the starting gate. Have I personally cracked and distributed software? No. But do I swear to uphold the right of the consumer to know of flaws in their software or implementation? Of course I do -- it's the core of my job as a consultant. But doing so may label me as a criminal, and not doing so is unethical and unprofessional. As the article point out, all you can do is try to do the right thing. Currently that may be illegal.
Maybe some of us will go to jail for it, but that's what it'll take to change or repeal ill-formed laws such as the DMCA. Nothing induces judicial scrutiny like a situation where a judge is embarassed to enforce a bad law against a just person. But for anyone contemplating the notion of a "test case", keep in mind that the ACLU only picks up your legal fees if you keep your nose clean while you're doing the (illegal) right thing.
J
what's that quote from the Chronicles of Amber? (Score:2)
If I bought a truck.... (Score:2, Interesting)
If I took apart someone else's truck without asking for permission, I suspect I'd just get my ass kicked. But, charges could of course be filed by the owner of the truck as well.
Why is it different with computers? Why are there people here saying that someone who looks at something they've legally purchased and find flaws with it are ethically in the wrong? And why should they not be able to speak up about it? The article is about a guy who reverse-engineered something on his own system. He didn't hack anyone else's system. What is wrong with that? I'm seeing tons of posts saying that all gray hats are black hats, or that ethically gray hat hacking is wrong although they do it anyway, and lots of garbage like that. What is gray at all about experimenting on your own machine when you've purchased the software?!? The whole gray/black/white hat stuff to me only applies (in any way, even if it is all b.s.) when you're poking into *other* people's computers.
Yes, if you find a hole, it's polite to everyone to give the company a chance to fix it before going public. But - that's a polite social thing to do. I see nothing wrong with telling an emporer or anyone else that they are butt naked. And if I feel like it, I should be able to tell everyone that the emporer is butt naked without asking his permission. That's called freedom of speech.
White / Gray / Black defined (Score:2)
Hacks systems at the request of the system owner to find vulnerabilities. Helps system administrator eliminate obvious holes first. Gets a paycheck and free lunches from the IT manager.
GRAY
Inconsiderately hacks systems without the knowledge of the system owner, blinded by his good intentions. Notifies system administrator about holes in the system. Receives suspicion and a subpoena, gets free representation.
BLACK
Cracks systems in search of personal booty and root exploits. His back-door scripts leave no traces. Notifies the world by rerouting all requests for the public site to goatse.cx. Never gets caught, gets all the chicks.
A.C. (Score:2)
My black hat has a big `EFF' on the front... (Score:2)
Keep All Hacks Secret. (Score:2, Interesting)
Gray is Black.. I AGREE (Score:2, Insightful)
I agree that if your Gray then your black. You might be Black with good intentions.. but your still black.
It's like breaking into a store; simply to warn the store owner that you could break into a store.. no different. Or to use a popular theme in other postings regarding a house with an Open sign on it. NO! It's more like going up to a house, trying all the doors and windows till you find one that is open.
Unless you are specifically asked by a company owner or software maker to exploit security holes, you shouldn't be doing it. If your concerned about security of the source, then choose a OpenSource alternative or write your own. If your using a COTS, then ask the publisher for permission to test the software for security holes, most will allow you as long as your a paying customer. If they don't, you probably don't want to be using that software vendor's appliction anyways.
It's all about property people and respecting peoples privacy. Yes, it would be a utopian society if everybody could be online without fears of your network being compromised, and that's not reality obviously. But we don't need vigilanties running around exploiting everybodies software or network just because they can. It's not research its criminal; you've breached somebodies privacy even if you didn't do damage. If you want to practice, setup your own private network with software that allow's you to do as such. An no, I don't agree at all with the penalties associated with violations of the DMCA. They are outrageous and should be removed and educated individuals should re-establish new ones.How white is "white"? (Score:2)
My job (and my hobbies) involves legally acquiring software and hardware and testing it, tearing it apart, looking for weak spots.
That includes purchasing items like a Cisco PIX or a software firewall, testing for security holes, and often extends to writing and executing working exploits for these holes, on legally acquired copies running in my test lab.
These actions may violate the vendor's EULA. But they do not ever involving penatration of the network, host, or data belonging to an innocent third-party. Do these acts make me a black hat?
Neither I personally nor my employers trust the publisher to do their own testing and report honestly on the results.If my customer agrees, I report issues to the vendor. If they not respond, and if my customer agrees, I will post some or all information to a full-disclosure list. What color is my hat now?
While it may be in violation of the law or a civil transgression to "test" software after purchasing a legally licensed copy, I do not agree that such testing turns a grey hat to black.
I've breached whose privacy? That of the vendor who wrote the software or designed the hardware?
If I legally acquire software and hardware, install it on my private testbed, then exploit the software (locally, in my "sandbox"), it most certainly is research. It may also be criminal. If I take the results of my tests and publish them, that too is research, and under the DMCA or certain EULAs, may be unlawful.
Regardless of how the laws are contorted to depict my actions, I will not accept the label of "black hat" on this basis.
So you're saying Consumer Reports is illegitimate? (Score:2)
The fact that I did not obtain the publisher's permission does not magically redefine my activity to be "not research".
I bought a sports car. I don't think it goes fast enough. I swap out the intake system, have a machine shop rebore the engine, and I extract the manufacturer's ROM, edit the ROM image to tune the pre-computed fuel curve table, and burn a new ROM for myself.
All of this activity I define as "research". The car manufacturer might not agree, and will void my warranty. But the fact that I do not have permission from them to "hack" my car does not change the definition of my research to something else, it only changes my relationship with the vendor, and precludes me from obtaining future "tech support" from the vendor.
My clients choose to use non-open-source products. They choose to pay me to perform "research" on these products and supply my results either exclusively to my client, or to Bugtraq. I accept my client's conditions, and perform research for them.The fact that the company that sold them the hardware or software did not agree to this "research" does not change the definition of my activity.
If my client was "Consumer Reports", would you still have a problem with my research?
Consumer Reports buys all the items they test from retail outlets, and does not ask the manufacturer for permission to perform their "research": http://www.consumerreports.org/static/popup/didyou know.html [consumerreports.org]
I don't mind wearing a black hat (Score:4, Interesting)
I'm already a criminal. I imagine most people on here are. Who the hell hasn't broken a law today. We're in a drought here in Maryland. Water a plant today, did ya? Broke the law. have you let a teenager bum a cigarette? Criminal.
Why should anyone care what color hat they supposedly wear. It's an arbitrary label. I call myself a hacker. I don't break things. I don't steal things. I try not to hurt people I like. In my opinion, that makes me an OK guy. Of course, opinions vary.
Oh, and you... yeah you. Stop looking over your shoulder. I'm running crack against your password file right now. Might want to go change a few of 'em. Especially root. You know, the one that's your girlfriend's name. (And we both know she's not really your girlfriend. All you really have to do is ask her out, but you're scared. Pussy.) I'm only telling you all this because I like you. Now go ask her out, wimp.
A house is not a computer (Score:2, Insightful)
"Ok, say you someone breaks into your house/car/business but doesn't steal anything" to mirror the actions of "hacking"?
Yes, it really sounds like it might be a good analogy, but computers are absolutely none of the above.
There is no such thing as a nice citizen who comes around to your house and checks to make sure your door is locked and your jewelry is secured in your house. There never has been, there never will be, and there never will need to be, because the Internet is a way different medium than the real world.
Analogies are great for helping geeks explain computer terms to non-computer people, but no matter how you slice it an apple will never be an orange.
A prime example of how it doesn't work is in software "hacking". If a major gaping security hole in someone's software exists, it is something that desperately needs to be fixed immediately and brought to people's attention.
Imagine something simple like an IIS bug (no way!) that allows people to download the source code for some script on your server that includes things like database and system passwords. Some well meaning (gray) hacker tells Microsoft about this, and gets tossed in jail. Meanwhile the same exploit is found at the same time by a malicious (black) cracker, who tells all his l337 script kiddie friends and before you know it some poor startup companies have just given out credit card numbers and secure corporate information to exactly the wrong kind of people.
Where is the white hat in all this?
Oh, he thought about the exploit, but didn't look into it because that sort of thing is naughty and he might get his pretty little white hat dirty.
Testing security measures and breaking software is absolutely necessary if we want to keep robust efficient systems across the country.
Do you really think other countries prosecute their L337 cR4X0rs when they break into our untested unsecured networks?
There have been hackers ever since there have been computers, and it needs to stay that way or we will all find ourselves up that silicon creek without a paddle.
Whitehats can break the law too. (Score:2)
The page says a black hat will not disclose their hacks and use them for their own gain. That sounds like me. I run unix boxes and I think Windows in most cases is trash. When a client says they are as secure, I've been known to show them why they aren't. I've had one client get all upset since I wouldn't explain to MS how I took down their secure box. MS isn't paying me and they have done enough boneheaded things to make my life hell at times. I'm not going to do anything else that helps gates and his evil minions make my job harder.
Re:DMCA (Score:5, Interesting)
One could take that to mean that early "white hat" hackers served their purpose successfully. By roaming through corporate systems, they managed to call attention to a lot of gaping security flaws that ended up getting fixed.
Also, roaming through corporate streams was a necessity for hard-core geeks in the days when Internet connectivity was prohibitively expensive. Much of what recreational hackers where "borrowing" other people's network resources for can now be done on a common consumer connection.
Re:DMCA (Score:5, Insightful)
One of the largest holes that I currently see is the lack of any security on all of the wireless networks! You can load a machine up and use a card with a MAC address that you use for nothing but hacking and NEVER be caught. The good ole days aren't gone, but the good ole days are here right now. UNTRACEABLE baby, with COTS equipment at that. From my house with a 24db antenna I can see ten networks that are not encrypted. I was thretened with a lawsuit recently when I informed a company of an unencrpted network that I found while driving to my house, I will never do that again, but now I will keep them to myself just incase I want to do some "gray" actions. Don't get me wrong, I don't go around destroying networks, but with wireless in the state that it is in today, I could definately do that.
Cheers
Re:DMCA (Score:2)
"Get your network out of my airspace or I will sue you for trespass."
Re:Dragonlance saga (Score:2, Insightful)
Reminds me of some primitive societies on our own planet, where they burn witches, medicine-men, doctors, anyone-with-specialized-knowledge-who-challenges-
Smart people, regardless of their intentions, have always been feared...
Re:There's no such thing as a whitehat (Score:2)
That being said I do thing ou give the comapny a little notice (at most 5 days) before you release it..
Re:There's no such thing as a whitehat (Score:2)
Re:From the so-stupid-it-may-just-be-legal dept (Score:2)