Is Paying Hackers Good for Business? 94
Jenny writes "In the light of the recent QuickTime vulnerability, revealed for $10,000 spot cash, the UK IT Security Journalist of the Year asks why business treats security research like a big money TV game show. 'There can be no doubt that any kind of public vulnerability research effort will have the opportunity to turn sour, both for the company promoting it and the users of whatever software or service finds itself exposed to attack without any chance to defend itself. Throw a financial reward into the mix and the lure of the hunt, the scent of blood, is going to be too much for all but the most responsible of hackers. There really is no incentive to report their findings to the vulnerable company, and plenty not to. Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
Too late (Score:5, Insightful)
Re: (Score:3, Insightful)
Re:Too late (Score:4, Insightful)
If it were just the case that companies were ignoring the security issues in development because it was cheaper, well, that's business for you, but the reverse is commonly true. I'm simply amazed by the frequency with which people write their own products from scratch in areas where products that have already had all the "low hanging fruit" patched are freely available for commercial use!
Here's a hint: you're not going to save any money by writing your own user authentication mechanism, or your own RPC infrastructure, or your own file encryption software, or your own network traffic encryption software. You're spending money to re-invent the wheel, and you're getting a trapezoid with an off-center axel!
Re: (Score:1)
Complete and utter bullshit.
The anti-virus/security industry has bent over backwards for over a decade to avoid even the appearance if impropriety. Recollect the public and nasty castigating of the University of Calgary over virus writing courses to "train" antivirus researchers. After this and other efforts there are still large numbers of people who still think antivirus companies write viruses. Offering bounties on vulnerabilities is no different from employing malware authors. This does nothing
Re: (Score:2)
Re: (Score:2)
Yes, selling exploits on the black market is illegal, but that's why it's called the black market, it's a place people go to sell illegal things.
Selling exploits is illegal? Or is it only illegal because it's on the black market? (and therefore illegal, because anything sold on the black market is illegal?)
I don't get it. Why would selling knowledge of security vulnerabilities be illegal? In the US, the DMCA makes selling copyright circumvention technologies illegal, but I'm not really sure that would apply to general security vulnerabilities. As I see it, cracking into somebody else's box is certainly illegal in most cases, but selling i
Re: (Score:2)
Comment removed (Score:5, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2)
No. What you said is not an analogy. Normal bounty hunters would look for exploiters on the lamb.
Re: (Score:1)
Re: (Score:3, Informative)
Re: (Score:1)
You're going to feel sheepish when you realize that should be "on the lam".
Re: (Score:1)
Re:Bounty Hunters (Score:5, Interesting)
At least on the black market, you know, honor among thieves.
Re: (Score:2)
No, that would be illegal. If a cop does it to you, it's entrapment, but in this case it would be... hell, I don't know what it would be. But by throwing the contest they're inviting people to attack their software, and unless your lawyer is utterly incompetent, the DMCA would not apply because you had express permission.
Re: (Score:2)
the DMCA would not apply because you had express permission.
Well, that really depends on how exactly the contest is stated. If you discover a exploit and then make a announcement about it at the same time you try to claim the prize the company might turn around and sue you saying that you didn't have the right to announce the exploit to the general public without their express permission. If on the other hand you discover it and only tell them and they try to sue you, yes, then you could pretty much laugh them out of court.
Re: (Score:1)
Well, that really depends on how exactly the contest is stated. If you discover a exploit and then make a announcement about it at the same time you try to claim the prize the company might turn around and sue you saying that you didn't have the right to announce the exploit to the general public without their express permission. If on the other hand you discover it and only tell them and they try to sue you, yes, then you could pretty much laugh them out of court.
At least, we'd like to believe so. Remember, justice != logic.
Re: (Score:1, Funny)
Nothing. Both Cops and Dog the bounty hunter get cool TV shows. Clearly that is the solution.
What i fail to understand (Score:4, Insightful)
Why couldn't one just announce "Joe Bob McHobo was the winner!" without publicizing the vulnerability itself before the softwares author gets a crack at it.
Humanity is weird.
Re: (Score:2)
I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public. From what I have seen of most pe
Re: (Score:2)
I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public.
Would you be willing to do it assuming there was a reasonable lag time between the announcement of the discovery and the announcement of the details of the exploit? Something reasonable like say a week or two (agreed to before the contest is started), to give some time for the developer(s) to fix the problem and release a patch. Assuming that the requisite time has passed then either party could release the details, and also have a legally binding contract giving them that right (maybe with a clause that
Re: (Score:2)
You seem to have a reading comprehension problem. I suggest you look up the meaning of the
Re: (Score:2)
Re: (Score:2)
Pardon the grandparent for assuming that you meant something other than what you said. You've cleared that up, though.
There, fixed that for you.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Disclosure is key (Score:3, Interesting)
The most damaging holes are the ones that only the bad guys know about. This doesn't tend to advance security in software, it just allows people to take over your machine without your permission.
Security research or incentivization schemes that don't include a built-in mechanism to promote disclosure of the discovered problems won't help much.
Money laundering (Score:1)
The most damaging holes are the ones that only the bad guys know about
I think this is where we see why this is a money laundering scheme--with big money. The "bad guys" know hundreds, maybe thousands of holes. There is no shortage of security vulnerabilities in nearly any code base in modern software. There are people who have entire libraries of text files describing vulnerabilities for whatever they want.
Remember the semi-cynical description of job descriptions? From a random job seeker's point of view all job descriptions are things that they're seeking to fit themsel
Re: (Score:2)
Re: (Score:3, Informative)
Careful? Yes. Deliberate? Maybe. Diligent? Usually not, which is why we end up with ads requiring a decade of .NET experience or similar.
Usually the HR department knows jack diddly shit about the job they're writing requirements for. And if you hand them requirements that actually fit the position, they'll rewrite them anyway.
Re: (Score:2, Interesting)
And:
The second most damaging holes are the ones that both the bad guys and the developers know about, but no one else does
The third most damaging holes are the ones that everyone knows about
The fourth most damaging holes are the ones that only the developers know about
If you reveal an exploit, you know that you are in the third state. If you do not reveal an exploit to the public, but only tell the developers, you might have made things
Re: (Score:1)
Responsible disclosure (Score:4, Insightful)
Re:Responsible disclosure (Score:5, Insightful)
Wrong. Responsible Disclosure is an attempt to curb the greater than linear complexity associated with testing patches.
If a bug is found in product X, then all applications that reside upon product X need to be validated as functional. In an enterprise, that could include applications plus interfaces that are unique to that organization. Most studies on code complexity find that complexity increases at a greater than linear clip. Responsible Disclosure is the opportunity to level the playing field between the "good guys" and the "bad guys" (deliberately avoiding hat color references).
Anyone who claims Full Disclosure is the best for his company is:
A) Not a sysadmin at all
B) A lazy sysadmin who refuses to fully test patches
-OR-
C) A vulnerability pimp (e.g. IDS, AV, Vuln Assessment, etc.)
Re: (Score:2)
I am under the impression that a bunch of fortune 500s want security bugs disclosed to software vendors and a select group of companies including themselves, and to nobody else. The problem is that EVERYBODY wants to be one of those select companies, which means the bug gets out anyway. So now the bugs leak out to those who would exp
Re: (Score:2)
Oh, please. Responsible disclosure isn't about who it embarasses; this isn't high school. It's about lost data and compromised systems of real people and real companies.
What you're preaching is a form of Econ 101 [joelonsoftware.com] -- if we incentivize security patching via reputation, you'll have more people fixing their holes. Maybe, but regardless, I think you'll just have more people changing the definition of what constitutes a vulnerability.
Re:Responsible disclosure (Score:4, Interesting)
Actually, security through obscurity is very functional and useful as part of a security scheme. Your password is security through obscurity. Why don't you just give it to everyone if it makes no difference?
And if you disclose the wrong vulnerability to the general public you have a live, exploitable hole that everyone knows about sitting there ripe for attack. Which is better?
Responsible disclosure is simply evaluating what is best for the security of users and disclosing i that manner. n some cases, the best thing for overall security is immediate, public disclosure to pressure the vendor into fixing the hole more quickly and to give users a chance to work around the vulnerability. In other cases, where the vendor is responsive, and ther is no easy way to mitigate the vulnerability for the end user, immediate disclosure increases the risk to users with no real benefit.
Who is embarrassed is immaterial. Ignoring the likely consequences of you disclosure method, however, is irresponsible, which is why the alternative is called "responsible disclosure."
In many cases the vendor is very motivated and goes to work with all their resources immediately. Take a look at the OmniWeb vulnerability published by the MOAB project. Omnigroup implemented a fix within a day and had it available for download, but they do the same thing for bugs disclosed to them privately. All the immediate disclosure did was give hackers more time to exploit the problem before the fix reached users. Disclosing a vulnerability to the public before sending it to a responsible and security minded development team is good for no one but blackhats. Also, rushing vendors to write code faster, can result in more bugs in said code, including other vulnerabilities or or bugs.
Please. Most users will not switch platforms because of security issues and many are locked into MS's desktop monopoly by some software they absolutely need and price constraints. The vast majority of users never even hear about security vulnerability disclosure in the first place.
Here's a tip for you from someone who does work in the security industry. If you're looking for a job in the field, don't expose your irresponsible ideas about disclosure if you want a chance at being hired somewhere respectable.
Re: (Score:2)
Your also wrong about security issues not having an impact on platform choice. No one sane runs their web server on windows. User
Password not "obscurity" (Score:2)
Re: (Score:2)
Re: (Score:2)
Fixing bugs costs money.
Saying 'Company X has all these exploitable bugs' will cost Company X money, in stock price dropping, fewer consumers, etc.
Thus, exposing exploits can slow down the bug correction process by moving resources away from doing bug corrections. Bonus points if they lose enough money that they have to fire one of the code guys, and he takes a list of unpatched bugs with him when he goes.
Now, you're pushing u
hmm (Score:3, Insightful)
Maybe because the bugs they find are "showstoppers"?
NO WHAMMIES! (Score:2)
It's business (Score:1)
You quit charging me for your information, I'll quit charging you for mine.
Make no mistake, there's plenty of people out there perfectly willing to pay me for my information.
I wish... (Score:4, Funny)
Why not pay? (Score:3, Interesting)
Anything that is the most thorough way of eventually getting the programme secure is the best way to go about it. Period.
Stunning (Score:3, Insightful)
Re:Stunning (Score:4, Funny)
Wait, you mean there are stories/authors who don't get lambasted on slashdot?
I thought we pretty much did our best to rip every story to shreds?
Out of context. (Score:3, Insightful)
Damn the consequences (Score:5, Insightful)
Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.. Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.
I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.
I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass... So *#@& 'em, we just need a good way to disclose anonymously.
Re: (Score:1)
You've got the choreography reversed
Re:Damn the consequences (Score:4, Interesting)
So Apple sued the guy who disclosed this Quicktime vulnerability? If that happened, I never heard about it. In fact, I work i the security industry and very, very rarely hear about any lawsuits, which is why they are news when they do happen.
Why? Would such a statement stop them from later doing it? In general companies don't sue over vulnerability disclosures, no matter whether they are immediate, or if the vendor is given time. The reason security researchers tend to give companies time to fix things is because that is what they think is best for security, overall.
That doesn't really work. Basically you can sue anyone for anything in the US (with very few exceptions). I don't see the need for one here since I very rarely, if ever, hear about anyone being sued for disclosing bugs.
Re: (Score:1)
So many of us are alluding to this, but so few are actually calling it out
“No incentive” !? Is $10,000 so lacking as
Re: (Score:2)
But this debate is a bit silly since there are any number of legal firms that pay bounties for exploits in popular software, then extort huge "security consulting" fees out of the vendors to reveal these exploits. hen the company offers the bounty directly it just cuts out the middle-man.
Re: (Score:2)
Re: (Score:1)
I'm sure anti Americans regret any collateral damage they cause.
Responsability (Score:2, Insightful)
If a construction company builds a bridge with defects that causes it to fall on someone, that someone can sue them.
If a software company makes an insecure product, and someone gets pwned because of it, that should be allowed to sue for damages.
Yes security holes aren't easy to find in big products, but it should never be an excuse for a company (especially those that make billions, wink wink) for them to release
Re:Responsibility (Score:2, Insightful)
You forgot the EULA (Score:2)
Re: (Score:2)
In the case of Vista, what happens if you turn UAC off?
Re: (Score:3, Insightful)
You are somewhat correct. Sloppy coding techniques do lead to security vulnerabilities which lead to exploit code which eventually lead to websites burning, etc. However, that is only one category of security flaws. If you look at, say the GDI flaws Microsoft had last year (for example), you'll notice that vulnerability is actually a design flaw-- allowing ex
Hackers? (Score:1, Informative)
MOD PARENT UP (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Slashdot editors on the other hand, should know better. There are enough of us here who actually are non-cracking hackers, after all.
"Hackers" is just one of several examples (Score:1)
"Linux" instead of "GNU/Linux" (when not referring specifically to the Linux kernel)
"piracy" instead of "copyright infringement"
I am sure we could dig up more.
Re: (Score:2)
(Don't believe the lies, hacker kids. Become successful and get your life together and everything else will fall into place. Except, fucktards who should know better think that hackers are crackers.)
bug testing? (Score:4, Interesting)
If a third party demands money to keep QUIET about a vulnerability, that's extortion.
Much of the animosity here is that many security researchers specialize in breaking things--they haven't ever worked on engineering a large, complex system. They just don't understand how much time is required to test code before it is released. Also, the legal teams for many companies just don't understand that alienating security researchers by filing law suits is only going to make their situation worse.
Re: (Score:2)
The problem is that bug testers want to be paid for their efforts. The companies will do anything, fair or unfair, to avoid payment.
To any bug testers, I offer these:
Hint: Never answer this question "What will you do if we don't pay for this information?".
Answering that question, with nearly any answer, can lead to extortion charges.
Next hint: Never demonstrate a vulnerability, to anyone, just document it. Written words are rarely illegal. Actions are more frequently illegal.
Not paying is -definitely- not good for business.. (Score:2)
who-the-which-what? (Score:2)
I'm a UK cit, I work in infosec, and I've a friend who's an IT hack (er, that is, journalist :) ) I have no idea who the UKITSJotY might be. Mine non-UK SIJOTY is Bruce Schenier, same as last year and the year before that, with Peter Neumann a close second.
I have a better idea! (Score:2)
Why not hire a professional team of assessment professionals to look at your stuff?
I'm not talking a lame corporate-compliance team, but a highly experienced team of world-class hackers, who are employed by an extremely reputable company and managed by an experienced staff capable of communicating problems quickly and completely.
Try this one: www.accuvant.com
Then you don't have any of these issues.
Of course, that wouldn't necessarily be as cheap. I think $10,000 would definitely be on
History (Score:1, Insightful)
Evolve or die? (Score:2)
As a customer, then, who should you buy from? The companies wit
Incredibly stupid (Score:2)
And the new crop of victims will never know who to thank.
Re: (Score:1)