Is Paying Hackers Good for Business? 94
Jenny writes "In the light of the recent QuickTime vulnerability, revealed for $10,000 spot cash, the UK IT Security Journalist of the Year asks why business treats security research like a big money TV game show. 'There can be no doubt that any kind of public vulnerability research effort will have the opportunity to turn sour, both for the company promoting it and the users of whatever software or service finds itself exposed to attack without any chance to defend itself. Throw a financial reward into the mix and the lure of the hunt, the scent of blood, is going to be too much for all but the most responsible of hackers. There really is no incentive to report their findings to the vulnerable company, and plenty not to. Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
Too late (Score:5, Insightful)
Re:Too late (Score:3, Insightful)
What i fail to understand (Score:4, Insightful)
Why couldn't one just announce "Joe Bob McHobo was the winner!" without publicizing the vulnerability itself before the softwares author gets a crack at it.
Humanity is weird.
Responsible disclosure (Score:4, Insightful)
hmm (Score:3, Insightful)
Maybe because the bugs they find are "showstoppers"?
Re:Bounty Hunters (Score:3, Insightful)
Stunning (Score:3, Insightful)
Out of context. (Score:3, Insightful)
Damn the consequences (Score:5, Insightful)
Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.. Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.
I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.
I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass... So *#@& 'em, we just need a good way to disclose anonymously.
Re:Responsible disclosure (Score:5, Insightful)
Wrong. Responsible Disclosure is an attempt to curb the greater than linear complexity associated with testing patches.
If a bug is found in product X, then all applications that reside upon product X need to be validated as functional. In an enterprise, that could include applications plus interfaces that are unique to that organization. Most studies on code complexity find that complexity increases at a greater than linear clip. Responsible Disclosure is the opportunity to level the playing field between the "good guys" and the "bad guys" (deliberately avoiding hat color references).
Anyone who claims Full Disclosure is the best for his company is:
A) Not a sysadmin at all
B) A lazy sysadmin who refuses to fully test patches
-OR-
C) A vulnerability pimp (e.g. IDS, AV, Vuln Assessment, etc.)
Responsability (Score:2, Insightful)
If a construction company builds a bridge with defects that causes it to fall on someone, that someone can sue them.
If a software company makes an insecure product, and someone gets pwned because of it, that should be allowed to sue for damages.
Yes security holes aren't easy to find in big products, but it should never be an excuse for a company (especially those that make billions, wink wink) for them to release unsafe products.
Re:Too late (Score:4, Insightful)
If it were just the case that companies were ignoring the security issues in development because it was cheaper, well, that's business for you, but the reverse is commonly true. I'm simply amazed by the frequency with which people write their own products from scratch in areas where products that have already had all the "low hanging fruit" patched are freely available for commercial use!
Here's a hint: you're not going to save any money by writing your own user authentication mechanism, or your own RPC infrastructure, or your own file encryption software, or your own network traffic encryption software. You're spending money to re-invent the wheel, and you're getting a trapezoid with an off-center axel!
Re:It's the WRONG APPROACH (Score:3, Insightful)
You are somewhat correct. Sloppy coding techniques do lead to security vulnerabilities which lead to exploit code which eventually lead to websites burning, etc. However, that is only one category of security flaws. If you look at, say the GDI flaws Microsoft had last year (for example), you'll notice that vulnerability is actually a design flaw-- allowing executable code to live embedded in file objects was the problem [the embedded code's trustworthiness had no mechanism to be measured and therefore any user double-clicking on a malicious code-within-an-image file would have their system compromised]. Design flaws are much more tricky to prevent and most experts attempting to solve this problem suggest that development houses should leave the design aspects of their code to people with a background in security principles, or at least have some sort of design-time security review. This is mostly what formalized threat modeling attempts to do.
But you are right
Re:Responsibility (Score:2, Insightful)
History (Score:1, Insightful)