Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security

Is Paying Hackers Good for Business? 94

Jenny writes "In the light of the recent QuickTime vulnerability, revealed for $10,000 spot cash, the UK IT Security Journalist of the Year asks why business treats security research like a big money TV game show. 'There can be no doubt that any kind of public vulnerability research effort will have the opportunity to turn sour, both for the company promoting it and the users of whatever software or service finds itself exposed to attack without any chance to defend itself. Throw a financial reward into the mix and the lure of the hunt, the scent of blood, is going to be too much for all but the most responsible of hackers. There really is no incentive to report their findings to the vulnerable company, and plenty not to. Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
This discussion has been archived. No new comments can be posted.

Is Paying Hackers Good for Business?

Comments Filter:
  • Too late (Score:5, Insightful)

    by orclevegam ( 940336 ) on Thursday May 10, 2007 @04:27PM (#19073935) Journal
    0-Day exploits are already big business on the black market, better for the companies to pay for disclosure and have a more secure product, than for the exploits to be sold off on the black market and only discovered after a significant portion of the user base has been compromised.
  • Re:Too late (Score:3, Insightful)

    by !ramirez ( 106823 ) on Thursday May 10, 2007 @04:30PM (#19073979)
    There's a simple solution to this. Stop writing sloppy, insecure, poorly-managed code, and actually MAKE a product that works as advertised and is fairly secure. Hackers go after the low-hanging fruit. This is nothing more than a product of the 'get it out the door as quick as possible, damn the consequences' software industry mentality.
  • by Adambomb ( 118938 ) * on Thursday May 10, 2007 @04:30PM (#19073995) Journal
    Is why would such contests HAVE to report what vulnerability successfully got through. Shouldnt the results be between the company holding the contest, the successful hacker, and companies whose software was involved in the vulnerabilities be the only ones who know?

    Why couldn't one just announce "Joe Bob McHobo was the winner!" without publicizing the vulnerability itself before the softwares author gets a crack at it.

    Humanity is weird.
  • by morgan_greywolf ( 835522 ) * on Thursday May 10, 2007 @04:33PM (#19074027) Homepage Journal
    'Responsible disclosure' is a euphemism for 'we can't fix bugs fast enough, so if you keep the vulnerabilities a secret, it'll help us to save face.' And more time often means months, not days. Responsible disclosure is nothing more than security through obscurity. And security through obscurity is as good as no security at all. In the intervening months, you have a live, exploitable hole sitting there ripe for attack! And not just on that one system -- every like-configured system is vulnerable. I say, damn the consequences. Report as soon as possible no matter who it embarrasses. It'll either put more pressure on them to fix the bugs faster, or push users to more secure platforms, where security fixes don't take months and are usually found before their ever exploited in the wild.
  • hmm (Score:3, Insightful)

    by eclectro ( 227083 ) on Thursday May 10, 2007 @04:33PM (#19074029)
    why business treats security research like a big money TV game show

    Maybe because the bugs they find are "showstoppers"?
  • Re:Bounty Hunters (Score:3, Insightful)

    by malcomvetter ( 851474 ) on Thursday May 10, 2007 @04:38PM (#19074107)
    The problem with your analogy is that "bounty hunters" in the infosec debate would actually be searching for the exploiters, not the exploits.
  • Stunning (Score:3, Insightful)

    by Pheersome ( 116234 ) on Thursday May 10, 2007 @04:45PM (#19074239)
    Wow. How is it that an "ex-hacker" who now "specialises in security from the white hat side of the fence" (from the author's bio) can have so little clue about the responsible disclosure debate and the economics of modern vulnerability research? Maybe getting lambasted on Slashdot will be a wake-up call for him to actually do his homework before he spouts off.
  • Out of context. (Score:3, Insightful)

    by Kintar1900 ( 901219 ) on Thursday May 10, 2007 @04:45PM (#19074255) Homepage
    Nice way to take the situation out of context with the snippet here on /. I think the important question isn't whether public, for-pay security hunting is a good idea, but rather if it's ethical for an outside firm to pay for it. Would anyone have batted an eye if Apple had been the one advertising for a hack for the Mac? I don't think so, they'd probably have been lauded for having the wherewithal to offer good money to people to help them find exploits of their software.
  • by minion ( 162631 ) on Thursday May 10, 2007 @04:45PM (#19074257)
    Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
     
    Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.. Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.
     
    I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.
     
    I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass... So *#@& 'em, we just need a good way to disclose anonymously.
  • by malcomvetter ( 851474 ) on Thursday May 10, 2007 @04:51PM (#19074379)

    'Responsible disclosure' is a euphemism for 'we can't fix bugs fast enough, so if you keep the vulnerabilities a secret, it'll help us to save face.'

    Wrong. Responsible Disclosure is an attempt to curb the greater than linear complexity associated with testing patches.

    If a bug is found in product X, then all applications that reside upon product X need to be validated as functional. In an enterprise, that could include applications plus interfaces that are unique to that organization. Most studies on code complexity find that complexity increases at a greater than linear clip. Responsible Disclosure is the opportunity to level the playing field between the "good guys" and the "bad guys" (deliberately avoiding hat color references).

    Anyone who claims Full Disclosure is the best for his company is:
    A) Not a sysadmin at all
    B) A lazy sysadmin who refuses to fully test patches
    -OR-
    C) A vulnerability pimp (e.g. IDS, AV, Vuln Assessment, etc.)

  • Responsability (Score:2, Insightful)

    by Ariastis ( 797888 ) on Thursday May 10, 2007 @04:58PM (#19074505)
    They released a product with security holes in it, they should pay to have them found.



    If a construction company builds a bridge with defects that causes it to fall on someone, that someone can sue them.

    If a software company makes an insecure product, and someone gets pwned because of it, that should be allowed to sue for damages.
    Yes security holes aren't easy to find in big products, but it should never be an excuse for a company (especially those that make billions, wink wink) for them to release unsafe products.
  • Re:Too late (Score:4, Insightful)

    by lgw ( 121541 ) on Thursday May 10, 2007 @05:19PM (#19074817) Journal

    There's a simple solution to this. Stop writing sloppy, insecure, poorly-managed code, and actually MAKE a product that works as advertised and is fairly secure. Hackers go after the low-hanging fruit. This is nothing more than a product of the 'get it out the door as quick as possible, damn the consequences' software industry mentality.
    While this comment is more flaming than is perhaps strictly necessary, this is certainly the heart of the problem. Security best practices are no longer a dark art. In my experience, people often do extra work to create security holes in their products.

    If it were just the case that companies were ignoring the security issues in development because it was cheaper, well, that's business for you, but the reverse is commonly true. I'm simply amazed by the frequency with which people write their own products from scratch in areas where products that have already had all the "low hanging fruit" patched are freely available for commercial use!

    Here's a hint: you're not going to save any money by writing your own user authentication mechanism, or your own RPC infrastructure, or your own file encryption software, or your own network traffic encryption software. You're spending money to re-invent the wheel, and you're getting a trapezoid with an off-center axel!
  • by malcomvetter ( 851474 ) on Thursday May 10, 2007 @05:23PM (#19074877)
    I am a security analyst by profession and education [not that it matters, but as a distinction of the previous poster's non-security background].

    You are somewhat correct. Sloppy coding techniques do lead to security vulnerabilities which lead to exploit code which eventually lead to websites burning, etc. However, that is only one category of security flaws. If you look at, say the GDI flaws Microsoft had last year (for example), you'll notice that vulnerability is actually a design flaw-- allowing executable code to live embedded in file objects was the problem [the embedded code's trustworthiness had no mechanism to be measured and therefore any user double-clicking on a malicious code-within-an-image file would have their system compromised]. Design flaws are much more tricky to prevent and most experts attempting to solve this problem suggest that development houses should leave the design aspects of their code to people with a background in security principles, or at least have some sort of design-time security review. This is mostly what formalized threat modeling attempts to do.

    But you are right ... there are vast categories of vulnerabilities that end up compiled in code unnecessarily. And a great place to start for anyone looking to weed these unforgiveable buffer overrun types of issues out of their code is to use a static analyzer on their code. Essentially, static analysis tools attempt to catch these obvious (or sometimes not so obvious) bugs before the code is shipped to customers. Fortify Software [fortifysoftware.com] is a great place to look for such a tool.
  • Re:Responsibility (Score:2, Insightful)

    by Strilanc ( 1077197 ) on Thursday May 10, 2007 @05:25PM (#19074909)
    The problem with your argument is it's much harder to create a secure software product than it is to create a secure bridge. This is especially true because delaying construction of a bridge for a month can be done without competitors swooping in and taking the market.
  • History (Score:1, Insightful)

    by Koby77 ( 992785 ) on Thursday May 10, 2007 @07:07PM (#19076337)
    "Responsible disclosure" would have been great, except that history has shown us that it usually doesn't work. When "responsible disclosure" has been tried the vulnerability has lingered (especially with the larger corporations). When the vulnerability has been openly disclosed, then suddenly the software gets a patch. If history had been different then perhaps we would give the idea consideration. But it wasn't, and it was a problem created by the software companies themselves, so here we are today reaping the seeds that were sown.

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...