Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security

Is Paying Hackers Good for Business? 94

Jenny writes "In the light of the recent QuickTime vulnerability, revealed for $10,000 spot cash, the UK IT Security Journalist of the Year asks why business treats security research like a big money TV game show. 'There can be no doubt that any kind of public vulnerability research effort will have the opportunity to turn sour, both for the company promoting it and the users of whatever software or service finds itself exposed to attack without any chance to defend itself. Throw a financial reward into the mix and the lure of the hunt, the scent of blood, is going to be too much for all but the most responsible of hackers. There really is no incentive to report their findings to the vulnerable company, and plenty not to. Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
This discussion has been archived. No new comments can be posted.

Is Paying Hackers Good for Business?

Comments Filter:
  • Too late (Score:5, Insightful)

    by orclevegam ( 940336 ) on Thursday May 10, 2007 @04:27PM (#19073935) Journal
    0-Day exploits are already big business on the black market, better for the companies to pay for disclosure and have a more secure product, than for the exploits to be sold off on the black market and only discovered after a significant portion of the user base has been compromised.
    • Re: (Score:3, Insightful)

      by !ramirez ( 106823 )
      There's a simple solution to this. Stop writing sloppy, insecure, poorly-managed code, and actually MAKE a product that works as advertised and is fairly secure. Hackers go after the low-hanging fruit. This is nothing more than a product of the 'get it out the door as quick as possible, damn the consequences' software industry mentality.
      • Re:Too late (Score:4, Insightful)

        by lgw ( 121541 ) on Thursday May 10, 2007 @05:19PM (#19074817) Journal

        There's a simple solution to this. Stop writing sloppy, insecure, poorly-managed code, and actually MAKE a product that works as advertised and is fairly secure. Hackers go after the low-hanging fruit. This is nothing more than a product of the 'get it out the door as quick as possible, damn the consequences' software industry mentality.
        While this comment is more flaming than is perhaps strictly necessary, this is certainly the heart of the problem. Security best practices are no longer a dark art. In my experience, people often do extra work to create security holes in their products.

        If it were just the case that companies were ignoring the security issues in development because it was cheaper, well, that's business for you, but the reverse is commonly true. I'm simply amazed by the frequency with which people write their own products from scratch in areas where products that have already had all the "low hanging fruit" patched are freely available for commercial use!

        Here's a hint: you're not going to save any money by writing your own user authentication mechanism, or your own RPC infrastructure, or your own file encryption software, or your own network traffic encryption software. You're spending money to re-invent the wheel, and you're getting a trapezoid with an off-center axel!
    • Bullshit.

      Complete and utter bullshit.

      The anti-virus/security industry has bent over backwards for over a decade to avoid even the appearance if impropriety. Recollect the public and nasty castigating of the University of Calgary over virus writing courses to "train" antivirus researchers. After this and other efforts there are still large numbers of people who still think antivirus companies write viruses. Offering bounties on vulnerabilities is no different from employing malware authors. This does nothing
      • You must live in a very different world than I do. Yes, selling exploits on the black market is illegal, but that's why it's called the black market, it's a place people go to sell illegal things. Because it's illegal and risky, it drives the price up, the higher the price, the more people are willing to risk getting caught to make money. If on the other hand there's a legitimate legal way to make money, even if it's a fraction of that possible on the black market, more people will be willing to pass up the
        • by dougmc ( 70836 )

          Yes, selling exploits on the black market is illegal, but that's why it's called the black market, it's a place people go to sell illegal things.

          Selling exploits is illegal? Or is it only illegal because it's on the black market? (and therefore illegal, because anything sold on the black market is illegal?)

          I don't get it. Why would selling knowledge of security vulnerabilities be illegal? In the US, the DMCA makes selling copyright circumvention technologies illegal, but I'm not really sure that would apply to general security vulnerabilities. As I see it, cracking into somebody else's box is certainly illegal in most cases, but selling i

          • That's a fairly good point, selling the actual flaw is not (usually) illegal. What is illegal however, and what is primarily trafficked in on the black market, are utilities (viruses, worms, and root kits to name a few) that take advantage of those exploits to break into systems. Often the cracked systems themselves are sold off as well. Of course, with some of the most recent legislation, and creative lawyers and politicians putting a fair amount of spin on things, it may not be long before selling ANY exp
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Thursday May 10, 2007 @04:29PM (#19073973)
    Comment removed based on user account deletion
    • Re: (Score:3, Insightful)

      The problem with your analogy is that "bounty hunters" in the infosec debate would actually be searching for the exploiters, not the exploits.
      • by l4m3z0r ( 799504 )

        No. What you said is not an analogy. Normal bounty hunters would look for exploiters on the lamb.

        • Not even that. Normal bounty hunters would look for accused exploiters on the lam. Or did we decide that if you are on bail then you are guilty. If so, why are we letting guilty go free for a short time?
          • Re: (Score:3, Informative)

            by Torvaun ( 1040898 )
            Generally, the accused but innocent don't take off. They stay in the state like they're supposed to, they show up to their trial, and then they most often get acquitted. Violating bail is, in fact, a crime, so a bail jumper is a criminal, regardless of whether or not he's guilty of the crime he put up bail for.
        • "Normal bounty hunters would look for exploiters on the lamb"

          You're going to feel sheepish when you realize that should be "on the lam". ;-)
        • This is slashdot, so where's the car analogy?
    • Re:Bounty Hunters (Score:5, Interesting)

      by Applekid ( 993327 ) on Thursday May 10, 2007 @04:45PM (#19074241)
      In the US, bounty hunters have legal protection to do what they do. If a company puts up a juicy reward for finding a security hole, the person coming forward could easily get the shaft and then be prosecuted under DMCA.

      At least on the black market, you know, honor among thieves.
      • If a company puts up a juicy reward for finding a security hole, the person coming forward could easily get the shaft and then be prosecuted under DMCA.

        No, that would be illegal. If a cop does it to you, it's entrapment, but in this case it would be... hell, I don't know what it would be. But by throwing the contest they're inviting people to attack their software, and unless your lawyer is utterly incompetent, the DMCA would not apply because you had express permission.

        • the DMCA would not apply because you had express permission.

          Well, that really depends on how exactly the contest is stated. If you discover a exploit and then make a announcement about it at the same time you try to claim the prize the company might turn around and sue you saying that you didn't have the right to announce the exploit to the general public without their express permission. If on the other hand you discover it and only tell them and they try to sue you, yes, then you could pretty much laugh them out of court.

          • Well, that really depends on how exactly the contest is stated. If you discover a exploit and then make a announcement about it at the same time you try to claim the prize the company might turn around and sue you saying that you didn't have the right to announce the exploit to the general public without their express permission. If on the other hand you discover it and only tell them and they try to sue you, yes, then you could pretty much laugh them out of court.

            At least, we'd like to believe so. Remember, justice != logic.

    • Re: (Score:1, Funny)

      by Anonymous Coward
      What's wrong with both?

      Nothing. Both Cops and Dog the bounty hunter get cool TV shows. Clearly that is the solution.
  • by Adambomb ( 118938 ) * on Thursday May 10, 2007 @04:30PM (#19073995) Journal
    Is why would such contests HAVE to report what vulnerability successfully got through. Shouldnt the results be between the company holding the contest, the successful hacker, and companies whose software was involved in the vulnerabilities be the only ones who know?

    Why couldn't one just announce "Joe Bob McHobo was the winner!" without publicizing the vulnerability itself before the softwares author gets a crack at it.

    Humanity is weird.
    • Is why would such contests HAVE to report what vulnerability successfully got through. Shouldnt the results be between the company holding the contest, the successful hacker, and companies whose software was involved in the vulnerabilities be the only ones who know?

      I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public. From what I have seen of most pe

      • I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public.

        Would you be willing to do it assuming there was a reasonable lag time between the announcement of the discovery and the announcement of the details of the exploit? Something reasonable like say a week or two (agreed to before the contest is started), to give some time for the developer(s) to fix the problem and release a patch. Assuming that the requisite time has passed then either party could release the details, and also have a legally binding contract giving them that right (maybe with a clause that

        • I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public.

          Would you be willing to do it assuming there was a reasonable lag time between the announcement of the discovery and the announcement of the details of the exploit? Something reasonable like say a week or two

          You seem to have a reading comprehension problem. I suggest you look up the meaning of the

          • by Manchot ( 847225 )
            Pardon the grandparent for assuming that you weren't a zealot. You've cleared that up, though.
            • Pardon the grandparent for assuming that you weren't a zealot. You've cleared that up, though.

              Pardon the grandparent for assuming that you meant something other than what you said. You've cleared that up, though.

              There, fixed that for you.

              • by lgw ( 121541 )
                Well, yeah, the ancestor assumed "only an idiot would say that, so I'm going to give this guy the benefit of the doubt and assume he meant something slightly different". Don't worry, you have in fact removed all doubt.
        • That is precisely what i was meaning in terms of having the companies involved sort out their security details PRIOR to it become public. Reporting it publicly everywhere just gives people a how-to until such time as the vulnerability is fixed.
          • I think the biggest concern is more over time spans involved. It's important before hand to agree to how quickly the details of any vulnerabilities should be disclosed, otherwise either the company isn't happy because they have to scramble to patch something overnight, or the researcher isn't happy because he can't release any of the details about what he discovered. Having a fixed time agreed to also serves as motivation for the company to actually do something, instead of just sitting on it and waiting un
  • Disclosure is key (Score:3, Interesting)

    by Uruk ( 4907 ) on Thursday May 10, 2007 @04:32PM (#19074017)
    The value of finding security holes is in disclosing them to everyone, particularly the affected vendor.

    The most damaging holes are the ones that only the bad guys know about. This doesn't tend to advance security in software, it just allows people to take over your machine without your permission.

    Security research or incentivization schemes that don't include a built-in mechanism to promote disclosure of the discovered problems won't help much.
    • The most damaging holes are the ones that only the bad guys know about

      I think this is where we see why this is a money laundering scheme--with big money. The "bad guys" know hundreds, maybe thousands of holes. There is no shortage of security vulnerabilities in nearly any code base in modern software. There are people who have entire libraries of text files describing vulnerabilities for whatever they want.

      Remember the semi-cynical description of job descriptions? From a random job seeker's point of view all job descriptions are things that they're seeking to fit themsel

      • About job descriptions. This is necessary because even if somebody is already doing the job, they still have to go through the same hiring process, at least thats the way it works in government and many other organizations. I knew a guy who was a contract for a couple years, and they decided to turn his position into a full time job. They had to hold interviews and everything. Even though they were perfectly happy with his work, and he would require no training or time to learn about how things worked,
      • Re: (Score:3, Informative)

        by drinkypoo ( 153816 )

        In reality, though, job descriptions are the result of careful, diligent, and deliberate definition by HR departments who already have a candidate in mind.

        Careful? Yes. Deliberate? Maybe. Diligent? Usually not, which is why we end up with ads requiring a decade of .NET experience or similar.

        Usually the HR department knows jack diddly shit about the job they're writing requirements for. And if you hand them requirements that actually fit the position, they'll rewrite them anyway.

    • Re: (Score:2, Interesting)

      by EvanED ( 569694 )
      The most damaging holes are the ones that only the bad guys know about.

      And:
      The second most damaging holes are the ones that both the bad guys and the developers know about, but no one else does
      The third most damaging holes are the ones that everyone knows about
      The fourth most damaging holes are the ones that only the developers know about

      If you reveal an exploit, you know that you are in the third state. If you do not reveal an exploit to the public, but only tell the developers, you might have made things
      • What happens when you have enough exploits that you can't take them on a case by case scenario? I know the usual answer is "Scrap the code and start over! It was shite to begin with". What about if the code is already in production? What about if it's vital system software? What if you just fired all the contract coders because you figured the job was done, realized the code was garbage, and don't have enough cash to hire coders to fix it? I think your system has a very well thought out and quite frankly lo
  • by morgan_greywolf ( 835522 ) * on Thursday May 10, 2007 @04:33PM (#19074027) Homepage Journal
    'Responsible disclosure' is a euphemism for 'we can't fix bugs fast enough, so if you keep the vulnerabilities a secret, it'll help us to save face.' And more time often means months, not days. Responsible disclosure is nothing more than security through obscurity. And security through obscurity is as good as no security at all. In the intervening months, you have a live, exploitable hole sitting there ripe for attack! And not just on that one system -- every like-configured system is vulnerable. I say, damn the consequences. Report as soon as possible no matter who it embarrasses. It'll either put more pressure on them to fix the bugs faster, or push users to more secure platforms, where security fixes don't take months and are usually found before their ever exploited in the wild.
    • by malcomvetter ( 851474 ) on Thursday May 10, 2007 @04:51PM (#19074379)

      'Responsible disclosure' is a euphemism for 'we can't fix bugs fast enough, so if you keep the vulnerabilities a secret, it'll help us to save face.'

      Wrong. Responsible Disclosure is an attempt to curb the greater than linear complexity associated with testing patches.

      If a bug is found in product X, then all applications that reside upon product X need to be validated as functional. In an enterprise, that could include applications plus interfaces that are unique to that organization. Most studies on code complexity find that complexity increases at a greater than linear clip. Responsible Disclosure is the opportunity to level the playing field between the "good guys" and the "bad guys" (deliberately avoiding hat color references).

      Anyone who claims Full Disclosure is the best for his company is:
      A) Not a sysadmin at all
      B) A lazy sysadmin who refuses to fully test patches
      -OR-
      C) A vulnerability pimp (e.g. IDS, AV, Vuln Assessment, etc.)

      • by Rich0 ( 548339 )
        And how are all these organizations supposed to test their code if they don't know there is a vulnerability, or what might have changed in a product they just got a patch for?

        I am under the impression that a bunch of fortune 500s want security bugs disclosed to software vendors and a select group of companies including themselves, and to nobody else. The problem is that EVERYBODY wants to be one of those select companies, which means the bug gets out anyway. So now the bugs leak out to those who would exp
    • Report as soon as possible no matter who it embarrasses.

      Oh, please. Responsible disclosure isn't about who it embarasses; this isn't high school. It's about lost data and compromised systems of real people and real companies.

      What you're preaching is a form of Econ 101 [joelonsoftware.com] -- if we incentivize security patching via reputation, you'll have more people fixing their holes. Maybe, but regardless, I think you'll just have more people changing the definition of what constitutes a vulnerability.

    • by 99BottlesOfBeerInMyF ( 813746 ) on Thursday May 10, 2007 @05:03PM (#19074595)

      Responsible disclosure is nothing more than security through obscurity. And security through obscurity is as good as no security at all.

      Actually, security through obscurity is very functional and useful as part of a security scheme. Your password is security through obscurity. Why don't you just give it to everyone if it makes no difference?

      In the intervening months, you have a live, exploitable hole sitting there ripe for attack!

      And if you disclose the wrong vulnerability to the general public you have a live, exploitable hole that everyone knows about sitting there ripe for attack. Which is better?

      Responsible disclosure is simply evaluating what is best for the security of users and disclosing i that manner. n some cases, the best thing for overall security is immediate, public disclosure to pressure the vendor into fixing the hole more quickly and to give users a chance to work around the vulnerability. In other cases, where the vendor is responsive, and ther is no easy way to mitigate the vulnerability for the end user, immediate disclosure increases the risk to users with no real benefit.

      I say, damn the consequences. Report as soon as possible no matter who it embarrasses.

      Who is embarrassed is immaterial. Ignoring the likely consequences of you disclosure method, however, is irresponsible, which is why the alternative is called "responsible disclosure."

      It'll either put more pressure on them to fix the bugs faster...

      In many cases the vendor is very motivated and goes to work with all their resources immediately. Take a look at the OmniWeb vulnerability published by the MOAB project. Omnigroup implemented a fix within a day and had it available for download, but they do the same thing for bugs disclosed to them privately. All the immediate disclosure did was give hackers more time to exploit the problem before the fix reached users. Disclosing a vulnerability to the public before sending it to a responsible and security minded development team is good for no one but blackhats. Also, rushing vendors to write code faster, can result in more bugs in said code, including other vulnerabilities or or bugs.

      ...or push users to more secure platforms where security fixes don't take months and are usually found before their ever exploited in the wild.

      Please. Most users will not switch platforms because of security issues and many are locked into MS's desktop monopoly by some software they absolutely need and price constraints. The vast majority of users never even hear about security vulnerability disclosure in the first place.

      Here's a tip for you from someone who does work in the security industry. If you're looking for a job in the field, don't expose your irresponsible ideas about disclosure if you want a chance at being hired somewhere respectable.

      • by Weezul ( 52464 )
        Yes, but he's not an insider. He's a guy who only once used a nice canned exploit to play a prank on a friend. All we outsiders see is news about some stupid/evil company who prosecutes some nice kid for "responcible disclosure". Kids are well liked by most. Adults who beat up kids are liked by none. So a vigorous assult on those adults ability to beat people up seems best.

        Your also wrong about security issues not having an impact on platform choice. No one sane runs their web server on windows. User
      • Your password is security through obscurity.
        A password is secret, not merely obscure. It's the key that fits the lock.
      • Comment removed based on user account deletion
    • If 'they can't fix bugs fast enough,' more pressure will not help. It will hurt.

      Fixing bugs costs money.
      Saying 'Company X has all these exploitable bugs' will cost Company X money, in stock price dropping, fewer consumers, etc.
      Thus, exposing exploits can slow down the bug correction process by moving resources away from doing bug corrections. Bonus points if they lose enough money that they have to fire one of the code guys, and he takes a list of unpatched bugs with him when he goes.

      Now, you're pushing u
  • hmm (Score:3, Insightful)

    by eclectro ( 227083 ) on Thursday May 10, 2007 @04:33PM (#19074029)
    why business treats security research like a big money TV game show

    Maybe because the bugs they find are "showstoppers"?
  • What's the difference between you charging me for information , & me charging you for information ?

    You quit charging me for your information, I'll quit charging you for mine.

    Make no mistake, there's plenty of people out there perfectly willing to pay me for my information.
  • I wish... (Score:4, Funny)

    by firpecmox ( 943183 ) on Thursday May 10, 2007 @04:43PM (#19074217) Journal
    My school would do this for me so I would stop getting suspended.
  • Why not pay? (Score:3, Interesting)

    by superbus1929 ( 1069292 ) on Thursday May 10, 2007 @04:44PM (#19074235) Homepage
    Here's my view: the one and only point of trying to find a vulnerability is to find the vulnerability. You don't care how it's done, you want that vulnerability found while you still have SOME control over it instead of after it's been out in the wild, and you have to patch around it. What's the best way to find your vulnerabilities? Have outsiders working towards a prize. Not only is it good publicity, looks great on the winner's resume, you find just about everything wrong with your product. It's truly win-win.

    Anything that is the most thorough way of eventually getting the programme secure is the best way to go about it. Period.
  • Stunning (Score:3, Insightful)

    by Pheersome ( 116234 ) on Thursday May 10, 2007 @04:45PM (#19074239)
    Wow. How is it that an "ex-hacker" who now "specialises in security from the white hat side of the fence" (from the author's bio) can have so little clue about the responsible disclosure debate and the economics of modern vulnerability research? Maybe getting lambasted on Slashdot will be a wake-up call for him to actually do his homework before he spouts off.
    • Re:Stunning (Score:4, Funny)

      by merreborn ( 853723 ) on Thursday May 10, 2007 @05:19PM (#19074827) Journal

      Maybe getting lambasted on Slashdot will be a wake-up call for him to actually do his homework before he spouts off.


      Wait, you mean there are stories/authors who don't get lambasted on slashdot?
      I thought we pretty much did our best to rip every story to shreds?

  • Out of context. (Score:3, Insightful)

    by Kintar1900 ( 901219 ) on Thursday May 10, 2007 @04:45PM (#19074255) Homepage
    Nice way to take the situation out of context with the snippet here on /. I think the important question isn't whether public, for-pay security hunting is a good idea, but rather if it's ethical for an outside firm to pay for it. Would anyone have batted an eye if Apple had been the one advertising for a hack for the Mac? I don't think so, they'd probably have been lauded for having the wherewithal to offer good money to people to help them find exploits of their software.
  • by minion ( 162631 ) on Thursday May 10, 2007 @04:45PM (#19074257)
    Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
     
    Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.. Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.
     
    I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.
     
    I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass... So *#@& 'em, we just need a good way to disclose anonymously.
    • I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass

      You've got the choreography reversed

    • by 99BottlesOfBeerInMyF ( 813746 ) on Thursday May 10, 2007 @05:09PM (#19074677)

      Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.

      So Apple sued the guy who disclosed this Quicktime vulnerability? If that happened, I never heard about it. In fact, I work i the security industry and very, very rarely hear about any lawsuits, which is why they are news when they do happen.

      Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.

      Why? Would such a statement stop them from later doing it? In general companies don't sue over vulnerability disclosures, no matter whether they are immediate, or if the vendor is given time. The reason security researchers tend to give companies time to fix things is because that is what they think is best for security, overall.

      I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.

      That doesn't really work. Basically you can sue anyone for anything in the US (with very few exceptions). I don't see the need for one here since I very rarely, if ever, hear about anyone being sued for disclosing bugs.

  • Responsability (Score:2, Insightful)

    by Ariastis ( 797888 )
    They released a product with security holes in it, they should pay to have them found.



    If a construction company builds a bridge with defects that causes it to fall on someone, that someone can sue them.

    If a software company makes an insecure product, and someone gets pwned because of it, that should be allowed to sue for damages.
    Yes security holes aren't easy to find in big products, but it should never be an excuse for a company (especially those that make billions, wink wink) for them to release
    • Re:Responsibility (Score:2, Insightful)

      by Strilanc ( 1077197 )
      The problem with your argument is it's much harder to create a secure software product than it is to create a secure bridge. This is especially true because delaying construction of a bridge for a month can be done without competitors swooping in and taking the market.
    • Basically, most EULAs will leave you hanging out to dry in this regard. They'll make sure you acknowledge that the company isn't responsible for security breaches, or at the very least you waive your right to sue for damages in such an instance.
    • by dodobh ( 65811 )
      So what happens when the hole is due to the interaction between multiple components, not all of which may be provided by the same vendor? What happens if you are not running in the recommended safe mode?

      In the case of Vista, what happens if you turn UAC off?
  • Hackers? (Score:1, Informative)

    by tm2b ( 42473 )
    Of course paying hackers is a good idea, if you want to generate any interesting code... Oh, wait a minute. Slashdot has bought into the lowest common denominator usage of "hacker" to mean a cracker. And here I thought my opinion of the Slashdot moderators couldn't get any lower, after I had moderation privs revoked for daring to criticize them on other matters...
    • Please mod up the "hacker-truth, moderator-bashing" post!
    • by mjeffers ( 61490 )
      Language changes over time and the meaning of the word "hacker" is now commonly understood to mean what geeks would term "cracker". Similarly, "gay" doesn't just mean happy and when I call someone a "bastard" I don't mean that they're the product of unmarried parents. You're fighting a battle that you lost at least 10 years ago.
      • Be careful... he might CrAx0r your B0xen... hmm... It doesn't have the same kind of ring to it, does it...
      • by tm2b ( 42473 )
        Not at all. I don't expect the mehums in the mass media to pay any attention, that battle is truly lost. It was more of a rout than a battle, truthfully.

        Slashdot editors on the other hand, should know better. There are enough of us here who actually are non-cracking hackers, after all.
    • More terminological abuse from Slashdot editors:

      "Linux" instead of "GNU/Linux" (when not referring specifically to the Linux kernel)
      "piracy" instead of "copyright infringement"

      I am sure we could dig up more.
  • bug testing? (Score:4, Interesting)

    by Lord Ender ( 156273 ) on Thursday May 10, 2007 @05:19PM (#19074825) Homepage
    Buying vulnerability info from a third party is just outsourcing your QA. It's just buying testing + bug reporting.

    If a third party demands money to keep QUIET about a vulnerability, that's extortion.

    Much of the animosity here is that many security researchers specialize in breaking things--they haven't ever worked on engineering a large, complex system. They just don't understand how much time is required to test code before it is released. Also, the legal teams for many companies just don't understand that alienating security researchers by filing law suits is only going to make their situation worse.
    • by vinn01 ( 178295 )

      The problem is that bug testers want to be paid for their efforts. The companies will do anything, fair or unfair, to avoid payment.

      To any bug testers, I offer these:

      Hint: Never answer this question "What will you do if we don't pay for this information?".

      Answering that question, with nearly any answer, can lead to extortion charges.

      Next hint: Never demonstrate a vulnerability, to anyone, just document it. Written words are rarely illegal. Actions are more frequently illegal.
  • OK, let's suppose you were to have a standard "date" and didn't pay. You might think this is just dandy, perfectly fine business but in fact the hooker probably has some associates who would be willing to break your kneecaps for that money. So from that perspective, paying hookers is definitely good for business.
  • UK IT Security Journalist of the Year...

    I'm a UK cit, I work in infosec, and I've a friend who's an IT hack (er, that is, journalist :) ) I have no idea who the UKITSJotY might be. Mine non-UK SIJOTY is Bruce Schenier, same as last year and the year before that, with Peter Neumann a close second.

  • I have a better idea.

    Why not hire a professional team of assessment professionals to look at your stuff?

    I'm not talking a lame corporate-compliance team, but a highly experienced team of world-class hackers, who are employed by an extremely reputable company and managed by an experienced staff capable of communicating problems quickly and completely.

    Try this one: www.accuvant.com

    Then you don't have any of these issues.

    Of course, that wouldn't necessarily be as cheap. I think $10,000 would definitely be on
  • History (Score:1, Insightful)

    by Koby77 ( 992785 )
    "Responsible disclosure" would have been great, except that history has shown us that it usually doesn't work. When "responsible disclosure" has been tried the vulnerability has lingered (especially with the larger corporations). When the vulnerability has been openly disclosed, then suddenly the software gets a patch. If history had been different then perhaps we would give the idea consideration. But it wasn't, and it was a problem created by the software companies themselves, so here we are today rea
  • If we can create a situation where bug disclosures are maximized, the products with the most serious security problems will die, and likely take their companies with them. So if you're a company that reasonably believes your products have few if any such bugs, your smartest bet is to encourage all companies to offer rewards to hackers - if you're right about the quality of your products, it will take your competition down and leave you standing.

    As a customer, then, who should you buy from? The companies wit
  • In humans in general, you get behaviors that are rewarded or reinforced. You reward hacking, cracking and exploits and you will get more of it. Mostly focused in directions you didn't even dream of originally.

    And the new crop of victims will never know who to thank.
    • This is precisely why it is a good idea to pay hackers. There are many rewards already available to the exploiters, yet none of these reward disclosure to the affected parties. If we reward disclosure we can expect to see much more of it. The longer that bugs go undisclosed, the longer malicious hackers will have their zero day exploits. With readily available bounties, these exploits will be much harder to pass around the underground without someone leaking the info. This will close down communications bet

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...