Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Bug

Are Bug Bounties the Right Solution For Improving Security? 58

saccade.com writes Coding Horror's Jeff Atwood is questioning if the current practice of paying researchers bounties for the software vulnerabilities they find is really improving over-all security. He notes how the Heartbleed bug serves as a counter example to "Linus's Law" that "Given enough eyeballs, all bugs are shallow." "...If you want to find bugs in your code, in your website, in your app, you do it the old fashioned way: by paying for them. You buy the eyeballs. While I applaud any effort to make things more secure, and I completely agree that security is a battle we should be fighting on multiple fronts, both commercial and non-commercial, I am uneasy about some aspects of paying for bugs becoming the new normal. What are we incentivizing, exactly?
This discussion has been archived. No new comments can be posted.

Are Bug Bounties the Right Solution For Improving Security?

Comments Filter:
  • by vidarlo ( 134906 ) <vidarlo@bitsex.net> on Saturday April 04, 2015 @10:58AM (#49404407) Homepage

    NSA is buying security holes [techdirt.com] to use against us. This is part of what Snowden revealed with the leaks.

    Offering a bounty, even though it is not as much as the security problem could fetch on the grey market, creates a certain loyalty towards the vendor, and makes it easier to go to them, and ensure the hole gets patched. It also attracts more eyeballs to your software, as finding a problem means money. Google has gone even further - by offering grants [google.no] for research into specific products, where you get money for checking security of the software, not just finding security prolems.

    So I believe it is a good thing; it probably means more holes will be reported directly to the vendor, and not sold for exploit. It probably attracts eyeballs as well...

  • by Rosco P. Coltrane ( 209368 ) on Saturday April 04, 2015 @10:59AM (#49404409)

    It pays better to exploit the bugs...

  • by QuietLagoon ( 813062 ) on Saturday April 04, 2015 @11:00AM (#49404413)

    ... He notes how the Heartbleed bug serves as a counter example to "Linus's Law" that "Given enough eyeballs, all bugs are shallow."...

    I think the big issue with the Heartbleed bug was that the OpenSSL code base was so egregiously poorly written and maintained that eyeballs started bleeding whenever they looked at it. imo, the OpenSSL code base never had enough eyeballs looking at it to make its bugs shallow. It was painful to look at, so eyeballs avoided looking at it.

    .
    I still think that Linus' Law hold true, or at least is a very good guideline. I think exceptions like the OpenSSL code base are needed to hone the point that Linus' Law makes.

    I also take issue with the headline, as I do not think there is any one right solution for improving security. The improvement of security is a multi-faceted endeavour and an ongoing process.

    • by Rosco P. Coltrane ( 209368 ) on Saturday April 04, 2015 @11:08AM (#49404465)

      I think the only thing the OpenSSL bug shows is how flimsy the underlying framework of the internet is. Most of the shit we all use, trust and take for granted was coded in someone's basement over the weekend a long time ago. All it takes is one clever guys to take a good look at the code to exploit it, and it's probably fair to say he'll be the only one to review the code ever...

      • by Anonymous Coward

        I couldn't agree more. Just last month we learned that NTP is maintained by a single person who does it on his own time and dime (and who could stop any time to look for a paying job). I opened the changelog and there is a lot of ongoing activity, so it's not like the person is just answering the occasional email.

        While I am very much into open source, and I don't blame it for this, I do believe that there is a tendency associated with it to take certain things for granted. That makes some of the biggest

        • ... Just last month we learned that NTP...

          NTP.org is its own problem. Even when there was more than a single person maintaining it, the development looked less than favorably upon code improvement suggestions from the community.

          ... I do believe that there is a tendency associated with [open source] to take certain things for granted....

          You've hit on the main problem. It is not open source, per se, as you imply in your message. It is community involvement.

          .
          Where you have a community that is involved and stays involved, bugs are shallow. When you have a community, such as NTP.org, where suggestions are pushed away, bugs become very deep.

      • ...Most of the shit we all use, trust and take for granted was coded in someone's basement over the weekend a long time ago. ...

        ... and the code written in nice air-conditioned offices in Redmond, Washington has shown itself to be so much more secure over the years ....

    • by OzPeter ( 195038 ) on Saturday April 04, 2015 @11:15AM (#49404497)

      I think the big issue with the Heartbleed bug was that the OpenSSL code base was so egregiously poorly written and maintained that eyeballs started bleeding whenever they looked at it. imo, the OpenSSL code base never had enough eyeballs looking at it to make its bugs shallow. It was painful to look at, so eyeballs avoided looking at it.

      I agree. Heatbleed is not a counter example, it is simply evidence that the original "Linus's Law" was not complete. A better version of it would be

      Given enough eyeball hours, all bugs are shallow

      With the definition of "enough" being dependent on the complexity of the code in question.

      • A better version of Linus' Law would be the original one.

        So, if rapid releases and leveraging the Internet medium to the hilt were not accidents but integral parts of Linus's engineering-genius insight into the minimum-effort path, what was he maximizing? What was he cranking out of the machinery?

        Put that way, the question answers itself. Linus was keeping his hacker/users constantly stimulated and rewardedâ"stimulated by the prospect of having an ego-satisfying piece of the action, rewarded by the sig

        • 8. Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.

          Seems to work for Firefox. Every time there's a N.0.0 release, there's a N.0.1 release in less than a week - every... freaking... time. I'd wish they'd focus on getting things done correctly rather than quickly and churning out new major version numbers.

    • I think the big issue with the Heartbleed bug was that the OpenSSL code base was so egregiously poorly written and maintained that eyeballs started bleeding whenever they looked at it. imo, the OpenSSL code base never had enough eyeballs looking at it to make its bugs shallow. It was painful to look at, so eyeballs avoided looking at it.

      That's really just speculation.

      So let's everyone ask ourselves this question: how many times do we personally browse open source code, looking for vulnerabilities or other bugs?

      Let me guess that the answer is: I mostly run precompiled binaries, and might rarely take look at a particular small piece of code to solve a specific problem (which I came across by running the binary).

      I suggest that it just is likely that most OSS projects are like OpenSSL: only the core developers take a look at the codebase.

      My so

      • ...So let's everyone ask ourselves this question: how many times do we personally browse open source code...

        "Everyone" does not need to do it. You've set up a premise that fails on face value.

        • Ok, good counterargument, but I still suspect that the amount is extremely low outside the main developer team.

          The romantic vision of hackers around the world sitting comfortably next to a fireplace with a ThinkPad and browsing source code in the evening is just a fantasy...

    • A shallow bug is one that can be fixed, or at least understood and described, quickly, easily, or simply.

      That doesn't mean the bugs will be found, it characterizes what happens after they are found.

      I don't believe Linus' Law has anything to do with the number of bugs *found*, rather bugs *fixed*.

      It is the open source community that says more bugs will be found because anyone can read the source - but then no one reads the source. And then people (mis) understand that Linus' Law somehow means that all bugs

    • "He notes how the Heartbleed bug serves as a counter example to "Linus's Law" that "Given enough eyeballs, all bugs are shallow."

      He falsely notes that, since Linus' law has absolutely nothing to do with it. He seems to think Linus said: If enough people work on a project, there won't be any bugs.' Linus' Law refer to the ability to track down, understand, and fix a bug once it has been discovered.

      And my first thought when I read the title was in line with yours. What a stupid question. Just as with secu

  • No, but it is one pathway out of several.
    • -1 redundant

      So we should view bug bounty programs as an additional angle of attack, another aspect of "defense in depth", perhaps optimized a bit more for commercial projects where there is ample money. And that's OK.

      That's from the fucking article. You're not helping.

  • Why aren't bounties considered a good idea? Yes, we've always argued that it gives developers an incentive to intentionally insert bugs so that they may profit off of them later. But are there any documented cases of that actually happening? For one, the authors of the code should automatically be prohibited from profiting by their mistakes. For another, there is a huge chance someone else would find and submit the bug first, and only the first finder should get rewarded. In human behavior, as a general ru
    • The problem is, Jeff is uncomfortable with the idea. That's the whole of the foundation in the linked article. But there is this point:

      The incentives feel really wrong to me. As much as I know security is incredibly important, I view these interactions with an increasing sense of dread because they generate work for me and the returns are low.

      Not all reports of security issues will be real issues, and if you offer bounties some people will be looking for an easy payout.

      Most of the article is useless junk:

    • It may be an effective component to your total bug strategy, but it should be the last on the list. The primary effort should be oriented to not releasing the bugs to begin with.

      Let's say I create an adversarial system in my company. I pay developers a base salary plus an at-risk bonus for delivery of software QA by the deadline. If they deliver before the deadline, the at-risk bonus increases.
      QA has base salary plus can earn that at-risk bonus by finding the bugs between when it's delivered to them for

  • ... is the best solution. Nothing gets them to fix the bug besides liability. The fear of lawsuits, embarrassment, etc... that gets them to take it seriously. Nothing else.

    MS had some bugs that they knew about for a decade that they didn't patch.

    You jump right to setting their nuts on fire.

  • by anwyn ( 266338 ) on Saturday April 04, 2015 @11:16AM (#49404509)
    Free software, yes. Bug bounties, maybe.

    But recent developments have made clear that securable hardware is sine qua non. All firmware must be in socketed memory, so that you can take it out and externally check it. You can't trust an untrusted system to check itself. All firmware must be protectable with a hardware readonly jumper or switch.

    I know that this is inconvenient and a revolution on how hardware is currently made. but if people started demanding it en mass, it would not cost very much. And I mean the firmware in disk drives and optical media players and especially routers.

    There may be other requirements.

    This is sine qua non. Without this we have nothing.

    • What I personally think is really scary is that a lot of devices in our PCs are ready to accept new firmware at any moment. There usually are no safeguards that I can enable to prevent malicious code being injected to core components like BIOS, CPU microcode, HDD, DVD...

      Now, in general, hardware security is a tricky concept, because currently the hardware layer is simply fully trusted.

      • by spauldo ( 118058 )

        Back in the old days, you'd generally have to either replace the physical chip or at least move a jumper in order to write to firmware.

        The current status quo is simply to lower costs for support; you don't have to send out a tech to update someone's firmware, you can just have them download a file and run it. Companies can save money on QA because you can fix mistakes later on. To make matters worse, the vast majority of consumers don't understand the problem.

        I agree with you 100%, but this is money we're

        • Good point. Practicality trumps security here.
        • Adding a write-protect jumper only costs a few cents.
          And if you want to keep the convenience of downloadable upgrades, don't install the jumper.
          Sure, only a few percent of us would buy a motherboard because the BIOS had the option to add a write protect jumper, but that's still a few percent more sales.
          Plus it's a marketable difference - if you've got it and your competitors don't, then you can use scare tactics;
          "Unlike our competitors, we care about your computer's health, that's why all our motherboards h

          • Adding a write-protect jumper only costs a few cents.
            1cent times 1 billion devices is 10million "lost profit".

            • And 1 percent less sales * 1 billion devices * $100 a board is $1 billion in "lost profit".

              If the jumper costs %0.1 of the profit on the device, then it only needs to improve sales by %0.1.

    • back when you need to burn eproms code was better on release there where still updates but the base code. Even with older dos / windows game where mostly done on release with some updates. Now days things are more buggy and software comes out with things that will be added later with updates that are easy to install.

  • It's either someone who will fix the bug or someone who plans to exploit it.

  • by slashmydots ( 2189826 ) on Saturday April 04, 2015 @12:08PM (#49404749)
    Any expense that scales with the job it does or problems it solves is a good thing. Why pay someone XX per hour to sort books in a book store if you can pay them $0.05 per book and ensure static, scaled costs that tie to inventory cycling (aka sales) directly. Then again, you don't know how many bugs are in your program.
  • What about better QA and no don't have the dev's do testing.

    Also the QA needs to have full access to what they are testing. They need to be able stuff that end users can do as well being able to manually set up the system / data in different ways not only to make easier / faster to test out some modes but to also to setup for some unusual modes / settings.

    QA needs to be able to think out of the box and it does not need to be some job that is just a side job to some ones other job.

  • by xlsior ( 524145 ) on Saturday April 04, 2015 @03:16PM (#49405727) Homepage
    ...was covered almost 20 years ago by Scott Adams: http://dilbert.com/strip/1995-... [dilbert.com]

"Engineering without management is art." -- Jeff Johnson

Working...