Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security

Full-Disclosure Wins Again 122

twistedmoney99 writes "The full-disclosure debate is a polarizing one. However, no one can argue that disclosing a vulnerability publicly often results in a patch — and InformIT just proved it again. In March, Seth Fogie found numerous bugs in EZPhotoSales and reported it to the vendor, but nothing was done. In August the problem was posted to Bugtraq, which pointed to a descriptive article outlining numerous bugs in the software — and guess what happens? Several days later a patch appears. Coincidence? Probably not considering the vendor stated "..I'm not sure we could fix it all anyway without a rewrite." Looks like they could fix it, but just needed a little full-disclosure motivation."
This discussion has been archived. No new comments can be posted.

Full-Disclosure Wins Again

Comments Filter:
  • Re:The difference (Score:3, Interesting)

    by toQDuj ( 806112 ) on Wednesday August 15, 2007 @11:53AM (#20237573) Homepage Journal
    Sure, seatbelts are a prime example, but I've also seen recalls for much more mundane stuff, such as Ikea furniture and kiddie toys. A bug in software could really cause problems, albeit probably indirectly.

    B.
  • Two basic problems (Score:3, Interesting)

    by cdrguru ( 88047 ) on Wednesday August 15, 2007 @12:02PM (#20237673) Homepage
    Full disclosure results in announcing a bug not to the world, but only to people that are paying attention. Does this include all the users of that software? No, not even most of them. So who gets informed? People looking for ways to do bad things. The user's do not hear about the defect, the potential exploit or the fix that corrects it.

    They are just left in their ignorance with the potential for being exploited.

    The "I want to do bad things" community has the information and is paying attention. Their community gets advance information before there even is a fix and they get to evaluate if it is worth their efforts to exploit it.

    The other group that gets to benefit from full disclosure is the media. Starved for news of any sort, bad news is certainly good news for them.

    All in all, full disclosure is simply blackmail. Unfortunately, no matter what the result is the user of the product affected gets all of the negative attributes. Their instance of the product isn't fixed because unless they are paying attention they don't know. They get to lose support if the company decides to pull the product rather than kneel to the blackmail. If the bug is exploited the end user get to suffer the consequences.

    You can think this would justify eliminating exclusions for damages for software products. There isn't any way this would fly in the US because while we like to think we're as consumer-friendly as the next country, the truth is this would expose everyone to unlimited liability for user errors. Certainly unlimited litigation even if it was finally shown to be a user error which is by no means certain. And do not believe for a moment that you could somehow exclude software given away for free from damages. If you have an exclusion for that you would find all software being free - except it would be costly to connect to the required server for a subscription or something like that. Excluding free software would be a loophole that you could drive a truck through.
  • by Anonymous Coward on Wednesday August 15, 2007 @12:28PM (#20238013)
    Here is a PERFECT example where
    a) change was needed
    b) public was unaware
    c) individual wanted change
    d) individual alerted a portion of the public
    e) change was made.

    No lawyers, no State, no violation of freedoms, no taxes, no fines.

  • I saw the vulnerability page. They don't have access restriction to subdirectories.

    Here's how I've solved this problem:

    1) Modify the htaccess (or even better, the httpd.conf) files, so that ANY access to any of the subdirectories of the main app is forbidden. The only exceptions are: a) submodule directories, whose php files do a login check, or b) common images (i.e. logos) /CSS/XSLT/javascript dirs.

    2) The only way to view your files is through the web application's PHP file lister and downloader. This should be child's play for anyone with PHP knowledge: PHP has the fpassthru function, or if you're memory-savvy, use standard fopen. Make sure the lister doesn't accept directories above the ones you want to list, and for the files use the basename() function to strip them from subdirectories.

    3) Any file in the PHP application MUST include() your security file (which checks if the user has logged in and redirects them to the login page otherwise). For publicly-available pages, add an anonymous user by default.

    4) For log in (if not for the whole app), require https.

    4a) If you can't implement https, use a salt-based login, with SHA-256 or at least MD5 for the password encryption.

    5) Put the client's IP in the session variables, so that any access to the session from a different IP gets redirected to the login page (with a different session id, of course).

    6) After log in, regenerate the session id.

    7) Put ALL the session variables in the SESSION array, don't use cookies for ANYTHING ELSE.

    I consider these measures to be the minimum standard for web applications. It shocks me that commonly used apps still fail to implement them properly.
  • How many? (Score:3, Interesting)

    by benhocking ( 724439 ) <benjaminhocking@nOsPAm.yahoo.com> on Wednesday August 15, 2007 @12:58PM (#20238365) Homepage Journal

    There *are* two genuine conflicting sides here and you can't just wave one of them away.
    I can count at least 3, and I wouldn't be surprised if there aren't a lot more. Between only telling the company about a discovered security flaw and immediately announcing it to the entire world is a whole range of possibilities. To name a few:
    • Initially tell only the company. If they do nothing, then release it to everyone.
    • Initially tell only the company, but tell them that you will release it to everyone in X days.
    • Initially tell the company and CC a few other white hats that you trust.
    • Initially tell the company and CC the better business bureau, etc.
    (By "CC" I'm implying that you're letting the company know that you're telling other people.)
  • Re:The difference (Score:1, Interesting)

    by xmarkd400x ( 1120317 ) on Wednesday August 15, 2007 @12:58PM (#20238371)
    The difference might not be as big as you think. Most hardware that can harm people has a very specific application. If you modify it or use it outside of its intended use, the manufacturer has no liability. For instance: if your seatbelt won't fit around your belly, and you cut it sewing some cloth in to make it longer. You get in an accident, and you die because the seatbelt broke. This is by no means the fault of the manufacturer. Now, how this applies to software: If software was to become liable for any bug whatsoever, vendors would start making similar claims. Their software would be able to be used only on certified operating systems, and only with certified software. Any attempts to modify the source, binaries, or memory while in execution will cause the user to assume all liability. I don't think those scenarios are all that different.
  • False assumptions? (Score:3, Interesting)

    by mmeister ( 862972 ) on Wednesday August 15, 2007 @01:10PM (#20238543)
    There seem to be some false assumptions here. It is assumed the company did not look at the bug and potential fixes until after it was "fully disclosed". If they released a fix a couple days later, the more likely scenario is that they've been looking at the problem and assessing what options they had to address the problem.

    Ironically, the full disclosure probably forced them to put out the solution before it was ready, leaving the risk of new bugs. IMHO, forcing a company to rush a fix is not the answer. If you work for a real software company, you know that today's commercial software often has thousands of bugs lurking, although many are very edge case and are often more dangerous to fix than not fix (esp if there is a workaround).

    There should be enough time given to a company to address the issue. Some can argue whether or not 5 months is enough time, but that's a different argument. I think forcing companies to constantly drop everything for threat of full disclosure will end up doing more harm than good.
  • by MostAwesomeDude ( 980382 ) on Wednesday August 15, 2007 @01:33PM (#20238825) Homepage
    I went back and looked at some statistics for my Subversion logs and bug tracker. I find that roughly 11% of bugs were "discovered;" that is, filed first, by me. That means a whopping 89% of programming errors went unnoticed by me, and were found by the community. Now, I may be a lone maintainer of code, but even in a team, bugs will still get past. The assumption that the public, or at a minimum, the black-hat community knows more about your bugs than you do is not unreasonable. It is just as valid in the context of SQL injections in PHP scripts as it does in the context of buffer overflows in hardware DVD players.

    For example, read up on the ongoing attacks on AACS. The black hats (and yes, they are black hats) working on breaking AACS have exploited all kinds of software and hardware bugs and shortcomings in order to gather more information and cryptographic secrets. They have the upper hand because they are not fully disclosing their work. If they were to fully disclose the bugs in various tabletop HD-DVD players and software tools that they use to garner keys, you can bet that the problems would be fixed. As is, though, they are still ahead of the AACSLA.
  • True economics (Score:1, Interesting)

    by Anonymous Coward on Wednesday August 15, 2007 @02:12PM (#20239331)
    There seems to be this strange notion that blackhats benefit from full disclosure.

    The thinking seems to be something like this: when a bug is disclosed, blackhats that were unaware of the bug become informed and have a window of exploitation until the bug is patched.

    This seems absurd to me. As soon as the bug is disclosed, users become aware and can immediately block the vulnerability. If there is no other solution, they could at least stop using the vulnerable software. So the window of exploitation is the amount of time from the disclosure to widespread awareness and shutdown of buggy software.

    Some would say that it is over-simplistic to think that you can just shut down vulnerable software. Some might claim that it just isn't practical. I think what this argument really means is that it could be very costly to some enterprises to have to shutdown some vulnerable systems. The system administrators would have to weigh the costs of shutting down against the costs of being exploited.

    Full disclosure is really just an economic issue. Full disclosure highlights the costs of using buggy software. Distributors of more buggy software may not appreciate the reflection on the total costs of using their software. Some businesses and people may not appreciate the forced realization of the total costs of the software that they use.

    Some people may tweak their bathroom scales to make them feel better about the total costs of their dietary habits. But they shouldn't rant about standard, untweaked scales being unethical in their methods of disclosure.

    If the truth about the software you develop or use is uncomfortable, don't try to cover it up. Hiding your eating disorder doesn't solve the problem.

    You can't make the best economic decisions unless you recognize the true economics of your software choices.
  • Call me sceptical (Score:3, Interesting)

    by RingDev ( 879105 ) on Wednesday August 15, 2007 @02:15PM (#20239385) Homepage Journal
    I'm not familiar with the software in question, but are they meaning to say that the company did nothing for a month, then they posted the vulnerabilities publicly, and in less than 7 days the company became aware of the post, tested the vulnerabilities, designed a solution, corrected the code, and had a software update tested and ready for deployment?

    If so, that is some AMAZING response time. But I would venture a guess that they had already been working on the corrections. The public posting may have made a couple of coders work over time, and cut the testing phase out of the cycle, but for them to do the whole thing in less than 7 days is highly unlikely.

    Not only that, but since they would have either had to cut short, or cut out entirely the testing phase of the release, it is MORE likely that security issues remain, or that new security issue have been created and not found.

    I'm not sure I'd call this one a "win" just yet.

    -Rick
  • by mmeister ( 862972 ) on Wednesday August 15, 2007 @03:15PM (#20240169)
    Sorry, I was trying to make a more generic argument, and clearly flubbed that. My original point is that we will likely to more long term damage if all we do is bully companies. Believe it or not, there is more going on that just folks sitting waiting to fix bug reports that comes in for some random guy. And with smaller companies, they don't have a team that is on the attack for vulnerabilities found.

    I didn't see the original email he sent to the company. Nor did I see mention of followups to try and push them. That makes a difference as well, because I've seen plenty of "your stuff doesn't work" bug reports from folks.

    What fully disclosing probably did was put the company in fire mode. They had to stop everything else to attend this. This can really hurt smaller companies long term. Most can't afford teams that sit around waiting to attack these flaws.

    I do think full disclosure can be an important tool when you've tried again and again to get an important security issue addressed. But it should never come as a surprise to the company. There should be communication with the company throughout the process from the first report to the alert that you will be making this public in a month's time.

    I think it more harmful than not to try and play "gotcha" with companies.

    Now mind you, I'm not sure their one time fee model will last all that long -- but that's a separate issue.

Scientists will study your brain to learn more about your distant cousin, Man.

Working...