Full-Disclosure Wins Again 122
twistedmoney99 writes "The full-disclosure debate is a polarizing one. However, no one can argue that disclosing a vulnerability publicly often results in a patch — and InformIT just proved it again. In March, Seth Fogie found numerous bugs in EZPhotoSales and reported it to the vendor, but nothing was done. In August the problem was posted to Bugtraq, which pointed to a descriptive article outlining numerous bugs in the software — and guess what happens? Several days later a patch appears. Coincidence? Probably not considering the vendor stated "..I'm not sure we could fix it all anyway without a rewrite." Looks like they could fix it, but just needed a little full-disclosure motivation."
Re:The difference (Score:3, Interesting)
B.
Two basic problems (Score:3, Interesting)
They are just left in their ignorance with the potential for being exploited.
The "I want to do bad things" community has the information and is paying attention. Their community gets advance information before there even is a fix and they get to evaluate if it is worth their efforts to exploit it.
The other group that gets to benefit from full disclosure is the media. Starved for news of any sort, bad news is certainly good news for them.
All in all, full disclosure is simply blackmail. Unfortunately, no matter what the result is the user of the product affected gets all of the negative attributes. Their instance of the product isn't fixed because unless they are paying attention they don't know. They get to lose support if the company decides to pull the product rather than kneel to the blackmail. If the bug is exploited the end user get to suffer the consequences.
You can think this would justify eliminating exclusions for damages for software products. There isn't any way this would fly in the US because while we like to think we're as consumer-friendly as the next country, the truth is this would expose everyone to unlimited liability for user errors. Certainly unlimited litigation even if it was finally shown to be a user error which is by no means certain. And do not believe for a moment that you could somehow exclude software given away for free from damages. If you have an exclusion for that you would find all software being free - except it would be costly to connect to the required server for a subscription or something like that. Excluding free software would be a loophole that you could drive a truck through.
why drag lawyers and the government into this? (Score:1, Interesting)
a) change was needed
b) public was unaware
c) individual wanted change
d) individual alerted a portion of the public
e) change was made.
No lawyers, no State, no violation of freedoms, no taxes, no fines.
Require login, forbid any subdirectory access. (Score:5, Interesting)
Here's how I've solved this problem:
1) Modify the htaccess (or even better, the httpd.conf) files, so that ANY access to any of the subdirectories of the main app is forbidden. The only exceptions are: a) submodule directories, whose php files do a login check, or b) common images (i.e. logos)
2) The only way to view your files is through the web application's PHP file lister and downloader. This should be child's play for anyone with PHP knowledge: PHP has the fpassthru function, or if you're memory-savvy, use standard fopen. Make sure the lister doesn't accept directories above the ones you want to list, and for the files use the basename() function to strip them from subdirectories.
3) Any file in the PHP application MUST include() your security file (which checks if the user has logged in and redirects them to the login page otherwise). For publicly-available pages, add an anonymous user by default.
4) For log in (if not for the whole app), require https.
4a) If you can't implement https, use a salt-based login, with SHA-256 or at least MD5 for the password encryption.
5) Put the client's IP in the session variables, so that any access to the session from a different IP gets redirected to the login page (with a different session id, of course).
6) After log in, regenerate the session id.
7) Put ALL the session variables in the SESSION array, don't use cookies for ANYTHING ELSE.
I consider these measures to be the minimum standard for web applications. It shocks me that commonly used apps still fail to implement them properly.
How many? (Score:3, Interesting)
Re:The difference (Score:1, Interesting)
False assumptions? (Score:3, Interesting)
Ironically, the full disclosure probably forced them to put out the solution before it was ready, leaving the risk of new bugs. IMHO, forcing a company to rush a fix is not the answer. If you work for a real software company, you know that today's commercial software often has thousands of bugs lurking, although many are very edge case and are often more dangerous to fix than not fix (esp if there is a workaround).
There should be enough time given to a company to address the issue. Some can argue whether or not 5 months is enough time, but that's a different argument. I think forcing companies to constantly drop everything for threat of full disclosure will end up doing more harm than good.
Re:Adopt the cryptographer threat model (Score:3, Interesting)
For example, read up on the ongoing attacks on AACS. The black hats (and yes, they are black hats) working on breaking AACS have exploited all kinds of software and hardware bugs and shortcomings in order to gather more information and cryptographic secrets. They have the upper hand because they are not fully disclosing their work. If they were to fully disclose the bugs in various tabletop HD-DVD players and software tools that they use to garner keys, you can bet that the problems would be fixed. As is, though, they are still ahead of the AACSLA.
True economics (Score:1, Interesting)
The thinking seems to be something like this: when a bug is disclosed, blackhats that were unaware of the bug become informed and have a window of exploitation until the bug is patched.
This seems absurd to me. As soon as the bug is disclosed, users become aware and can immediately block the vulnerability. If there is no other solution, they could at least stop using the vulnerable software. So the window of exploitation is the amount of time from the disclosure to widespread awareness and shutdown of buggy software.
Some would say that it is over-simplistic to think that you can just shut down vulnerable software. Some might claim that it just isn't practical. I think what this argument really means is that it could be very costly to some enterprises to have to shutdown some vulnerable systems. The system administrators would have to weigh the costs of shutting down against the costs of being exploited.
Full disclosure is really just an economic issue. Full disclosure highlights the costs of using buggy software. Distributors of more buggy software may not appreciate the reflection on the total costs of using their software. Some businesses and people may not appreciate the forced realization of the total costs of the software that they use.
Some people may tweak their bathroom scales to make them feel better about the total costs of their dietary habits. But they shouldn't rant about standard, untweaked scales being unethical in their methods of disclosure.
If the truth about the software you develop or use is uncomfortable, don't try to cover it up. Hiding your eating disorder doesn't solve the problem.
You can't make the best economic decisions unless you recognize the true economics of your software choices.
Call me sceptical (Score:3, Interesting)
If so, that is some AMAZING response time. But I would venture a guess that they had already been working on the corrections. The public posting may have made a couple of coders work over time, and cut the testing phase out of the cycle, but for them to do the whole thing in less than 7 days is highly unlikely.
Not only that, but since they would have either had to cut short, or cut out entirely the testing phase of the release, it is MORE likely that security issues remain, or that new security issue have been created and not found.
I'm not sure I'd call this one a "win" just yet.
-Rick
Re:False assumptions? (Score:3, Interesting)
I didn't see the original email he sent to the company. Nor did I see mention of followups to try and push them. That makes a difference as well, because I've seen plenty of "your stuff doesn't work" bug reports from folks.
What fully disclosing probably did was put the company in fire mode. They had to stop everything else to attend this. This can really hurt smaller companies long term. Most can't afford teams that sit around waiting to attack these flaws.
I do think full disclosure can be an important tool when you've tried again and again to get an important security issue addressed. But it should never come as a surprise to the company. There should be communication with the company throughout the process from the first report to the alert that you will be making this public in a month's time.
I think it more harmful than not to try and play "gotcha" with companies.
Now mind you, I'm not sure their one time fee model will last all that long -- but that's a separate issue.