Heartbleed Sparks 'Responsible' Disclosure Debate 188
bennyboy64 writes: "IT security industry experts are beginning to turn on Google and OpenSSL, questioning whether the Heartbleed bug was disclosed 'responsibly.' A number of selective leaks to Facebook, Akamai, and CloudFlare occurred prior to disclosure on April 7. A separate, informal pre-notification program run by Red Hat on behalf OpenSSL to Linux and Unix operating system distributions also occurred. But router manufacturers and VPN appliance makers Cisco and Juniper had no heads up. Nor did large web entities such as Amazon Web Services, Twitter, Yahoo, Tumblr and GoDaddy, just to name a few. The Sydney Morning Herald has spoken to many people who think Google should've told OpenSSL as soon as it uncovered the critical OpenSSL bug in March, and not as late as it did on April 1. The National Cyber Security Centre Finland (NCSC-FI), which reported the bug to OpenSSL after Google, on April 7, which spurred the rushed public disclosure by OpenSSL, also thinks it was handled incorrectly. Jussi Eronen, of NCSC-FI, said Heartbleed should have continued to remain a secret and be shared only in security circles when OpenSSL received a second bug report from the Finnish cyber security center that it was passing on from security testing firm Codenomicon. 'This would have minimized the exposure to the vulnerability for end users,' Mr. Eronen said, adding that 'many websites would already have patched' by the time it was made public if this procedure was followed."
Re:No Good Solution. (Score:4, Interesting)
Standard means jack. As long as there is no good reason (like, say, avoiding a fine that breaks your back or jail time) bugs like that are not being told, they're being sold.
wtf ? (Score:4, Interesting)
IT security industry experts are beginning to turn on Google and OpenSSL, questioning whether the Heartbleed bug was disclosed 'responsibly.
Are you fucking kidding me? What kind of so-called "experts" are these morons?
Newflash: The vast majority of 0-days are known in the underground long before they are disclosed publicly. In fact, quite a few exploits are found because - drumroll - they are actively being exploited in the wild and someone's honeypot is hit or a forensic analysis turns it up.
Unless you have really, really good reasons to assume that this bug is unknown even to people whose day-to-day business is to find these kinds of bugs, there is nothing "responsible" in delaying disclosure. So what if a few script-kiddies can now rush a script and do some shit? Every day you wait is one day less for the script kiddies, but one day more for the real criminals.
Stop living in la-la-land or in 1985. The evil people on the Internet aren't curious teenagers anymore, but large-scale organized crime. If you think they need to read advisories to find exploits, you're living under a rock.
Re:WTF? (Score:5, Interesting)
The only possible way is to disclose to the responsible manufacturer (OpenSSL) and nobody else first, then, after a delay given to the manufacturer to fix the issue, disclose to everybody. Nothing else works. All disclosures to others have a high risk of leaking. (The one to the manufacturer also has a risk of leaking, but that cannot be avoided.)
It's not about leaking. The reason I'm not alone in the security community to rage against this "responsible disclosure" bullshit is not that we fear leaks, but that we know most of the exploits are already in the wild by the time someone on the whitehat side discovers it.
Every day you delay the public announcements is another day that servers are being broken into.
Re:No Good Solution. (Score:3, Interesting)
There is no right, it's already gone bad so you've just got a lot of wrongs to choose from. So my opinions on disclosure are informed by risk minimization. Or to borrow a term, "harm reduction."
The order people were informed about heartbleed smells more like matter of "It's about who you know." than getting the problem fixed. If OpenSSL isn't at or or real close to the top of the list of people you contact the first day, you're either activity working against an orderly fix or don't trust the OpenSSL folks with the knowledge to fix their own software and are beyond a healthy level of paranoia.
Re:Not that good (Score:4, Interesting)
Open source software is often made freely available at no costs to downloaders and embedders. There is little incentive for these users to pay anything for it, including for support, since the main reason to adopt this software is to not pay at all.
Well, one could hope that issues like this will prompt those selfish companies to begin either developing their own software & quit relying on the freely given work of others or give them an incentive to support those who are building the critical software components. My personal opinion is that if a company is going to utilize a FOSS project and do self support, that they would provide some sort of resource back to the project.
Further aggravating the issue is the claim by activists that the software code is reviewed by millions of people as it is freely available to anyone. The fallacy of this claim resides in the lack of interest of anyone to do this. Indeed, who would review other people's code for free or for fun?
I happen to know several people who like reviewing & examining other people's code, especially complex code like what one would find in OpenSSL. These are the same type of people who just so happen to be the ones fixing a lot of the bugs you run into in OSS projects. It is people like that who make OSS projects succeed. I mean Linus Torvalds wrote Linux as a hobby project, and continued to review people's additions as a part of that hobby(now he gets paid to do what he was doing for fun). I personally don't do it because my free time interests lie elsewhere, but I enjoy software development enough that I would without those other distractions. So I'd say your argument is invalid.
Re:Not that good (Score:4, Interesting)
Several fundamental mistakes in there.
First, OpenSSL is not typical of Free Software. Cryptography is always hard, and other than, say, an Office Suite, it will often break spectacularily if a small part is wrong. While the bug is serious and all, it's not typical. The vast majority of bugs in Free Software are orders of magnitude less serious.
Second, yes it is true that the notion that anyone can review the source code doesn't mean anyone will actually do it. However, no matter how you look at it, the number of people who actually do will always be equal or higher than for closed source software.
Third, the major flagships of Free Software are sometimes, but not always picked for price. When you're a fortune-500 company, you don't need to choose Apache to save some bucks. A site-license of almost any software will be a negliegable part of your operating budget.
And, 3b or so, contrary to what you claim, quite a few companies contribute considerable amounts of money to Free Software projects, especially in the form of paid-for support or membership in things like the Apache Foundation. That's because they realize that this is much cheaper than having to maintain a comparable software on their own.
Full disclosure, nothing else (Score:2, Interesting)
Look, Google knew it. Google is part of prism. You are still wondering, if the NSA may have used Heartbleed?
Re:WTF? (Score:5, Interesting)
Actual Experience Against "Responsible Disclosure" (Score:5, Interesting)
Historically, so-called "responsible disclosure" has resulted in delayed fixes. As long as the flaw is not public and causing a drum-beat of demands for a fix and a possible loss of customers, the developer organization too often treats security vulnerabilities the same as any other bug.
Worse, those who report security vulnerabilities responsibly and later go public because the fixes are excessively delayed often find themselves branded as villains instead of heroes. Consider the case of Michael Lynn and Cisco in 2005. Lynn informed Cisco of a vulnerability in Cisco's routers. When Cisco failed to fully inform its customers of the significance of the security patch, Lynn decided to go public at the 2005 Black Hat conference in Las Vegas. Cisco pressured Lynn's employer to fire him and also filed a lawsuit against Lynn.
Then there was the 2011 case of Patrick Webster, who notified the Pillar Administration (major administrator of retirement plans in Australia) of a security vulnerability in their server. When the Pillar Administration ignored Webster, he used the vulnerability to extract personal data from about 500 accounts from his own pension plan (a client of the Pillar Administration). Webster made no use of the extracted personal data, did not disseminate the data, and did not go public. He merely sent the data to the Pillar Administration to prove the existence of the vulnerability. As a result, the Pillar Administration notified Webster's own pension plan, which in turn filed a criminal complaint against Webster. Further, his pension plan then demanded that Webster reimburse them for the cost of fixing the vulnerability and sent letters to other account holders, implying that Webster caused the security vulnerability.
For more details, see my "Shoot the Messenger or Why Internet Security Eludes Us" at http://www.rossde.com/editoria... [rossde.com].