NSA Allegedly Exploited Heartbleed 149
A user writes: "One question arose almost immediately upon the exposure of Heartbleed, the now-infamous OpenSSL exploit that can leak confidential information and even private keys to the Internet: Did the NSA know about it, and did they exploit if so? The answer, according to Bloomberg, is 'Yes.' 'The agency found the Heartbeat glitch shortly after its introduction, according to one of the people familiar with the matter, and it became a basic part of the agency's toolkit for stealing account passwords and other common tasks.'"
The NSA has denied this report. Nobody will believe them, but it's still a good idea to take it with a grain of salt until actual evidence is provided. CloudFlare did some testing and found it extremely difficult to extract private SSL keys. In fact, they weren't able to do it, though they stop short of claiming it's impossible. Dan Kaminsky has a post explaining the circumstances that led to Heartbleed, and today's xkcd has the "for dummies" depiction of how it works. Reader Goonie argues that the whole situation was a failure of risk analysis by the OpenSSL developers.
Comment removed (Score:5, Insightful)
This seems plausable (Score:4, Insightful)
I can understand this happening. It would make sense that the NSA would have someone or multiple people review every patch and check-in for a package as important as OpenSSH, just looking for exploitable mistakes.
I would not be surprised if they review a great deal of FOSS software they deem important to national security.
This sounds likely (Score:5, Insightful)
Re:It's not a bug (Score:5, Insightful)
it's a (NSA) feature...
Even if it's not an NSA feature...of course the knew about it! They would have to be even more incompetent than we think not to. They are HUGE, with something like 40,000 employees. At least of few of those employees must be dedicated to code review of OSS looking for vulnerabilities, and more in general looking for vulnerabilities in any widely used software. And if that's the case, then you'd think OpenSSL would be one of the first things they'd look at. The fact that they didn't tell anyone though shows that the S is NSA is bullshit. They cared more about being able to exploit the vulnerability themselves than making their country's computers more secure. If they cared one shit about their country's security then they'd have big teams dedicated to finding software vulnerabilities and working with vendors to fix them.
It's time we own up to this one (Score:5, Insightful)
OK guys. We've promoted Open Source for decades. We have to own up to our own problems.
This was a failure in the Open Source process. It is just as likely to happen to closed source software, and more likely to go unrevealed if it does, which is why we aren't already having our heads handed to us.
But we need to look at whether Open Source projects should be providing the world's security without any significant funding to do so.
Re:It's time we own up to this one (Score:5, Insightful)
The problem with open source when it comes to things like this is that there are so few people who are even qualified to implement protocols like this, and even fewer of them who are willing to work for nothing. The community needs to pony up some cash to have important projects audited like what they are trying to do with TrueCrypt right now.
Re:This seems plausable (Score:1, Insightful)
This is the dark side of the "with enough eyeballs, all bugs are shallow" theory. The eyeballs don't have to tell anyone else.
The full source code is conveniently carried to NSA without them needing to bully any company. Then it is analyzed by genius hackers who are paid top dollar for the job. They probably already have a good stock of other OSS exploits too, which are unknown to the rest of the world.
According to who? (Score:4, Insightful)
Bloomberg is the reporting organization, so they can't bee the source. They name no sources, just "two people familiar with the matter", which could mean they asked me twice.
Re:It's time we own up to this one (Score:5, Insightful)
This was a failure in the Open Source process.
Indeed. People have been saying for years that the OpenSSL code leaves much to be desired but nobody dares fix it because it might break something (needed: comprehensive unit tests).
There's been a bug filed for years saying that the code won't build with the system malloc, which in turn prevents code analysis tools from finding use-after-free conditions. The need here is less clear - leadership of the project has not made such a thing a priority. It's not clear that funding was the sole gating factor - commit by commit the code stopped working with the system malloc and nobody knew or cared.
Sure, a pile of money would help pick up the pieces, but lack of testing, continuous integration, blame culture, etc. might well have prevented it in the first place.
We still have sites like Sourceforge that are solving 1997 problems, like offering download space and mailing lists when what we need today is to be able to have continuous integration systems, the ability to deploy a vm with a complex project already configured and running for somebody to hack on, etc.
Re:It's time we own up to this one (Score:5, Insightful)
Re:It's time we own up to this one (Score:4, Insightful)
This might be more an issue of how it is being used. Not everything using SSL also uses "certificate authorities". Theres also no reason why software which odes can't give a warning if the CA were to unexpectedly change.
It's a completely broken idea, especially in this age when the worst enemy is the own government.
Has there been a time, at least within modern history, where this has not really been the case?