Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Encryption Government Privacy

NSA Allegedly Exploited Heartbleed 149

A user writes: "One question arose almost immediately upon the exposure of Heartbleed, the now-infamous OpenSSL exploit that can leak confidential information and even private keys to the Internet: Did the NSA know about it, and did they exploit if so? The answer, according to Bloomberg, is 'Yes.' 'The agency found the Heartbeat glitch shortly after its introduction, according to one of the people familiar with the matter, and it became a basic part of the agency's toolkit for stealing account passwords and other common tasks.'" The NSA has denied this report. Nobody will believe them, but it's still a good idea to take it with a grain of salt until actual evidence is provided. CloudFlare did some testing and found it extremely difficult to extract private SSL keys. In fact, they weren't able to do it, though they stop short of claiming it's impossible. Dan Kaminsky has a post explaining the circumstances that led to Heartbleed, and today's xkcd has the "for dummies" depiction of how it works. Reader Goonie argues that the whole situation was a failure of risk analysis by the OpenSSL developers.
This discussion has been archived. No new comments can be posted.

NSA Allegedly Exploited Heartbleed

Comments Filter:
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Friday April 11, 2014 @05:22PM (#46729487)
    Comment removed based on user account deletion
  • by capedgirardeau ( 531367 ) on Friday April 11, 2014 @05:26PM (#46729531)

    I can understand this happening. It would make sense that the NSA would have someone or multiple people review every patch and check-in for a package as important as OpenSSH, just looking for exploitable mistakes.

    I would not be surprised if they review a great deal of FOSS software they deem important to national security.

  • This sounds likely (Score:5, Insightful)

    by gurps_npc ( 621217 ) on Friday April 11, 2014 @05:29PM (#46729549) Homepage
    The basic fact is, if they did not exploit it, then someone working for them is thinking "DAMN, I wish I thought of using that!"
  • Re:It's not a bug (Score:5, Insightful)

    by NoKaOi ( 1415755 ) on Friday April 11, 2014 @05:43PM (#46729637)

    it's a (NSA) feature...

    Even if it's not an NSA feature...of course the knew about it! They would have to be even more incompetent than we think not to. They are HUGE, with something like 40,000 employees. At least of few of those employees must be dedicated to code review of OSS looking for vulnerabilities, and more in general looking for vulnerabilities in any widely used software. And if that's the case, then you'd think OpenSSL would be one of the first things they'd look at. The fact that they didn't tell anyone though shows that the S is NSA is bullshit. They cared more about being able to exploit the vulnerability themselves than making their country's computers more secure. If they cared one shit about their country's security then they'd have big teams dedicated to finding software vulnerabilities and working with vendors to fix them.

  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Friday April 11, 2014 @05:46PM (#46729661) Homepage Journal

    OK guys. We've promoted Open Source for decades. We have to own up to our own problems.

    This was a failure in the Open Source process. It is just as likely to happen to closed source software, and more likely to go unrevealed if it does, which is why we aren't already having our heads handed to us.

    But we need to look at whether Open Source projects should be providing the world's security without any significant funding to do so.

  • by Anonymous Coward on Friday April 11, 2014 @05:54PM (#46729723)

    The problem with open source when it comes to things like this is that there are so few people who are even qualified to implement protocols like this, and even fewer of them who are willing to work for nothing. The community needs to pony up some cash to have important projects audited like what they are trying to do with TrueCrypt right now.

  • by Anonymous Coward on Friday April 11, 2014 @06:08PM (#46729823)

    This is the dark side of the "with enough eyeballs, all bugs are shallow" theory. The eyeballs don't have to tell anyone else.

    The full source code is conveniently carried to NSA without them needing to bully any company. Then it is analyzed by genius hackers who are paid top dollar for the job. They probably already have a good stock of other OSS exploits too, which are unknown to the rest of the world.

  • According to who? (Score:4, Insightful)

    by radarskiy ( 2874255 ) on Friday April 11, 2014 @06:15PM (#46729851)

    Bloomberg is the reporting organization, so they can't bee the source. They name no sources, just "two people familiar with the matter", which could mean they asked me twice.

  • by bill_mcgonigle ( 4333 ) * on Friday April 11, 2014 @07:25PM (#46730301) Homepage Journal

    This was a failure in the Open Source process.

    Indeed. People have been saying for years that the OpenSSL code leaves much to be desired but nobody dares fix it because it might break something (needed: comprehensive unit tests).

    There's been a bug filed for years saying that the code won't build with the system malloc, which in turn prevents code analysis tools from finding use-after-free conditions. The need here is less clear - leadership of the project has not made such a thing a priority. It's not clear that funding was the sole gating factor - commit by commit the code stopped working with the system malloc and nobody knew or cared.

    Sure, a pile of money would help pick up the pieces, but lack of testing, continuous integration, blame culture, etc. might well have prevented it in the first place.

    We still have sites like Sourceforge that are solving 1997 problems, like offering download space and mailing lists when what we need today is to be able to have continuous integration systems, the ability to deploy a vm with a complex project already configured and running for somebody to hack on, etc.

  • by l0n3s0m3phr34k ( 2613107 ) on Friday April 11, 2014 @10:01PM (#46731205)
    Exactly! Everyone can get to the source, the whole point of OSS is that the companies themselves can (and should, from a risk-analysis point) be reviewing all the code too before implementation...it's along the lines "you get what you pay for" yet at least here everyone is given the chance to see exactly what's being run (as opposed to pre-compiled apps). IMHO, this really isn't an OpenSSL issue as much as a failing of due diligence by all the companies using it. The admin's excuse of "well, we don't actually know what the code says" fails here, and anyone over the past two years could have reviewed it themselves and fixed this! Maybe this will spur corps to actually review code of critical infrastructure when it's avalible as part of corp policy from now on, perhaps the insurance companies who do "Errors and Omissions" policies will start forcing corps to do that; kinda surprised that this isn't already a standard policy, as code review of OSS is one of it's main strengths and if your company doesn't do it then their missing out on one of the biggest assets of using OSS.
  • by mpe ( 36238 ) on Saturday April 12, 2014 @03:09AM (#46732181)
    SSL is a much worse problem in itself. Relying on some "trustworthy" certificate authority sounds like a good idea, huh?

    This might be more an issue of how it is being used. Not everything using SSL also uses "certificate authorities". Theres also no reason why software which odes can't give a warning if the CA were to unexpectedly change.

    It's a completely broken idea, especially in this age when the worst enemy is the own government.

    Has there been a time, at least within modern history, where this has not really been the case?

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...