NSA Allegedly Exploited Heartbleed 149
A user writes: "One question arose almost immediately upon the exposure of Heartbleed, the now-infamous OpenSSL exploit that can leak confidential information and even private keys to the Internet: Did the NSA know about it, and did they exploit if so? The answer, according to Bloomberg, is 'Yes.' 'The agency found the Heartbeat glitch shortly after its introduction, according to one of the people familiar with the matter, and it became a basic part of the agency's toolkit for stealing account passwords and other common tasks.'"
The NSA has denied this report. Nobody will believe them, but it's still a good idea to take it with a grain of salt until actual evidence is provided. CloudFlare did some testing and found it extremely difficult to extract private SSL keys. In fact, they weren't able to do it, though they stop short of claiming it's impossible. Dan Kaminsky has a post explaining the circumstances that led to Heartbleed, and today's xkcd has the "for dummies" depiction of how it works. Reader Goonie argues that the whole situation was a failure of risk analysis by the OpenSSL developers.
Comment removed (Score:5, Insightful)
Re: (Score:1)
That was their mandate in the first place. Nobody begged - It was an order.
Re: (Score:3)
Re: (Score:3)
Think back to past presidents views on parts of the the US intelligence community.
JKF had is views on the CIA after the Bay of pigs.
Rockefeller Commission, Church Committee, Pike Committee, Murphy Commission, the Select Committee on Intelligence and the Directorate of Operations events in 1977. The domestic activities, human experimentation issues and need for a ban on
Re: (Score:2)
part of defending a network is understanding how it can be attacked. so you can develop countermeasures to mitigate the attacks.
so it helps when defending a network if you are also good at attacking one.
Re: (Score:2)
Why even have the same agency responsible for foreign electronic intelligence and put them in charge of "cyberdefence" (how I hate that term..).
It's a massive conflict of interest. You're virtually begging them to find and then sit on dangerous exploits.
Their "cyberdefence" mission is to defend DoD systems, not the entire world's computers.
If you don't like it, gripe that NIST and DHS aren't doing their jobs (they are the agencies actually over commercial internet security and non-DoD government sites) or transfer/alter their authority. Everybody thinking the NSA is there to protect their banking and email all have the wrong idea of what they do.
Obligatory xk..... (Score:5, Funny)
This seems plausable (Score:4, Insightful)
I can understand this happening. It would make sense that the NSA would have someone or multiple people review every patch and check-in for a package as important as OpenSSH, just looking for exploitable mistakes.
I would not be surprised if they review a great deal of FOSS software they deem important to national security.
Re: (Score:1, Insightful)
This is the dark side of the "with enough eyeballs, all bugs are shallow" theory. The eyeballs don't have to tell anyone else.
The full source code is conveniently carried to NSA without them needing to bully any company. Then it is analyzed by genius hackers who are paid top dollar for the job. They probably already have a good stock of other OSS exploits too, which are unknown to the rest of the world.
Re:This seems plausable (Score:4, Interesting)
Then it is analyzed by genius hackers who are paid top dollar for the job.
"Top dollar"? This is a government agency. They pay based on the GS scale. Even if the NSA's security hackers were classified at GS-15 (the highest rate), that's about $120K a year to begin – if they really are "geniuses" then they could do better in Silicon Valley, and probably feel better about their jobs as well.
In general, the GS scale pays somewhat more than typical private-sector rate for low-end jobs, but considerably less for high-end jobs.
Government contractors rake in the dough, but that money goes to politically-connected businessmen, not rank-and-file employees.
Re: (Score:1)
But this presents an interesting challenge, how do you evade someone that has control of the network? Is it possible?
Re:This seems plausable (Score:4, Interesting)
This patch was submitted at 7pm on Dec 31st, 2011, so the only people looking at it were the ones expecting it. I guess they were not disappointed.
http://git.openssl.org/gitweb/... [openssl.org]
Re: (Score:2)
I challenge anybody to review it and find (or notice) the bug. ... agency, or mafia).
My point, once again, is: C should not be used for security sensitive programs, we should start using managed languages.
I know, won't happen, because people are lazy and won't learn. Yet again we will think that this fix solves everything, that now OpenSSL is fixed. Which it most likely is not; I would be really surprised if there are no holes KNOWN (to some russian, chinese, israeli, usa,
Re: (Score:3)
I challenge anybody to review it and find (or notice) the bug.
Wasn't this a plain and simple using un-sanitized data from packet received from the adversary (for code review purposes, all network data comes from an adversary)? Anybody doing serious code review should know to check for this and study code until sure all such values are handled safely, or reject change if code is too obfuscated to be sure. Anybody missing this should not be given any code to review responsibility without more experienced supervision.
Obfuscated Variable Names (Score:3, Informative)
I challenge anybody to review it and find (or notice) the bug.
It's actually kind of easy to see. I just use the same trick I use when trying to read almost anyone's code: I assume that some jackass obfuscated all of his variable names and so I rename them as I figure out what they actually represent so that the new names actually describe the variable. Once that's complete, I'm left with "memcpy(pointer_to_the_response_packet_we_are_constructing, pointer_to_some_bytes_in_the_packet_we_received, some_number_we_read_out_of_the_packet_we_received)" and it immediately
Re: (Score:2)
I just use it because I hate everything else even more.
So true
...but more seriously, the code in that check-in is why I hate to let anyone work on any programming projects with me.
You can teach them how to do better.
Re: (Score:2)
Re: (Score:2)
You would think if the government was doing this, they would at least tell their fellow government agencies about the flaws, so that they would not be vulnerable to foreign hackers...
I am totally surprised by this (Score:2)
This sounds likely (Score:5, Insightful)
Re: (Score:2)
They definitely didn't exploit it. How do I know? Because they said so [twitter.com], so it must be true. Right? They wouldn't lie to us, would they?
I don't understand (Score:2)
Why can we not start a class action lawsuit against the Government, NSA and those that allow snooping around in personal data without probable cause?
Re: (Score:2)
One cannot simply sue a branch of the government without asking permission from the government to allow it to be sued - guess how often THAT happens? Plus is NSA has a built-in out; its in the interests of national security. Its bullshit - we all know it - but it a legal out, its the reason they can deny your FOIA request for information about Area 51, the Roswell incident, as well as the intelligence records on Jimmy Hoffa or J. Edgar Hoover.
You don't understand, yep! (Score:5, Informative)
Glad you asked: it happens all the time, ever since the Tort Claims Act of 1948 substantially waived the sovereign immunity doctrine. You can read more about it at Wikipedia [wikipedia.org].
People sue the government all the time. It's literally an everyday occurrence.
Re: (Score:2)
That doesn't provide an opened ended unlimited right to sue. I very much doubt you'd have an allowable claim for injury on this.
Re:You don't understand, yep! (Score:4)
I'm not weighing in on that one. I'm only correcting the original poster, who said the U.S. rarely waives sovereign immunity. In fact, the opposite is true: it rarely invokes it. Tens of thousands of tort claims against the U.S. government are underway even as we speak, all of them with waived sovereign immunity.
Re: (Score:3)
...and I learn something new every day. Thank you for sharing that without calling me a moron. I knew it had been a few years since my last political science class, now I have something new to read up on.
Re: (Score:2)
You need to have enough evidence of *something* possibly happening to show that you have standing to bring the case.
Re: (Score:2)
That's a good idea, because the least accountable branch of government is surely on your side! /s
The judicial branch and the supreme court serve much the same purpose as the Tsar in old Russia. No matter how bad it gets, it's not the Tsar's fault. It's the noblemen's fault. The Tsar just has bad advisers. If only we could get past them and talk to him and make him understand, it'd all be OK.
Re: (Score:2)
To start with, because of sovereign immunity.
Do it enough times (Score:2)
If you know about it and have access to virtually unlimited resources you can afford to attach to your target and do it as many times as you want in order to get what you want.
And, frankly, I don't believe the guy that claims responsibility for the bug.
As well, if something this simple could cause such an issue then clearly it is an issue for lots of other important security programs.
Re: (Score:3)
As well, if something this simple could cause such an issue then clearly it is an issue for lots of other important security programs.
Yes, it's one of the most common memory handling bugs and is known as a buffer overflow [wikipedia.org], generally buffer overflows are difficult to exploit which can be seen in the fact that nobody has actually demonstrated extracting a key using this particular bug, just that it is "possible" to do so. Winning the lottery is also "possible".
There's all sorts of complete bullshit about this bug in the press, to paraphrase what I read today in the WSJ that "It turns out that just 4 European developers and some guy in th
Re: (Score:2)
Private key grabbed. Game over.
One successful attempt took >2.5M requests over a day. Second successful attempt was something like 100k requests.
http://blog.cloudflare.com/the... [cloudflare.com]
It's all in the luck of the draw. When you don't have any logging of this, you've got no idea how long people have been poking at this and literally no idea what anyone has made off with.
Not the last we'll hear about OpenSSL? (Score:2)
NSA put the bug there, of course they exploited it (Score:1)
We need to find out if the author of this bug is or was on the NSA payroll. It would not be surprising to find out he was paid to put it there.
Re:NSA put the bug there, of course they exploited (Score:5, Informative)
The author of this bug and the reviewer of the commit have both been very forthcoming about the mistake. There's little reason to suspect malicious intent in this particular instance.
That doesn't mean the NSA didn't know about it or exploit it, though.
Re: (Score:2)
Heinlein's but don't rule out malice still applies.
Look. I get that the NSA has these incredible resources (thousands of personnel, alone), but they're still all working for the government: the king of big company bullshit with a side of no incentive to work hard. I'll kiss a pimple on your ass if there aren't many hundreds of others' disenfranchised like Snowden who lack either the luxury of being able to leave or the courage to do so.... these folks commitment is plau
Re: (Score:3)
Re: (Score:1)
The bug seems quite obvious. I would expect they could be a little more clever if they were to write the bug itself. This is failure of the OpenSSL project, period.
Re: (Score:2)
We need to find out if the author of this bug is or was on the NSA payroll. It would not be surprising to find out he was paid to put it there.
The author responsible for the bug has already admitted that it was a mistake (and it's not like buffer overflows are unheard of, so it really is plausible). Sure, it's possible that the NSA secretly paid him (or ever coerced him by holding some incriminating evidence over his head), but it would likely take someone with the resources of the NSA to uncover such a secret NSA payout. Something of that nature probably wouldn't even be available in Snowden's document archive.
Re: (Score:2)
Re: (Score:2)
The author of the bug probably introduced it accidentally. It's easy to do. The author of the special wrapper code in openSSL that purposely prevents newer versions of malloc from doing memory checking that would have revealed this bug a little more suspicious.
It's time we own up to this one (Score:5, Insightful)
OK guys. We've promoted Open Source for decades. We have to own up to our own problems.
This was a failure in the Open Source process. It is just as likely to happen to closed source software, and more likely to go unrevealed if it does, which is why we aren't already having our heads handed to us.
But we need to look at whether Open Source projects should be providing the world's security without any significant funding to do so.
Re:It's time we own up to this one (Score:5, Insightful)
The problem with open source when it comes to things like this is that there are so few people who are even qualified to implement protocols like this, and even fewer of them who are willing to work for nothing. The community needs to pony up some cash to have important projects audited like what they are trying to do with TrueCrypt right now.
Re:It's time we own up to this one (Score:4, Informative)
Re:It's time we own up to this one (Score:5, Insightful)
Re:It's time we own up to this one (Score:4, Informative)
Once you start down the math path the classes get smaller and fewer stay for needed years vs lure of private sector telco or unrelated software work.
Most nations really do produce very few with the skills and keep them very happy.
Trips, low level staff to help, good funding, guidance, friendships all just seem to fall into place.
Bringing work home and helping open source could be seen as been an issue later vs students or team members who did open source games or made apps.
Re: (Score:2)
OK guys. We've promoted Open Source for decades. We have to own up to our own problems.
This was a failure in the Open Source process. It is just as likely to happen to closed source software, and more likely to go unrevealed if it does, which is why we aren't already having our heads handed to us.
But we need to look at whether Open Source projects should be providing the world's security without any significant funding to do so.
If it's just as likely to happen to closed source software, then why is it a failure of the Open Source process? It was discovered and fixed so quickly *because* it's open source - there may be similar holes in closed source software that are being exploited today, yet no white hats have discovered them yet.
Re: (Score:3)
Re: (Score:1)
It was discovered and fixed so quickly *because* it's open source
For crikessakes, the heartbleed vulnerability existed for over 2 years before being discovered and fixed!
Re:It's time we own up to this one (Score:4, Interesting)
It was discovered and fixed so quickly *because* it's open source
For crikessakes, the heartbleed vulnerability existed for over 2 years before being discovered and fixed!
Sorry my bad, that sentence was confusing -- I meant the fix was fast, not finding the bug.
An exact timeline for Hearthbleed is hard to find, but it looks like there was some responsible disclosure of the bug to some large parties about a week before public disclosure and release of the fixed SSL library.
In contract, Apple learned of its SSL vulnerability [nist.gov] over a month [theguardian.com]before they released an IOS patch and even after public disclosure of the bug, it was about a week before they released the OSX patch. And just like the OpenSSL bug, Apple's vulnerability was believed to have been in the wild for about 2 years before detection. (of course, since the library code was opensourced by Apple, several unofficial patches were released before Apple's official patch).
Re: (Score:2)
Re: (Score:3)
I think we need to take a serious look at the "many eyes" theory because of this. Apparently, there were no eyes on the part of parties that did not wish to exploit the bug for close to two years. And wasn't there just a professional audit by Red Hat that caught another bug, but not this one?
I'm going to say this calls into question the value of professional audits.
My experience is that visual inspection of code does little to remove all the bugs. It's just really hard to muster the concentration needed to verify that the code is good with your eyes.
Re: (Score:2)
Three years is quick?
Re: (Score:3)
this does not have anything to do with open source and all to do with the software development process (or lack of) used here: something like this could've happened in a closed source library just as easily, the only difference would be that rather than source analysis you'd have used other tools to find the vulnerability: if a new addition to a protocol comes in and you have bad intentions of course the first thing you do is to see what happens if you feed it invalid data, if you did that here you'd have f
Re: (Score:2)
and btw, funding is good, but funding does not buy you a good software development process: for that you need to actually focus on finding a good process first, and use the funding to achieve what you are planning without forgetting that if it's a critical piece of infrastructure nowadays it will be attacked by adversaries with much larger pockets than yours no matter how large yours are, so the process has to take into account that any development is done in a completely hostile environment, where a-priori
Re: (Score:2)
Re: (Score:1)
Re:It's time we own up to this one (Score:5, Insightful)
This was a failure in the Open Source process.
Indeed. People have been saying for years that the OpenSSL code leaves much to be desired but nobody dares fix it because it might break something (needed: comprehensive unit tests).
There's been a bug filed for years saying that the code won't build with the system malloc, which in turn prevents code analysis tools from finding use-after-free conditions. The need here is less clear - leadership of the project has not made such a thing a priority. It's not clear that funding was the sole gating factor - commit by commit the code stopped working with the system malloc and nobody knew or cared.
Sure, a pile of money would help pick up the pieces, but lack of testing, continuous integration, blame culture, etc. might well have prevented it in the first place.
We still have sites like Sourceforge that are solving 1997 problems, like offering download space and mailing lists when what we need today is to be able to have continuous integration systems, the ability to deploy a vm with a complex project already configured and running for somebody to hack on, etc.
Re: (Score:1)
"less clear"?
Less clear my ass! I'd say there is no leadership in the project, unless "FUD" (fear of it breaking something) is called "leadership". But then as you say, "nobody cares".
If the code is as you describe, the whole shebang should be rewritten from scratch using higher level managed language. Any managed language would have prevented the information leak although probably not the unchecked value.
Re: (Score:2)
In the process of rewriting, it's inevitable that a ton of brand-new bugs will be introduced in the new codebase, and you'll have lost all the time and effort hardening the library and fixing all of the thousands of previously exploitable issues.
I think talk of scrapping or rewriting the library is a bit of an overreaction caused by the scale and scope of the issue, and is certainly not plausible in the short term anyhow. I'd say the proper thing to do is to halt development of new features for a time and
Re: (Score:3)
I hate to disagree with you, but this has nothing to do with Open Source, it has to do with software engineering.
This same bug could have been introduced in closed-source software just as easily. The problem is making sure that software is securely reviewed before its disseminated, much like the OpenBSD people have been touting all these years, instead of just throwing things together however they work.
The only part F/OSS played in this is that we *found* the bug and can identify exactly when and how it oc
Re: (Score:3)
But we need to look at whether Open Source projects should be providing the world's security without any significant funding to do so.
I'm in favor of more funding for open source, but in this case I would still trust the security of the internet on open source long before I would trust it to closed source. I've seen what too much closed source looks like, and it scares me.
Re:It's time we own up to this one (Score:5, Interesting)
Re:It's time we own up to this one (Score:4, Insightful)
This might be more an issue of how it is being used. Not everything using SSL also uses "certificate authorities". Theres also no reason why software which odes can't give a warning if the CA were to unexpectedly change.
It's a completely broken idea, especially in this age when the worst enemy is the own government.
Has there been a time, at least within modern history, where this has not really been the case?
Fork it. (Score:5, Funny)
Theo de Raadt should fork OpenSSL. He could call it OpenOpenSSL.
.
Re: (Score:2)
What OpenSSL needs is multiple independant line by line code audits of the paid variety, by teams of competent people. It may be an open source piece of software, but considering the countless billions of dollars at stake, there shouldn't be any fucking issue finding the money to make this shit happen.
What major corporations use SSL? Cisco? IBM? Anybody else like that? We could probably get them to foot most of the bill.
Does the "fix" include scrubbing? (Score:3)
When this was supposedly "fixed" in OpenSSL, did the fix just fix this one known bug? A real fix includes fixing the storage allocator to overwrite all released blocks, so no other old-data-in-buffer exploit would work.
We fixed the glitch. (Score:2)
Bob Porter: We always like to avoid confrontation, whenever possible. Problem is solved from your end.
Re: (Score:3)
A real fix includes not rolling their own malloc, then fixing the bugs that were hidden by their badly written freelist which prevented people from reverting to a normal malloc.
Re: (Score:2)
Ignoring the performance hit with this (that many application won't take)
Clearing recently used memory is cheap, because it's in the cache. Clearing memory in general is cheap on modern CPUs, because the superscalar features do it really well. MOV is 35% of instructions, so CPUs are designed to do it efficiently.
It's security code. You have to scrub memory.
According to who? (Score:4, Insightful)
Bloomberg is the reporting organization, so they can't bee the source. They name no sources, just "two people familiar with the matter", which could mean they asked me twice.
Highly likely that NSA knew early on (Score:3, Interesting)
...
I have not yet grasped the full scope of the implications of this bug, but if you take the stance that things that could have been done also has been done (imho the only safe assumption), is this a good characterization? Or are there any limiting factors that makes this impossible? Like for example the amount of memory that could be leaked while the application is running (as servers aren't restarted often) is certain information that is stored statically in memory potentially not reachable?
During the last two years:
1. Any/all certificates used by servers running openssl 1.0.1 might have been compromized and should be revoked (the big cert-reset of 2014?)
2. Because of 1, any/all data sent over a connection to such servers might now be know by a bad MITM (i.e. for large scale: the various security services/hostile ISPs, local scale/targeted attacks: depends on who else happened to know, and this person/organization happened to be your adversary, looks unlikely, but who knows...)
3. Any/all data stored in SSL-based client applications might have been compromised.
From a users perspective - change all passwords/keys that has been used on applications based on openSSL-1.0.1? How to know what services? To be safe, change them all? Consider private data potentially sent over SSL to be open and readable by the security services?
Thinking about the large-scale:
For how long has the NSA been picking up information leaked by Heartbleed (assuming that they have at least since late evening the 7:th or early morning the 8:th seems a given)?
-Not in the Snowden documents that has been revealed so far (absence of proof != proof of absence, but language might give a hint)
-No report of unusual heartbeat streams being spotted in the wild (was anyone looking?)
Let's assume for the sake of argument the NSA does not have people actually writing the OpenSSL code in the first place.
When did they know about it's existence?
time_to_find_bug = budget * complexity_of_bug / size_of_sourcecode * complexity_of_sourcecode * intention_to_find_bugs
Where
budget = manpower * skillset
and
time_to_find_bug < inf.
when
skillset >= complexity_of_bug
Heartbeat bug:
complexity_of_bug = low
OpenSSL:
size_of_sourcecode = 376409 lines of code (1.0.1 beta1)
complexity_of_sourcecode = high
NSA:
intention_to_find_bugs = 1
budget = $20 * 10^9 ?
=> manpower = 30k ?
skillset = high
Guesstimate: one to a few months -> early 2012 to go through the changes made to 1.0.1 building on earlier work already done on the 0.8.9 branch...
...
Or to say it another way, I think it is safe to assume that, given the simplicity of the bug, NSA knew about Heartbleed in early on. The anonymous comments to Bloomberg gives nice confirmation of this.
Re: (Score:1)
Re: (Score:2)
Heartbleed Challenge Over (Score:5, Interesting)
Re: (Score:2)
For some reason I can't get to that page (DDOS'd? Taken offline?)
Here's the results on their blog:
http://blog.cloudflare.com/the... [cloudflare.com]
Why has it not crashed the servers? (Score:2)
Failure of risk analysis by more than OpenSSL devs (Score:5, Informative)
Private key compromise is indeed possible (Score:4, Informative)
Snowden (Score:2)
Has there been any cases where the leaked information has been usefull in pointing out flaws which lead to patching security holes?
Finally! (Score:2)
"and today's xkcd has the "for dummies" depiction of how it works."
Thank you, thank you, thank you! At last I get it. So simple. So fiendishly simple.
Re: (Score:3)
Re: (Score:1)
Really man! [slashdot.org] And nothing confirms this story like an official denial. (I think that's how it goes)
Re: (Score:2)
I think that NSA has good coverage and analysis tools. They probably knew of Heartbleed, and they probably know of dozens more flaws like this.
I will be glad, when they cease to exist.
Re: (Score:2)
When the NSA ceases to exist, it will be because they are adsorbed by something bigger and more powerful.
Look at the KGB. They were adsorbed by the Russian Oligarchy.
Re: (Score:3)
Yeah, empires never fade, and always get replaced by bigger ones.
So don't look at dinosaurs, and the tiny mammals that survived them, and surely not at entropy, which is breaking everything down to energy and then smearing that around, slowly, patiently, irreversibly. Do not realize that the universe is a joke at the cost of anyone who likes (to keep) power, that having lots of materials and commanding people around or killing them does not constitute power more than a fart constitutes a solid object, and i
Lots of truth but also some wishful thinking (Score:2)
As I'm too often involved in myself sometimes: :-)
http://www.pdfernhout.net/on-d... [pdfernhout.net]
"This approximately 60 page document is a ramble about ways to ensure the CIA (as well as other big organizations) remains (or becomes) accountable to human needs and the needs of healthy, prosperous, joyful, secure, educated communities. The primarily suggestion is to encourage a paradigm shift away from scarcity thinking & competition thinking towards abundance thinking & cooperation thinking within the CIA and othe
Re: (Score:2)
Since the dawn of man the size of political units has always been increasing.
Family
Band
Tribe
City
State
Nation
World Government(?) (If we don't have nuclear war first)
Re: (Score:1)
"No, we didn't exploit the Heartbleed bug. You're accusing us of stealing a Ford Focus when we've got garages full of Ferraris and F-16s. Puh-lease! Why would we bother exploiting something that gives such rudimentary and fleeting access to data and could just as easily be used by the Chinese or Russians when we have the espionage skills and resources available to us to install our own secure backdoors that only we can use and
Re:It's not a bug (Score:5, Insightful)
it's a (NSA) feature...
Even if it's not an NSA feature...of course the knew about it! They would have to be even more incompetent than we think not to. They are HUGE, with something like 40,000 employees. At least of few of those employees must be dedicated to code review of OSS looking for vulnerabilities, and more in general looking for vulnerabilities in any widely used software. And if that's the case, then you'd think OpenSSL would be one of the first things they'd look at. The fact that they didn't tell anyone though shows that the S is NSA is bullshit. They cared more about being able to exploit the vulnerability themselves than making their country's computers more secure. If they cared one shit about their country's security then they'd have big teams dedicated to finding software vulnerabilities and working with vendors to fix them.
Re: (Score:2)
National Spying agency...
They do provide the right companies with expertise on security for securing important technology, and justifying compliance to said recommendation for sale of sensitive products. I don't know how many of the 40000 employees do that, but that's one "Security" feature that they _do_ offer.
Re: (Score:1)
The fact that they didn't tell anyone though shows that the S is NSA is bullshit.
Wouldn't that be the NBA? Interesting mashup there.
LeBron James: athlete, sycophant, spy.
Re: (Score:2)
The fact that they didn't tell anyone though shows that the S is NSA is bullshit. They cared more about being able to exploit the vulnerability themselves than making their country's computers more secure.
It's a basic conflict of interest with police/defense/intelligence agencies. They gain power from the existence of threats, so it's in their self interest to favor policies that perpetuate them while pretending to do the opposite. The War on Drugs, Cuban Embargo, etc.
Re: (Score:1)
The fact that they didn't tell anyone though shows that the S is NSA is bullshit. They cared more about being able to exploit the vulnerability themselves than making their country's computers more secure. If they cared one shit about their country's security then they'd have big teams dedicated to finding software vulnerabilities and working with vendors to fix them.
You are confused as to what NSA's "defensive" mission is. They aren't there to be the defenders of the internet. They aren't there to be corporate America's QA department. They aren't there to review open source and provide fixes. They aren't there to "make the country's computers more secure".
They are there to protect DoD classified systems. That's the defensive mission, as an agency under the DoD umbrella. Protect DoD classified systems and anything that deals with military activities. All this extraneous
Re: (Score:1)
Re: (Score:3, Interesting)
You could probably t
Re: (Score:1)
Or here: https://f5.com/ [f5.com]
Re: (Score:3)
The same reason NATO and other US allies did not understand the NSA Martin and Mitchell defection http://en.wikipedia.org/wiki/M... [wikipedia.org] in 1960 with the press conference saying:
"As we know from our previous experience working at N.S.A., the United States successfully reads the secure communications of more than forty nations, including its own allies."
Embassies, govs and firms went on using the same junk standard crypto hardware over decades of revisions.
Re: (Score:1)