Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Encryption Google Privacy Security Apple

Neglecting the Lessons of Cypherpunk History 103

Nicola Hahn writes Over the course of the Snowden revelations there have been a number of high profile figures who've praised the merits of encryption as a remedy to the quandary of mass interception. Companies like Google and Apple have been quick to publicize their adoption of cryptographic countermeasures in an effort to maintain quarterly earnings. This marketing campaign has even convinced less credulous onlookers like Glenn Greenwald. For example, in a recent Intercept piece, Greenwald claimed:

"It is well-established that, prior to the Snowden reporting, Silicon Valley companies were secret, eager and vital participants in the growing Surveillance State. Once their role was revealed, and they perceived those disclosures threatening to their future profit-making, they instantly adopted a PR tactic of presenting themselves as Guardians of Privacy. Much of that is simply self-serving re-branding, but some of it, as I described last week, are genuine improvements in the technological means of protecting user privacy, such as the encryption products now being offered by Apple and Google, motivated by the belief that, post-Snowden, parading around as privacy protectors is necessary to stay competitive."

So, while he concedes the role of public relations in the ongoing cyber security push, Greenwald concurrently believes encryption is a "genuine" countermeasure. In other words, what we're seeing is mostly marketing hype... except for the part about strong encryption.

With regard to the promise of encryption as a privacy cure-all, history tells a markedly different story. Guarantees of security through encryption have often proven illusory, a magic act. Seeking refuge in a technical quick fix can be hazardous for a number of reasons.
This discussion has been archived. No new comments can be posted.

Neglecting the Lessons of Cypherpunk History

Comments Filter:
  • Re:Yep (Score:4, Interesting)

    by Travis Mansbridge ( 830557 ) on Sunday December 07, 2014 @06:51AM (#48541731)
    Even ROT13 takes some effort to solve, it's better than laying everything out over the wire in plaintext. Will it stop the NSA from reading it if they really want to? No, but it prevents it from being searchable at the press of a button from a massive index of communications.
  • by Anonymous Coward on Sunday December 07, 2014 @07:26AM (#48541771)

    Whilst the changes implemented by Apple, Google and others are a matter of record, the sad truth is that none of that matters.

    There is simply no amount of encryption that a US complany can deploy which trumps an NSL - a "National Security Letter". The fact is, if a company receives an NSL from the US Government, it has *no choice* but to comply, and to do so without alerting the potential subject[s] to the fact that it has been subverted. So far I am aware of only one party - Lavabit - who stood up to demands for keying materials.

    So Glenn is misguided at best, outright wrong at worst. At the moment, there are very few countries in the world where it is possible for a private citizen [or company] to set up a cryptoscheme that the government does not have the right to demand access. In most cases, witholding keys or pass phrases can result in instant censure, typically including jail time.

    I think the lesson is: you can't trust your PC, or your government...

  • Re:Yep (Score:5, Interesting)

    by jones_supa ( 887896 ) on Sunday December 07, 2014 @07:49AM (#48541797)

    A security solution does not have to be 100% perfect to still provide value.

    Let's take another example. A workstation requires pressing Ctrl-Alt-Del and typing a password to unlock the computer. You might say that it is useless protection because an attacker can just walk away with the hard drive of the computer.

    So why is the password still useful? Well, without a password, an attacker might just start locally using the computer and quickly take a look at various secret documents. If he were to grab the hard drive, it would take significantly more time, which would increase the chances of being captured by the security team.

    To get back to the topic, by using encryption you are not the lowest hanging fruit out there.

  • by Opportunist ( 166417 ) on Sunday December 07, 2014 @08:07AM (#48541815)

    That Damocletian sword of a NSL is the biggest threat to competitiveness of the US data storage providers. On the internet, it does not matter where you store your data. Never before it mattered so little whether your datacenter is next door or in Abu Dhabi.

    As a company, would I want to have my data in the hands of a data center where I KNOW it could instantly be forced to hand it over to a government that has a record that borders on that of China when it comes to industrial espionage?

    I've been consulting with companies that wanted to outsource their data storage. They all had a list of countries where you may NOT store their data if you want to enter the bidding war, and without fail the US was on that list, along with other pinnacles of freedom like China. Iran was oddly absent from most lists, even.

  • by PvtVoid ( 1252388 ) on Sunday December 07, 2014 @09:05AM (#48541923)

    TFA is correct that simply thinking that, because there is a zillion-bit crypto algorithm thrown into the communication stream, that everything is good and security is guaranteed. There are many, many attack channels that do not involve brute-forcing the crypto. Keyloggers, for example.

    But this is silly:

    Back in the 1980s and 1990s, a group of encryption mavens known as cypherpunks sought to protect individual privacy by making "strong" encryption available to everyone. To this end they successfully spread their tools far and wide such that there were those in the cypherpunk crowd who declared victory. Thanks to Edward Snowden, we know how this story actually turned out. The NSA embarked on a clandestine, industry-spanning, program of mass subversion that weakened protocols and inserted covert backdoors into a myriad of products.

    In actuality, the crypto implementations promoted by cypherpunks were exactly those that made it difficult or impossible for such a program of mass subversion to take place. Remember that the height of the cypherpunk movement was when the Clinton administration was pushing hard, really hard, for the NSA-sponsored Clipper Chip, which was, in a nutshell, crypto subverted by design and mandated by law. We now know that when the spooks found that was politically impossible, they went ahead and did it anyway, in secret. But the cypherpunk tools, most notably PGP (and later GPG, when PGP sold out and went corporate). Hell, even look at /dev/random: when it was revealed that the NSA had actually, and pretty amazingly, undermined hardware random number generators on widely available chips, /dev/random was still just fine, because it treats all sources of entropy as potentially untrustworthy, including the chip.

    The first lesson we should learn from the history of the cypherpunks is that trusting your crypto to a closed product is always, always a bad idea. That was the lesson then, and it is still the lesson now.

    The second lesson is that crypto, like any security, is all about the threat model. In that light, should we reject the widespread adoption of end-to-end crypto in commercial products? Of course not. If Apple and Google implement crypto by default, it will make efforts to dragnet information exponentially harder, even if the crypto is imperfect. This is why the spooks are beating the drum against it: it closes off that one particular threat model, which they have come to rely on. It doesn't close off other kinds of attack, but so what?

    The third lesson is that crypto, by itself, is not a panacea. Nobody ever said it was. The cypherpunk message was not that we can write PGP, declare victory, and walk away. The message was that privacy changes the relationship between the citizen and the state in beneficial ways, and that, in a technological society, we need to embrace technological means of increasing our privacy, in ways that cannot be controlled by the state.

  • by Anonymous Coward on Sunday December 07, 2014 @09:13AM (#48541939)

    Yep. Cyclists certainly have learned the lesson that there is no such thing as absolute security, only relative security. The best you can do is make so the thief decides to go after someone else's bike instead. If your bike is an especially attractive target, the pros who know what they are looking for might still get it.

    And not only in this case, are the cops not doing much to stop the thieves, they're working on the same team. So really, we can't give up efforts to limit what the thieves can do.

  • by PvtVoid ( 1252388 ) on Sunday December 07, 2014 @10:08AM (#48542075)

    It's not that cryptography has failed to bring us security, it's that the people have failed to make use of the available cryptography in the first place.

    It's worse than that. As an artist friend of mine told me recently: "Ten years ago I used to wonder how people would respond to the massive loss of privacy represented by social media. Now we know: the only thing people actually worry about is that nobody is watching."

"Floggings will continue until morale improves." -- anonymous flyer being distributed at Exxon USA

Working...