Forgot your password?
typodupeerror
Encryption Security Communications Open Source

CyanogenMod Integrates Text Message Encryption 118

Posted by Unknown Lamer
from the only-criminals-text-with-aes dept.
sfcrazy writes "People are now more concerned regarding their privacy after discovering about efforts made by governments to spy on their communications. The most practical solution to keep messages, emails and calls secure is to use a cryptographic encryption mechanism. However, just like the name of the method, the installation process is complex for most users. To solve this, CyanogenMod will come equipped with built in encryption system for text messages." Whisper System has integrated their TextSecure protocol into the SMS/MMS provider, so even third party sms apps benefit. Better yet, it's Free Software, licensed under the GPLv3+. Support will debut in Cyanogenmod 11, but you can grab a 10.2 nightly build to try it out now.
This discussion has been archived. No new comments can be posted.

CyanogenMod Integrates Text Message Encryption

Comments Filter:
  • Key exchange (Score:5, Interesting)

    by Anonymous Coward on Monday December 09, 2013 @08:35PM (#45645843)

    The most important part of any crypto communication system is key exchange. Looks like this protocol uses automated SMS key exchange, and implementations should store keys similar to SSH. It's trivial to MITM, but it's a high risk attack because people can simply meet in person to compare keys.

    • by DrYak (748999) on Monday December 09, 2013 @10:08PM (#45646589) Homepage

      It's trivial to MITM, but it's a high risk attack because people can simply meet in person to compare keys.

      Avoiding MITM has been successfully solved using the Socialist Milionaire [wikipedia.org] problem.
      At most, 2 contacts need to call (voice) each other and compare a bunch of keywords. From that point onward, their communication can be trusted.

      I see another problem:
      The best (and nearest-to-perfect) secure solution requires end-to-end encryption. (the absolute first and last application on the chain to the encryption / decryption. Encryption is done on the first ever software getting the message, decryption is done on the last software drawin the message on the screen)

      But CyanogenMod's implementation isn't end-to-end. They instead have integrated crypto in the SMS messaging service of the OS.
      The intention is noble: You're not forced to use CyanogenMod's SMS App. You could use Skype or Facebook chat app (as long as the app supports handling SMS in addition to other communication)...
      The main problem is easy to spot: ... These 3rd party app could actually be spying.

      • by wvmarle (1070040)

        But CyanogenMod's implementation isn't end-to-end. They instead have integrated crypto in the SMS messaging service of the OS.
        The intention is noble: You're not forced to use CyanogenMod's SMS App. You could use Skype or Facebook chat app (as long as the app supports handling SMS in addition to other communication)...
        The main problem is easy to spot: ... These 3rd party app could actually be spying.

        Same for CyanogenMod itself. Who says this addition hasn't been implemented by an NSA employee, backdoor and all?

        • by DrYak (748999) on Tuesday December 10, 2013 @04:12AM (#45648349) Homepage

          Same for CyanogenMod itself. Who says this addition hasn't been implemented by an NSA employee, backdoor and all?

          Except that CyanogenMod itself is opensource.
          You check the source yourself, and the source is seen by lots of other people. If there's a backdoor in there, someone is bound to see it.
          Even if some NSA employee managed to use social engineering to sneak in an exploitable-bug while submitting a patch to improve otherwise the code, someone will end up noticing it. (e.g.: Both Debian and Android have had, at some point of time, a broken DSA generation which produced predictable key. Nonetheless, in both case the defect was noticed and corrected).

          That's the whole point of RMS' rant about free and opensource being a necessity for security. If the source is open, you don't have to specifically trust the author of the source (who might either be a mole or clumsy and end up making bugs). You can instead trust the community (Debian, Android), or you could check it yourself (I'm able to do *some* light code reviewing for a few of my coding needs), or pay someone to do the checks for you (TrueCrypt is exactly getting this treatment, crowd funding style).

          And even if you don't compile your binaries yourself and doubt about the binaries offer as downloads by the CyanogenMod team (perhaps the binary you download contain a backdoor that isn't in the source), several tools are here to help too:

          - GPG-signing of binaries (so you know the binary you got was actually from CyanogenMod and not one of the relay of NSA which ended up serving you a booby-traped binary, exactly like their slashdot clone)
          - Deterministic build (a way for several independant people to check that the binary you have are produced from the official source and not by some NSA mole inside CyanogenMod who is injecting a backdoor before publishing them. It's used by Tor, Bitcoin, etc. It's being implemented for TrueCrypt too)
          - Differential build (each time there's a discussion about trusting the source, there's always someone coming up with this old paper of C's author about booby trapped self-replicating compiler. And completely forgot that the author himself proposed a way to detect such booby-trapped shit. Not that this was ever seen in the wild. But in theory it's evitable, with these technique).

          • by DrXym (126579)

            TrueCrypt is exactly getting this treatment, crowd funding style.

            After how many years? Truecrypt has been out for a decade, meets the definition of open source, and despite its relatively modest size is only now receiving audit to see if the source can be trusted and that the binaries everyone has been using were actually built from it.

            As such I wouldn't hold much faith that just because Cyanogenmod is open that suddenly it's more secure than a proprietary product. It might be and open source is good for a raft of reasons, but I suspect anyone who wanted to throw an ex

            • As such I wouldn't hold much faith that just because Cyanogenmod is open that suddenly it's more secure than a proprietary product. It might be and open source is good for a raft of reasons, but I suspect anyone who wanted to throw an exploit could still bury it in plain sight if they wished.

              No, indeed. Being opensource doesn't make CyanogenMod automagically secure. GPL and BSD license, aren't magic pixie dust, per se.
              BUT being opensource at least make it 100% possible to audit CyanogenMod (unlike say, iOS. Even if you wanted you couldn't audit that one, because its source code is a well guarded secret by Apple).

              If you're not content with approach of "let's wait. if there's something evil inside, someone is bound to discover it eventually, some day", YOU CAN DO something about it.

              1. Either have

              • There is still yet another reason to trust OpenSource code, risk of being exposed. If you're the NSA, and you're inserting illicit code into Open Source, then you're at a very high risk of being exposed as a mole. This risk, being a known mole is too high for a "real" spy. If I were a spy agency, I wouldn't risk any assets for such a short term gain. Once exposed, a mole will have no trustworthiness AND all associations would likely become suspect. Basically, you're risking the whole operation on the assump

                • Combined with the above, these two assumptions (risk of exposure, looking for compromises) is sufficient to take the approach that the code is not likely compromised on purpose. This is not to say, that there are no risks, just that they aren't likely to be intentional.

                  Either that, or the backdoors are much more sophisticated and designed to look like genuine errors once discovered.
                  Probably would look much more like something out of the "Underhanded C Code Contest [xcott.com]" than an explicit "If (NSA_flags == ture) then send_to(NSA, data);"

                  In theory, someone with half a clue would notice that putting backdoors has 2 very strong disadvantages:
                  - a bug exploitable by your guys is a bug exploitable by Russian/Chinese/etc. a hidden backdoor in software used by US civilians also makes th

                • by dubbreak (623656)

                  This risk, being a known mole is too high for a "real" spy. If I were a spy agency, I wouldn't risk any assets for such a short term gain. Once exposed, a mole will have no trustworthiness AND all associations would likely become suspect.

                  And the solution to that problem is easy. Money. Well money and indirection.

                  Most people can be bought for a price and they don't have to know it's the NSA doing the buying, it could be a terrorist group or something more benign. All that matters is there is not direct link between the code submitter and NSA. Heck, the submitter can claim the NSA made him/her do it as long as they come off as a crazy person (which they will with no direct proof.. "well this person paid me to submit this code, no they didn'

                • by DrXym (126579)
                  This is absurd. Risk of being exposed? I'm sure the NSA is more than capable of running a few fake personas.
    • by chihowa (366380) *

      Today, we are launching our version initially into the CM 10.2 nightly stream to test the server load and make sure things are working at scale. Once things are dialed in, we’ll also enable this for CM 11 builds moving forward.

      Depending on how it's implemented, the whole system [github.com] may depend on a central server that facilitates the initial key exchange (prekeys). That, in itself, seems like a massive compromise vector. Why should users need to trust a third party server for key exchange? It'd be much better to use a system like ZRTP where the users are expected to compare fingerprints out of band.

      There is a method of key exchange that doesn't use the server (KeyExchangeMessage), but it isn't clear whether the user gets to choose whi

      • by Fnord666 (889225)

        Depending on how it's implemented, the whole system may depend on a central server that facilitates the initial key exchange (prekeys).

        From the WhisperSystem posting:

        The Cyanogen team runs their own TextSecure server for WhisperPush clients, which federates with the Open WhisperSystems TextSecure server, so that both clients can exchange messages with each-other seamlessly.

        • by Pi1grim (1956208)

          Wait, wait, wait. If they federate with some other service, why not federate it with XMPP networks? AFAIK TextSecure uses OTR or some variation of it. And if you make it talk to other xmpp servers out there it's not yet another messenger, it's a step towards future.

  • by geminidomino (614729) on Monday December 09, 2013 @08:40PM (#45645881) Journal

    Even before the buyout, the CM team refused patches to basically integrate pdroid into the mod, for fear of "angering developers." So even if something like this works, all the bad guys have to do is hit up the app market for the data it's sucking up anyway.

    • by Rennt (582550) on Tuesday December 10, 2013 @07:12AM (#45648945)

      That's kind of a re-invention of history. CM simply didn't integrate pdroid because it was a support nightmare waiting to happen. At the time of the pdroid discussion, Steve said that they were already working on a bunch of privacy features that would meet the usability standards they were aspiring to... and here we are.

      Don't forget that this message encryption follows on from the App Privacy Mode that they have successfully deployed since then (and makes much of pdroid redundant). They are taking a measured and transparent approach to privacy. Just as a serious organisation should..

      • That's not what the mailing list said. And no, the current offerings don't come even close to pdroid, or "enough", for that matter. It's pitiful theater at best.

  • Spy vs Spy (Score:5, Interesting)

    by BringsApples (3418089) on Monday December 09, 2013 @08:47PM (#45645951)
    Seriously, why are The People trying to play Spy vs Spy with their own government? The government owns the internet. It's as silly to encrypt your license plate as it is your text messages. You have no way to do so. If you're able to send a text, then you're using a carrier of some kind. That carrier has no control over the government's ability to get the data if the government wants to. Remember, it's metadata that we're talking about. "Who talked to who - and what time(s)". Linking people together is what it's all about. They don't need to know what you're talking about, so long as they know who you're talking to.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      You show...good but not total understanding. In current systems this works.

      In past systems this has been handled.

      See mixmaster, remailers, etc.

      If whisper does this right, what people know is "35 messages go into mixmaster3 at time t" and "15 messages go out of mixmaster3 to mixmasters 1..n at time t+1" and "16 go out to realworld addresses A1...Ai"

      A good enough tumbler chain crushes most metadata analysis for short messages, provided you'll live with limited message loss, and a bit of latency that makes re

    • Re:Spy vs Spy (Score:5, Insightful)

      by hawguy (1600213) on Monday December 09, 2013 @09:40PM (#45646399)

      Seriously, why are The People trying to play Spy vs Spy with their own government? The government owns the internet. It's as silly to encrypt your license plate as it is your text messages. You have no way to do so. If you're able to send a text, then you're using a carrier of some kind. That carrier has no control over the government's ability to get the data if the government wants to.

      Isn't that the whole point of this project? It allows you to encrypt your data, so unless you think the government has a secret back door into every encryption algorithm, when you encrypt your data, the government can't see it. They may still be able to see who you're talking to (a TOR-like extension might help), but they won't know what you're saying unless they compromise your phone (or happened to compromise the key exchange).

      Remember, it's metadata that we're talking about. "Who talked to who - and what time(s)". Linking people together is what it's all about. They don't need to know what you're talking about, so long as they know who you're talking to.

      Despite what the NSA wants you to think, it's not just "Metadata" -- any analyst who believes that a conversation is with a foreign correspondent can retrieve the entire contents of the conversation -- text, email, etc with nothing more than a slightly better than 50% belief that one party in the conversation is foreign. No warrants or other oversight required.

      Do you think the government should be able to retrieve your private conversations on an analyst's "hunch"?

      • Do you think the government should be able to retrieve your private conversations on an analyst's "hunch"?

        Not at all, I think the whole thing is total bullshit, beyond expression and pushes the bounds of humanity itself. However That means nothing to anyone but me, maybe some others. What our government needs is some small, no matter how small, reason to point a finger. Don't forget, the whole war in Iraq started because of a "hunch". And they found out that that hunch was wrong. Well... the wars (yes wars) machine keeps on turning, brother.

        ...so unless you think the government has a secret back door into every encryption algorithm...

        Where have you been man? See here [techreport.com].

        • by hawguy (1600213)

          ...so unless you think the government has a secret back door into every encryption algorithm...

          Where have you been man? See here [techreport.com].

          There's a big difference between a backdoor in a published encryption algorithm and a backdoor in commercial encryption software/hardware. It's much harder to hide a backdoor in a well known algorithm that's been under international scrutiny. Though I do have my doubts about the ECC constants [stackexchange.com]

      • by crazyvas (853396)
        One other point to add to the parent: Whatever you give up now, whether it's metadata alone or metadata + content, consider that this will be stored forever. It might be become easier/legal to probe into these several years down the line. If the govt's storage systems are incompetent, they might all even become public at some point.

        It is /still/ worth protecting what you can.
      • by number17 (952777)

        Do you think the government should be able to retrieve your private conversations on an analyst's "hunch"?

        Does that analyst work for the government or a Company X located in Country Y? If they work for Company X located in Country Y then no warrants or other oversight required.

    • The Importance of Metadata:

      It is not Who you know, or What you know, or even What you know about Who you know.

      It is Where is Who, that provides the targeting for the missile.

    • by jovius (974690)

      Playing Spy vs Spy with the government is a form of democratic control of things related to you. That control has been lost to agencies and bureaucrats. There should be a trusting relationship, but the trust is broken. The trust itself is based on an empty facade, which was effectively proven by Snowden (and others..). The power is an illusion. In the end an individual can independently act regardless of the conditioning.

  • I don't see the point, since the encryption methods are probably also hacked, and we probably has OS libraries with whisper routines that just invoke built into our devices.

    All the backdoors are wide open, and the front doors too. We live in Stasi Germany.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      I've seen little suggesting the NSA has been breaking much crypto at the algorithm level...most of what I've seen in the leaked materials has suggested more in the way of mitm attacks (sometimes through key forgery, as the NSA essentially has root certificate authority). The public key side of this of course still leaves that possibility entirely open if one doesn't take care to do a secure key exchange, but it's still a step in the right direction.

    • by ewieling (90662)
      Are you advocating people more or less say "I give up. You win." to governments?
      • You gotta be of ill will to read that in his comment. I don't think you are. Question then is why your
        comment. Answer: maybe you are more desperate than him. A shallow answer. Morale: let's be sure
        we're all on the same side and do something on the technical as well as the political level.
  • aFskf8as sdfjsdf a8Pg7d !!
  • While anything to prevent any intrusion into peoples private conversations is laudable, it's just treating a symptom of the greater problem. Just my opinion, but I would also like to see efforts made so governments and any other authorities don't spy on their people as well as these kinds of efforts which make it so they can't spy on their people.
  • I looked at this, and there are 2 things I can't understand:

    1. How does key distribution work? Even public-key crypto of this type doesn't necessarily work if there is a man in the middle.
    2. How is metadata protected? For an SMS, often the timestamp and sender/recipient pairing is as revealing as the message content.

    • From what I gather the encryption scheme is vulnerable to MITM attacks, and doesn't do anything about metadata.

      Every message is encrypted with a unique key so if they MITM a conversation they'll only get that conversation's data.

      MITM isn't hard for agencies like the NSA, but it takes a hell of a lot more effort than passive taps.
      The idea isn't to prevent a targeted attack, the idea is for users to prevent large scale data collection.

  • by dsoodak (3022079) on Monday December 09, 2013 @11:16PM (#45647075)
    There was an article posted on either slashdot or boingboing which linked to the following: http://events.ccc.de/congress/2011/Fahrplan/attachments/2022_11-ccc-qcombbdbg.pdf [events.ccc.de] Summary: the (usually) proprietary firmware on the chip that controls real-time functions such as wireless communication (which requires so many different standards to be adhered to that it ends up being a real mess and rarely rewritten) is surprisingly easy to hack. I believe there was a quote that you could get remote code execution after sending it a string of less than 100 bytes. It also mentioned that the chip with the main OS is often a slave to the one with the RTOS. Just curious if anyone knows if CyanogenMod accounts for this particular type of security vulnerability.
  • Text messages are useful because they're banal. "I'll be late" "pick up milk" "what section are you sitting in?" "when do we need to leave?" If you're committing any relevant content to SMS you have different problems that security isn't solving.

  • How keys are managed (Score:5, Informative)

    by Fnord666 (889225) on Tuesday December 10, 2013 @01:03AM (#45647711) Journal
    From the Open WhisperSystems Blog: [whispersystems.org]

    The TextSecure Protocol

    TextSecure's upcoming iOS client (and Android data channel client) uses a simple trick to provide asynchronous messaging while simultaneously providing forward secrecy.

    At registration time, the TextSecure client preemptively generates 100 signed key exchange messages and sends them to the server. We call these "prekeys". A client that wishes to send a secure message to a user for the first time can now:

    Connect to the server and request the destination's next "prekey."
    Generate its own key exchange message half.
    Calculate a shared secret with the prekey it received and its own key exchange half.
    Use the shared secret to encrypt the message.
    Package up the prekey id, the locally generated key exchange message, and the ciphertext.
    Send it all in one bundle to the destination client.

    The user experience for the sender is ideal: they type a message, hit send, and an encrypted message is immediately sent.

    The destination client receives all of this as a single push notification. When the user taps it, the client has everything it needs to calculate the key exchange on its end, immediately decrypt the ciphertext, and display the message.

    With the initial key exchange out of the way, both parties can then continue communicating with an OTR-style protocol as usual. Since the server never hands out the same prekey twice (and the client would never accept the same prekey twice), we are able to provide forward secrecy in a fully asynchronous environment.

    • Since the server never hands out the same prekey twice (and the client would never accept the same prekey twice), we are able to provide forward secrecy in a fully asynchronous environment.

      PFS only if you trust the server. Haven't we seen the "trust the server"-concept a few times too much lately?

    • Man, I can see a few holes in that, and I only read one of Bruce Schneier's books...
  • So now your 20 character "just on the way home" text message blows out to a couple of thousand bytes, and your telco provider has to send your SMS as 10 SMS now, and charges your accordingly.

  • Encrypting your stuff is all good and nice, but you should use a piece of software that has been written using established secure coding standards. Just because it's open source doesn't mean it's also secure (cf. PHP, OpenSSL). Rather, being open source is a necessary, but not a sufficient criterion in the evaluation of security-critical applications.

    Given the track record [f-droid.org] of this particular application, I'm a bit skeptical whether one should really use it for anything serious.

    • by gnoshi (314933)

      Hmm. I feel OK about trusting someone who understanding encryption standards sufficiently well to to identify SSL implementation bugs in major browsers (and construct exploits for them) working on encryption software that I use.
      What I got out of the F-Droid conversation was that someone complained about a bug (which they overstated) that had already been fixed, and because Moxie himself wasn't publishing to F-Droid the version on it didn't get updated.

  • CM11 nightlies starting with cm-11-20131210-NIGHTLY also include WhisperPush according to the changelog. I'm still on 20131208 on Nexus S so I can't check how it actually behaves.
  • "A few people are now more concerned regarding their privacy after discovering about efforts made by governments to spy on their communications."

    There .. that's more accurate....

FORTRAN is for pipe stress freaks and crystallography weenies.

Working...