Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Encryption Security

Plausible Deniability From Rockstar Cryptographers 358

J. Karl Rove writes "Nikita Borisov and Ian Goldberg (of many, many other projects) have released Off the Record Messaging for Gaim. Encrypt an IM, prove (at the time) that it came from you, and deny it later. The authentication works only when the message is sent; anybody can forge all the messages he wants afterwards (toolkit included). Captured or archived messages prove nothing. And forward secrecy means Big Brother can't read your messages even if he wiretaps you AND grabs your computer later on. All the gooey goodness of crypto, with none of the consequences! They have a protocol spec, source code, and Debian and Fedora binaries."
This discussion has been archived. No new comments can be posted.

Plausible Deniability From Rockstar Cryptographers

Comments Filter:
  • by Anonymous Coward on Thursday December 16, 2004 @05:06PM (#11109152)
    Who needs any of this? Just try what I do: write your messages as GW Basic programs. This is so uncrackable that even I can't tell what is in it after I use it.
  • by raider_red ( 156642 ) on Thursday December 16, 2004 @05:09PM (#11109192) Journal
    A way to deny some of the stupider posts I've made on Slashdot.
  • by MikeCapone ( 693319 ) <skelterhell @ y a hoo.com> on Thursday December 16, 2004 @05:10PM (#11109195) Homepage Journal
    This thing sounds great, but before it is really useful it needs to be out there in sufficient numbers. I hope that distros will start installing it by default on their default gaim version.
    • It shouldn't even be a matter of lots of people using it. After all, if you write something and get busted for it, you can use Plausible Deniability in court.

      "Your honor, there is no way to prove that this message came from my client or was forged by the investigators who used to beat him up in gym class."

      I guess then it would just turn into a matter of your word vs. theirs.

      Any lawyers out there?
      • I guess then it would just turn into a matter of your word vs. theirs.
        That's the way it is with any other case.

        "Your honor, there is no way to prove that this kilo of cocaine came from my car. It's just the officers word vs. mine. Someone's framing me."

        • right but this turns it from "we can prove that it was him" to "we're pretty sure it was him because we trust the cops more". That seems like a big legal difference to me.

          To use the cocaine example, imagine that in one case, the cocaine had your prints al over it and had a picture of you holding it. In another case, there's a kilo of coke in your trunk that doesn't have any prints or any other indications that you ever saw it in your life. If you go to court for having a kilo of coke in your trunk, you'
          • To use the cocaine example, imagine that in one case, the cocaine had your prints al over it and had a picture of you holding it.
            Or imagine that the police dug though the garbage in your car and found a bag or a box with your prints on it, and said they found the cocaine in there. And then they photoshop your face onto a picture of someone else holding that bag. Farfetched? Sure, but who is the jury going to believe, the police -or- an accused drug dealer?
      • I guess then it would just turn into a matter of your word vs. theirs

        Right. Because the word of a defendant at trial is worth a lot. (That's sarcasm, for the record).

        The word of an officer almost always carries more weight than that of the accused. I've never seen anyone get out of a ticket for rolling through a stop sign just because the only evidence was the testimony of the cop.
  • I wonder (Score:4, Funny)

    by ab384 ( 810021 ) on Thursday December 16, 2004 @05:11PM (#11109212)
    How much later is "later"?

    "Did I just say that I'd walk the dog?"
    "Yes!"
    "Nobody can prove that I just said that."

    • Re:I wonder (Score:5, Informative)

      by Entrope ( 68843 ) on Thursday December 16, 2004 @05:16PM (#11109282) Homepage
      "Later" is after the speaker decides that conversation is over. You pick a signing key for your messages, sign it with your normal public key, send messages using the first key, and your correspondent can confirm you are who you claim. When you want to finish the conversation, you publish (at least to your correspondent) the temporary signing key, and anyone who has it can then forge messages that are as trustable as what you said.
      • Re:I wonder (Score:5, Interesting)

        by roystgnr ( 4015 ) <roy&stogners,org> on Thursday December 16, 2004 @05:35PM (#11109468) Homepage
        What stops your correspondent from sending your messages to something like Stamper [itconsult.co.uk] before you publish the temporary key? After the temporary key is published it will be possible to forge messages signed by that key, but it won't be possible without the collaboration of the timestamping service to forge messages signed by that key and dated before it's publication.
        • Re:I wonder (Score:4, Interesting)

          by Anonymous Coward on Thursday December 16, 2004 @07:02PM (#11110287)
          With Stamper he can prove he received a message before a certain time. What he can't prove is that he hadn't already got the signing key at this time (as nobody will certify the time of the publication of the key). So while he knows these messages were sent by you, he can't prove it to anyone else, as he could have gotten the signing key first, then generated the messages and then send first the messages to Stamper and the key afterwards.
      • There is only one gotcha: if you are corresponding with those you are ostensibly trying to cloak your communications from.

        They could then collect the plain-text and log the IP address from whence it came.
        • Plausible deniability doesn't mean you can convince someone you didn't say to them what you said to them. It just means they can't prove to someone else that you said it. That they logged your IP just means they know your IP, which proves at most that you talked with them about something, but not what you said.
  • by Chris Mattern ( 191822 ) on Thursday December 16, 2004 @05:15PM (#11109275)
    Does this mean it's going to feature in the next edition of GTA?

    Chris Mattern
  • OK, I've followed the link and read, but the bottom line is, how does this supposedly do what it claims to be able to do?
    • by chill ( 34294 ) on Thursday December 16, 2004 @05:21PM (#11109333) Journal
      It authenticates and creates a "conversation". This allows you to be certain the person on the other end is who you think it is. DH key exchange is performed.

      Then, messages sent during that conversation are encrypted using disposable session keys. (128-bit AES w/SHA-1 HMAC).

      Think of it as an authentication tunnel down which you send encrypted messages. The message encryption is in no way related to the authentication, and the disposable session keys mean they have no re-use value.

      -Charles
      • Thanks. That helps some, and makes it a bit clearer than jusr reading the protocol document. But I'm not clear on how this acomplishes Big Brother can't read your messages even if he wiretaps you AND grabs your computer later on.

        I presume this has something to do with that authentication tunnel , but I'm not really following it. Do you understand it?

        • The idea is the keys are disposed of when the tunnel is torn down.

          If big brother gets your MAIN key, he has no way of recreating the SESSION keys. Those are created using key info from the person you are chatting with as well. Without those, the messages are now subject to brute-force.

          NOTHING is perfect. If your machine is compromised BEFORE you start the conversation, it would be possible to get everything and crack it nicely.

          Hmmm...I do wonder about how hard it is comparitively to cryptanalyze ultra
          • I did't see a provision for refreshing session keys, but I only glanced thru the code and docs and didn't read it in depth.

            Its not explicitly mentioned, but "forward secrecy [atis.org]" implies that the session keys change at some point, though it may not change within a single communication. (If Key A and Key B always created the same SessionKey S, then compromising Key A or B would allow an attacker to reveal S (for all past sessions as well) when they talked to each other again.)
            • I see it in the protocol.txt file. It looks like they change session keys pretty often. It is near the bottom of the file, in the "When you receive an OTR Data message:" section.

              * * *

              If the MAC verifies, decrypt the message using the "receiving AES key".

              Finally, check if keys need rotation:
              - If the "recipient keyid" in the Data message equals our_keyid, then
              he's seen the public part of our most recent DH key pair, so we
              securely forget our_dh[our_keyid-1], increment our_keyid,
        • The key seems to be the "disposable key" part.

          With normal public-key crypto, you sign with your actual private key, and you encrypt with the recipients actual public key. This means that if someone gets hold of the recipients private key, then can decrypt the messages, and because your public key is, well, public, they can prove that you wrote the message.

          In this system, you generate throw-away keys, and exchange them securely when you start communicating. After you are done communicating, you can just

        • Exchange public keys, so you can do crypto. Then, using public crypto, send throw-away public keys to the other guy, every 20 seconds, and encrypt your conversation with them. Since these keys were only ever in ram, not on the HD, they can't use them to decrypt the messages. I think.

          I have trouble following it myself.
  • I think cross-client compatible encryption is more important at the moment. Jabber offers OpenPGP, but the development of the gaim plugin that also does this has stalled a while ago. Bummer. As long as only gaim talks to gaim with a particular encryption, it won't get used on a wide scale.
  • I wonder (Score:3, Funny)

    by WormholeFiend ( 674934 ) on Thursday December 16, 2004 @05:20PM (#11109321)
    Is there an Internet Cafe at Guantanamo?
  • by Anonymous Coward on Thursday December 16, 2004 @05:21PM (#11109328)
    Sometimes Big Brother can 'prove' anything by force. Why do you think he's called Big? Small people need stuff like evidence, proof, and proper legal process. There are many recent examples of Big Brother having his way, proof and fact be damned.
  • by G4from128k ( 686170 ) on Thursday December 16, 2004 @05:21PM (#11109329)
    If you create a message, chances are that fragments of the plain text will be in various caches and VM pages on your harddisk. It may not last for very long -- being overwritten by subsequent paging -- but if someone takes your computer soon after, they may find incriminating junk on the HD.
    • So don't use a swap partition. If it's a concern of your's, at least. What are you, a criminal? :P

      If you're using gaim, chances are high that you're also using linux. There's no rational reason to be using a swap partition on a linux desktop, what with the price of RAM these days.
      • How about using SELinux (plus extensions?) to setup Access Control Lists and encrypt/strong wipe the swap drive?

        Mac OS X 10.4(i believe) will support encrypting the swap file, and is going to use ACLs to boot. Linux is surely capable, I would assume.
      • WTF? (Score:3, Informative)

        by phorm ( 591458 )
        What with the price of RAM these days? Sorry, but even with a lot of RAM there's not any reason why one shouldn't have swap. What happens when you do overrun your RAM just that one time?

        Besides, swap in 'nix isn't used unless you need to. Most of the time my laptop (256MB RAM) doesn't run into swap at all, so chances are I don't have to worry about that.

        And as to the temp files, etc... if you do have the RAM to spare and you're really paranoid, mount a nice big 512MB ramdisk on loopback and a quick rebo
    • Aren't encrypted swap partitions possible?

    • by Mr.Ned ( 79679 ) on Thursday December 16, 2004 @05:31PM (#11109429)
      That's why you have encrypted swap. On OpenBSD it's as simple as setting the sysctl 'vm.swapencrypt.enable=1'; there are HOWTOs for other operating systems. Look for the device mapper on Linux, for example.
      • For those you want to know how to use encrypted swap paritions on Linux here is how:
        PS: Your computer will not operate any slower than when using plain swap. I kid you not.
        PPS: this works in mandrake and suse.

        make sure module cryptoloop is loaded:

        > modprobe cryptoloop

        assuming you want to use /dev/hdb as your swap partition (you can actually use any partition or even a flat file) then type:

        >losetup -e aes256 /dev/loop0 /dev/hdb

        if /dev/loop0 doesn't work, try loop1 or loop2 etc. (you are looking fo
    • fragments of the plain text will be in various caches on your harddisk...being overwritten by subsequent paging

      Worse than that, HD data overwritten is still recoverable if someone had enough cash, like say the FBI.

      How to really wipe a HD [slashdot.org]

      The only way to guarantee The Man can't get your data is to melt down your drive.
  • by man_ls ( 248470 ) on Thursday December 16, 2004 @05:28PM (#11109397)
    I really want a cryptosystem where I can enter, say, two different plaintexts (of similar length, I imagine) and then there are two keys: the private key, and the decoy key.

    If required to give up "your private key" then give up the decoy key. The decoy plaintexts decrypts, and you're done. The real plaintext is still hidden away.

    Does anything like this exist?
    • by myowntrueself ( 607117 ) on Thursday December 16, 2004 @05:35PM (#11109463)
      "Does anything like this exist?"

      Its called 'steganography'

      What you do is you have a huge stash of embarassing hardcore porn, say 'bukkake bloopers 2000'

      You use steganography to hide your real naughtyness inside those images and encrypt the image archive.

      When someone insists that you decrypt it, you naturally get really embarassed but finally relent.

      They see what you are 'hiding' and maybe laugh in your face; but they don't detect the stegged content (which would, presumably, be *far* worse than 'bukkake bloopers 2000' but what *that* could be I cannot imagine).
    • by Speare ( 84249 ) on Thursday December 16, 2004 @05:44PM (#11109547) Homepage Journal
      I thought of the duress keyphrase, too. While we're randomly thinking, I once imagined that a good keyphrase (decoy or otherwise) would be the full text to the Fourth Amendment. Then recite the keyphrase only under oath before a Judge. Worth a shot, anyway.
    • by Qzukk ( 229616 ) on Thursday December 16, 2004 @05:47PM (#11109581) Journal
      Yes, its called "Phonebook Encryption". Not sure why. It's written by familiar faces [freenet.org.nz] though.
      • As an aside, the steganography idea mentioned by another poster above is probably the one I'd use.

        With this thing on your computer, you could give them the fake key, and in a couple of days they'd figure out that you've got the phonebook userspace tools on there and realize they've been had.
    • Disclaimer : IANBS (I am not Bruce Schneier)

      1. use the decoy D plaintext as a One Time Pad (yes, OTPs are inconvenient and need to be transmitted secretely too) and encrypt your plaintext P with it. This gives ciphertext C. C = f(P,D)=f(D,P)

      2. when "they" require you to give up your key, give them the message you wanted to hide from them. Cross your fingers they don't look at that OTP. When they decrypt the ciphertext with the key, they will get the decoy message. Just hope for them not to look at the key
    • I really want a cryptosystem where I can enter, say, two different plaintexts (of similar length, I imagine) and then there are two keys: the private key, and the decoy key.
      You're searching for a system that isn't vunerable to 'rubberhose cryptanalysis'. See also this slashdot thread [slashdot.org].
    • I have given some thought towards creating such a dual-plaintext message. The main problem is that, in practice, generating a complete bogus plaintext for every "sensitive" message you send is going to be a pain and not many people will be willing to do it.

      But if you are, the simplest approach is to encrypt all messages in a double-length mode. When sending an innocent message, one not requiring the double-encryption feature, it gets encrypted as usual, and gets paired with a random stream of noise data, o
    • For further information, here is a link to a long posting I made on sci.crypt five years ago on the topic of dual-plaintext messages:

      http://groups-beta.google.com/group/sci.crypt/msg/ 7f73818727a16be5 [google.com]
    • Does anything like this exist?

      Yes. Sort of.

      http://truecrypt.sourceforge.net/ [sourceforge.net]
      http://www.security-forums.com/forum/viewtopic.php ?t=24577 [security-forums.com]
    • by cutecub ( 136606 ) on Thursday December 16, 2004 @06:15PM (#11109838)
      The only conceptually similar system I know about is the, now defunct, rubberhose [rubberhose.org].

      Rubberhose was a plausibly-deniable disk encryption system which allowed you to create 2 distinct encrypted file systems which occupied the same disk space.

      One would be the decoy and have harmless boring info, the other would be the "real" file system.

      If you were compelled to give up the passphrase to the filesystem, you could give up the decoy passphrase.

      The implementation was tricky, because neither file system could "know" about the other, otherwise, an enemy would know you were hiding the "real" file system and could imprison or torture you into giving up the passphrase.

      Since the stakes were high, Rubberhose had features to thwart forensic disk-surface analysis. A percentage of disk blocks from both file systems would be randomly repositioned on the drive, to ensure that the more heavily used "real" file system didn't stand out in any statistical way.

      I'd love to see something similar revived.

      -Sean

    • There are several different ways of doing things like you describe.

      (Correct me if I'm wrong, but) one-time pads provide complete deniability, because any any encrypted message could produce any decrypted message, depending on the pad. It would be impossible to prove what your message really was.

      One time pads are usually too inconvenient. There are also 'rubber-hose' proof encryption systems, where the encrypted message includes empty space. Each key provided reveals more of the decrypted message, but it i
    • Check out a program called DriveCrypt. It'll let you do that or stenography. Unfortunatly it's Windows only at the moment, but it works great for me, stuck in a Windows world without a brick in sight. ::Digitac
    • It's called chaffing and winnowing.

      Pretty much, each cypher block is signed. If, during decryption, you see an invalid signature, then you ignore that block.

      Now, when you encrypt, you (randomly) multiplex the blocks of your super-secret message, several dummy messages, and many random blocks.

      It's obvious that your system does this. That's why you include several dummy messages (make 'em plausible!) and also many random blocks. They can demand another key all they want, but they have no way of telling

    • Shannon described this in his seminal paper Communication Theory of Secrecy Systems [edgenet.net] and called it equivocation (i.e. the property that multiple candidate keys will generate the plausible but different plain text messages from the same cipher text). Cryptographers consider this a good thing. The related notion of Unicity Distance refers to how much cipher text is needed to uniquely identify the that generated it (assuming that the cryptographic algorithm is known) with high probability. By keeping number
    • An interesting article about a cryptosystem along the lines of what you asked about:

      http://theory.lcs.mit.edu/~rivest/chaffing.txt

      An excerpt:

      I note that it is possible for a stream of packets to contain more than one subsequence of ``wheat'' packets, in addition to the chaff packets. Each wheat subsequence would be recognized separately using a different authentication key. One interesting consequence of this is that if law enforcement were to demand to see an authentication key so it could identif

  • Excellent! (Score:5, Interesting)

    by boodaman ( 791877 ) on Thursday December 16, 2004 @05:29PM (#11109401)
    Wonderful stuff if it does everything it is supposed to do. I can't wait to check it out.

    I've often wondered about this when it comes to forensics testimony. For example, even if you have my computer with some incriminating evidence on there, how can you prove beyond reasonable doubt that I put it there? I would think that unless you have a video tape of me typing the incriminating evidence on the keyboard, and can prove that the tape was made at the time in question and is unaltered, is the only way to prove anything.

    Computers can be programmed to do anything at anytime, including carrying on a "conversation". You can also easily create an incriminating e-mail message that looks like it was sent, but it never was. Ditto log files, etc. For example, Apache log files are text: it would be trivial to create a script that spoofed a log file with your IP address as the incriminating info...but then how does the plaintiff prove that isn't how it was created?
    • Comment removed (Score:5, Interesting)

      by account_deleted ( 4530225 ) on Thursday December 16, 2004 @05:34PM (#11109449)
      Comment removed based on user account deletion
    • There are logs all over the place. Your isp/employer will have logs of when you connected, checked mail, etc. I've done forensics on hack attempts on web sites and had to compare our logs with those of the ISP for the attacker in order to have what was considered meaningful evidence. An IP address is meaningless without context, as you say. A preponderance of evidence from multiple unrelated sources gives sufficient context.
    • For example, even if you have my computer with some incriminating evidence on there, how can you prove beyond reasonable doubt that I put it there?

      It's called Constructive Possession [law.com]. It's the same as when the cops find drugs stashed underneath your mattress. Because you are in control of that area, the drugs might be deemed yours. It doesn't matter that you weren't caught with drugs actually in hand (Primary Possession).

      If many people have access to the computer, it may be difficult to apply construc

  • This is great... (Score:4, Interesting)

    by Duncan3 ( 10537 ) on Thursday December 16, 2004 @05:29PM (#11109406) Homepage
    Not sure for _who_, but it's great.

    I can see some people having huge use for this, drug dealers, chat room stalkers, and of course all communications between an executive and their broker ;) Any place you need to be able to say "I didn't say that" later - where woulkd that be except a courtroom???

    I can't think of any good reason for _me_ to use it tho. Maybe I'm just not shadey enough.
    • Some things are Right(tm) but not legal.

      If you assume a benevolent government, then you don't need it. There are plenty of people who don't.
    • Now assume you live in Red China and are trying to conduct what U.S.ians consider legitamite business with Taiwan.

      "I didn't say it. Someone else must have forged it," won't stop you from being dissapeared, but it'll go a lot further before a tribunal and get you more help from the U.S. embassy there than "Yup. It was me what done it."
  • by Bronster ( 13157 ) <slashdot@brong.net> on Thursday December 16, 2004 @05:33PM (#11109444) Homepage
    Let me get this straight - it can be proved that you

    a) created a plausible deniability capable link; and

    b) intentionally released the key to said link so that someone else could impersonate you later.

    Frequently all that's needed is the fact that you communicated with somebody for evidence - not the specifics of what you said. Sure maybe you just called them up and did some heavy breathing down the line - there's no proof you actually _spoke_, but any jury in the world would convict you.

    Of course you work around that by creating a new link every hour to the same person, and maybe or maybe not using it - but it still shows you're in communication with them. There's no way around that.

    Nice idea, but don't think your child pornography dealing down this link is going to somehow get you off the hook.
    • Frequently all that's needed is the fact that you communicated with somebody for evidence - not the specifics of what you said.

      Martha Stewart went to prison based on what she communicated with her stock broker, not that she communicated with him. I'm sure both parties there would've been happy to have a bit of plausible deniability.

      This is a tool with a very specific purpose, and its unsuitability for other purposes doesn't make it worthless. I can't drive a nail with a cold chisel, but that's not a f

  • by fuzzy12345 ( 745891 ) on Thursday December 16, 2004 @05:35PM (#11109458)
    Quick, someone, anyone. Combine this with yesterday's P2P In 15 Lines of Perl: http://developers.slashdot.org/article.pl?sid=04/1 2/15/1953227&tid=95&tid=156&tid=1
  • by Anonymous Coward on Thursday December 16, 2004 @05:37PM (#11109475)
    BillG: So, did the donation to the SCO fund to kill Linux go through?

    SBallmer: Yep, sure did. And we even explained the need for us to buy one of their licenses for unlimited computers. You know, for our in-house independent benchmarking company. You know, the whole "Get the Facts" campaign?

    BillG: I see... but this SCO thing doesn't look like it's going to work. We need to go after them in even more indirect ways to avoid more antitrust sanctions. With Ashcroft gone, we may get a harder wrist-slap than last time.

    SBallmer: We're already getting the puppet companies set up now. They have applied for tons of patents that could destroy Linux. We simply buy a perpetual license to all patents for a cool billion, and we're set.

    BillG: How can companies apply for patents that already exist in Linux? What about prior art?

    SBallmer: Don't worry, there's plenty of critical new or rewritten code since the patent applications that violates them. We've even guessed what Linux might add in the future, and patented that as well!

    BillG: But if those lawsuits fail.. then what?

    SBallmer: Well, we're working on getting the GPL ruled illegal. We're also going to deal a blow to all open source operating systems by our deals with bios manufacturers to only run operating systems who have paid their license to get the code signed. (Don't worry, they listen to our piles of money - if they obey us, they money keeps coming)

    BillG: So, you want the computer to be like an xbox, then? We might want to start drafting legislation for mod chips to prevent people from using linux.. er.. pirated copies of windows longhorn without the subscription/expiration feature. After all, we don't want people to use windows without paying their subscriptions...

    SBallmer: Already in the works. Prebought PCs will include a 3 year subscription to Longhorn Home/Crippled Edition. After this 3 years is up, the people buy a new computer rather than renewing their license (for an old computer, mind you) for another 3 years. The money from Intel and Dell is already pouring in. We can't allow mod chips because people would just use that to load the Corporate Edition.
  • One Really Good Use (Score:4, Interesting)

    by Thunderstruck ( 210399 ) on Thursday December 16, 2004 @05:38PM (#11109486)
    Is for folks in Law Firms. An option like this can permit a lawyer to communicate over the internet with a client in a secure way (because getting my client to go through the process of encrypting stuff with GPG is unlikely at best) ... but where intercepted be useless as evidence in court.

    I gotta have it.

    • You do realize that attorney-client conversations are privileged and can't normally be used in court, right? At least not in any Western country I'm aware of. The fact that it got "intercepted" does not change this in the slightest.

      Besides, at the end of the day, if an attorney has to "give up" his client's secrecy, the court isn't going to bother with logs and taps - they're going to ask the lawyer what he was told, and if he doesn't fess up, they'll throw his ass in jail for contempt.

      OTR GAIM is not goi
  • by Mantorp ( 142371 ) <mantorp 'funny A' gmail.com> on Thursday December 16, 2004 @05:40PM (#11109509) Homepage Journal
    a while back there was a story up here about a gaim plugin as a p2p app, couple it with this and you can say "It wasn't me" that downloaded that Shaggy album.
  • This is great! (Score:4, Interesting)

    by lawpoop ( 604919 ) on Thursday December 16, 2004 @05:44PM (#11109541) Homepage Journal
    What I would like to see is some kind of encrypted, p2p, email/IM replacement that doesn't rely on centralized servers. I realise what I've said is redundant -- P2P that doesn't rely on servers, but I'm trying to be clear. Messages would get routed through webs of trust, and if you lose your keys, you can have your new keys signed by people you know in real life. This would totally eliminate spam and ensure privacy and authentication for communcations.
    • Re:This is great! (Score:4, Interesting)

      by legirons ( 809082 ) on Thursday December 16, 2004 @07:10PM (#11110362)
      "What I would like to see is some kind of encrypted, p2p, email/IM replacement that doesn't rely on centralized servers"

      Well why not go looking for them then, rather than writing it on slashdot. Many exist. Even something like InvisibleNet's IIP (invisible IRC proxy) would do lots of what you want, Konspire2B would do more, there are more encrypted P2P and chat tools than you can shake a stick at, plus protocols that offer what you want with many different clients. Or go all the way and try GNUNet [ovmj.org] (replacement for freenet) and such like.

      People are always posting "oh if only there was a distributed deniable torrented video blogging system with a pseudononymous web-of-trust [sourceforge.net]" or something, yet I never see you on my Konspire2B client. Just download the damn things and see what they do, some of the apps are really quite cool.

  • 1) Charge up a bunch of stuff on line on your CC.
    2) Immidiately post your CC number to the net.
    3) In amongst other potential charges, deny that you made any of them.
    4) Profit!
  • Messages sent _before_ transmitting the temporary session key are presumed to be authentic, while messages sent _after_ the temporary session key could have been forged. Not insurmountabe, but something to think about.
  • Killer! (Score:2, Funny)

    by go$$amer ( 218906 )
    Now I just need something interesting enough to talk about to merit the install :o

  • by earthforce_1 ( 454968 ) <earthforce_1@y[ ]o.com ['aho' in gap]> on Thursday December 16, 2004 @06:10PM (#11109801) Journal

    1. Receive message from your boss insisting you carry out some risky or unwise instructions.

    2. * Disaster *

    3. Boss disavows his earlier orders. Guess who is the fall guy?
    • Well, this isn't necessarily the case in your scenario because one could always argue that you wouldn't have done it had you not authenticated your boss in the first place.

      What it really protects you from is the case where, later, your boss forges messages and says he sent them TO you... or forges messages he says came FROM you; in either case you can claim that the conversation had already ended and it was indeed a forgery by someone else because the key was then openly available.

      It is a subtle, but impo
    • You would only use such systems in situations where both parties want total privacy. An office environment is not a place for this: your personal chats with friends are, as are research discussions with colleagues.
  • What possible beneficial effect does false repudiation offer? More ways for people to be assholes...
  • Suse?? (Score:3, Interesting)

    by pair-a-noyd ( 594371 ) on Thursday December 16, 2004 @08:32PM (#11111181)
    I tried to compile it on Suse 9.1 and it crapped all over itself.
    Anyone gotten it to run compile/run on Suse 9.1?

  • GAIM Encryption (Score:3, Informative)

    by SKPhoton ( 683703 ) on Thursday December 16, 2004 @09:02PM (#11111455) Homepage
    GAIM already offers two encryption plugins. It's cool to see another implementation being created.

    gaim encryption [sourceforge.net] uses RSA. There's also gaim-e [sourceforge.net] which uses GPG.

    I've used gaim encryption and it works very well. It requires the plugin to be installed on both ends but once that's done, it autodetects that both ends support it and enables encryption.

    Oh, there's a binary available for windows and both source and packages for linux.
    And, it's in portage!
    emerge gaim-encryption
  • by logicnazi ( 169418 ) <gerdesNO@SPAMinvariant.org> on Thursday December 16, 2004 @10:00PM (#11111958) Homepage
    Wow, that was an interesting and clever paper. At the very end of the paper though they consider the situation with email. In particular the question is asked if an encryption system which works for an asynchronos system like email but doesn't allow outsiders to prove authorship is possible.

    The solution proposed is to use ring signatures which only permit proof that one of the parties to the communication (secret) wrote the message. As the authors note this solution still suffers from the defect that a third party who manages to obtain the plaintext of a message can still prove that it was created by one of the participants. This can be partially protected against by encrypting the signature part of the message (assuming the message itself was not already so encrypted) to the recipient but if the recipients private keys are ever comprimised (a subpeona, confiscation of computer by law enforcement) this protection vanishes.

    The authors contend that no system using a non-interactive protocol can both provide authentication to the parties involved but resist proof of authorship by at least one of the parties in the case of key comprimise. I don't believe this is correct and while I can not provide a full system which demonstrates this property I can provide a sketch of how one might work and it would be an intriguing problem to design a cryptographic system with these properties.

    Suppose at some time t0 Bob creates a public private key pair together with time stamp attesting to the time of creation. This time stamp, and the key itself could be authenticated by Bob signing with his conventional non-repuditory long-lived key. Let us call the key parts Public and Private. Suppose also that we can discover a one way function S with an associated function (not necessarily one-way) P with the following property. If we apply the one way function S to Private and the function P to Public we create a new public/private key-pair, i.e., S(Private) is the private key associated with public key P(Public). If we could find such suitable functions we could design a cryptosystem with the requisite properties.

    Every time a fixed interval of time passes, say an hour, Bob applies the one-way function S to Private storing the new result and forgetting the original key. Thus after 1 hour Bob has the key S(Private) after two hours S(S(Private)) and so forth. Now when Alice chooses to send Bob a message she chooses for what period of time Bob is capable of authenticating that message. If she thinks he will read it immediatly she might choose an hour, if he is out of town perhaps a week. After composing the message Alice computes some sort of signature/authentication (Ring signature etc..). Now alice computes the number of hours that will have passed between the creation time stamp of Bob's public key and the time her authentication period ends. She then applies the function P to Public once for every hour and uses the result to encrypt her signature. She then appends the encrypted signature, and the unencrypted time it will expire to the message and sends it to Bob. If the communication is to be secret she could then encrypt the entire message authentican and all with her favorite encryption scheme.

    So long as Bob recieves the message from Alice before the authentican period has ended he has no trouble decrypting the authenticating signature. Bob simply computes the number of hours from the current time until the authentication period ends, applies S to Private that many times (not forgetting the current value of private in this case) and uses the result to decrypt Alice's authentication since the properties of the functions guarantee this is the corresponding private key to the public key alice used for encryption. Once decrypted the signature authenticates Alice's message and then is discarded by Bob (If a ring signature is used Bob can create the same signature at any time if he has the message plaintext so has no incentive to keep the decrypted signature).

    However, once the
  • and watch the RIAA and MPAA literally EXPLODE!!!!
  • Shades of grey (Score:3, Interesting)

    by mcrbids ( 148650 ) on Friday December 17, 2004 @07:44AM (#11114591) Journal
    One of the things that's particularly endemic to the Slashdot community is the "black/white" point of view - the idea that something is secure, or not, it's white or it's black.

    But that's not how security is! It's all shades of grey, and the darker the shade of grey, the worse off things are.

    Nothing is ever bulletproof, and seldom is anything ever wide-ass open to the world. It's somewhere in between.

    I have a remote-desktop package integrated with one of my apps. It makes for very easy tech support, and I've got it built right into the menu system of my most popular application [effortlessis.com], so that customers using my software package have access to instantaneous, high-quality tech support.

    To prevent users from popping up on my development system anytime they have a question, I put a password in place. It requires a small, 4-digit numeric code, and it changes every day.

    By slashdot standards, this is terrible security. It's numeric. No letters, just numbers. The code changes every day, but only based on the day of year. It can easily be predicted, if one has any understanding of the underlying, otherwise very simple algorithm used to guess these numbers.

    Anybody with a packet sniffer could crack it with one support session.

    But, in this case, it really doesn't matter. The worst that will happen is that your computer's desktop will appear on my screen without my Windows VM.

    You could DOS me with 10,000 VM screens, but it would take a very short amount of time for me to block the port number for the VPN and kill that.

    So, what's the purpose for improving security? It's secure enough. And that's the point. Many people around here will have a cow if something is potentially crackable, while sitting behind physical locks that can be compromised with an expired credit card.

    Gosh! Somebody could pull out their credit card, slide it through the gap between the door and the jamb, and break into your home!

    In a black/white world, your home would only be considered safe if it had 1/4 inch steel plate exterior, and locks that the NSA would have serious trouble with.

    In the real (shades of grey) world, a deadbolt and a solid-core door is usually good enough, and people live with the odds. Heck, even in the worst ranked neighborhood, you have about a 3.5 to 4 percent chance of getting burgled in a given year. (http://www.ojp.usdoj.gov/bjs/glance/burg.htm) I almost never lock my back door, and I've never had a problem with it.

    That's good enough security for most, as evidenced by the fact that the most important issue was national security or "the war in Iraq" in the recent election. (http://www.rasmussenreports.com/Issue%20Clusters_ Election%20Night.htm)

    Notice that individual household crime isn't even on the list (unless you include the 6% "domestic issues", despite the relative insecurity of the average home.

    Brought home to me by the book "Secrets and Lies" by Bruce Schneier, this world is not a black and white world. Relative risk must be evaluated, and the equation must be brought to something we can all live with.

    PS: Link to sites with A tags appears to be broken on slashdot. I tried numerous times to post links to the aforementioned sites and could not do so.

The use of money is all the advantage there is to having money. -- B. Franklin

Working...