Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Encryption Security

Modern History of Cryptography Techniques 204

Heather writes "The encryption scheme you rely on today might be full of holes just a few years down the road. Learn how far we've come in the last few decades, and why your apps need to be ready for change. This article builds on a previous article about Enigma, Germany's WWII-era encryption system."
This discussion has been archived. No new comments can be posted.

Modern History of Cryptography Techniques

Comments Filter:
  • What? (Score:5, Funny)

    by Anonymous Coward on Wednesday August 17, 2005 @11:10AM (#13339293)
    Why can I never undestand articles about cryptography?

    They always seem to be written in a way that makes them incomprehensible.
    • Re:What? (Score:5, Insightful)

      by Anonymous Coward on Wednesday August 17, 2005 @11:14AM (#13339343)
      Cryptography is pretty heavily math-centric. To truly love cryptography over and above the obvious social factors and coolness level of being able to hide stuff, you really need to be somewhat of an academic math geek. Academia speaks a completely different language than real people. It's a hazard of living in dark hallways and not getting out much to meet the human race.
    • Re:What? (Score:3, Funny)

      by Anonymous Coward
      OK, let me be clear.
      Alice ("A") writes a story about Cryptography ("C") to the IBM Developerworks site ("D"). Bob ("B") is watching over her shoulder, essentially intercepting the photons ("-->P")
      Bob, then submits the story to the Slashdot ("S")
      Anonymous Coward Reader ("R") is confused because they are written in a way (encrypted "+*+") to make them incomprehensible.

      So essentially, you are saying you don't understand:
      R ?= A(D)--->P(B)+*+"--->S
      What could be simpler than that???
  • by Raistlin77 ( 754120 ) on Wednesday August 17, 2005 @11:11AM (#13339315)
    The encryption scheme you rely on today might be full of holes just a few years down the road.

    If is will be full of holes just a few years down the road, wouldn't it then be correct to say it's full of holes now?!
    • by rholliday ( 754515 ) on Wednesday August 17, 2005 @11:14AM (#13339349) Homepage Journal
      I suppose technically that's correct. But, "The encryption schem you use today has holes in it, and the tools will get small enough to go through those holes just a few years down the road." just doesn't quite roll off the tongue. :)
    • by bentcd ( 690786 ) <bcd@pvv.org> on Wednesday August 17, 2005 @11:24AM (#13339449) Homepage
      Not if you only intended for the protection to last a couple of years.
      One of the key decisions to make when choosing an encryption scheme is for how long the information is to be protected. If the answer is "until release date", then you can often get away with a very low-end encryption scheme. If the answer is "forever", then go for one time pad and it'll be secure until doomsday. Of course, one time pad is considerably more expensive in terms of administration, but as is so often the case, you get what you pay for :-)
    • Yes, of course that's true.

      I suspect all current encryption schemes have weaknesses that will be epxloitable in the future. It's just a question of when. Is the NSA still 10 years ahead of the state of the art?

          Civilian crypto has come a long way and that's a good thing.

        Rich...
    • No no. Encryption is like a pair of socks. Just like socks over time encryption wears out and develops holes. The holes can be fixed by darning er patching but they aren't ever as comfy as before.

  • by karvind ( 833059 ) <karvind.gmail@com> on Wednesday August 17, 2005 @11:12AM (#13339329) Journal
  • At some point, decryption techniques will evolve to translate it to something cool.

  • by william_w_bush ( 817571 ) on Wednesday August 17, 2005 @11:16AM (#13339373)
    so... great, but why aren't most tcp streams encrypted by default? the client side load is negligable, and there is a lot of acceleration available server-side. Even relatively simple encryption would make me feel better about those voip calls I'm essentially sending in the clear over a public network.

    The net is a very public network considering, and especially considering how many protocols are plaintext cheap encryption (pref in hardware) seems like it should be required. It's past the proof of concept stage, just having it work at all isn't enough anymore.
    • by Anonymous Coward
      so... great, but why aren't most tcp streams encrypted by default?

      Because IPsec is the One True Way of doing IP encryption, and IPsec is basically unusable for opportunistic encryption.

      There are lots of encryption options out there if you look for them. Protocols like email (OpenPGP and SMTP over SSL), IM (numerous IM encryption options, ranging from crap to decent), and obviously HTTP have encryption already standard and built into common implementations.
    • So what sort of encryption would you place on a multicast stream? It is the job of the applications involved to determine if encryption is required, not the networks.
    • by qwijibo ( 101731 ) on Wednesday August 17, 2005 @11:37AM (#13339577)
      As has been mentioned, it's the job of the application to determine whether or not encryption is necessary and what type. There is no one size fits all solution that could be implemented at the network layer without creating more problems than it solves. If you're sending financial transaction information, the additional time to encrypt and sign is worthwhile. It takes time to encrypt and decrypt data. For VOIP, that may be considered an unnecessary and unacceptable inconvenience. However, from an application development standpoint, not offerring the user that choice is pretty lame.

      Another reason for not having a default level of encryption at the network layer is that it takes a long time to get everyone to upgrade. Poor encryption can be worse than none in the sense that non-security-geeks don't know the difference and may assume that their connections are secure. It's better to start with the assumption that they are insecure and if that is not acceptable, mitigate against that risk with an appropriate level of encryption in the application.
      • err, yes but your assumption assumes that people believe voip and similar apps are secure now? haven't heard vonage blast on it's latest marketing campaign "oh yeah, btw don't do banking by phone on this, because you can get screwed."

        if nobody knows how is it more secure than bad encryption?

        otherwise i'd agree.
    • ``so... great, but why aren't most tcp streams encrypted by default?''

      Because there is really no need to. I don't need to have all the public webpages I request to be sent to me over an encrypted link. Nor the publicly accessible ISO images I download. Nor the files I access over NFS. Etc. Encryption is there when I need it, but I don't need to burden myself, my computer, and the whole network infrastructure with it when I don't need it.

      ``the client side load is negligable''

      I really don't agree with that. T
    • so... great, but why aren't most tcp streams encrypted by default?
      Isn't IPv6 supposed to have some kind of embedded encryption scheme? I remember reading about it somewhere.
  • Premise is nonsense (Score:5, Informative)

    by Paul Crowley ( 837 ) on Wednesday August 17, 2005 @11:19AM (#13339396) Homepage Journal
    DES was *not* considered "uncrackable" when it was launched. In fact, cryptographers such as Michael Weiner warned that the key was too short and described the dangers of a hardware-based key cracker practically as soon as it was announced.

    The history of cryptography is not simply one of algorithms thought uncrackable being cracked. It is one of consistent refinement of our understanding and technique, but to imagine that the history of DES means we'll be breaking open 256-bit AES-encrypted messages in a few years is delusion.
    • by JUSTONEMORELATTE ( 584508 ) on Wednesday August 17, 2005 @11:28AM (#13339502) Homepage
      FWIW, DES was effectively broken by Evi Nemeth (at CU Boulder) using a paired-primes database and an all-software solution. There was no hardware-based key cracker, there was an algorithm that took a ton of cylces to generate the db, then a simple bit of lookup code to decrypt the cyphertext.
      IIRC, when she demonstrated it, they decrypted something like 5,000 passwords from a nearby /etc/passwd file in less than a minute on a Sun3.
      She made a point of telling us that the NSA has a copy of her work and her database.
      • Any chance you can cite that? I've good looking (primarily because I wanted to know when). The closet thing I've found is this [securitydigest.org]

        The email I'm referring to is down a little ways.

        the break is in the diffie hellman key exchange for des based on 127 bits. it was done quite a while ago, solving the discrete log problem for the field 2 ** 127 -1. the work was with ron mullin at the university of waterloo. the actual implementation of the algorithms was done on the denelcor hep supercomputer (since defunct)

        • Sorry that I don't have a citation -- I wasn't at the conference -- she told the story to me and other students of her 'Special Topics in Computer Networking' class in 1991, maybe 1992.
          I googled a bit, and found several pages with the same short bio text about her:

          ... Nemeth is best known for originally identifying inadequacies in and breaking the "Diffie-Hellman trap," the mathematical basis for a large portion of modern network cryptography.

          Hopefully that helps you in your googling.

        • I'm trying to track down the paper, but not having much luck. It looks like you're going to be looking for a paper:

          Mullin, R., Nemeth, E. and Weidenhofer, N., "Will Public Key Crypto Systems Live up to Their Expectations? HEP Implementation of the Discrete Log Codebreaker", Proc. of the 1984 Intl Conf on Parallel Processing, Aug. 21-24, 1984, pp. 193-196.


          This information from Evi Nemeth's bio [caida.org] page. Evi appears to have an e-mail address (on the same page)--you could try contacting her directly.
      • What does DES has to do with paired-primes btw? From my basic understanding of DES, there is nothing other than XOR and bit twiddling in the DES algorithm and I don't see the connection with prime numbers.
      • by Anonymous Coward
        This does not make sense. Using paired primes to attack DES? DES isn't based on primes, it is based on shuffling bits around iteratively (a Feistel network.) Decrypting password files from an /etc/passwd file? For that context, DES is used in hash mode, and password *doesn't* decrypt - because multiple passwords end up with the *same* hash.

        Also, searching reveals Evi Nemeth talking about implementing a break of a DES keyexchange using Diffie-Hellmann: Date: Fri, 30 Oct 87 19:32:32 MST From: evi@bould

      • The most effective attacks on DES are brute force, linear cryptanalysis, and the improved davies attack (a form of differential cryptanalysis). This talk of paired primes is confused nonsense, probably to do with some sort of dictionary-based attack on Unix passwords, which is a different but related problem. It sounds like she might be using Hellman's time/space tradeoff.
    • Not only did academic cryptographers criticize the key length, the US government forbade using DES to protect classified information.
      • DES was approved for use in encrypting sensitive, but unclassified, information like personnel records. NSA's policy at the time was that only algorithms designed by the NSA were approved for use in encrypting classified information, and only when implemented in an NSA approved device. Skipjack was the first public algorithm that was approved for protecting (lower level) classified information.
    • Yeah, I was thinking the same thing. I remember learning in my Cryptography class back in school that DES wasn't considered very secure by many cryptographers back in the 80's and how surprising it really is that it took so long to come up with another standard. In fact, the known insecurity of DES is the reason that people started doing DES multiple times: 3DES. Here is some more information on 3DES: http://en.wikipedia.org/wiki/3DES [wikipedia.org] and http://kingkong.me.berkeley.edu/~kenneth/courses/s ims250/des.html [berkeley.edu]

    • I think we need to make the point that there's a difference between a flaw in the encryption algorithm and the length of a key. Any code is crackable if you have enough time to generate every single possible key. As time passes, machines get faster and doing a brute-force attack on a 56-bit DES key doesn't look like a massive problem any more. If the algorithm is broken, it's effectively a shortcut to finding the key without having to try every permutation.
      • "Any code is crackable if you have enough time to generate every single possible key."

        That's not quite true. In the case of a one-time pad (OTP), properly-encrypted plaintext will remain hidden forever as long as certain requirements are met (the pad is never used twice, the media containing the pad is not physically captured by the enemy, the pad was sufficiently random to begin with, etc.).

        Now, I know some lamer is going to come up and say, "That's not true! Given enough time, SOMEONE will eventuall
  • AES Far from Secure (Score:3, Interesting)

    by generationxyu ( 630468 ) on Wednesday August 17, 2005 @11:22AM (#13339427) Homepage
    TFA mentions using AES, TDES, or RSA as alternatives to DES. He also says, "...the final AES standard is estimated to require a current cryptanalysis system 149 trillion years to decrypt." That may be true for direct-channel cryptanalysis, but side-channel attacks such as cache timings against most implementations of AES can guess the key given known plaintext, known ciphertext, and at least estimated timings for encryption.

    Read more: http://cr.yp.to/antiforgery/cachetiming-20050414.p df [cr.yp.to]
  • "why your apps need to be ready for change"

    Hmmm...I didn't see that part, just another Crypto 101 thing with a pitch for some harware gizmo at the end. Is there another article that should be linked in?

  • by Paul Crowley ( 837 ) on Wednesday August 17, 2005 @11:24AM (#13339448) Homepage Journal
    Actually, reading on, it looks like the author really doesn't have a clue. At one point he suggests using RSA in place of DES. Even most Slashdot readers know that in practice, when you use RSA for encryption, you use it in conjunction with a symmetric encryption algorithm.

    IBM has considerable cryptographic expertise; it's a shame none of it was brought to bear on this article.
    • Holy factual errors, Batman!

      In addition to the errors you rightly point out, TFA repeatedly mischaracterizes the machine the EFF built to crack DES. In the sidebar, the author refers to an "accelerator card used in a standard PC." Later on in the article, he refers to the system as using FPGA's to crack DES in 3 hours. The EFF's machine, described in their very good book, was comprised of several racks of custom-built boards with ASIC's, not FPGA's, controlled by one PC. Though that PC was certainly "

    • by Conare ( 442798 ) on Wednesday August 17, 2005 @11:52AM (#13339709) Journal
      Agreed. In addtion I like this from TFA:
      New standards are emerging from NIST, including AES (Advanced Encryption Standard) and TDES (Triple DES).
      Once again even most Slashdot readers know that TDES is finished emerging from NIST and is in the process of being obsoleted by AES which also emerged from NIST long ago.

      It is also interesting to note the bias they give PGP here. Basically, there are two good asymmetric key distribution schemes in the world: PGP and PKI.

      PGP is very useful if you have a small group or feel you can rely on out of band mechanisms for key distribution. For example, if I have been talking to you on the phone, and say I am going to email you my public key, you can be pretty sure it came from me when it arrives a little later.

      In a large organization though, key distribution is more problematic, and this is where PKI excells. For example if I receive a message that purports to be from the CIO telling me to install a patch how can I be sure it is really him and not some random dude(ette)? Ah! because the certificate that contains his public key is digitally signed by an entity that I trust (because they told me that I will trust it when I took the job ). PGP is good for dealing with people you know personally or have met in some fashion. PKI is good for dealing with both people you have met personally, and people that you have not met, but need to be able to exchange secure communication with anyway.

      On the other hand PGP is free.
      • by pclminion ( 145572 ) on Wednesday August 17, 2005 @01:16PM (#13340607)
        It is also interesting to note the bias they give PGP here. Basically, there are two good asymmetric key distribution schemes in the world: PGP and PKI.

        PKI just means "public key infrastructure" and can refer to any method for managing and exchanging public keys. X.509 certificates and the entire framework of trusted authorities surrounding them are just one implementation of a PKI. PGP is another, more simplistic implementation.

        So you can't really compare PGP, which is a specific application, to PKI, which is just a broad term for key management infrastructures.

        And what about "PKI" (in the sense you seem to mean it) isn't free? OpenSSL can do everything with certificates that you'd ever want to do.

        • PKI in an effective trust architecture is much harder to do. Sure, you can create your own structure for free using OpenSSL, but why should I trust you? More importantly, why should I trust your brother's uncle's cousin's son's former roommate's key, which has a trust relationship all the way back through that chain with you?

          Compare this with the trust architecture built with assistance from a VeriSign or an Entrust. They have a known trust level based on established practices which filter down through t
          • PKI in an effective trust architecture is much harder to do. Sure, you can create your own structure for free using OpenSSL, but why should I trust you? More importantly, why should I trust your brother's uncle's cousin's son's former roommate's key, which has a trust relationship all the way back through that chain with you?

            That wasn't at all the sort of model I was suggesting. Imagine a group of family and friends who want to securely email each other. They select one individual, probably the most resp

      • ...NIST seems to be currently doing a lot of work on encryption modes (how the blocks are linked together). A lot of current modes (ECB, for example) are weak and make it easy to extract some information from encrypted files.


        (At least, they were in the middle of doing studies on modes the last time I looked, and I've not heard anything on new mode standards since.)

    • Whitfield Diffie and Martin Hellman devised a scheme to safely transport a public key with a message, allowing this public key to be paired with a private key and then used to decrypt private data. This Diffie-Hellman key exchange protocol is used, for example, by RSA and PGP.
      That's close to being so bad that it's not even wrong.

      The linked article is of negative value and you're better off skipping it.

    • Actually, reading on, it looks like the author really doesn't have a clue. At one point he suggests using RSA in place of DES. Even most Slashdot readers know that in practice, when you use RSA for encryption, you use it in conjunction with a symmetric encryption algorithm.

      Exactly, and this because the asymmetric part (RSA) is very slow compared to a symmetric algorithm. So we use the asymmetric part only to perform a key agreement protocol, in other words to agree on a new key to be used in the followi

      • The thing that should really kill RSA is that it has no advantages whatsoever over Rabin besides being slightly simpler, and it has several important disadvantages, particularly the absence of a provable relationship with factoring.
    • His series of articles are an attempt to sell cryptographic co-processors.

      If one were interested in the history of cryptography, one would read The Code Breakers [amazon.com], by David Kahn (very thick book, yet very interesting). Or, if one were interested in how to utilize cryptography into business processes, one would probably have read Secrets and Lies [schneier.com], by Bruce Schneier.

      ...when you use RSA for encryption, you use it in conjunction with a symmetric encryption algorithm.

      One does that only because public-key algo

  • by RouterSlayer ( 229806 ) on Wednesday August 17, 2005 @11:26AM (#13339468)
    I see tons of articles, but no one talks about "IDEA" any more.

    from my research so far it hasn't been cracked. it was a european standard, so I guess it's not favorable in the US or north america.

    it's still my favorite. and maybe it enjoys a bit of "security through obscurity" these days. But I'd really like to know.

    and oh, if you're going to say it was cracked, please provide reliable references with links.

    Seriously, I'd really like to know.
    • IDEA is patented (Score:5, Informative)

      by crimethinker ( 721591 ) on Wednesday August 17, 2005 @11:38AM (#13339591)
      It is my understanding that IDEA is patented (how this is even possible to patent a sequence of mathematical operations is a topic for another flamewar^Wdiscussion) and the holders of that patent wanted royalties. PGP used IDEA originally, but GnuPG wouldn't touch it for the royalty issue, and it eventually fell out of favour as other ciphers with 128-bit and larger keys became more widely available, e.g. Blowfish, Twofish, Serpent, Rijndael (AES), etc.

      -paul

    • Form Bruce Schneier's website, the paper about Blowfish [schneier.com]: "Many of the other unbroken algorithms in the literature--Khufu [11,12], REDOC II [2,23, 20], and IDEA [7,8,9]--are protected by patents." From the Wikipedia [wikipedia.org] article, Bruce Schneier is again quoted, "In my opinion, it is the best and most secure block algorithm available to the public at this time." (Applied Cryptography, 2nd ed.) However, by 1999 he was no longer recommending IDEA due to the availability of faster algorithms, some progress in its cry
  • by sixpaw ( 648825 ) on Wednesday August 17, 2005 @11:26AM (#13339471)
    The article has no discussion of truly modern encryption schemes (their description stops at RSA/PGP and they don't even go into any details); it has no discussion of why modern schemes are considered more secure than DES, no discussion of what might make them less secure (i.e., no mention of factoring/discrete logs as the root 'hard problems' behind current crypto) and no discussion of what's on the horizon in terms of things like quantum cryptography.

    On the other hand, it does go into cheerful detail on why IBM's Exciting New Coprocessor (r) is the right solution for your enterprise encryption needs!

    I know IBM are the 'Good Guys' and all, but that doesn't make advertising for them (especially in the form of a front-page slashdot article) any more palatable than advertising for anyone else...
  • HA! (Score:5, Funny)

    by MosesJones ( 55544 ) on Wednesday August 17, 2005 @11:28AM (#13339494) Homepage

    I just used MD5 as my encryption mechanism and the files will NEVER be recovered.

    This "joke" such as it is was based on a real world experience where the "smart" IT chap at a company I helped had in his words...

    "Tried a number of different compression and encryption approaches and MD5 consistently gave the smallest files"

    I asked if they had ever done a recovery, and strangely they had not... it was fun watching them try.
    • Re:HA! (Score:5, Funny)

      by utopianfiat ( 774016 ) on Wednesday August 17, 2005 @11:32AM (#13339546) Journal
      "Oh my god, MD5 ate the files!"
      "WHAT?"
      "It just finished digesting!"

      Thank you, I'll be here all week.
    • he obviously didn't do too much testing. Otherwise he would not have missed the great compression rates of the following code:
      #! /bin/sh
      rm -f $1 $1.compressed
      touch $1.compressed
      Try it, the compressed files are all of size 0!
      And even better: If you replace -f by -rf, you even can compress whole directories in one go!
  • Lal!! (Score:2, Funny)

    by Datamonstar ( 845886 )
    Jrr! V whfg YBIR pelcgbtencul!
  • by CdBee ( 742846 ) on Wednesday August 17, 2005 @11:32AM (#13339543)
    Fiction, but still good:

    Neal Stephenson - Cryptonomicon [amazon.com]

    Then to explain how Enoch Root lives so long, you'll need to read

    Neal Stephenson - The Baroque Cycle Trilogy [wikipedia.org]
  • by blair1q ( 305137 ) on Wednesday August 17, 2005 @11:45AM (#13339650) Journal
    One-time pad (OTP [google.com]) is the only "unbreakable" encryption.

    The rest are algorithmic, and therefore susceptible to decryption by algorithmic attacks. Decryption of them is a matter of being clued to the nature of the algorithm, and perhaps in possession of the knowledge of a secret constant with which the decryption algorithm can be generated. And once the constant is guessed, all messages based on it are decrypted.

    The only ways to decipher OTP-encrypted messages are to physically access the encryption or decryption pads, or steal the cleartext before it's encrypted or after it's decrypted.

    (Note: since VENONA [google.com] was not used only once, it's not actually OTP.)
    • OTP and (modern) algorithmic encryption are actually pretty close in quality: the weakest known point for both is transmission of the key (that is in both cases, a well funded attacker will find it easier to steal your key than to break your encryption).

      It might be that that could change for algorithmic though.

      But on the other hand, OTP is typically so much more vulnerable to key stealing that you might really be better off using algorithmic.

      There's also quantum encryption for unbreakable secure transfer of
    • One-time pad (OTP) is the only "unbreakable" encryption.

      Properly used that is true. The Russians used this system during the cold war. Unfortunately, every now and then, a cryptographer would re-use a pad even though procedures dictated never to do so. It wasn't until after Russian spies in the US government reported this to the KGB did they enforce the procedures. It goes to show that sometimes the weakest link involves the humans.

    • The rest are algorithmic, and therefore susceptible to decryption by algorithmic attacks. Decryption of them is a matter of being clued to the nature of the algorithm, and perhaps in possession of the knowledge of a secret constant with which the decryption algorithm can be generated. And once the constant is guessed, all messages based on it are decrypted.

      The problem with OTP, is that if you need to encrypt a gig of data, you need a gig of key bytes. A human cannot remember a gig of key bytes, so it mus

    • One-time pad (OTP) is the only "unbreakable" encryption.

      True, but incomplete -- under the right circumstances, even an OTP can be broken.

      To ensure that an OTP is unbreakable, you not only have to ensure that the key is used only once, but you also have to ensure that the key is completely unpredictable. This means starting with a truly random source, and ensuring against introducing bias in sampling that random source.

      The problem with this is that most random sources have relatively low bandwidth.

  • by ScaryMonkey ( 886119 ) on Wednesday August 17, 2005 @11:47AM (#13339678)
    The most fascinating thing to me in the history of WWII encryption is not Enigma (which was pretty cool) but what the Americans used in the Pacific war: the Navajo language. By sending messages in Navajo they utterly confounded the Japanese, who have never been slack in the figuring-things-out department. Goes to show how much stranger of a code our own laguage is, when we think about it
    • by Trurl's Machine ( 651488 ) on Wednesday August 17, 2005 @12:28PM (#13340096) Journal
      The most fascinating thing to me in the history of WWII encryption is not Enigma (which was pretty cool) but what the Americans used in the Pacific war: the Navajo language.

      There's some interesting parallel here. Pre-WWII Polish cryptography (its less known success was breaking the Soviet codes during the war of 1920 [wikipedia.org] - Polish victory helped to save the entire Western world from communism) was so strong thanks to polyethnic character of Polish culture. It was not really difficult to find bilingual Polish mathematicians - fluently speaking the language of the enemy, be it Russia or Germany. Pre-WWII Japan was - and to some extent, still is - a very closed society, with little interest in the world outside. It was difficult to find anyone with any interest in other cultures or languages - not even truly bilingual, among the Japanese mathematicians. In code breaking, victory belongs to the open - not just open algorithms, but also open minded, open societies. This is also why I think that right now, the Western world needs MORE interest in islamic cultures and MORE attempts to understand them - if not for any better reason, just for better decryption of intercepted messages.
      • by Anonymous Coward
        The success of using Navajo wasn't so much due to Japan being a closed society; it was because there were no Navajo speakers outside the US at all, and the language had no alphabet and had never been written down. On top of all that, they spoke in coded ways that didn't even make sense to untrained Navajo speakers.

        I can guarantee you that the Polish would have been just as stymied by the Navajo "Code-talkers" as the Japanese were.

        • by Trurl's Machine ( 651488 ) on Wednesday August 17, 2005 @02:19PM (#13341163) Journal
          The success of using Navajo wasn't so much due to Japan being a closed society; it was because there were no Navajo speakers outside the US at all, .

          But there were anthropologists, researchers, people who studied Navajo language etc. Japan "closedness" resulted in comparatively low interest in anthropology in general - while in pre-WWII European countries, including Poland, there were people studying alien cultures just for sake of interest in otherness as such. There are no native Nambikwara speakers outside Brasil but in case of war between Brasil and France, French code breakers could break the "Nambikwara code" thanks to works done on Nambikwara by Claude Levi-Strauss. The point is that there were no Levi-Strausses in Japan.
    • Its called security by obscurity and is generally considered not cool.
      • Its called security by obscurity and is generally considered not cool.

        Consider though; there's nothing obscure about a guard with a gun standing by a door, and most people would agree that that door is pretty secure. Imagine if behind that door was a key to unlock a vault someplace that contained a whole lot of money - the fact that the key shape is 'obscure' is secondary to the fact that to discover the key shape you're likely to risk being shot at.

        My understanding was that the Navajo code talkers were al

  • All this shows is (Score:4, Interesting)

    by el_womble ( 779715 ) on Wednesday August 17, 2005 @11:59AM (#13339765) Homepage
    how useless popular comms software is. Why should I have to register with Verisign to send an encrypted email to my girlfriend, co-workers etc. Why can't I just click a button and generate a random 128 bit key set and use PGP?

    Why isn't this standard? A better question is, why can I send a MIME encoded attachement anywhere, but not a PGP encoded plain text email? Imagine the spam you could filter if you had a list of the PGP keys of all your friends and family. Imgaine if they moved email address, but there PGP key stayed the same.

    If this is because Zimmerman want his 2 cents (which I can't blame him for) can't it be included in the cost of Windows and Macs, and let the rest of us download it for free? We need authenticatable (if there is such a word) emails, IMs etc yesterday. We have the technology!
    • how useless popular comms software is. Why should I have to register with Verisign to send an encrypted email to my girlfriend, co-workers etc. Why can't I just click a button and generate a random 128 bit key set and use PGP?

      What, pray tell, is wrong with the S/MIME standard? Nobody says your certificate has to be signed by Verisign. Jeez, set up your own Trusted Authority for you own group of friends and contacts, and self-sign your certificates.

      Who the hell would use PGP...

    • PGP Freeware has been widely available for years. You can also use GnuPG, in many ways a defacto replacement. (There do exist GUI front-ends for it, even in Windows.)

      Verisign deals in x.509 certs, not PGP keys. You can generate your own if you like, but it's much more awkward. Personally, I dislike x.509 certs, not so much because of the technology (it's fundamentally not much different from PGP) but because every implementation I've seen is a pain to work with (e.g. doesn't let you change email address
    • Alright, don't take this as an insult, but your post is very uninformed. I don't know what's wrong with mods these days that they modded it up so high, whereas more deserving posts have been modded flamebait and offtopic.

      Anyway, you had one good point:

      ``how useless popular comms software is.''

      I fully agree with you there. PGP has existed for a loooong time. Spam filters have existed for a long time, too. Both can easily be used with mutt. I also believe the Mozilla mail client does a good job here. How long
      • I'm sorry to pick on your reply, but it seemed to be the one the aggravated me the most.

        I understand the point of a trusted third party. What I also understand is that, as you proved to yourself, you just don't need one. As long as you generated the keys yourself, for a large enough key space, the chances of anyone having, or calculating your key pair approaches 0. This is about the same as someone intercepting the transmission of my private key to a trusted third party, but much, much lower than the chance
        • Ok, maybe I misjudged your post and you understand very well, but that's not the impression I get.

          ``I understand the point of a trusted third party. What I also understand is that, as you proved to yourself, you just don't need one. As long as you generated the keys yourself, for a large enough key space, the chances of anyone having, or calculating your key pair approaches 0.''

          The issue is not whether someone can or can't generate a key pair identical to yours. The issue is that when you receive a message
    • We need authenticatable (if there is such a word) emails, IMs etc yesterday.

      Sure, fine, we all know that. But the practical question is how do you start up such a communication in the first place? In other words, how do two parties share a secret without communicating it, and thereby risking its exposure?

      The answer is not to share a secret, but instead to use an asymmetric key pair and only communicate the public key. But this only solves part of the problem. Now you can communicate in secret, but y

  • humans are better (Score:2, Interesting)

    bWbhy blbeave bibt btbo bab bcbomputer bwbhen bab bhbuman bcban bdbo bab bbbetter bjbob?
  • Simon Singh [wikipedia.org]'s Code Book [simonsingh.com] covers history of encryption pretty extensively starting from Caesar's time. Enigma and others are covered very well.
    The encryption methods are covered in layman's terms(I think!).
  • Is it time to increase the default keysize in GPG?

    Currently, the default key generation method in GPG is to create a 1024 bit DSA master key and Elgamal subkeys. The GNU Privacy Handbook admits [gnupg.org] that a key size of 1024 bits is "not especially good given today's factoring technology."

    If the authors of GPG know that 1024 bits is not a good key length for an asymmetric cipher, why not set the default length for the master key at 2048 bits? If that would require switching to RSA as the default signing al

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...