Forgot your password?
typodupeerror
Security Encryption The Internet IT

SSL and the Future of Authenticity 98

Posted by Soulskill
from the laying-off-alice-and-bob dept.
An anonymous reader writes "There has been a growing tide of support for replacing SSL's Certificate Authorities with an alternative authentication mechanism. Moxie Marlinspike, the security researcher who has repeatedly published attacks against SSL, has written an in-depth piece about the questions we should be asking as we move forward, and urges strong caution about adopting DNSSEC for this task."
This discussion has been archived. No new comments can be posted.

SSL and the Future of Authenticity

Comments Filter:
  • The CAKE protocol (Score:4, Interesting)

    by Omnifarious (11933) * <eric-slashNO@SPAMomnifarious.org> on Monday April 11, 2011 @02:38PM (#35784282) Homepage Journal

    This, Diaspora, and personal interest by friends have gotten me interested in working on The CAKE Protocol [cakem.net] again. My goal is a Python reference implementation that can speak over TCP, email, and possibly IM.

    Last time I stalled out once I got a job. I also realized that the protocol design was flawed, and the API design for the internals was awkward. Also, I was all alone in a new city. I have friends who are interested now, which makes it easier. And maker spaces with people to talk to. When you have to work on something all by yourself it's hard to stay motivated.

    • Also, I see this as a chance to overhaul HTTP and a couple of other protocols so they have stronger authentication and easily implemented distributed caching.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      The "CAKE" protocol you say?

      Liar.

    • But if the protocol is still flawed, does the rest really help you?
      • I've revised the protocol spec, and I think version 2 is fairly good. It still has a problem with liveness, but this is due to the fact the protocol is very round-trip avoidant, and there are some mitigating strategies that can be adopted. Another problem is the lack of forward secrecy due to not using Diffie-Hellman for key negotiation and instead directly encrypting the keys with RSA. That can also be fixed in a later version without altering the protocol so significantly that users will have to modify th

        • by cjb658 (1235986)

          No matter how good SSL is, you can always just remove it with a tool like sslstrip (also developed by Moxie Marlinspike). I guess you could have banks and other web sites instruct the user to look for the lock icon, but like Moxie said in his Defcon talk, he tested it on hundreds of users by running a Tor exit node, and every one of them still logged in after being sent an unencrypted login page.

          • No matter how good SSL is, you can always just remove it with a tool like sslstrip (also developed by Moxie Marlinspike). I guess you could have banks and other web sites instruct the user to look for the lock icon, but like Moxie said in his Defcon talk, he tested it on hundreds of users by running a Tor exit node, and every one of them still logged in after being sent an unencrypted login page.

            I don't understand how your comment is relevant to what I said. I'm not talking about SSL. I'm talking about the CAKE protocol [cakem.net].

          • Were they apple users? The lock on safari is ridiculously small and unobtrusive. It's nice that it's unobtrusive, but it would be better if it was a *little* more visible. And also, it would be nice if you could set it up to give an alert when the certificate changes, even if it's a "legitimate" change..

            I really do not like having to tell my mom, "Yeah it's pretty secure. Just look for the lock icon, and click it and review the certificate and make sure it points to the website you think you're accessin

        • One last flaw is that I'm not positive a 256-bit namespace is quite large enough to handle all names that will ever be generated given that the birthday paradox effectively halves this to 128-bits

          GUIDs are 128 bits. Have GUID collisions ever been a problem?

          • Also, IPv6 uses 128 bit addresses. So unless you expect more than one actual name per possible IPv6 address, this should be completely sufficient.

            • I'm thinking one name per gram of matter in the solar system with each getting a new key every 30 years or so. Would the whole thing scale up to a matryoshka brain without breaking? So yeah, I'm thinking really big and being a bit silly.

          • Yeah, I'm thinking really big and being a bit silly. I'm thinking one name per gram of matter in the solar system. Would the whole thing scale up to a matryoshka brain without breaking?

    • by PopeRatzo (965947) *

      Good luck. It's a worthy way to spend your energy and time.

      Where can I donate to support this project?

  • by Anonymous Coward

    Texting one time pads is the ONLY SOLUTION.

    • by Roogna (9643)

      I realize the comment was posted as a joke of sorts, but on the sense of taking it seriously... say you've got someone with AT&T based Internet access and AT&T cell access. Then you're trusting the phone provider to not intercept and replace the one time pad with their own in a MITM attack. While texted messages is probably a fair amount of security for say, your WoW account, I wouldn't consider it secure for the communications that really matter. As the article says, when SSL was developed a lot

  • RTFA (Score:4, Informative)

    by Tigger's Pet (130655) on Monday April 11, 2011 @02:40PM (#35784304) Homepage

    I just hope that the many people who will post on here, with all their different opinions will actually take the time to read the article first. I know that is asking for a lot on /. but I can hope. Moxie Marlinspike (what a great name by the way) has really done a great piece of work here and it deserves to be read and digested before being critiqued.

    • While I agree with you about the problem, I have yet to see a fix that isn't worse in the transition. Short of heavy regulation of the certificate authorities... And I hate government regulation.
      • It shouldn't be government regulation. No one country has the right to try and regulate the Internet, even though both the US and the UK seem to think they do far too often.
        This is yet another reason why we (that's the global 'we', not just the few) need a totally independent Internet regulation authority, funded by every government, to oversee the whole thing. They could make decisions for the good of the Internet, not for the good of their own little corner of the world, or because industry is threateni

        • The problem would be (as it is with the UN) to get all the individual forces that invariably would compose the backing of that organization to cooperate.
        • by AmiMoJo (196126)

          Hahahahahahhahahahahahhahahahaha... aha... Wait, you were being serious?!

          You want a government funded institution made up of citizens of various nations to be completely independent and unmolested by political desires? Can you think of one example when something like that has worked?

    • It was a wondefully terse piece of work. He's got a talent for writing. You won't be wasting your time, RTFA.
    • Re:RTFA (Score:5, Interesting)

      by Burdell (228580) on Monday April 11, 2011 @03:00PM (#35784556)

      I RTFAd, and a few things jump out at me:

      - Attacking GoDaddy's trust because Bob Parsons went hunting in Africa to help farmers. Way to bring politics into a supposed technical discussion.

      - Confusing management of the DNS root with domain takedowns done at the registrar level.

      - Repeated use of "forever", as if certificates don't expire (and protocols never change).

      I think DNSSEC could be used to augment SSL security. For example, sites could list valid key IDs in a DNSSEC-signed record. You still use CA-signed certs, but a rogue CA can't also edit your DNSSEC-signed record. It is also much easier to monitor DNS for somebody trying to change something.

      • by DarkOx (621550)

        Ok many of your points are fair but your issue with the word forever really isnt. Many of the root CA certs don't expire for like 30 years! In Internet which has only really existed that long and only been public for a little under 20 years and SSL even less, 30 years is for all practical discussion the same thing as forever.

      • Re:RTFA (Score:4, Insightful)

        by Artraze (600366) on Monday April 11, 2011 @03:14PM (#35784700)

        I agree about the safari nonsense. Still, GoDaddy is a sleazy company that often seems to cater more to scammers and spammers than it does people that just want a domain name.

        The domain takedowns bit is basically referring to the fact that ICANN is not untouchable. Practically, this isn't _too_ much different than the DHS having a trusted root certificate (as they're _probably_ the only ones that can manipulate ICANN). However, it does mean that you can't un-trust the DHS (and maybe Chinese) root certificates because the manipulation will be happening in the background. (Which isn't to say they can't/don't manipulate Verisign at the moment, but I hope you get the point.)

        "Forever" is a relative term. As far as I'm concerned, it means long enough to exploit a vulnerability. Say... a month? Certificates don't expire that fast, and protocols are glacial by comparison. We're still using SSL, after all, how long do you think it'll be until we replace its replacement? Forever. Maybe even literally.

        Those points made, I do agree that DNSSEC probably wouldn't hurt; the more independent sources of trust the better. Augmenting it further with a traditional web of trust would be even better too.

      • by yincrash (854885)
        Are you positive GoDaddy is being picked on because of the hunting thing? There are definitely many more reasons to not trust the security of GoDaddy.
        • Re:RTFA (Score:4, Informative)

          by Culture20 (968837) on Monday April 11, 2011 @06:06PM (#35786586)

          Are you positive GoDaddy is being picked on because of the hunting thing?

          I am. The link to the hunting is in a sentence denouncing goDaddy's trustworthiness based on his personal trustworthiness (without other reasons cited).

          There are definitely many more reasons to not trust the security of GoDaddy.

          Would have been nice for TFA to state them. Sure, we here at /. know those reasons, but the populace at large doesn't. Most people think GoDaddy is a porn site.

      • by Raenex (947668)

        Attacking GoDaddy's trust because Bob Parsons went hunting in Africa to help farmers. Way to bring politics into a supposed technical discussion.

        I agree bringing in politics was a bad call. Then again, you're going to defend that dickhead CEO as "helping farmers"? Seriously, get real. He's just out to shoot some big elephants and give himself an erection. If he really wants to help farmers, he'd help them put up fences and use their wildlife for tourism.

      • by ekhben (628371)

        The two things that jumped out at me were that Moxie has made a faulty assumption on the trust model of DNSSEC, and that Moxie has made a faulty assumption on the trust model of web certification.

        Web certification is for relying parties to determine that a host is authorised to act on behalf of a domain holder.

        DNSSEC is for relying parties to eliminate the need to trust the distributed database of DNS.

        The question at the bottom of the article would lead to this if it were actually answered. Who do I

      • by Karellen (104380)

        Way to bring politics into a supposed technical discussion

        I thought the main point of his article is that deciding who to trust, how long to trust them, and what to trust them with, is a political, not a technical, problem.

    • It's a hard subject to write about. The article contains some interesting ideas, along with a fair amount of coarse hyperbole which, unfortunately, serves to make it less credible. And nobody who writes professionally should have trouble using words like "effect" and "proscribe" correctly.

      The thing is, readers take writing competence as a proxy for competence generally. It's not an unreasonable strategy. Any reader who's not already a subject matter expert is going to have to accept the writer's expe
      • It shouldn't (and doesn't) require proposal of a good idea in order to shit all over a bad one. Why do you want to attach an arbitrary bit of work to an already valuable service?

    • Oh, absolutely. You just have to get over his annoying, made-up-sounding name. I'll issue a thorough review of his ideas if and only if I can do it under the name Krypto McCypher.

      • by dash (363)
        You may publish under the pseudonym Krypto McCypher if you wish. You may also believe this hides your real identity. But some of us knows the truth, Mr Clexler. Some of us are smarter than you think!
  • The main issue (Score:5, Insightful)

    by dev.null.matt (2020578) on Monday April 11, 2011 @02:53PM (#35784472)

    It seems like the real problem is that any good solution to this issue will, by necessity, require the user to make informed decisions about who to trust and who to not trust. Based on the state of non-technical scamming, the success of confidence men throughout history and the fact that most people just want their browser to take them to whatever is linked off their friends' facebook pages, I can't see that this will ever be resolved.I mean, unless we decide to trust some body to make these decisions for people. Unfortunately, that pretty much brings us back to our current problem.

    That's the main problem I see with the author's notion of "trust agility:... it requires action from Joe Sixpack users who just want their browser to work in the same way their TV does.

    • Re:The main issue (Score:4, Insightful)

      by jonbryce (703250) on Monday April 11, 2011 @03:06PM (#35784626) Homepage

      The problem is that any non-technical user is going ask what buttons they need to press to get the website to work, and will then press them blindly no matter what.

    • by DarkOx (621550)

      Well maybe we need to admit there might not be a solution you. Sometimes you can't solve social problems with technical solutions and what is more social than the concept of trust!

    • Re:The main issue (Score:4, Interesting)

      by increment1 (1722312) on Monday April 11, 2011 @03:58PM (#35785212)

      There is a reasonably straight forward technical solution, that could be implemented in a future SSL protocol, to resolve the issue of trust when you already have an account on a site. A host site can add the hash of your password to the symmetric key used after the key exchange, your browser can then do the same on your side. This is essentially using a a shared secret (the hash of your password) as part of the symmetric key. The result is that no one in the middle can intercept your communication even if they have compromised the certificate.

      Since most attacks are done on people who already have accounts, this is a decent improvement in security. It will not, however, prevent spoofing a site before you have an account on it, so extra precaution would need to be taken.

      The implementation of this protocol would require that when initiating an https session with the server, you need to input your account credentials to your browser (not posted to the host), which then uses them to establish the SSL session.

      • by emt377 (610337)

        There is a reasonably straight forward technical solution, that could be implemented in a future SSL protocol, to resolve the issue of trust when you already have an account on a site. A host site can add the hash of your password to the symmetric key used after the key exchange, your browser can then do the same on your side. This is essentially using a a shared secret (the hash of your password) as part of the symmetric key.

        I like this. In addition, when logging in there's no reason to send the password, when it could be hashed with a random initialization vector. All the browser has to send is proof of knowledge of the password, not the password itself. The password only needs to be sent verbatim when it's changed and during registration. This overcomes a weakness in your proposal, namely that your scheme requires authentication to set up the connection; so it can't be used for the authentication itself. The site also ne

        • I like this. In addition, when logging in there's no reason to send the password, when it could be hashed with a random initialization vector. All the browser has to send is proof of knowledge of the password, not the password itself. The password only needs to be sent verbatim when it's changed and during registration. This overcomes a weakness in your proposal, namely that your scheme requires authentication to set up the connection; so it can't be used for the authentication itself. The site also needs to provide a challenge to prevent playbacks, I suppose that could be part of the IV.

          My intention in my original description was that your password is not sent to the site, it is used for authentication by verification of the establishment of the encrypted stream. So you would go to the https site (or maybe it would be httpss for super secure... ;-), and you would be prompted (by your browser, not the site) for your credentials. You input your username and password, and only your username gets sent to the site (via regular https or what have you), and then both the remote site and your br

      • > There is a reasonably straight forward technical solution, that
        > could be implemented in a future SSL protocol, to resolve the
        > issue of trust when you already have an account on a site. A host
        > site can add the hash of your password to the symmetric key
        > used after the key exchange, your browser can then do the same
        > on your side.
        > This is essentially using a a shared secret (the hash of your
        > password) as part of the symmetric key. The result is that no one
        > in the middle can

      • This already exists, it's a standardised part of TLS called TLS-SRP and TLS-PSK. No browser implements it, because it would make PKI look bad and CAs redundant if implemented.

    • by Artraze (600366)

      You've hit the nail on the head: security isn't easy. It requires some effort, knowledge, and concern to implement correctly. For security minded individuals, all this has, in many ways, been solved for a while now:
      http://en.wikipedia.org/wiki/Web_of_trust [wikipedia.org]
      The problem is that establishing and maintaining trust requires a basic understanding of what's happening and some effort. It's not hard, but Joe Sixpack doesn't doesn't want to learn anything. It's hard enough to ask them to make sure there's a little

      • by Nursie (632944)

        Why should I trust a web of trust?

        I trust my friends, and maybe their judgement in friends. Beyond that, why would I trust anyone to verify anything?

        The CA system for HTTPS is hopelessly broken, this much is clear, but I genuinely don't get how a wb of trust is any better. An extended group of people vouching for each other is not my idea of trustworthy either.

    • If the system is designed such that anyone can choose who they are going to trust, then people who can't make that decision on their own can still rely on others, such as the browser vendors, to make good default decisions for them. As it is now, it is unfeasible for either individuals or browser vendors to stop trusting large CAs due to the disruption it would cause.

  • by MobyDisk (75490) on Monday April 11, 2011 @02:53PM (#35784480) Homepage

    I tried to click the link, but my employer blocks thoughtcrime.org.

  • Just as soon as we figure out how to get every browser on every platform updated to support the new standard before it goes live.
    • Re: (Score:3, Informative)

      by Anonymous Coward

      The idea isn't to replace SSL, just the authenticity mechanism the browsers employ. Most of what's on the table allows browsers to use the new system and old system simultaneously, with a "both must pass" or "either can pass" setting. So it's not the transition that is difficult.

  • Every SSL web site I care about requires me to login. Why not just make mutual password knowledge part of the SSL handshake and be done with it? Then even if a TLA or someone from a convention in vegas decides to take over the world at least the billions of people who have already established trust won't have to worry.

    There is still a problem of initial trust when establishing an account but by punting that to the edge people would be better able to make their own decisions. Is SSL good enough? Or would

    • > Every SSL web site I care about requires me to login. Why not just
      > make mutual password knowledge part of the SSL handshake and
      > be done with it?

      Or a random challenge encrypted with your GPG key. Advantage is, you could use that for every such site.

    • Every SSL web site I care about requires me to login. Why not just make mutual password knowledge part of the SSL handshake and be done with it?

      Because that would make CAs and PKI redundant, so no browser vendor will even consider it.

  • by GameboyRMH (1153867) <gameboyrmh@NoSpAM.gmail.com> on Monday April 11, 2011 @03:07PM (#35784632) Journal

    Why not switch to self-signed certs + a notary system like Perspectives? [networknotary.org] It would at least be an improvement on today's situation, since there would be no need for CAs and there would be some MITM prevention built into the system.

    • by Terrasque (796014)

      Agreed. Perspectives is the best solution I've seen for this problem so far.

      The only problem I see with it at the moment is either if

      1. An attacker control all your internet access from before you install it
      2. An attacker control the majority of the Perspectives servers.

      But I don't see any of those as impassable problems. 1 can be solved by a trusted install (f.x. from a cdrom), and nr 2 can be solved by crowdsourcing (and some cleverness, like requiring the servers to be in different subnets / countries -

      • by tepples (727027)

        1. An attacker control all your internet access from before you install it
        2. An attacker control the majority of the Perspectives servers.

        3. An attacker controls the SSL server's connection to the Internet. This is likely in countries with less protection of speech.

  • by NicknamesAreStupid (1040118) on Monday April 11, 2011 @03:09PM (#35784652)
    Like many things that are too hard to grasp or solve at a technical level, people tend to shift focus to something more discernible. In this case, it is the fact that millions of people who are a part of SSL need to be persuaded to change. This is called "the devil you know verses the devil you don't know" choice. Since most of the unknown devils are of the same paradigm as the known devil, then the status quo will remain. True, there may be a catastrophe that frightens everyone to adopt whatever seems like a safe harbor at the moment. Barring that, it will probably take a new paradigm, perhaps a transparent network that can converge very very fast, making most catastrophic exploits ephemerally limited in scope. True, that would be far from perfect, but then, what is close to perfect?
  • I don't have a big problem with the way the chain of trust works. I have software on my computer that allows me to manage the certificates that I trust. That way, I can decide for myself. Since I don't actually want to bother to do so, I defer to my operating system vendor's judgment. They provide a package containing a list of trusted certificates, which I then use. I can have as much or as little control as I want. I think this is a good system.

    What I do have a problem with is the fact that many applications will use cleartext connections without complaint, but give ominous warnings when using TLS with self-signed certificates. Sure, self-signed certificates don't provide authentication, but neither do clear connections. With TLS, at least I get encryption. This should be a step up in security. At the very least, security is no worse than without TLS.

    I am OK with a warning being shown the first time I connect to a service with a non-trusted SSL certificate, but I feel applications should take a page from SSH here: give a warning that isn't too ominous, and offer the chance to save the public key. Then, next time I connect, if the key matches, go right ahead without a warning. And shout if the key does not match. This should provide good security if the first contact is uncompromised. Importantly, it matches the scariness of the warnings with the risk of the situation.

    • by DarkOx (621550)

      I have software on my computer that allows me to manage the certificates that I trust. That way, I can decide for myself. Since I don't actually want to bother to do so, I defer to my operating system vendor's judgment.

      What choice do you have other than deferring to your operating system or browser vendor's judgment? I think you are telling yourself something to avoid a feeling of helplessness. I know you can remove Comodo and such when ready they have had a breach but what can really do about the others? Do you have the resources to audit their practices? Once you remove a CA how do establish trust with any sites that use them as their authority? Do you call customer service and get them to read a thumb print to you

    • by AmiMoJo (196126)

      The messages are warning you that a supposedly trustworthy server might not be what it claims to be. Unencrypted connections are by their nature untrustworthy so unless you want a warning whenever one is opened (hint: that would be about 20 if you open Google's search results in Chrome) there seems little point in reminding you of that fact.

      The real problem is not about you, it is about the organisations that rely on SSL like banks. If users get scary security warnings they won't want to use online banking.

    • I feel applications should take a page from SSH here: give a warning that isn't too ominous, and offer the chance to save the public key.

      Which doesn't help if there is a man in the middle from day one. This has happened in the wild (https://bugzilla.mozilla.org/show_bug.cgi?id=460374).

  • by Animats (122034) on Monday April 11, 2011 @03:53PM (#35785128) Homepage

    Certificate Authorities issue "Relying Party Agreements", which specify their obligations to users relying on their certificates. Some of these specify financial penalties payable to end users.Over the years, as with EULAs, these have been made so favorable to the CAs as to make them meaningless. (See, for example, Verisign's relying party agreement. [thawte.com] Or, worse, the one from Starfield, GoDaddy's CA. [godaddy.com])

    Now it's time to push back.

    The Mozilla Foundation should issue a tough standard for CA Relying Party Agreements to get a root cert into Mozilla. One that makes CA's financially responsible for false certs they issue, with a minimum liability limit of at least $100,000. The CA must be required to post a bond. A third party consumer-oriented organization like BEUC (in the EU) or Consumer's Union (in the US), not the CA, must decide claims.

    The technology behind SSL is fine. The problem is allowing CA's that aren't doing due diligence on their customers to have root certificates in major browsers. Mozilla all by itself has enough power to tighten up standards in this area. All it takes is the will.

    • I'm not sure Mozilla all by itself has anywhere near that level of power, but I absolutely stand behind the concept--right now, the only thing the CAs have to lose is their reputation, as if that matters at all. Hey, remember that time you stood around with your friends talking about how Verisign will just give a certificate out to any Joe Schmoe?

      The only thing a corporation cares about is money, so if you want me to trust you, put your money on the line. I trust that you'll do everything necessary to prote

  • A real system to take care of this issue is possible.

    There's a number of problems here, and each one needs to be addressed, but luckily none of them are really all that contradictory. I'll detail a practical system:

    Problem 1: A random user needs to be able to connect into the verification system easily.
    Solution: Handle this in the same way DNS is handled. When setting up your connection settings (IP address, DNS, etc), include a new setting such as 'SSL Verification Address Book' (henceforth referred to as

    • by RyanZA (2039006)

      Replying to my own comment, but...

      I realized after posting that this isn't required in internet settings at all, obviously.

      This system could be implemented as a Firefox plugin (without the DHCP part). Closer to how bit torrent works, you could install the plugin, then browse to your chosen SVAB's website and download their SVAB server's address and key into your plugin. This does leave you with transferring the key once over HTTP - but it could also be transferred via disk if required, etc. I'd also say tha

  • I sure wish people would stop saying "SSL" when what they mean is "https".

    SSL is fine. Other protocols based on SSL (notably, SSH) are fine.

    The problem is https was designed badly and uses SSL in a grossly insecure manner and is therefore fundamentally broken.

    If the protocol were designed securely, the browser would check that the server cert presented either is the *same* cert that was presented last time the site was visited or at least is signed by the *same* authority cert as signed the previous one.
    • by Nursie (632944)

      Came here to say this. Shame most people won't get the difference.

      The problem is with the trust infrastructure around HTTPS. SSL the protocol is not at issue here.

      Moxie Marlinspike is clearly a great security analyst and a pretty good writer, but it would be nice if he had mentioned the difference. SSL has far wider uses than the web, many of which make no use of public trust infrastructure, do not use http (so do not suffer from SSLStrip attacks), are limited to good ciphersuites that don't use out of date

    • In addition, if a new certificate is used, it could additionally be signed with the previous certificate to signify that it's a valid successor. Then when presented with a new, unknown certificate for a known site, the browser could verify the "chain signature" with the known previous certificate, and thus verify that the site is the same. This would additionally catch the case where someone buys a domain (and thus can legally get a certificate for it), but uses it for a completely different service.

  • by bradgoodman (964302) on Monday April 11, 2011 @09:48PM (#35788470) Homepage
    I do like the PGP/GPG model as well, but my belief is that it could be extended more into the "real-world".

    For example - if I get a (GPG Encrypted) Email from a RedHat employee, how do I know it's real? Simple. RedHat employees put their GPG Fingerprints on their physical-old-school business cards.

    If businesses took this approach too, it would work.

    For example, why doesn't my bank print their SSL fingerprint (assuming they have those - not really sure) on my credit card, letterhead, and other "official" materials?

    Taking this kind of approach literally takes the CA out of the equation. I trust the certificate because I know it's valid - not because someone else says it is.

    P.S. With a name like Moxie Marlinspike - it seems like he should have a handlebar mustache.

    • "The Monkeysphere project's goal is to extend OpenPGP's web of trust to new areas of the Internet to help us securely identify servers we connect to, as well as each other while we work online. The suite of Monkeysphere utilities provides a framework to transparently leverage the web of trust for authentication of TLS/SSL communications through the normal use of tools you are familiar with, such as your web browser0 or secure shell." See: http://web.monkeysphere.info/ [monkeysphere.info]

I bet the human brain is a kludge. -- Marvin Minsky

Working...