Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Encryption

W3C Releases First Working Draft of Web Crypto API 63

From David Dahl's weblog: "Good news! With a lot of hard work – I want to tip my hat to Ryan Sleevi at Google – the W3C Web Crypto API First Public Working Draft has been published. If you have an interest in cryptography or DOM APIs and especially an interest in crypto-in-the-DOM, please read the draft and forward any commentary to the comments mailing list: public-webcrypto-comments@w3.org" This should be helpful in implementing the Cryptocat vision. Features include a secure random number generator, key generation and management primitives, and cipher primitives. The use cases section suggests multi-factor auth, protected document exchange, and secure (from the) cloud storage: "When storing data with remote service providers, users may wish to protect the confidentiality of their documents and data prior to uploading them. The Web Cryptography API allows an application to have a user select a private or secret key, to either derive encryption keys from the selected key or to directly encrypt documents using this key, and then to upload the transformed/encrypted data to the service provider using existing APIs." Update: 09/19 00:01 GMT by U L : daviddahl commented: "I have built a working extension that provides 'window.mozCrypto', which does SHA2 hash, RSA keygen, public key crypto and RSA signature/verification, see: https://addons.mozilla.org/en-US/firefox/addon/domcrypt/ and source: https://github.com/daviddahl/domcrypt I plan on updating the extension once the Draft is more settled (after a first round of commentary & iteration)"
This discussion has been archived. No new comments can be posted.

W3C Releases First Working Draft of Web Crypto API

Comments Filter:
  • Anyone know which browsers & httpd's are planning support for this soon? Webkit?

  • Eventually at some point I would hope there's a backend server the app is talking to.
    Why can't the server do it? Thin Client Fat Client... Why are you making my client fat Sir?
    • Re:So, Why? (Score:5, Informative)

      by DragonWriter ( 970822 ) on Tuesday September 18, 2012 @11:24AM (#41375445)

      Why can't the server do it?

      If the server does the encryption, then the server has to see the unencrypted content.

      If the client does the encryption, the server doesn't have to see the unencrypted content.

      Also, if the server does the work and you have a million clients, then the server has to do a million units of work rather than the clients each doing one unit of work. This can make the server more impacted by traffic spikes and provide a less-consistent and sometimes lower-quality user experience than just offloading that work to the client.

      Alternatively, its more expensive (more CPU = more $$) for the server operator, who often owns the app. So there's an incentive to build apps in a way that the work is offloaded.

    • The server can't decrypt the page for you. That would eliminate the entire point of encrypting the page in the first place. The server encrypts the page and gives it to you, then your browser decrypts it using this interface specified by the W3C. Are you actually dumb, or is that just a hobby?

      • That's SSL and has been here for 17 years. This API is useful for other stuff, like data that never leaves the client unencrypted.

        • by Lennie ( 16154 )

          This is more about encrypting your data in the client and storing data encrypted on the server. So when a server compromise happends your data can't be easily stolen.

  • by InvisibleClergy ( 1430277 ) on Tuesday September 18, 2012 @11:21AM (#41375379)

    It was because NearlyFreeSpeech doesn't support HTTPS, and I wanted to implement some sort of encryption. So, I figured that my server could encrypt pagelets and send them, and then the client could use a previously-established key to decrypt the pagelets, attaching them to the DOM structure in a logical way. The problem is, since JavaScript explicitly disallows XSS, I couldn't figure out a way to contact a separate key authority server. This meant that however I did it, I'd be (more) vulnerable to a man-in-the-middle attack.

    Looking this over, it looks like this specification doesn't solve that issue. I know that key authorities can be compromised, but it's better to require two points of failure rather than one.

    • The problem is, since JavaScript explicitly disallows XSS, I couldn't figure out a way to contact a separate key authority server.

      JavaScript doesn't "explicitly disallow XSS". Most browsers (through implementations of the still-in-draft Content Security Policy, and, for IE, additionally through its own "XSS filter") include means of restricting XSS, but those browsers also allow pages to control whether and how those XSS-limiting features are applied.

      • There are also xss allow headers you can send in the http headers, called "http access control" and you can also get around it with things like JSONP.
        • Expanding on that JSONP mention to hopefully save someone a googling...

          XMLHTTPRequest calls are subject to single-origin policy, so can't be used for XSS. However, SCRIPT tags don't have this restriction, even for tags that are dynamically created using JavaScript. JSONP is a trick that leverages this to implement XSS.

          The main limitation is that JSONP can't be used to call non-JSONP web services. So changes to the third-party service may be needed in order to support JSONP.

          As an aside, dynamically loading i

      • Oh, derp. I don't know what I was thinking, then.

    • by dgatwood ( 11270 )

      Uh... JavaScript has allowed cross-site XHR for going on four years now. It does, however, require appropriate configuration on the server you're contacting. The bigger problem with that design is that if your web hosting server doesn't support HTTPS, how will the third-party server handing out authentication tokens set the token on the server side?

      No, this is better handled through a DH key exchange. Then both sides have a shared symmetric key, and both sides can store it locally (with client-side stora

      • I was thinking about this originally in January of 2011, and I think I remember finding people mentioning XHR but not finding anything beyond scant mentions. No good "what is this and how do I do" documentation. I was originally thinking a DH key exchange, but that requires you to store it each session, which means each session is vulnerable to MitM, or to use HTML5 things that were not widespread two years ago.

        • by dgatwood ( 11270 )

          Right. HTML5 local storage is a fairly recent addition. You might also have been able to use a cookie with the "secure" flag set, which means the cookie is sent only over HTTPS connections, but AFAIK can be accessed in JavaScript code locally. I'm not certain whether such cookies are accessible through JavaScript that arrived over unencrypted HTTP, though, so that might not work.

          Regarding cross-origin XHR, it's pretty straightforward. It works just like regular XHR. The only difference is that the ser

          • by dkf ( 304284 )

            Right. HTML5 local storage is a fairly recent addition. You might also have been able to use a cookie with the "secure" flag set, which means the cookie is sent only over HTTPS connections, but AFAIK can be accessed in JavaScript code locally. I'm not certain whether such cookies are accessible through JavaScript that arrived over unencrypted HTTP, though, so that might not work.

            You're supposed to be able to mark a cookie as being unavailable to Javascript (well, as being only for use with HTTP connections; secure transport of the cookie is an orthogonal attribute) but that's both something I wouldn't rely on working and also easy to disrupt from JS; there's nothing to stop any cookie from being overwritten with something else. Cookies aren't designed for deep security.

    • by Riskable ( 19437 )

      Tip: WebSockets don't have any cross-origin limitations. You can connect to anything from anywhere.

      So there you go. Now you can make that key authority server :D

  • Microsoft releases less secure copy of W3C Web Crypto API already implemented in Internet Explorer 10 called SecureXaml while citing the changes as "features".

  • by Anonymous Coward

    They are never, ever good. Just more stupid crap that stresses me out and makes me tired.

  • Douglas Crockford tends to disagree [crockford.com]. And he's not alone [matasano.com].

    How do they mitigate these inherent security problems of the JavaScript platform in the API draft? With XSS, I can always overwrite the browser's crypto API object, replacing it with a rogue implementation.

    My understanding has been that JavaScript in its present form is not a viable platform for cryptography.

    • by Lennie ( 16154 )

      I believe both Firefox and Chrome have support for:

      http://www.w3.org/TR/CSP/ [w3.org]

      Which allows for more control on where code should be loaded from.

      Actually I think having crypto as part of the browser is a bigger chance of success then just implementing the crypto in Javascript as some people clearly have already done. You don't want to implement a cryptographically secure pseudorandom number generator in Javascript it will never be secure.

      • CSP will be a huge help in reducing attack vectors. Another thing is the key material being unavailable in the DOM. Current JS libraries do not have the option of making all key references opaque and truly hiding the private and secret key material from the DOM. This spec allows the browser to only ever reference key IDs instead of the actual key material.
    • This. Providing proper crypto primitives in the JS standard library is a good thing, I suppose, but it doesn't solve any (and I do mean any) of the underlying problems with things like CryptoCat. CC actually had quite good crypto primitives (implemented from scrach in JS, but apparently implemented well).

      The problem was that every time a user wanted to use CC, they had to download the page (and its JS) from the CC server... and there are so many ways to attack that. An obvious one is to insert a backdoor in

      • Thats the point of this API I imagine. If it is included in the browser then there is nothing to intercept and replace. Also it can have some priveledged status where methods can't be overwritten by other scripts.
        • Um... no. No part of any of the attacks I described requires any interception or replacement of the crypto code (I thought I made that clear). You're still going to have to serve a webpage though. In fact, in order to use these new crypto functions, you're going to have to serve script as well.

          I (the attacker, whether via MitM or server control or some other mechanism) can modify that to my heart's content. I don't even have to modify the existing scripts; just inject my own that captures every keystroke se

  • Why not use MashSSL? That seems like a simpler and better solution.
  • Why PKCS#1v1.5? (Score:4, Interesting)

    by cryptizard ( 2629853 ) on Tuesday September 18, 2012 @12:13PM (#41376093)
    The API has two padding modes for RSA, PKCS#1v1.5 and OAEP. OAEP is provably secure. That is, if the underlying scheme (RSA) is a secure public key cipher, then RSA combined with OAEP is a semantically secure encryption scheme that is resistant to chosen-plaintext attacks. On the other hand, not only is PKCS#1v1.5 not provably secure, it has been known [jhu.edu] for years [cryptograp...eering.com] to be vulnerable to real world attacks.

    Most of the time when you see people using it today it is for backwards compatibility, but in this case they are designing a brand new API. Why not go with the one which we know to be secure instead of encouraging the use of a dangerously vulnerable scheme?
    • by B-Con ( 782416 )
      In theory the Web Crypto API might interact with data encrypted with non-Web Crypto API applications. They may want to preserve compatibility with other APIs.
      • Well in that case they should be a note in big red letters that says "not recommended for use, backwards compatibility only" or something to that effect.
  • ... it's a bunch of random thoughts. Most of the current "draft" consists of "TBD" or "here are some ideas that need to be fleshed out". This looks like it's years from reality, at which point it'll have turned into another CDSA-sized monstrosity containing the union of every feature requested by every vendor ever.
    • I have built a working extension that provides 'window.mozCrypto', which does SHA2 hash, RSA keygen, public key crypto and RSA signature/verification, see: https://addons.mozilla.org/en-US/firefox/addon/domcrypt/ [mozilla.org] and source: https://github.com/daviddahl/domcrypt [github.com] I plan on updating the extension once the Draft is more settled (after a first round of commentary & iteration)
      • I have built a working extension that provides 'window.mozCrypto', which does SHA2 hash, RSA keygen, public key crypto and RSA signature/verification,

        No offence, but that's about a hundredth of what SSLeay (the thing that came before OpenSSL) was doing 15 years ago. That's a long, long way to go before you have a general-purpose crypto API usable by web developers.

  • I'm interested to see how this would work with the WebRTC API, to allow for browser-based encrypted P2P communications.

Help fight continental drift.

Working...