Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Null-Prefix SSL Certificate For PayPal Released 351

An anonymous reader writes "Nine weeks after Moxie Marlinspike presented at Defcon 17, null-prefix certificates that exploit the SSL certificate vulnerability are beginning to appear. Yesterday, someone posted a null-prefix certificate for www.paypal.com on the full-disclosure mailing list. In conjunction with sslsniff, this certificate can be used to intercept communication to PayPal from all clients using the Windows Crypto API, for which a patch is still not available. This includes IE, Chrome, and Safari on Windows. What's worse, because of the OCSP attack that Moxie also presented at Defcon, this certificate cannot be revoked." Update: 10/06 23:19 GMT by KD: Now it seems that PayPal has suspended Marlinspike's account.
This discussion has been archived. No new comments can be posted.

Null-Prefix SSL Certificate For PayPal Released

Comments Filter:
  • by Anonymous Coward on Tuesday October 06, 2009 @05:49PM (#29664077)
    ...it is thought that more people are going to be using Macs' and Linux in the future.
  • by Darkness404 ( 1287218 ) on Tuesday October 06, 2009 @05:53PM (#29664113)
    The people who need to make sure to get everything secure in order to for the web to function have waited longer than -9 weeks- to get something fixed? When the thing was presented at... Defcon? What else do these people have to do other than fix these -major- flaws. When something is shown at Defcon, BlackHat, HOPE or any other major security conference, the first thing for these people to do would be to fix the flaw. 9 weeks is inexcusable.
    • Re: (Score:3, Informative)

      by Anonymous Coward

      Actually, this attack has been known a lot longer than that.

      I'm really glad the security product we developed uses OpenSSL even on Windows. The MS Crypto API was greatly desired at the time because it made the binary distribution a lot smaller. Originally everything was developed using OSSL because our stuff is cross-platform. Good thing we never found the time to switch over to CAPI on Windows.

    • Perhaps it's a bit like being a crowd and someone gets shot... everyone knows an ambulance needs to be called right away, but they stand there looking at each other thinking, "should I make the call?". Diffusion of responsibility?
    • by bertok ( 226922 ) on Tuesday October 06, 2009 @06:49PM (#29664639)

      The people who need to make sure to get everything secure in order to for the web to function have waited longer than -9 weeks- to get something fixed? When the thing was presented at... Defcon? What else do these people have to do other than fix these -major- flaws. When something is shown at Defcon, BlackHat, HOPE or any other major security conference, the first thing for these people to do would be to fix the flaw. 9 weeks is inexcusable.

      The problem is that this is not just some buffer overflow where you can replace single function call with an equivalent function call that does a safety length check. Security holes that depend on '\0' characters in strings exploit a systematic flaw in the Windows API design: the mix of two entirely different and incompatible types of strings all over the place. The 'native NT' API uses Unicode strings with an explicit length, but the Win32 API and C/C++ libraries usually use null-terminated strings. The dirty compromise is to use null-terminated strings together with an explicit length. Naively, one would think that this is now compatible with both, but it isn't - the NT API strings are a superset of the C-style API strings, because they can contain \0 characters, which the latter cannot handle.

      This is a glaring flaw, has been known for many years, and will probably never get completely fixed. The SysInternals guys wrote a nice article about it once, I think, but I can't find it any more. It's lost in the mists of time. It's been exploited repeatedly too. You can create files and registry entries with \0 in them, and then none of the user-mode tools will be able to modify or delete those, including Explorer and the command-line tools. Viruses and other malware make use of this 'feature' often.

      What really shits me is that Microsoft hasn't learned a thing. They talk big about security, but it's just talk. For example, the entire ASP.NET API suffers from a similar mismatch of encodings flaw: All of the data binding controls fail to properly HTML encode strings coming from a database. This makes virtually all ASP.NET applications ripe for exploits via XSS or other script injection attacks. The one time I wrote an ASP.NET app, I had to spend weeks going through and replacing all of the simple-looking bind statements with explicit calls to a method that would both bind and encode. Even in the upcoming 4.0 release, the flaw is still there. I suspect that it won't ever get fixed.

      If Microsoft can sit on a related security holes for years, don't hold your breath for a patch for this one. Even if they do fix it, I suspect they'll do something half-assed, like create a patch for IE only, instead of the cryptographic subsystem as a whole.

      • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Tuesday October 06, 2009 @07:09PM (#29664777)

        All of the data binding controls fail to properly HTML encode strings coming from a database. This makes virtually all ASP.NET applications ripe for exploits via XSS or other script injection attacks. The one time I wrote an ASP.NET app, I had to spend weeks going through and replacing all of the simple-looking bind statements with explicit calls to a method that would both bind and encode. Even in the upcoming 4.0 release, the flaw is still there. I suspect that it won't ever get fixed.

        To be fair, that's the kind of thing Microsoft really can't fix: plenty of people depend on outputting HTML stored in the database, and making escaping the default would break these users. We can debate the usefulness of Microsoft's compatibility-über-alles approach, but you can't fix that problem and preserve backward compatibility.

        • by andymadigan ( 792996 ) <.moc.liamg. .ta. .nagidama.> on Tuesday October 06, 2009 @07:38PM (#29665001)
          In fact, most SDK's out there would likely have a similar "flaw". In Java land you need to do the escaping yourself, and there isn't a built-in function to do XML or HTML escaping. You just need to know to handle it.
          • Re: (Score:3, Informative)

            by goofy183 ( 451746 )

            True but the core Java language doesn't ship with any nice HTML widgets. I believe JSF either does escaping by default or at least has a single app-wide setting to enable it by default. The Spring MVC framework has similar options, where with one line I can enable XML and JS escaping in all content written out by UI components. Being backwards compatible is one thing but not having an option to do default escaping is just opening your developer base up to all sorts of issues.

        • Re: (Score:3, Insightful)

          by bertok ( 226922 )

          All of the data binding controls fail to properly HTML encode strings coming from a database. This makes virtually all ASP.NET applications ripe for exploits via XSS or other script injection attacks. The one time I wrote an ASP.NET app, I had to spend weeks going through and replacing all of the simple-looking bind statements with explicit calls to a method that would both bind and encode. Even in the upcoming 4.0 release, the flaw is still there. I suspect that it won't ever get fixed.

          To be fair, that's the kind of thing Microsoft really can't fix: plenty of people depend on outputting HTML stored in the database, and making escaping the default would break these users. We can debate the usefulness of Microsoft's compatibility-über-alles approach, but you can't fix that problem and preserve backward compatibility.

          There's no backwards compatibility, ASP.NET was a completely new framework, written from the ground up. It should have done escaping correctly, right from the start. Ideally, it should be a flag that you can toggle on and off on the level of individual text fields, controls, or a whole page, and the default should be safe.

          Storing HTML in databases is one thing, and there are controls for emitting such data, such as XML, Literal or Placeholder controls, but that's a special case where a page is assembled fro

      • by nametaken ( 610866 ) on Tuesday October 06, 2009 @07:26PM (#29664907)

        For example, the entire ASP.NET API suffers from a similar mismatch of encodings flaw: All of the data binding controls fail to properly HTML encode strings coming from a database. This makes virtually all ASP.NET applications ripe for exploits via XSS or other script injection attacks.

        I would be pretty upset if everything I pulled from DB was automagically HTML encoded. I protect against XSS where it needs to be done. There are places where HTML encoding your data would not work. I do, however, always use parameterized inserts to protect against sql injection on top of an appropriate string cleaning function. Few things aggravate me like shitty ad-hoc inserts and the absence of string cleaning tied to a client-driven interface.

      • Re: (Score:2, Informative)

        by Jaime2 ( 824950 )
        I just tried it with ASP.Net 2.0. A TextBox, HTMLInputText, div, and span control all escaped HTML properly. A Label did not properly escape the Text property. I can't think of very many situations where you would use user supplied values for label text, that a span wouldn't be more appropriate for. By default TextBoxes don't allow HTML to be submitted at all. BTW, ASP.Net 2.0 is four years old.
        • by bertok ( 226922 ) on Tuesday October 06, 2009 @10:36PM (#29666095)

          I just tried it with ASP.Net 2.0. A TextBox, HTMLInputText, div, and span control all escaped HTML properly. A Label did not properly escape the Text property. I can't think of very many situations where you would use user supplied values for label text, that a span wouldn't be more appropriate for. By default TextBoxes don't allow HTML to be submitted at all. BTW, ASP.Net 2.0 is four years old.

          Well, I just tested it with 3.5, and it's still just as broken as when I first tried it with 2.0.

          First of all, "div" and "span" aren't controls at all, but are simply markup elements, and neither support data binding (which is what I was talking about), and neither do any kind of encoding at all, so I think you might be missing my point entirely. Also, "Label" is not that rare - it's the default control inserted by the GUI designer in Visual Studio if you bind a text field in a FormView, and as you noticed, it fails to encode.

          Second, while some controls do perform encoding, this only works sometimes, usually if the target control is a "Literal", or effectively the same (e.g.: If a Literal control is generated by a data bound control as a child control). As far as I know, the Literal control is the only control that has a "Mode" property that can be used to toggle HTML encoding modes, so most other text fields are not encoded.

          For example, if you bind the "Text" property of a HyperLinkField of a DataGrid, then no HTML encoding is done, and no encoding options are available. The only option is to do a manual bind to a code-behind method that performs the encoding for you.

          What particularly shits me is how random the encoding is. Sometimes it works (literals), sometimes it doesn't (hyperlinks), but then sometimes it randomly works again, such as the Alt text of Image fields. It's not documented either!

          Is this the quality and attention to security you'd expect from the world's biggest software company? Random, unpredictable, undocumented, insecure behavior in their flagship web framework? Really?

      • Re: (Score:2, Insightful)

        by techeddy ( 1651409 )
        Actually they have addressed the HTML encoding in ASP.net 4.0: http://haacked.com/archive/2009/09/25/html-encoding-code-nuggets.aspx [haacked.com] Although I agree it has taken quite a while, but sometimes one does need to output with and without the encoding, so I find it nice to have an explicit and easily identifiable way to do both.
      • by QuantumRiff ( 120817 ) on Tuesday October 06, 2009 @08:44PM (#29665429)

        I have never understood that for years, you have been able to create a folder with a space at the end of its name in a script. Try, just try, to delete that folder.. You can't create it in explorer, you can't delete it in explorer.. in fact, the only way to fix that I have found, is hope to god its a long file name, drop to a command prompt, and delete it with "Del folder~1"

        Years and years...

        Speaking of which, I got to try that in server 2008, and Windows 7.. Its a fun way to use 3 lines of script to really piss off your IT co-workers...

        • by Anonymous Coward on Tuesday October 06, 2009 @10:00PM (#29665875)

          I have never understood that for years, you have been able to create a folder with a space at the end of its name in a script. Try, just try, to delete that folder.. You can't create it in explorer, you can't delete it in explorer.. in fact, the only way to fix that I have found, is hope to god its a long file name, drop to a command prompt, and delete it with "Del folder~1"

          Well, the documentation for Windows Explorer specifically states that it may not support all the naming conventions of the underlying file systems. Of course, it would be entirely reasonable to expect it to fully support the naming conventions of any Microsoft file system, but MS seems to operate under an unusual definition of "reasonable"...

          You don't need a script to create such folders, just the command prompt. This will work just fine: mkdir ".\Space \". Even better, dir /X may fail to reveal this as a long filename (by definition, any filename containing a space is a long filename even if it's eight or fewer characters in length), in which case there's no way to use dir to make it obvious there's an abomination in the list of folders.

          Note that mkdir "Space " won't give you the trailing space in the folder name, at least not on anything earlier than Vista or 2003 (never tried this trick on anything after XP). Similarly, rmdir "Space " fails to remove the directory, but you can remove it with rmdir ".\Space \".

          File this under "Stupid cmd.exe tricks".

          Speaking of which, I got to try that in server 2008, and Windows 7.. Its a fun way to use 3 lines of script to really piss off your IT co-workers...

          Heh, create three sibling directories named "stuck" where they have one, two, and three trailing spaces - then sit back and watch the consternation. It will look like there are three folders with identical names under the same folder (impossible!), and none of them can be deleted with Explorer. Pure, evil fun.

          - T

  • Wow? (Score:4, Funny)

    by Anonymous Coward on Tuesday October 06, 2009 @05:54PM (#29664129)

    Moxie Marlinspike - that's a goblin name if I ever saw one.

  • Is that... (Score:2, Interesting)

    by petronije ( 1650685 )
    a bug or a feature?
  • by mindstrm ( 20013 ) on Tuesday October 06, 2009 @06:05PM (#29664227)

    With CNs like www.paypal.com\0ssl.secureconnection.cc

    Shouldn't the CA who issued the certificate bear *some* of the blame here?

    It just seems logical....

    • by ekhben ( 628371 ) on Tuesday October 06, 2009 @07:35PM (#29664971)

      Ahh, you've discovered why SSL on the web is fundamentally broken -- CAs have no incentive to act responsibly, since their customers are certificate requestors, not relying parties. And certificate requestors like CAs who don't have heavy process and high fees.

      I believe the only way forward is for browsers to change the model: associate a certificate SKI with a web site on first visit, warn if that changes. Don't worry about certificate validity, since the hierarchical trust model has been compromised from the root.

      • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Tuesday October 06, 2009 @07:39PM (#29665003)

        CAs have no incentive to act responsibly, since their customers are certificate requestors, not relying parties. And certificate requestors like CAs who don't have heavy process and high fees.

        Especially Comodo [theregister.co.uk]:

        Five minutes later I was in the possession of a legitimate certificate issued to mozilla.com - no questions asked - no verification checks done - no control validation - no subscriber agreement presented, nothing

      • I believe the only way forward is for browsers to change the model: associate a certificate SKI with a web site on first visit, warn if that changes. Don't worry about certificate validity, since the hierarchical trust model has been compromised from the root.

        Something else that should work (once DNSSEC is actually implemented everywhere) would be to list your SSL fingerprint as a DNS entry. This is less secure in some ways (say, if someone hijacks your account with whoever you bought your domain from), but better in others (better assurance on first visit, you can change the fingerprint if your server is compromised and someone else gets hold of your cert's private key).

    • With CNs like www.paypal.com\0ssl.secureconnection.cc

      Shouldn't the CA who issued the certificate bear *some* of the blame here?

      So I can't have a .paypal.com\0ssl.secureconnection.cc subdomain of secureconnection.cc?

    • by buchner.johannes ( 1139593 ) on Tuesday October 06, 2009 @09:03PM (#29665535) Homepage Journal

      Jacob Appelbaum presented a wildcard cert that you can use for any domain a week ago. Not sure why this is a story when a paypal-only forged cert comes out.

      https://www.noisebridge.net/pipermail/noisebridge-discuss/2009-September/008400.html [noisebridge.net]

      Note that you can create a SSL cert for any subdomain you host. I.e. CA root gives you *.example.com, you sub-certify a certificate for mail.myhome.example.com. So you can not blame a root CA for this issue, as anyone who is in the hierarchie can create a \x00 cert.

  • Update (Score:5, Informative)

    by Hatta ( 162192 ) * on Tuesday October 06, 2009 @06:25PM (#29664415) Journal

    Sounds like PayPal should be freezing everyone's account until this is fixed.

  • by magsol ( 1406749 ) on Tuesday October 06, 2009 @06:25PM (#29664417) Journal
    Because that is totally going to fix the problem.
    • Because that is totally going to fix the problem.

      It sure as hell will. They should have done that 9 weeks ago!

    • If you don't shoot the bearers of bad news, people will keep bringing it to you.

      • by dfay ( 75405 )

        If you don't shoot the bearers of bad news, people will keep bringing it to you.

        Awesome. This is a quote I'm going to remember for a long time!

  • FTA :

    "It won't work for exploiting the bug for software written with the WIN32 api, they don't accept (for good
    reason) *!"

    Como?

  • by Anonymous Coward on Tuesday October 06, 2009 @06:38PM (#29664541)

    For more information about null-prefix attacks, the video is here [defcon.org].

  • ..that I closed my PayPal account. :-)
  • by eyepeepackets ( 33477 ) on Tuesday October 06, 2009 @06:55PM (#29664689)

    Kirk: How is the messenger, Bones?

    McCoy: He's dead, Jim.

    Kirk: Well, I suppose our mission here is accomplished.

    McCoy: Yes, I suppose you're right.

  • by Monkier ( 607445 ) * on Tuesday October 06, 2009 @08:16PM (#29665257)

    what usually happens:
    * you request a cert common-name=serverbox.mydomain.com from a Certificate Authority (CA)
    * CA determines you are authorized to make this request on behalf of mydomain.com
    * serverbox.mydomain.com serves down the signed cert, your browser makes sure website == common-name == serverbox.mydomain.com

    what these clever guys discovered:
    * you can request a cert common-name=paypal.com\0.mydomain.com
    * CA determines you are authorized to make this request on behalf of mydomain.com
    * man-in-the-middle sits in between you and paypal.com, serves down this cert, victim's browser makes sure website == common-name == paypal.com (whoops!)
    * victim sees paypal.com in their browser with that reassuring padlock

  • by durnurd ( 967847 ) on Tuesday October 06, 2009 @09:16PM (#29665613) Homepage
    I'm rather fond of this bit of ignorance:

    The certificate is the latest to target a weakness that causes browsers, email clients, and other SSL-enabled apps to ignore all text following the \ and 0 characters

Fast, cheap, good: pick two.

Working...