Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Bug Microsoft

Deserialization Issues Also Affect .NET, Not Just Java (bleepingcomputer.com) 187

"The .NET ecosystem is affected by a similar flaw that has wreaked havoc among Java apps and developers in 2016," reports BleepingComputer. An anonymous reader writes: The issue at hand is in how some .NET libraries deserialize JSON or XML data, doing it in a total unsecured way, but also how developers handle deserialization operations when working with libraries that offer optional secure systems to prevent deserialized data from accessing and running certain methods automatically. The issue is similar to a flaw known as Mad Gadget (or Java Apocalypse) that came to light in 2015 and 2016. The flaw rocked the Java ecosystem in 2016, as it affected the Java Commons Collection and 70 other Java libraries, and was even used to compromise PayPal's servers.

Organizations such as Apache, Oracle, Cisco, Red Hat, Jenkins, VMWare, IBM, Intel, Adobe, HP, and SolarWinds , all issued security patches to fix their products. The Java deserialization flaw was so dangerous that Google engineers banded together in their free time to repair open-source Java libraries and limit the flaw's reach, patching over 2,600 projects. Now a similar issue was discovered in .NET. This research has been presented at the Black Hat and DEF CON security conferences. On page 5 [of this PDF], researchers included reviews for all the .NET and Java apps they analyzed, pointing out which ones are safe and how developers should use them to avoid deserialization attacks when working with JSON data.

This discussion has been archived. No new comments can be posted.

Deserialization Issues Also Affect .NET, Not Just Java

Comments Filter:
  • Just don't use JSON or XML. You can thank me later.

    • by vadim_t ( 324782 )

      So what do you recommend instead?

      • Re:Simpler solution (Score:5, Interesting)

        by hord ( 5016115 ) <jhord@carbon.cc> on Sunday August 13, 2017 @11:12AM (#55003031)

        JSON or YAML are probably both fine. XML is simply wasteful and unnecessary. Personally I think we should be using something like s-expressions (lisp-like). People hate them because of the parens but every other encoding has as many negative points in different ways. The advantage is that the syntax is far simpler to understand and parse leading to safer software. Some might say that having an "executable" format is bad but I'd point to bugs like this as being proof that even "text" formats are just executables in disguise. The Lisp creed is "data is code" and I've come to agree.

      • by Anonymous Coward

        ASN.1

        • by skids ( 119237 )

          (mandatory missing sarcasm tag warning)

          Not that many developers would base a decision on an AC slashdot post, but...

    • by Sloppy ( 14984 )

      I understand why you'd recommend against JSON since it was originally intended to be an expression (and some fuckwits would eval() it) rather than really intended to do quite the same thing as, say, Python's pickles. But what's the beef with XML?

      • Seriously? It's not like there weren't plenty of ways to store data that were far less verbose, more self-documenting, and took up less space and cpu both to create and search through.

    • Re:Simpler solution (Score:5, Informative)

      by angel'o'sphere ( 80593 ) <{angelo.schneider} {at} {oomentor.de}> on Sunday August 13, 2017 @12:23PM (#55003317) Journal

      The serialization format has nothing to do with the deserialization vulnarabilities.

      • Your comment is prescient and face-palm worthy.......because it is clear and succinct.

        Face-palm worthy because a few years ago, a lot of these bugs were found in XML Java deserializers. A lot of people said, "Don't use XML! It's insecure!" then went off to write the same frameworks, but using JSON instead. They ended up with all the same bugs.

        I guess next people will rewrite them in YAML or binary.....nah, binary is scary, you never know what people could put in there!
        • Bugs in XML deserialization don't allow for arbitrary code execution.
          Neither does JSON or YAML.

          So, what exactly would be the attack vectors (in a VM) via text only (de)serialization?
          I mean: buffer overflows, putting code on the stack or changing return adresses for JSRs obviously are impossible.

          • The main thing is when you deserialize an object that has a constructor that does something (or a setter or a getter that does something). Since there are many objects of this type in the Java/C# standard library, an attacker can send a serialized copy of one of these objects, and send it over the wire. The deserializer will happily deserialize it.

            Buffer overflows are kind of rare these days. Because of things like ASLR, they are hard to exploit. It's mainly about logic bugs of various types.
            • And exactly that e.g is the reason why 'standard' deserialization of objects in Java/JVM does neither use ctors nor setters.
              No idea about .Net

              • Which one is the standard deserializing library in Java?
                • The build in ObjectOutputStream and ObjectInputStrream.

                  They allow serialized objects to either implement java.io.Serializable or java.io.Externalizable

                  https://docs.oracle.com/javase... [oracle.com]
                  https://docs.oracle.com/javase... [oracle.com]

                  ( Why google finds the 7 version and not the 8 as first hits is beyond me :D )

                  The vulnerability comes from the option to overwrite "readObject()". Serialized data objects contain usually the classes as well. So when you read them, you also read and link the code, and hence use the supplied "rea

    • by hey! ( 33014 )

      Yep. Gin up your own solution with the exact same security flaws.

      I don't care how smart you are; everyone else is collectively smarter than you are. From a security standpoint you want to use popular frameworks that take security seriously and respond to the inevitable exploits promptly. Doing things in an idiosyncratic way is not protection because (a) systems can be probed using black-box methods like fuzzing and (b) chances are your way of doing it has been used thousands of times before.

    • I remember life before XML or JSON. It wasn't pretty. I've reverse-engineered the .doc and .xls file formats. It was a time when everybody made up their own file formats, and there were no libraries to help you read and write those formats. No, thank you, I'll live with the potential serialization issues.

      • And there's your problem - you or your user was using a shitty format. This is a long-solved issue. Even plain text or SDF or tab-delimited or fixed field width are quick and easy to implement, and variable-field-width can also be made self-documenting with just a bit of work. All are far easier to implement than xml or json, and if it's become corrupted, you'll usually be able to see exactly where pretty quickly and recover everything else.

        • by vadim_t ( 324782 )

          The parent is talking about .doc and .xls formats. These are absolutely not suitable for something as simple as tab or fixed field formats. They can contain arbitrary data like embedded images and videos. They have a very complex markup system. They have features like versioning, scripts, and oodles of metadata. They have to deal with arbitrary data of arbitrary length. They can attach arbitrary amounts of parameters to some piece of text. .doc and similar is one of the few cases where XML is actually not o

          • In fact, the new docx and xlsx formats are implemented in XML.

            There are many data sets that don't work well as CSV. Anything, for example, that has one-to-many relationships such as customer order history with names, addresses, billing info, etc., doesn't work well as CSV. That's the whole point of XML / JSON--you can easily store and retrieve data sets that are more complex than a spreadsheet. And that is just about everything.

            • Arbitrary data serialization was solved back before the PC was invented. See the following ASCII control codes

              0x1c - FS - File separator The file separator FS is an interesting control code, as it gives us insight in the way that computer technology was organized in the sixties. We are now used to random access media like RAM and magnetic disks, but when the ASCII standard was defined, most data was serial. I am not only talking about serial communications, but also about serial storage like punch cards, p

              • There is nothing that is inherently more secure about ASCII control codes over XML or JSON. And it's inherently less human-readable. There's a reason the world has moved on past ASCII control codes!

          • Then just send them as one big binary blob with a list of offsets, sizes, and file names to the beginning of each separate file as a virtual header using a tab between each offset, size, and file name, followed by a cr. Be a hell of a lot more compact, and extraction of an individual file is as simple as an lseek to the offset, read the size and filename and read(size) number of bytes. Modify as needed, and you can store ANYTHING pretty much in its original form. XML is not needed. Same as emojis. The world
        • by Dog-Cow ( 21281 )

          You are such a stupid shit that it's amazing that you can even string together coherent phrases. Or is someone ghost-writing for you?

          • No more stupid than the folks who solved serialization back in the 60s using 0x1c, 0x1d, ox1e, and 0x1f to store multiple databases each with their own tables in a single file. And certainly not stupid enough to throw out a solution that was simple and worked for a piece of shit just because it's trendy.
  • by zifn4b ( 1040588 ) on Sunday August 13, 2017 @10:27AM (#55002897)
    Real developers use an XML or JSON reader instead of using direct deserialization. Trust me I've built systems both ways and deserialization directly into objects is no bueno. You end up with more problems with version compatability alone to negate the benefits. There are also performance issues as well.
    • It's a trap! (Score:3, Interesting)

      by Anonymous Coward

      Completely agree. We used .net binary serialization/deserialization because it was such a quick way to get things up and running...with like two lines of code. The fact that the serialized objects were about 10x bigger than they needed to be was not a problem.

      It turns out the namespaces are included in the serialized data, so the moment we did an ounce of lightweight refactoring we broke it. It took us less than a day to write our own serializer, but an extra three days of combined manpower to get a form

    • You are 100% correct.
      Unfortunately, going by the amount of projects affected by the bug, it seems that most programmers are not "real programmers"

    • Trust me I've built systems both ways and deserialization directly into objects is no bueno.

      Yeah, running a auto-deserializer on untrusted data is basically guaranteed to be a security flaw. The NSA and FSB will pwn you at that point, along with anyone else who wants to (just ask PayPal).

    • Absolutely correct. Any additional development overhead or memory use is acceptable in return for the gained compatibility, reliability and security.

  • by Anonymous Coward

    This is a programming problem that can happen anywhere. No language is immune. No project is automatically secure from exploits, or able to patch framework universally for all deployments.

    Java and .NET will always have security issues, along with literally every other programming language. Anyone shocked, surprised, upset, or hostile to that concept is in the wrong profession.

    Assume everything is compromised. Assume nothing is secure. Design around that assumption and you will survive.

    • by peppepz ( 1311345 ) on Sunday August 13, 2017 @11:36AM (#55003149)
      The title is sensationalistic. Even the original bug the author talks about, calling it repeatedly a "Java" bug, was actually a bug in the Apache Commons Collections library, not in the platform, and it could only be triggered if a server using the library allowed customers to provide serialized data for itself to deserialize, which is severely wrong in the first place (it's akin to eval()-ing client-provided text).
    • by Tablizer ( 95088 )

      Assume everything is compromised. Assume nothing is secure. Design around that assumption and you will survive.

      But you won't be able to compete with shortcut takers. They will look more productive than you. The penalty for shortcut taking is not just large enough, I hate to say. I'm just the messenger.

  • by RyanFenton ( 230700 ) on Sunday August 13, 2017 @10:41AM (#55002949)

    I'm kind of surprised this hasn't already built into a more prominent issue over time.

    Performance issues I can stomach - there's going to be some unavoidable parsing logic no matter how you go at translating from runtime to storage or network logic - but instead, large swaths of objects just get ignored in major libraries. When using unity, for instance, can't serialize dictionaries, and many other objects in the default serializer - which is a major oversight.

    Google actually has provided some rather nice tools to help with this - I tend to use their 'Protocol buffer' libraries for their rather nice serialization options. This doesn't address security on its own - nothing does completely, but designing careful locked signal processing and independent cross-checking steps can help a lot. Well-salted encryption alone won't really save you.

    My pet peeve with protocol buffers the need to give everything an index number, with no real auto-numbering for rapid design - I can see the logical need, to be able to rely on that order for processing - it's just an extra babysitting step that gets me sometimes. For what it does, it's still the best I've found to be consistent between diverse projects and still leaving room for decent security.

    Ryan Fenton

    • by hord ( 5016115 )

      I've looked at protocol buffers but everything I've ever read about people actually using them in production says they are a nightmare over time because they are binary. Supposedly the object versioning alleviates some of this but I think people were complaining about how to deal with mandatory fields over time. I can't remember but I suspect this plus JavaScript being in the browser is what makes JSON so prevalent. I have no idea why XML is used. I can't even think of a single advantage it has over any

  • Google engineers banded together in their 20% time.
  • by Anonymous Coward on Sunday August 13, 2017 @11:35AM (#55003147)

    JSON only defines a bunch of basic data types. It defines no ability to run anything. These bugs are in (de)serialization layer above it, which uses JSON as a transport and extend the meaning of the data stored to be able to deserialize higher-level objects.

    JSON or XML are not the problem here. The same problem could happen if you serialized to CSV or TXT or anything else for that matter.

    • by Tablizer ( 95088 )

      It's probably a problem with "generic" reconstruction of objects based on data. If the data is used to (re) construct objects, then some objects can potentially have behavior because that's how objects are defined. If the data is "clever" enough, it may end up constructing objects you don't want.

      It's probably better to parse out to low-level "scalar" values and hand-code the part that stuffs them into objects or databases rather than let a parser actually build objects or object trees itself.

      • It's probably better to parse out to low-level "scalar" values and hand-code the part that stuffs them into objects or databases rather than let a parser actually build objects or object trees itself.

        This is exactly right. Because the data is untrusted, you need to verify it anyway, and adding parsing code to that usually doesn't add much overhead (it can often be the same code).

        In the defcon talk they made a strong case that these generic de-serialization libraries are extremely difficult if not impossible to use securely. They were just grabbing at low-hanging fruit, as soon as you've imported these libraries, you're compromised. They didn't even discuss ways that the libraries might be used incorre

      • by pjt33 ( 739471 )

        It's probably better to parse out to low-level "scalar" values and hand-code the part that stuffs them into objects or databases rather than let a parser actually build objects or object trees itself.

        If you're dealing with enough different datatypes then it might be a big development and maintenance saving to have a generic object builder in your deserialiser. The key is to make it so that you whitelist the datatypes it will deserialise.

        • by Tablizer ( 95088 )

          I see a problem with white-listing. Objects are often part of a bigger ecosystem. You may have to white-list sub-sets of objects to do it right, making it non-trivial to guarantee you didn't leave a current or future hole.

          You are right that it might be a big saving to have auto-object generation, but at a risk.

  • Can someone explain what the problem is here? Serialized objects are just code, and if you're running untrusted code you've got bigger problems than bugs in your serialization libraries.

    • Re:I don't get it (Score:4, Informative)

      by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Sunday August 13, 2017 @01:37PM (#55003639) Homepage

      General rule of thumb as always... a vague security announcement is never as big a deal as its title makes it out to be.

      There really isn't much of a problem. Reading TFA, a few vulnerabilities have been discovered in a couple applications and libraries. None of these were part of .NET, and no systemic issues in how people code for .NET have been found.

  • They aren't part of. Net itself, just third party libraries.
  • As stated int he linked document, for JSON.NET to be vulnerable, you have to explicitly set an option making it less secure.

    As with encryption and security libraries, you are better off using well-established libraries like JSON.NET than rolling your own. A solo developer, or corporate team, just doesn't have the resources or time to work out all the security vulnerabilities, as can be done with a dedicated library.

  • This is not surprising ! We discovered recently some "billion-laughs"-style DOS attacks that exploit vulnerabilities in Java, and ported some of them to .NET and Ruby. Details here: http://drops.dagstuhl.de/opus/... [dagstuhl.de] (paper, there is also an artefact to run attacks in a VM), and the source code is here: https://bitbucket.org/jensdiet... [bitbucket.org] . We did have some problems porting this from Java to .NET but managed eventually. Interestingly, some of these problems were caused by a bug in .NET: a broken contract

Everyone has a purpose in life. Perhaps yours is watching television. - David Letterman

Working...