Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Security Sony

New Destover Malware Signed By Stolen Sony Certificate 80

Trailrunner7 writes: Researchers have discovered a new version of the Destover malware that was used in the recent Sony Pictures Entertainment breaches, and in an ironic twist, the sample is signed by a legitimate certificate stolen from Sony. The new sample is essentially identical to an earlier version of Destover that was not signed. Destover has been used in a variety of attacks in recent years and it's representative of the genre of malware that doesn't just compromise machines and steal data, but can destroy information as well. The attackers who have claimed credit for the attack on Sony have spent the last couple of weeks gradually releasing large amounts of information stolen in the breach, including unreleased movies, personal data of Sony employees and sensitive security information such as digital certificates and passwords. The new, signed version of Destover appears to have been compiled in July and was signed on Dec. 5, the day after Kaspersky Lab published an analysis of the known samples of the malware.
This discussion has been archived. No new comments can be posted.

New Destover Malware Signed By Stolen Sony Certificate

Comments Filter:
  • by ruir ( 2709173 ) on Wednesday December 10, 2014 @09:17AM (#48563587)
    gets better everytime. This is not news anymore, it is a damn mexican soap opera.
    • Well, maybe this will be a wakeup call for them.

      From the sounds of it Sony's overall approach to security was quite inadequate, and they've had several systems hacked over the last few years.

      Only now instead of customer data, they're getting hit where it hurts and might have to actually take this seriously.

      • by Anonymous Coward

        You say that as though Sony's security practices are not normal for all Fortune 500 companies. There are probably a few shining examples of good behavior, but I haven't worked for a company in the last 15 years that cared to do more than the bare minimum. Even then it was only if they HAD to do it.

        Fact of the matter is we are seeing that there will always be someone out there with far more time and resources to test your systems security than you ever will. They are also likely much smarter than you beca

        • by tlhIngan ( 30335 )

          You say that as though Sony's security practices are not normal for all Fortune 500 companies. There are probably a few shining examples of good behavior, but I haven't worked for a company in the last 15 years that cared to do more than the bare minimum. Even then it was only if they HAD to do it.

          One of the more famous hacks happened a few years ago, where hacks broke into one Sony website, then used the same vulnerability the next day to break into other Sony websites in another country. And repeated the

          • It's one thing to have a vulnerability. It's another to not have it patched on all vulnerable sites.

            Not that defending Sony is a habit I keep ...

            When you're talking about a company like Sony, which is really a zillion separate entities under one umbrella, I find it hard to believe Sony could have reacted so quickly as to lock down all other sites in one day.

            I'm betting no company that big is capable of responding that fast.

            I've worked for far smaller companies which had huge disconnects and delays between r

            • by arth1 ( 260657 )

              When you're talking about a company like Sony, which is really a zillion separate entities under one umbrella, I find it hard to believe Sony could have reacted so quickly as to lock down all other sites in one day.

              I'm betting no company that big is capable of responding that fast.

              It depends on what kind of people they have hired. Those who actually take an interest and read underground news multiple times a day are more likely to have fixes in place long before the need is evaluated, determined and requested up and down the corporate chain.
              Even if it should break with policies to patch without permission, a halfway decent sysadmin would invoke emergency powers in cases like this, and do the paperwork later.

              • by Rich0 ( 548339 )

                When you're talking about a company like Sony, which is really a zillion separate entities under one umbrella, I find it hard to believe Sony could have reacted so quickly as to lock down all other sites in one day.

                I'm betting no company that big is capable of responding that fast.

                It depends on what kind of people they have hired. Those who actually take an interest and read underground news multiple times a day are more likely to have fixes in place long before the need is evaluated, determined and requested up and down the corporate chain.
                Even if it should break with policies to patch without permission, a halfway decent sysadmin would invoke emergency powers in cases like this, and do the paperwork later.

                Vulnerabilities come out every week. If you have to violate policy every week to keep up with them, then if the company wants to stick with the broken policy you're going to end up getting fired.

                A big principle in large companies is division of labor and responsibility. Nobody really feels like they own the whole thing, so nobody cares enough to stick their neck on the line. It is easier to just shrug your shoulders like everybody else when the big disaster happens. If you do work your heart out you at

            • Or this could be the ultimate honeypot... Sounds like a movie in the making... Everybody's breaking out the popcorn. Sony is selling the popcorn.

    • When this is all over somebody's going to win a Pulitzer Prize for an article laying out the whole saga.
    • It's like my grandpa always used to say "Kid, you DO NOT fuck with the The HD-DVD Promotion Group!"

    • I vote for Eric Estrada to play Ken Kutaragi.

      Dos Mujeres. Uno Carmageddon.

  • by Anonymous Coward on Wednesday December 10, 2014 @09:29AM (#48563617)

    Anyone working in IT will have no doubt come across those who I refer to as the "Certificate Crazies".

    These are people who, when confronted with a security issue of some sort, immediately try to remedy it with certificates.

    They insist on using certs everywhere from ssh authentication to signing apps. If certificates can be used, even if it makes the work unnecessarily awkward or even if it doesn't actually help in any way, they will insist on using certificates.

    And then normal people work around the awkwardness that certificates often bring, rendering them irrelevant.

    In practice, a certificate is nothing more than a long password that's impossible for a normal human to memorize. So it ends up in a file somewhere, if not several "somewheres", where it can be easily stolen. Unlike the password in somebody's head or even on a sticky note behind the monitor, these certificate files can often be stolen remotely!

    Meanwhile, the "Certificate Crazies" deny that this is a problem, even when confronted with stolen certificates that have been misused!

    After railing against passwords for so long, how they do these "Certificate Crazies" often suggest getting around problems with stolen certificates? Why, they recommend using a short, human-friendly password that's needed in order to use the certs!

    These people are a joke.

    • What exactly is your point here? That certificates are worthless and shouldn't be used? (They aren't) That they are overused? (possibly). What's the alternative here anyways? two factor auth? I mean, I'm trying hard to find value in your post, but I'm not getting a lot of good points out of it. People misuse certs for sure, but that doesn't mean that as an authentication mechanism that they are useless.
      • by sjames ( 1099 )

        More that some people think certs are magic. They figure "just use a cert" and all is well without a single thought to setting up or maintaining an internal CA and properly securing the signing keys.

        That and somehow thinking that identities in run of the mill certs are somehow incontrovertible facts rather than understanding that the identity is no stronger than the verification procedures of the weakest trusted CA.

        My personal complaint about them is the screwy storage formats for the things.

    • How does signing executables work under Windows? I mean, some small company, let's say "John Doe Software" probably does not have its certificate verified down to the root level. Wouldn't this mean that anyone can create an executable with a certificate that says "John Doe Software"? Then I wouldn't know which one is authentic. Have I misunderstood something?
      • by 0123456 ( 636235 )

        I mean, some small company, let's say "John Doe Software" probably does not have its certificate verified down to the root level.

        Uh, yes, it does, otherwise it would be useless. But any CA can sign a certificate for that company, so it's as strong as the least secure CA.

        • by Rich0 ( 548339 )

          I mean, some small company, let's say "John Doe Software" probably does not have its certificate verified down to the root level.

          Uh, yes, it does, otherwise it would be useless. But any CA can sign a certificate for that company, so it's as strong as the least secure CA.

          Or the least principled one. You have to trust every CA in every country on the planet. If you're a US defense contractor should you be trusting some Chinese CA? If you're a Chinese defense contractor, should you be trusting Verisign? The SSL trust model is a joke.

      • by blackpaw ( 240313 ) on Wednesday December 10, 2014 @09:50AM (#48563733)

        I work for a small company that signs its code - pretty much required if you want to install in any enterprise these days.

        Its a certificate chain - we purchase a cert from a provider such as Verisign. They request basic proof of identity - business registration, contact number etc. They create a cert for us signed by them. Their cert is signed by Microsoft.

        We sign our app with our cert - anyone accessing the binary signed by us can verify it hasn't been alterated and our cert was signed by Verisign, which was signed by Microsoft.

        Note that all this provides is proof that the exe was created by us. It in no way guareentees that we aren't distributing our own malware etc. But what it does provide is a way of tracing a exe back to the signer.

        • by Smerta ( 1855348 ) on Wednesday December 10, 2014 @10:53AM (#48564203)

          First of all, kudos to your small shop for actually signing your executables. I still find myself needing to install software from companies ($100M+ companies) that don't sign their executables (IAR Systems (ARM cross compiler), I'm looking at you, for example...)

          Anyway, I just wanted to clarify one thing that you wrote, because a lot of people don't understand the security implications:

          Note that all this provides is proof that the exe was created by us

          Technically, all this provides is proof that the exe was created by someone who has your private signing key. That's exactly what's going on here with Sony. The whole signing / certificate thing works, right up to the point where the signing key is leaked or extracted. I know you know this, but it's important enough IMO that it merits re-stating...

          • by ledow ( 319597 )

            Is this not why we have CRL's, though?

            You can't guarantee your key won't be stolen and used to sign malware. But you can say that you'll revoke it when that's the case, and re-sign your official software with the new key.

            Sure, it's a pain, and I don't know if Sony have done this - but the facility is there for the original owner to say "Actually, no, that's no longer a trusted cert... here, have this one instead".

        • by jbengt ( 874751 )
          No matter how hard you try to fail-safe the use of software made by someone other than yourself, the entire thing boils down to trust of the user in the vendor, anyway.
          The problem exists that this entire chain depends on trust in each and every one of the links, and if any one of them becomes compromised, (or even compromise themselves on purpose - NSA?), it becomes a trusted attack vector.
        • I work for a small company that signs its code - pretty much required if you want to install in any enterprise these days.

          Its a certificate chain - we purchase a cert from a provider such as Verisign. They request basic proof of identity - business registration, contact number etc. They create a cert for us signed by them. Their cert is signed by Microsoft.

          We sign our app with our cert - anyone accessing the binary signed by us can verify it hasn't been alterated and our cert was signed by Verisign, which was signed by Microsoft.

          Note that all this provides is proof that the exe was created by us. It in no way guareentees that we aren't distributing our own malware etc. But what it does provide is a way of tracing a exe back to the signer.

          Not if verisign or microsoft was compromised and new fake certificates was signed with the compromised master keys. Like the case with Sony here.

      • by MouseR ( 3264 )

        Your certificate is authenticated by checking against it's parent certificate authority. That parent also has a parent. Rinse and repeat until you reach one of the top certificate authorities. There are seven of those [wikipedia.org] (or just about?).

        For as long as the parents are valid and your certificate is valid, then it's considered signed.

        VeriSign, a top certificate authority back in 2001, had made the news because it's DB got compromised. All certificates underneath where disabled and the whole tree had to be re-cre

    • You're confusing authentication with identify verification.

    • by Alioth ( 221270 )

      Signing certificates are normally encrypted. Stealing the file will do no good unless you know the decryption passphrase. For example, to get a package into our local debian repository such that it can install/upgrade in our production environment, you'd not only need the gpg signing keys, but the 60+ character passphrase (which is NOT written down) to go with it.

    • by IamTheRealMike ( 537420 ) on Wednesday December 10, 2014 @10:49AM (#48564187)

      In practice, a certificate is nothing more than a long password

      Fail. A certificate contains a public key. This is nothing like a password. You're thinking of a private key. The whole point of a certificate is that you can prove your identity to someone without sending them your password.

      Unlike the password in somebody's head or even on a sticky note behind the monitor, these certificate files can often be stolen remotely!

      Double fail. Firstly, nobody actually steals certificates. Certificates are public. When someone says something was signed with a "stolen cert", what they actually mean is "stolen private key the public part of which is contained in a certificate signed by a trusted third party", but that's a mouthful, so we simply and say "stolen cert".

      Secondly, private keys can and absolutely should be protected with a password! Or they can be kept in special hardware. However, as you may have noticed, Sony got pwned pretty hard so presumably whatever private key was stolen either had no password, or they were able to just keylog the password when it was used.

      These people are a joke.

      The joke is on you ..... certificates are not a replacement for passwords and if you think they are, you didn't understand what they're used for.

      • by arth1 ( 260657 )

        Fail. A certificate contains a public key. This is nothing like a password. You're thinking of a private key. The whole point of a certificate is that you can prove your identity to someone without sending them your password.

        I see what you're trying to tell him, but you make it sound like there is a technical difference between private and public keys.
        They are really just two keys of a key pair, and anything locked with one can only be unlocked by the other. Which one is named "private" and which one is named "public" does not really make a technical difference.
        It's customary to have the shorter of the two keys be designated the public key, and also common to store a copy of the public key with the private key for convenience,

    • In practice, a certificate is nothing more than a long password that's impossible for a normal human to memorize. So it ends up in a file somewhere, if not several "somewheres", where it can be easily stolen.

      If certificates are used correctly they are stored in some kind of certificate store where they cannot just be "stolen".

      In the Windows certificate store, when you import a certificate, the default is to set the key to "non exportable". Non exportable means that you'll never get the key from that store - at least not from your user context (given that it is stored encrypted but on the local disk, an "root" user with access to physical disk sectors could theoretically reconstruct the key - but not without run

    • by hey! ( 33014 )

      In practice, a certificate is nothing more than a long password that's impossible for a normal human to memorize. So it ends up in a file somewhere,

      In other words you have no idea how they work, and you deal with that by calling people who advocate them "crazy".

  • What benefit does the attacker get by signing the malware with a company's certificate?
    • by Anonymous Coward
      Well for a start all the chumps who have computers/devices that auto-approve certificates by Sony are now compromised without even knowing it.
      • by slaker ( 53818 )

        So... people who play Everquest or own off-brand prosumer content creation software?

      • Well for a start all the chumps who have computers/devices that auto-approve certificates by Sony are now compromised without even knowing it.

        So their devices automatically run all software from Sony? Can you give an example how they would get compromised without knowing it?

        • So their devices automatically run all software from Sony? Can you give an example how they would get compromised without knowing it?

          Well, just off the top of my head ... the Playstations probably use the signatures ... and Sony makes Vaio laptops ... they make Smart TVs.

          So, if you can target any of that stuff, it could be pretty easy. I'm not a security guy, but once the certificate is compromised you can start from there.

          Hell, send them a malicious URL which direct them to a system you control and tell t

          • Well, just off the top of my head ... the Playstations probably use the signatures

            I'm looking forward to next week's headline: "Massive PS4 Botnet Discovered."

            • Unlikely, Sony's branches are independent. if it was Sony pictures, probably a Sony Pictures related certifcate. As an example, SCEfoo (Playstation Sony) wasn't affected by some SOE (Sony's PC game division) troubles some time back, and SOE wasn't affected by that big PSN breach. SOE and SCEfoo have separate logins and infrastructures.

          • Well, just off the top of my head ... the Playstations probably use the signatures ... and Sony makes Vaio laptops ...

            SOE and SCEfoo are different branches. They use separate logins and separate backends. SCEfoo wasn't affected by some SOE troubles and vice versa.

      • If true ... fuuuuuuuu .... My SONY TV just updated about an hour ago.

        Um ...

        Should I be worried? [Serious answers only, plz]

    • by geogob ( 569250 ) on Wednesday December 10, 2014 @09:45AM (#48563695)

      The aim of signing is to ensure users that the software their install is authentic (and assumed to be safe). Most users will blindly thrust non-signed software and drivers... almost no user will suspect a signed package. That already something.

      Furthermore, it also adds a bit to the drama of the whole story. For the hackers it's a bit like sitting on the throne with the crown on their head after having killed the king. The obviously like to humiliate their pray, and to that effect compromising their certificates in this way is wonderfully effective.

    • What benefit does the attacker get by signing the malware with a company's certificate?

      Last time a popup came up for a security cert and you clicked "Accept" did you notice that checkbox "Always trust this source" or something like that?

      That's why. If anyone can now claim their Sony, there are a lot of people that will not even get a warning and their computer/browser/whatever will just implicitly trust the cert.

    • What benefit does the attacker get by signing the malware with a company's certificate?

      Windows has a mechanism where kernel-mode drivers must be signed. For certain mandatory, early-load drivers (e.g. anti-malware tools, measured boot tools) the drivers must be signed by Microsoft. But Windows allows other kernel-mode drivers to be loaded as long as they are signed using a valid, non-revoked code-signing cert from (IIRC) Verisign.

      Kernel-mode drivers can obviously access memory in kernel-mode. This is a common way for malware to take foothold on a Windows machine. It is really hard to ensure t

      • by aiht ( 1017790 )

        Expect this certificate to be revoked in near future. This will close that avenue, and cause all machines infected drivers signed by the cert to refuse to load the malware driver.

        And cause all machines with legitimate Sony drivers (if there is such a thing?) signed with the same cert to refuse to load those too.

        • Expect this certificate to be revoked in near future. This will close that avenue, and cause all machines infected drivers signed by the cert to refuse to load the malware driver.

          And cause all machines with legitimate Sony drivers (if there is such a thing?) signed with the same cert to refuse to load those too.

          Unfortunately, yes. Sony will have to re-issue those legitimate drivers and sign them with a new cert. That is actually a good reason why a code signing certificate for widely distributed software absolutely should reside within a HSM, which will make the private key impossible to steal.

  • by NotDrWho ( 3543773 ) on Wednesday December 10, 2014 @09:51AM (#48563737)

    Just yesterday, they were the bastion of trustworthy software. Now this!

  • Systems Affected [symantec.com]: Windows 2000, Windows 7, Windows 95, Windows 98, Windows Me, Windows NT, Windows Server 2003, Windows Server 2008, Windows Vista, Windows XP ..
  • The scale of the Sony hack should have prompted the System admin to revoke any, and all certs that had the slightest possibility of being compromised. You can't keep the hackers out of your new fixed system if you still honor the certs they stole.
    • by jandrese ( 485 )
      Because CRLs suck and using them is a last resort. Plus, Sony has to re-issue the certs first or it will break existing consumer equipment. There is a chicken and egg problem where you want to push down the new certs securely before you invalidate the old ones, otherwise the consumers will get a warning about an improperly signed server trying to mess with the security on their machine.

      Or they have to wait for some third party (Windows Update for instance) to push it out, which takes time.
  • Apparently not. (Score:4, Informative)

    by Anonymous Coward on Wednesday December 10, 2014 @02:04PM (#48565563)

    From ISC SANS

    "Update: Turns out that the malware sample that Kaspersky was reporting on was not actual malware from a real incident. But the story isn't quite "harmless" and the certificate should still be considered compromised. A researcher found the certificate as part of the SONY data that was widely distributed by the attackers. The filename for the certificate was also the password for the private key. The researcher then created a signed copy of an existing malware sample retrieved from Malwr, and uploaded it to Virustotal to alert security companies. Kaspersky analyzed the sample, and published the results, not realizing that this was not an "in the wild" sample. [1] The certificate has been added to respective CRLs."

  • Well, it wouldn't be the first Sony-signed rootkit...

I'm a Lisp variable -- bind me!

Working...