Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Hackers Are Selling Legitimate Code-signing Certificates To Evade Malware Detection (zdnet.com) 50

Zack Whittaker, writing for ZDNet Security researchers have found that hackers are using code-signing certificates more to make it easier to bypass security appliances and infect their victims. New research by Recorded Future's Insikt Group found that hackers and malicious actors are obtaining legitimate certificates from issuing authorities in order to sign malicious code. That's contrary to the view that in most cases certificates are stolen from companies and developers and repurposed by hackers to make malware look more legitimate. Code-signing certificates are designed to give your desktop or mobile app a level of assurance by making apps look authentic. Whenever you open a code-signed app, it tells you who the developer is and provides a high level of integrity to the app that it hasn't been tampered with in some way. Most modern operating systems, including Macs , only run code-signed apps by default.
This discussion has been archived. No new comments can be posted.

Hackers Are Selling Legitimate Code-signing Certificates To Evade Malware Detection

Comments Filter:
  • by Anonymous Coward on Sunday February 25, 2018 @04:42PM (#56184881)
    Can I purchase a cert that helps?
  • by KBentley57 ( 2017780 ) on Sunday February 25, 2018 @04:51PM (#56184905)
    "Most modern operating systems, including Macs , only run code-signed apps by default." 1. Acquire source 2. $COMPILER 3. ./a.out I must not understand, anything really. Can someone clear this up, or is this just some slow Sunday news?
    • by Anonymous Coward on Sunday February 25, 2018 @04:58PM (#56184929)

      Your point is well taken. There are tons of ways to run unsigned code. Ignore that factual mistake. The point is that CAs are a very weak point in the system (as we all knew). Turns out delegating the keys to your kingdom, so to speak, isn't the best idea. The stewards of public trust, in this area like most others, are corrupt and greedy.

    • by Gravis Zero ( 934156 ) on Sunday February 25, 2018 @05:10PM (#56184979)

      google are so hard! https://www.manpagez.com/man/1... [manpagez.com]

    • by Dutch Gun ( 899105 ) on Sunday February 25, 2018 @07:35PM (#56185493)

      Are you asking "why can I compile and run locally-built apps?"

      Macs (and I think Windows) set a special attribute on files that have been downloaded from potentially untrustworthy sources, like downloading from the internet. It's not completely correct to say that Macs will only run signed apps by default. Rather, by default, they only run apps which are downloaded from an untrusted source if they are signed with a valid code certificate. Needless to say, a locally compiled application doesn't have this attribute set.

      To demonstrate this in practice, try creating a local web server. Something like (if I recall correctly):

      python -m SimpleHTTPServer

      Open a browser to "localhost:8000"

      Then, "download" your executable and save it. Try to execute it, and you'll see your macOS protest, because that "unsafe" attribute has been set via the process of downloading and saving the executable via the browser. Once run, that attribute is cleared, and the program is considered safe for execution without further checks.

    • by Jaime2 ( 824950 ) on Monday February 26, 2018 @11:52AM (#56188025)

      It means that ZDNet copied and pasted text from a barely readable article at RecordedFuture, and made no effort to figure out what the original author was trying to say. The RecordedFuture article was mostly of the "the sky is falling" type, with very little actual analysis, so figuring out what they were trying to say wouldn't have helped all that much anyways.

      At the end of the day, what they were saying was that anti-malware software often uses a scoring system and code that's signed with a legitimate certificate starts off with a higher score than unsigned or improperly signed code, and therefore gets through some defenses. Pretty much everyone already knew this. They also said that issuers of signing certificates have profit pressure to lower the bar on validation. This is also not new and the primary reason that some security people see centralized certificate authorities as a "bad idea".

      The quote you reference is an incomplete analysis. MacOS and Windows both have a simple protection mechanism built in for software downloaded from the Internet: they ask an "Are you sure" question if the executable is not signed by a trusted source. This was never designed to stop all malware and does a good job of filtering much of it. If a computer has more sophisticated anti-malware installed, it will do more analysis and might block malware even if it's signed. They both described the feature incompletely and failed to mention that the behavior is actually a pretty good deterrent that either causes malware writer to choose a different deployment avenue, or significantly raises their cost.

  • by Anonymous Coward on Sunday February 25, 2018 @04:52PM (#56184913)

    Modern app appers know that modern apperating apps only app APP-signed apps, NOT LUDDITE code-signed software!

    Apps!

  • by FrankSchwab ( 675585 ) on Sunday February 25, 2018 @04:53PM (#56184917) Journal

    So, we've found out in the past that some Certificate Authorities are about as trustworthy as the guy offering you Rolex's from the back of his van. At least he's open with the fact that he'll sell one to anyone.

    From that, we realized that a modern browser has innumerable CAs that they trust - and any one of them can issue rogue certificates.

    And now we realize that, not only do we have to worry about those, we have to recognize that, because the certificate issuance process isn't handled inside the client company, that anyone who can acquire the credentials of someone who can login to Digicert or whoever, can issue rogue certificates. And keeping credentials secret has been shown in the current world to be almost impossible.

    And yet we continue to write checks to CAs for certificates that we can't trust.

  • by oldgraybeard ( 2939809 ) on Sunday February 25, 2018 @05:49PM (#56185161)
    Isn't that the whole basis of the trust systems response? Is that certs can be revoked?

    Just wondering? I guess if you got bit in the mean time you would be irked. But future things could be stopped? Maybe? Wondering?

    Just my 2 cents ;)
    • by mysidia ( 191772 ) on Sunday February 25, 2018 @07:02PM (#56185381)

      Isn't that the whole basis of the trust systems response? Is that certs can be revoked?

      The Revokation mechanism is desgined to help with the rare case that the code signer's public key is compromised. It's NOT designed to facilitate the CA doing safety reviews on code they've signed to identify it as malware and cancel the signature.

      For performance reasons.... the Valid/Revoked status is generally cached at a minimum, for example, and some clients won't necessarily even check for revokation without a patch/upgrade being sent out to manually blacklist the cert --- the HARD end date on a cert is the expiration date on the cert;
        and revokation is not a very dependable facility; at least not without additional measures.

      • by oldgraybeard ( 2939809 ) on Sunday February 25, 2018 @08:00PM (#56185553)
        If that is true, what is the purpose? Why do we use it? Just my 2 cents ;)
        • by Opportunist ( 166417 ) on Monday February 26, 2018 @06:20AM (#56186709)

          It's mostly not entirely dependable because it happens so rarely (or happened, at least) that we still keep finding loopholes and faulty implementations.

        • by Jaime2 ( 824950 ) on Monday February 26, 2018 @10:34AM (#56187557)

          The purpose is to allow for a mechanism to recover from CA compromise, discovered protocol weakness, or private key compromise. If implemented properly, it would serve these purposes well. Unfortunately, the implementation of Certificate Revocation List checking has historically favored ease of access over security. It wasn't until a few years ago that some major web browsers checked for revocation at all.

          There are many reasons for this failure. Some security professional don't like the whole idea of the CA hierarchy and therefore don't put a lot of value in checking revocation status. These professionals would rather check for fingerprint changes and implement some sort of consensus-based mechanism to decide whether to trust a new cert. Some non-security minded folk think that checking for revocation only helps in very rare instances and hurts in the much more common cases of temporary unavailability of the CRL while also hurting performance - mostly in added connection latency.

          Think about this: a browser goes to an https site, gets a cert during the handshake and decides to check if the cert is revoked. The browser reads the CRL URL from the cert, and goes to download it. The CRL URL is almost certainly https, so it gets a cert when making that connection (unless the CRL is at the same domain as the content). This process repeats until there is a loop or until one of the CRLs is already in memory. Worst case, the CRL is on one of the sites that we are checking for revocation. In the event of a private key compromise, the bad actor can simply man-in-the-middle both the content and the CRL check. Presto, revocation check defeated. Absent this worst-case scenario, all of these checks are serialized, at least doubling the time it takes to connect.

          • by Anonymous Coward on Monday February 26, 2018 @12:05PM (#56188119)

            You sound new to the industry. CRL's designed purpose, like everything else in technology, is irrelevant to what it can be used for. Further, any CA can simply use cert stapling to force revocation upon every verification or download attempt.

            Worst case, the CRL is on one of the sites that we are checking for revocation. In the event of a private key compromise, the bad actor can simply man-in-the-middle both the content and the CRL check.

            Not with pinning. In addition, that would require the CA root cert private key to be compromised, which is always kept offline in professional CA's. That's like worrying about a specially crafted novel virus that bypasses all current vendor security explicitly crafted to only gain access to your smartphone selfies. Sure it's possible, but unless your last name is preceded with a political title there is nothing to worry about.

            Absent this worst-case scenario, all of these checks are serialized, at least doubling the time it takes to connect.

            No. Please stop spreading disinformation about the networking security field. Clear your browser cache, open the inspector, and watch the network timings. For a better view download Wireshark and learn how to use it. Learn what OCSP is. These things are taught in highschools now. People start in helpdesk for $10 an hour with this knowledge. Why don't you have it? Why are you writing as if you are an authority when cracking open a freshman textbook will show that you are twenty years behind and lacking theoretical understanding?

            • by mysidia ( 191772 ) on Monday February 26, 2018 @02:38PM (#56189349)

              Absent this worst-case scenario, all of these checks are serialized, at least doubling the time it takes to connect.

              This is why operating systems cache CRLs for weeks; to avoid serialization of CRL check with requests.
              The issue is similar for running programs VS visiting a website, except users expect programs to launch even faster, AND
              even While offline or disconnected to the internet, So the system has even LESS time to check for revocation on a code certificate.

              The PKI standards were simply Not designed in a way to handle revocation in an acceptable manner for end-user computing, thus Revocation checks
              are widely delayed or permitted to fail ------ If you want to revoke a cert AND have high effectiveness, then this needs to be a high-profile revocation with
              an announcement, And a software update for certificate blacklisting that users and administrators are alerted to apply to their systems quickly.

              The CRL URL is almost certainly https

              No. From what I see; most of the time the CRL URL is most certainly non-HTTPS.
              It makes sense..... ultimately there would be a circular dependency if CRL servers were HTTPS.
              Also, when the CRL is a HTTP file -- the file being downloaded generally has to be signed by the CA itself;
              one of the cool things about OCSP (for CA's that support) is the CA can delegate a separate certificate to handle revocations through the OCSP server,
              so the CA certificate does not have to be kept online to sign every new CRL.

            • by Jaime2 ( 824950 ) on Monday February 26, 2018 @04:34PM (#56190241)
              Pinning is an alternative to certificate revocation. The fact that pinning does not suffer the same weaknesses makes none of the above statements about certificate revocation invalid. Actually, I even said this:

              These professionals would rather check for fingerprint changes

              ... which is just a simplified explanation of pinning.

  • by superwiz ( 655733 ) on Sunday February 25, 2018 @06:33PM (#56185279) Journal
    Shouldn't it be "hackers are buying..." instead of "hackers are selling..."?
  • by Anonymous Coward on Sunday February 25, 2018 @06:38PM (#56185299)

    I wish I could have some form of code signing pseudonym. The software I write would benifit from a code signing certificate to authenticate it's from me but I certainly don't want my private details plastered all over it.

    • by Anonymous Coward on Monday February 26, 2018 @09:37AM (#56187275)

      "private details" is actually only your name and possibly country/city of residence.
      Most people would have to provide more information than that just to get a domain...
      But I agree that there would be advantages to a reputation based system, which wouldn't really care what you are called and whether you are a dog or not, just that you have released good stuff before.
      You can just sign your releases with GPG though. Sure Windows won't handle it automatically and it only is useful for advanced users thus, but certainly better than nothing.

  • by Anonymous Coward on Monday February 26, 2018 @01:05AM (#56185991)

    It's sad when those people can get certs more easily than independent developers. We need something that's like Letsencrypt that verifies applications based on on a site or something similar.

    • by Anonymous Coward on Monday February 26, 2018 @09:46AM (#56187331)

      The even more ridiculous part is, as OpenSource developer you can at least get relatively cheap certificates easy enough, but those aren't good enough to get crash reports! Completely ridiculous as of course the software can just include its own crash reported for the same effect, just with more duplicated code and higher risk for security issues. As a result, OpenSource projects don't do it. When it's a big enough project that then results in Microsoft themselves analyzing the bugs and the reporting it to the developers! WTF?!?
      Things like this just remind me how Microsoft still is completely incapable of understanding or even accommodating OpenSource.
      Another example is their store and the method to include "converted" desktop apps. Not only do you need a code signing certificate, you also need to answer loads of ridiculous questions.

  • by Anonymous Coward on Monday February 26, 2018 @09:50AM (#56187347)
    Have no fear APK will be along shortly to tell everyone about how hosts can stop you from being a victim of this by using his magic hosts file engine.
  • by Anonymous Coward on Monday February 26, 2018 @10:30AM (#56187537)

    The "trust" industry sells certificates in a way that makes it seem like you can trust any signed code, even though all they're doing is identity verification. Visit half a dozen software download sites and you'll easily run into malware wrapping installers that are signed with EV certificates. It's infuriating when an individual code signing certificate requires a visit to a notary, hard to meet requirements, and, after all that, gets blocked by the WIndows SmartScreen. As if my personal identity is less valuable than some throw away corporation registered by an industrial scale malware distributor.

    Last time I tried to get a code signing certificate from COMODO, they wanted ME to give THEM a link to an official list of notaries in my jurisdiction. Well, there isn't one, and the process for proper notary validation costs several times what they charge for the code signing certificates, so I'm positive none of the notary verification for personal code signing certificates in my jurisdiction gets done.

  • by michael_wojcik ( 4610715 ) on Monday February 26, 2018 @01:30PM (#56188801)

    They're not selling certificates. The CAs are selling the certificates, which are public documents once they're created.

    The "hackers" are selling the private keys that correspond to the certificates.

    This is a perfectly sensible, if unethical, business model. The incentive to keep the key private is to avoid diluting (usually to nothing) the value of certificate as a proof of provenance. Someone who obtains a code-signing certificate with the intent of selling the key doesn't have that incentive.

    And the headline's emphasis is wrong. As summary and TFA mention, the key finding is that these resold keys are displacing stolen keys for signing malware. And "legitimate" is imprecise, since (according to the research) while the certificates were obtained directly from CAs, that was under false pretense, with stolen credentials. So if the researchers are correct, this is more a shift from stealing signing keys to stealing credentials used to obtain certificates for keys generated by the attacker. That's not new; it's just more common than was popularly thought.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...