Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security

The CCleaner Malware Fiasco Targeted at Least 20 Specific Tech Firms (wired.com) 151

An anonymous reader shares a report: Hundreds of thousands of computers getting penetrated by a corrupted version of an ultra-common piece of security software was never going to end well. But now it's becoming clear exactly how bad the results of the recent CCleaner malware outbreak may be. Researchers now believe that the hackers behind it were bent not only on mass infections, but on targeted espionage that tried to gain access to the networks of at least 20 tech firms. Earlier this week, security firms Morphisec and Cisco revealed that CCleaner, a piece of security software distributed by Czech company Avast, had been hijacked by hackers and loaded with a backdoor that evaded the company's security checks. It wound up installed on more than 700,000 computers. On Wednesday, researchers at Cisco's Talos security division revealed that they've now analyzed the hackers' "command-and-control" server to which those malicious versions of CCleaner connected. On that server, they found evidence that the hackers had attempted to filter their collection of backdoored victim machines to find computers inside the networks of 20 tech firms, including Intel, Google, Microsoft, Akamai, Samsung, Sony, VMware, HTC, Linksys, D-Link and Cisco itself. In about half of those cases, says Talos research manager Craig Williams, the hackers successfully found a machine they'd compromised within the company's network, and used their backdoor to infect it with another piece of malware intended to serve as a deeper foothold, one that Cisco now believes was likely intended for industrial espionage.
This discussion has been archived. No new comments can be posted.

The CCleaner Malware Fiasco Targeted at Least 20 Specific Tech Firms

Comments Filter:
  • by goombah99 ( 560566 ) on Thursday September 21, 2017 @09:07AM (#55238117)

    If you never read this essay here it is
    https://www.ece.cmu.edu/~gange... [cmu.edu]

    Malware is slowly moving up the software chain to where this is becoming increasingly plausible.

    • Difficult to see that level of trust being achieved in this day of ad ridden smartphone aps that demand privelages far beyond what is needed (yet are so often granted because look! shiny virtual candy and puppies and magic swords and achievements and levels and you wouldn't want to consider those 2000 hours and $1200 you spent building your city a waste, would you?)

    • If you simply wish to verify you are not getting a trojan embedded into your binary by a compiler then you simply need to cross-compile a compiler from multiple compilers on multiple architectures and then compare the binaries each of the cross-compiled compilers produce. An example of this would be building GCC for x86 using itself and using Clang/LLVM on ARM (targetting x86). If the resulting builds of the GCC for x86 compiler produce identical binaries then it's extremely unlikely that either compiler

      • To do that, you would first need to make sure that the programs could be built with deterministic compilation. I don't believe that many projects have put in the time necessary to do that. That also ignores any optimizations or other features different compilers may use on the source code when compiling it.

        https://en.wikipedia.org/wiki/Deterministic_compilation [wikipedia.org]

        • You misunderstand. The point is to compile a compiler on using multiple platforms and compilers and use the resulting compilers to then build a program. If the compilers produce the same program binary same for each built version of the compiled compiler then it's unlikely to be compromised. This works because you are using the same compiler to build the program binary, just that the compiler was built using different methods.

          • and what does your turtle rest on?

            • is it a turtle binary or the turtle source code? ;)

            • It's multiple turtles. Ideally, one for each elephant's foot. The idea is that They aren't going to compromise them all.

              Suppose I take source code for the clang compiler, and compile it with clang, g++, Visual C++, and as many other compilers as I can get. Odds are that one of those compilers hasn't been compromised by Them, or at least not every one by the same Them. If everything's on the up and up, all of these compiled versions of clang should produce essentially the same code, so if two of them

      • Yes, many of us know about David Wheeler and his idea. Like so many ideas it works in theory, but not in practice. Trying to get the same source code to compile under different versions of GCC is hard enough. Getting it to compile in such a reflexive manner is not something that happens in reality I'm afraid.
        • Yes, many of us know about David Wheeler and his idea.

          It's the first I've heard of him or his idea.

          Trying to get the same source code to compile under different versions of GCC is hard enough.

          I'm not talking about using multiple versions of the same compiler, I'm talking about compiling a single version of a compiler using cross-architecture compilation and completely different compilers. The result is getting similar binaries of the same compiler for the same platform and target. Despite being similar, the compilers will produce identical binaries if they are not infected. Writing a trojan that will embed itself regardless of platform, operating s

          • You don't have any understanding of how compilers work. I PROMISE you that the result will not be anything even close to identical binaries.
            • I'm going to cut you a break and attribute this to miscommunication.

            • Why not? Let's take NSA C++. If it's written in reasonably portable C++, without undefined behavior or significant unspecified or implementation-defined behavior, it will compile to much different binaries on different platforms with different compilers. However, if all of these compilers are standard-conforming and the code is standard-compliant, the different binaries will do the same thing. Given identical input, they will produce output according to the abstract C++ execution model, and the impleme

              • Lets start with the fact that compilers are highly complex beasts, and are never written "in reasonably portable C++, without undefined behavior or significant unspecified or implementation-defined behavior", and then add to that the fact that there is not a 1 to 1 mapping of source code to assembly. Each compiler will take a different approach to implementing the source as assembly, and indeed different compiler options and targets will change the resultant binary, often in radical ways. The same compile
                • I suspect you're overstating the amount of implementation-dependent behavior in compilers, although it's been twenty years since I looked into it. Otherwise, I don't see how gcc and clang would be that portable.

                  However, the idea is not that two compilers spit out binaries that look alike. The idea is that, given a program, two compilers will spit out binaries that act alike. Two compiler binaries that act alike will put out mostly identical code given some source code.

                  The mapping of source to assemb

                  • "However, the idea is not that two compilers spit out binaries that look alike. The idea is that, given a program, two compilers will spit out binaries that act alike. Two compiler binaries that act alike will put out mostly identical code given some source code. "

                    This is completely false, and in the final summation you are contradicting yourself trying to say that they won't look alike, but will act alike, which will mean they look alike.

                    • Given portable source code, that doesn't rely on undefined behavior or effects of unspecified or implementation-dependent behavior, any good C++ compiler will produce code that is identical to any other in accesses to volatile objects and calls to system I/O routines (that being what the Standard requires). It won't be the same object code, because there's lots of different ways to accomplish the same thing.

                      Therefore, the output of good compilers, given the source code of a compiler, will be different b

                    • Right, but the whole theory, and it is a broken theory, is that you can prove they do the same thing by looking at the executable generated. It is easy to prove this is false. Build GCC with -S, -O2, and -O3 and compare the generated code. They will be radically different, and that is using the same source and compiler. For extra credit, prove that Thompson's malicious code doesn't only activate when built with -O2.
    • It was always plausible. It seems you didn't read it or don't understand what you read.
  • ...for outlining why I thought specific 32 bit platforms, like those used by corporate computing because they tend to maintain their existing image over time even if they have 64 bit machines rather than migrating to a 64 bit OS. Home computers have been sold with essentially only 64 bit OSes preinstalled for several years. Only ancient home computers and business computers are still 32 bit. Natural filter, reduces the amount of unwanted communications to the Command and Control servers.

    • by Maritz ( 1829006 )
      I suspect your assertion that corporations mainly use old 32 bit computers is largely bollocks. It's 2017.
      • by TWX ( 665546 )

        You misread it. It isn't that corporations mainly run 32-bit OS, it's that one won't find 32-bit OS anywhere else besides corporations.

  • Ben Kenobi: ...so you can see it was cleaning them...from a certain point of view.

  • by wardrich86 ( 4092007 ) on Thursday September 21, 2017 @09:24AM (#55238219)
    Seems weird that major tech firms would even bother with the likes of CCleaner... I'd assume they'd just re-image the PC's once they start getting fucky. In fact, I"m not even sure that most people use CCleaner.
    • by TWX ( 665546 ) on Thursday September 21, 2017 @09:35AM (#55238271)

      If it's simply a hop-off point, all you need is one engineer who operates outside of his IT department whose specific software needs mandate he has local admin rights on his computer. He runs the tool he uses at home instead of calling IT, and suddenly his box is now the initial penetration point to access the company network.

      • Application whitelisting would at least provide an audit trail in this case if not block the attempt to install altogether if the whitelist is controlled by another department.

    • I was wondering about this myself. I've never, ever seen the likes of CCleaner used in a professional setting. But, clearly, some do.

  • by Anonymous Coward

    My rule of thumb is never trust a source with foreign ties. We learned this from Kaspersky that its hard to distinguish if they are completely above board or not. Experts have said since Windows 7 that a registry cleaner is absolutely not recommended and could do more harm then good. Obviously they were not thinking in terms of malware. But don't install stuff on your PC that isn't needed.

    • by Vlad_the_Inhaler ( 32958 ) on Thursday September 21, 2017 @10:16AM (#55238501)

      All I have learned from Kaspersky is that some politician alleged Kaspersky may possibly be spying. No evidence, nothing. Nothing to indicate the politician knows anything above the Internet consisting of virtual tubes either. Everything else followed on from there.
      I actually trust Kaspersky to do the job more than I trust a lot of the competition, they have discovered some serious state-sponsored malware in the past. I don't know if Symantec still make virus scanners but when Google, Mozilla et al start initiating the process to "untrust" their certificates, I wouldn't run one of their scanners in a sandbox.

      • by theCat ( 36907 )

        That Kaspersky is as good as they are might be a good reason for nation states and global corporations to want to give them a hard time. IT has clearly become a modern munition, everyone is playing with fire, and there is a perverse incentive to undermine tools that make that play harder or less fruitful.

    • My rule of thumb is never trust a source with foreign ties.

      Which implies that you do trust domestic sources. It sounds like you should reevaluate how and what you decide to trust.

    • I don't trust software with foreign or domestic ties, and I feel a lot safer from Putin's snoops than Trump's. Russia has no legal authority over me, and no reason to be particularly concerned about me, unlike the US. I'll grant you that I don't know whether Kaspersky does anything for the Russian government, but I don't know whether the domestic products do anything for the US government. I know that no anti-virus that failed to detect the Sony rootkit is on my side.

    • by dddux ( 3656447 )
      I completely agree. That's why I don't trust Windows.
  • Comment removed based on user account deletion
  • CCleaner was always garbage that hosed the registry and "cleaned up" /TEMP. Completely useless and in many cases caused problems due to removal of placeholder registry items.

  • Anyone who thought that CCleaner was "security software" has no business using it, let alone submitting an article to Slashdot about it.

    It's a junk/orphan file cleanup utility. Not "security software". Not antivirus or anti-malware. Where do these idiots come from reporting this shit?

  • The code and techniques look like APT17 aka DeputyDog - hacking into tech firms, military and governments for the Chinese government for at least 10 years.

    They realized CCleaner was a fantastic indirect vector into a whole lot of firms, and god knows what else they've got their fingers in that people haven't noticed since most firms are Equifax level incompetent with security.

  • I use Avast free for a lot of my clients. Since CCCleaner is run by them, does that imply that I shouldn't trust Avast either?
  • They give the illusion of security behind the wall.

    If everything was exposed naked to the internet, it would have to be designed properly to be secure in the first place.

    "Sneaking behind a corporate firewall" only works if the machines behind that wall are not properly protected from each other.

MS-DOS must die!

Working...