Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security

GPU Malware Can Also Affect Windows PCs, Possibly Macs 49

itwbennett writes: A team of anonymous developers who recently created a Linux rootkit that runs on graphics cards has released a new proof-of-concept malware program that does the same on Windows. A Mac OS X implementation is also in the works. The problem the developers are trying to highlight lies not with the operating systems, such as Windows or Linux, nor with the GPU (graphics processor unit) vendors, but rather with existing security tools, which aren't designed to scan the random access memory used by GPUs for malware code.
This discussion has been archived. No new comments can be posted.

GPU Malware Can Also Affect Windows PCs, Possibly Macs

Comments Filter:
  • by flowerp ( 512865 ) on Tuesday May 12, 2015 @05:39AM (#49671377)

    I question why anyone would go that route for writing malware. When you start using the OpenCL APIs, your graphics cards will clock up and leave their low power states. The graphics card resource utilization (compute, memory transfers, memory usage) is shown by monitoring tools such as GPU-z and command line tools such as nvidia-smi. You can't hide anything on the GPU.

    • by Anonymous Coward on Tuesday May 12, 2015 @06:04AM (#49671431)

      They go that way because its there, obviously. Malware writers find the blindspot and that is a blindspot.

      If the malware writers don't find that then the NSA hackers will. Remember the hard disk flash is used by that NSA malware not the hard disk? That may in turn have had help from the hard disk vendors, by providing the NSA with the code for their hard disks:

      http://www.stuff.co.nz/technology/digital-living/66279485/NSA-hiding-Equation-spy-program-on-hard-drives

      Or that the NSA phone spyware that installs/runs itself in the modem chip of the mobile phone, not the computer of the mobile phone. That is easier because there are fewer modem makers than mobile phone makers. Qualcom LTE being common among many handsets.

      Or that spyware that runs inside the USB driver flash chip not the computer?

      If you don't notice the activity on your CPU why would you notice it on your GPU?

      • Found the link (Score:3, Informative)

        by Anonymous Coward

        The are lots of OSs in your computer that are hacked already. The wireless keyboard software or the wifi modem, or the ethernet card software running there. Even on an Apple HDMI *CABLE*, there was an ARM chip (doing re coding) that happens to be upgradeable, i.e. hackable by malware. The vendors leave the OS flashable since it might need a bug fix later, but in doing so they leave it exposed to malware.

        Apples HDMI cable is an ARM chip that can be 'upgraded':
        http://www.theverge.com/2013/3/1/4055758/why-does

    • There's many ways it could be quiet, in the audible way, at least on the desktop :
      - you use a fanless graphics card. duh!
      - your GPU is integrated into the CPU, and the combination is low power enough or a big enough heatsink and fans are installed
      - you got a low power midrange graphics card with a dual slot design and big enough heatsink and/or fan(s)
      - you messed with the fan profile, clocks, BIOS etc. to keep it quiet
      - certain high end cards are conservatively clocked to stay quiet (Quadro, Titan)

      On laptop

    • by gstoddart ( 321705 ) on Tuesday May 12, 2015 @07:36AM (#49671757) Homepage

      Honestly, it's an attack vector.

      Assuming that a particular attack vector couldn't ever happen sounds rather shortsighted.

      What's more likely ... this takes more work, but people will do it because of the same reasons they always write malware? Or that they'll just throw up their hands?

      Because if there's money to be made, or fun to be had ... why the hell wouldn't they exploit anything they can?

    • the amount of people who would be able to tell if they're gpu is being used if they're not using it is probably below half a percent. hell before this article i'd probably have chalked it up to some poorly coded program i'd installed on purpose or just dust buildup on the heat sink
    • I sincerely doubt you could tell the difference between utilisation from malware and... say... the Windows Aero interface.
      • You might notice if someone is tacky enough to run a hash cracker on the target machine's GPU; but GPUs are ever so good at very, very, fast memory access without straining themselves much or bothering the CPU at all. The 'ooh, antivirus isn't scanning your VRAM!!!' issue is practically irrelevant compared to the fact that you've got a more or less flexibly programmable secondary processor that can, in most systems, do whatever the hell it wants to pretty much all the RAM.

        The only saving grace is that it
    • by mlts ( 1038732 )

      You may notice that, as well as most Slashdotters... but how many users actually know anything about performance baselines or know/care about that?

      Most users will just complain that their laptop's battery life is shorter and that their laptop runs hotter, maybe blaming the PC maker on the topic.

      You can't really hide GPU usage, but most users or AV software are not going to be looking at that subsystem. Think Life of Bryan and the Roman legions searching one house multiple times. They won't check what is h

  • by Anonymous Coward on Tuesday May 12, 2015 @05:55AM (#49671415)

    The linux rootkit doesn't "run on GPUs".
    It allocates a buffer on the GPU and then stores strings in that buffer.
    So they've demonstrated that ... you can store data in RAM.
    Whoop-dee-fucking-doo.

    • What......RAM is vulnerable too? I didn't realize!!
    • by DrYak ( 748999 ) on Tuesday May 12, 2015 @07:38AM (#49671763) Homepage

      except that with big vendors like Nvidia, there is no memory protection (no good IOMMU support yet).

      So they've demonstrated that ... you can store data in RAM.
      Whoop-dee-fucking-doo.

      Except that, due to the above, that *RAM* happens to be accessible to anyone who might give a try.

      CPU:
      Imagine you have a software in which you are editing your *super secret* document.
      Imagine that there's a different software running in user space.
      That software can't access your document - there's a MMU on the CPU enforcing memory protection. A piece of software can't reach out to a memory block it doesn't have explicit access granted.

      GPU:
      Now imagine that the editor displays your document on the screen. It goes through the compositor onto a buffer on the graphic card (either GPU RAM, or RAM accessed by the GPU), before finally getting assembled for displaying on the screen (that's a normal behaviour).
      Now imagine you're also running a 3D game on the same computer. That game uses OpenCL to compute its physics. ...but...
      one of the shader run on the GPU is actually a trojan: instead of reading from some buffer used for the physics computation, it reads using a pointer to the location of CPU RAM where the display buffer of the first program is.
      It shouldn't be allowed to do so, but it does anyway. (e.g.: Nvidia's proprietary drivers) The request goes through and the game can siphon the output of the super secret editor. There's is (currently) nothing to enforce such memory protection and prevent one GPU shader to peek into buffers from another applications. Currently any shader running on the GPU can peek from any location it wants to.

      Nvidia and AMD need to properly implement support for IOMMU & the MMU inside the GPU itself.

      • Re: (Score:3, Informative)

        by Anonymous Coward

        except that with big vendors like Nvidia, there is no memory protection (no good IOMMU support yet).

        So then why not demonstrate that instead of "look mommy, I can store data in a buffer I own"?

        Currently any shader running on the GPU can peek from any location it wants to.

        Go ahead, try it. Hint: it won't work.

        Nvidia and AMD need to properly implement support for IOMMU & the MMU inside the GPU itself.

        They did. About 8 years ago.

      • by LoRdTAW ( 99712 )

        Nvidia and AMD need to properly implement support for IOMMU & the MMU inside the GPU itself.

        The IOMMU solved this by even giving each IO device its own virtual IO memory space. So no, the GPU can't randomly read protected memory. Nvidia and AMD don't have to implement anything as this is the job of the IOMMU, not the endpoint device itself.

        This was the same problem with the Firewire DMA exploit. Essentially before the IOMMU it was possible for any PCI card to randomly read any memory it wanted to. Firew

        • The IOMMU solved this by even giving each IO device its own virtual IO memory space. So no, the GPU can't randomly read protected memory. Nvidia and AMD don't have to implement anything as this is the job of the IOMMU, not the endpoint device itself.

          ...except that their drivers don't use it. Yes, there's a IOMMU in modern CPU. No, current GPU drivers don't use it fully. (According to several source about this proof-of-concept neither Nvidia's nor AMD's drivers do properly use IOMMU to isolate de GPU. They basically just grant the device wholesale access to the memory).

          Also, at least modern GPU form Nvidia have a MMU on the graphic card (doing the same job, but from the perspective of the GFX card: prevents 2 shaders from 2 different 3D applications fro

          • by LoRdTAW ( 99712 )

            ...except that their drivers don't use it. Yes, there's a IOMMU in modern CPU. No, current GPU drivers don't use it fully. (According to several source about this proof-of-concept neither Nvidia's nor AMD's drivers do properly use IOMMU to isolate de GPU. They basically just grant the device wholesale access to the memory).

            I misunderstood you due to bad verbiage: "No, current GPU drivers don't use it fully." The driver has nothing to do with enabling the IOMMU.
            The IOMMU automatically maps a device into its

  • by Anonymous Coward

    I think this is a solid idea and worth doing.

  • by Anonymous Coward

    While it is true that antivirus stuff currently tends not to peek at GPU address space; given the frankly shit job that AV managed with its current access to RAM and HDD, the implied "let Norton GPU 2015 protect you!" Message seems like it might eat a lot of expensive memory bandwidth for very little reward. How about CPU vendors(especially looking at you, Intel) stop pretending that an IOMMU is some kind of fancy enterprise feature, rather than virtually essential to not being caught with your pants down w

  • Wow, a complete fud story on this.

    nothing runs on the video card, they are string data there only.

    Shashdot now taking stories from WIRED?

  • I really don't want mcafee or norton scanning my GPU's ram and "quarantining" the (false) positives.
  • Sigh. (Score:5, Informative)

    by ledow ( 319597 ) on Tuesday May 12, 2015 @06:57AM (#49671593) Homepage

    1) It's misleading. The code is not executing on the GPU, it's just stored there.

    2) It's obvious. If you're just storing code as data, it doesn't matter what OS you use to do that.

    3) It's blatant pandering to media. Two stories (at least) on this, no extra content besides the bleeding obvious.

    4) It's a symptom of stupidity. If your only safety comes from being able to scan RAM or storage devices and find a "signature" amongst them of a known virus, you're an idiot. It's a stupid, pointless waste of time and computing resources. That there's an area of RAM available that DOESN'T have live protection built into existing antivirus is not shocking at all. Hell, you could store string stuff in the TPM chip, or in the HPA of a hard drive, or in an onboard EEPROM or anywhere else that antivirus can't scan. They'll be unable to "certify" it as safe (as if they could anyway!) and will have to rely on somehow spotting the loader program before execution no matter what variant of it is used, or how the actual data payload is encrypted. (Hint: They can't. Antivirus is exclusively "after the horse has bolted" security.

    5) Really, Slashdot?

    • by chihowa ( 366380 )

      WRT 4) It's not really a symptom of stupidity as much as a matter of limited choices. The OS can, and should, be hardened to prevent privilege escalation by malicious programs, but there's no universal and foolproof way to identify a user-run malicious program. A database of blacklisted known viruses isn't ideal, but it's not a bad approach. Especially if it's coupled with a whitelist of assumed benign programs (code signing). Without completely locking everything down and turning the general purpose comp

    • by mlts ( 1038732 )

      When cleaning PCs of malware, almost all of them have either perfectly functioning AV programs, or appear to do so. AV is useful on a legal eagle standpoint [1].

      As a usable tool of defense, I'd say that adblocking, blocking by IP address, using a hosts file, virtualization, and putting the web browser in a container/sandbox/VM will go far further in keeping malware at bay than any AV program. That, and not running randomly downloaded executables.

      We have had oddball places to store code since early on. In

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...