Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Malware Running On Graphics Cards 103

An anonymous reader writes "Given the great potential of general-purpose computing on graphics processors, it is only natural to expect that malware authors will attempt to tap the powerful features of modern GPUs to their benefit. In this paper, the authors demonstrate the feasibility of implementing a malware that can utilize the GPU (PDF) to evade virus scanning applications. Moreover, the authors discuss the potential of more sophisticated attacks, like accessing the screen pixels periodically to harvest private data displayed on the user screen, or to trick the the user by displaying false, benign-looking information when visiting rogue web sites (e.g., overwriting suspicious URLs with benign-looking ones in the browser's address bar)."
This discussion has been archived. No new comments can be posted.

Malware Running On Graphics Cards

Comments Filter:
  • by Yvan256 ( 722131 ) on Monday September 27, 2010 @11:24AM (#33712860) Homepage Journal

    It says slashdot.org in my URL bar but since the last few months the comments of users appear to be from digg.

  • by Reilaos ( 1544173 ) on Monday September 27, 2010 @11:24AM (#33712864) Homepage

    With this technology, new, more sophisticated Rickrolling is now possible.

    • by jpapon ( 1877296 ) on Monday September 27, 2010 @11:45AM (#33713226) Journal

      cudaMemcpy(d_rickAstley,h_rickAstley, AMOUNT_OF_TIME*TO_GIVE_U_UP * sizeof(float) ,cudaMemcpyHostToDevice);

      d_RickRolled >> (d_rickAstley);

    • Rewriting arbitrary stuff on your screen is possible for any virus without "infesting" the GPU. You own the OS, so you can force the GPU to paint anything you want without running specialized code on it. Actually, simply changing the contents of your screen was all what most old DOS-virii did. The interesting part is the "hiding from detection" part, not the "yet another way to write stuff on the screen" part.

      • by neumayr ( 819083 )
        That so? I thought owning a modern OS wouldn't give you the hardware access you had in the ol' days of DOS, not by a long shot. Just freely drawing around the screen doesn't work. Getting your code run by the GPU OTOH..
        Maybe I should actually read what this story is about, seems interesting.
        • That so? I thought owning a modern OS wouldn't give you the hardware access you had in the ol' days of DOS, not by a long shot. Just freely drawing around the screen doesn't work.

          Yeah I hate these modern OSes where all the apps, games, screensavers, desktop buddies etc can't just freely output to the screen.. it makes Photoshop and CAD really difficult to use :s

    • Re: (Score:2, Flamebait)

      Comment removed based on user account deletion
      • Comment removed based on user account deletion
        • I am not a CUDA developer, but I'm pretty sure that yes it only works on nVidia GPUs, but no you don't need to install the CUDA SDK to be able to run CUDA apps, all you need is the hardware. So it would be a pretty big target really.

          I know that malware authors would think of doing this eventually, but I still think that giving them ideas like this is Bad. Just look at what happened with Twitter this week - someone releases a harmless proof of concept type attack, and now already the script kiddies are comin

  • by RyuuzakiTetsuya ( 195424 ) <taiki@c[ ]net ['ox.' in gap]> on Monday September 27, 2010 @11:32AM (#33712988)

    except instead of doing that, it looked for textures that were generated anyway by games ads and swapped in other textures.

    My friends looked at me like I was evil and crazy.

  • I will show them... (Score:5, Interesting)

    by halfEvilTech ( 1171369 ) on Monday September 27, 2010 @11:33AM (#33713000)

    "Moreover, the authors discuss the potential of more sophisticated attacks, like accessing the screen pixels periodically and harvest private data displayed on the user screen"

    I guess we just change all fields to mask the entries with **** or if we want to really fool them use dots.

    • Yes, because I always display all private data as a bunch of asterisks. Passwords aren't the only private thing.
      • by jpapon ( 1877296 )
        I actually made a plugin once that did exactly that... it made all the text onscreen asterisks (or just random garbled text), and only showed you the actual text when you highlighted it.

        Of course, I could never find a use for it besides installing it on friends' computers when they weren't looking...

        • Would you mind sharing the code?

          I'm trying to create a similar plugin, only converting a specific type of text to garbage; the problem is that I don't have time to start from scratch and I can't find anything similar to start with as a base.
          • by jpapon ( 1877296 )
            Let me look around... This was back in Uni, but I may have it on one of my old archive externals at home.
    • Malware writers have been doing something like that for a while now. Mainly to deal with screen keyboards that some lending institutions require. I'm not sure that it's particularly prevalent at this point, but the technology is already there to exploit that.
    • You're really uncreative. What about those on-screen keyboards banking sites use? Just access the display data, combine with a click logger and presto! You now have access to a person's banking.

  • by blai ( 1380673 ) on Monday September 27, 2010 @11:35AM (#33713020)
    Should read "nvidia adds twitter and pop3 integration to newest line of GPUs"
  • imagine (Score:3, Insightful)

    by KillaGouge ( 973562 ) <gougec17 AT msn DOT com> on Monday September 27, 2010 @11:36AM (#33713048)
    Imagine starting to be target for specific porn habits. No amount of private browsing would keep the ads from showing up on your computer.
    • Re: (Score:3, Funny)

      by PPH ( 736903 )

      Gotta log off now and start working on an algorithm to detect the presence of areola color and texture.

  • by arivanov ( 12034 ) on Monday September 27, 2010 @11:36AM (#33713050) Homepage

    I used to run a small computer repair and write-to-order software shop for a living while in the Uni with two more people. One of them had that idea around 1994. In those days it was just to store the code in the video RAM pages which are not directly accessible to a scanner and keep a small polymorphic backstrap routine in main memory.

    What goes around comes around. Looks like this is using a similar approach. Even if you compute some stuff on the card you still need a bootstrap within the main system to use it and talk back to the "mothership".

    • by postbigbang ( 761081 ) on Monday September 27, 2010 @12:04PM (#33713518)

      Now that I think of it, my electric razor with new programming, was trying to attack me this morning, or so it seemed...

    • Re: (Score:3, Interesting)

      by Rich0 ( 548339 )

      I agree that somehow the code has to get into the GPU, which means a bootstrap of some kind from the main CPU. I'm not sure it has to remain in the main memory for any period of time, however, as long as the graphics card has DMA access back into main memory.

      I'm not sure how memory protection works on the most modern systems, but at least in the past DMA had wide-open access to everything. So, if the graphics card needed to get back into the CPU for a short time, it could just modify the interrupt descrip

      • by TheRaven64 ( 641858 ) on Monday September 27, 2010 @12:27PM (#33713878) Journal
        DMA is not a problem. It goes via the GART (and has since the AGP days), so the GPU can only see the bits of memory that it is explicitly shown. A bigger problem is that separate processes may not be isolated from each other on the GPU, so your WebGL program and your window server may be running in the same virtual address space on the GPU. Your WebGL program is then free to read or write any window's contents, as long as it can find the correct virtual address for the buffers.
    • I used to run a small computer repair and write-to-order software shop for a living while in the Uni with two more people. One of them had that idea around 1994. In those days it was just to store the code in the video RAM pages which are not directly accessible to a scanner and keep a small polymorphic backstrap routine in main memory.

      So which one of you guys is the guilty party?

    • Re: (Score:3, Informative)

      by faragon ( 789704 )
      A big problem in 1994 was the poor quality of DRAM used in graphics cards and/or tight DRAM timmings (many SVGA cards had overclocked DRAM, specially the ones running in VESA Local Bus 32-bit bus for i80486 CPUs).
  • Popups 2.0 (Score:5, Interesting)

    by BradleyUffner ( 103496 ) on Monday September 27, 2010 @11:36AM (#33713056) Homepage

    This should make for some wonderful new kinds of pop up ads that can't be dismissed or in any way taken out of focus.

    • Indeed, this would be one step harder to deal with than those stupid javascript prompts which can't be canceled without popping up again or killing the entire browser.
    • Well thats already an issue - I've seen Malware that makes fake "Blue Screen of Death" with the warning about viruses. You can't Alt+tab or Alt+f4 out of it, CTRL+ALT+Delete does nothing, it acts just like a Blue Screen of Death except for 2 things: Your mouse is still visible and moves, and it'll actually go away if you leave it for 5 minutes - instead of everyones initial reaction to reboot the machine.

      I always wondered how it was they managed to bypass all the other functions that I should have had avail

      • by Mashiki ( 184564 )

        By using rootkits and system level injections. That's the only way to do it, and Vundo(and varients) is probably the most famous offender of it.

  • by Doc Ruby ( 173196 ) on Monday September 27, 2010 @11:39AM (#33713112) Homepage Journal

    User and role based authentication/authorization is essential to security, but not sufficient. A machine that brings authentication/authorization down to the process level would be more secure.

    I'd like a PC that enforced access control on each process running. Every call to any HW, whether CPU, MMU, GPU, or any bus, to require authentication. A crypto ASIC with scores of simultaneous auth units pointing at each process space and the ACL table for auth in just a few extra clock ticks on operations per process, at startup and randomly every dozen or so calls. More frequently when there's a "heightened alert" either by network notification or during and after other security events like DoS attacks and malware discovery.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      I, too, want my system to crawl to a halt due to a bunch of authentication overhead.

      • Then don't get the auth ASIC I helpfully described, or use it infrequently the way I helpfully described.

    • by Movi ( 1005625 )

      This is more or less how the new game consoles (X360 and PS3) work.

    • So I'm guessing you've never used any modern security focused operating system.

      We've had everything you've mentioned for years ... probably 30 years or more.

      The problem is, if you want to get anything done in a reasonable amount of time, you're going to turn all that stuff off. When you hook EVERY syscall things tend to get mighty slow. Go run some software in a profiler and you'll get a rough idea of just how shitty it performs in the real world.

      Theres a reason general purpose operating systems don't do

      • I have. All of which is why I want the machine to have an ASIC that effectively adds an AUTH instruction that takes only a few clock cycles, rather than do this essential operation in software every time, also caching credentials and grants in ASIC RAM or LUT.

  • So when can we expect the GPU port of the nam-shub to protect us from the Cult of A5h3rah?

  • Malware everywhere (Score:1, Interesting)

    by Anonymous Coward

    I have seen somewhere botnets on routers here in slashdot.

    What's the next device to be infected? Network printers? SSDs with that little ARM to perform GC? NICs?

    • Re: (Score:3, Interesting)

      There is malware that runs on network printers already. There was the Hoots worm that printed out the picture of an owl with "O RLY" on it.
  • Driver problem (Score:5, Interesting)

    by TheRaven64 ( 641858 ) on Monday September 27, 2010 @11:53AM (#33713356) Journal

    Modern GPUs include memory protection, so different processes can be prevented from reading each others' VRAM, just as they can be prevented from running each others' RAM. This is not always used by the drivers, which may just map the entire physical VRAM into the GPU's virtual address space. With properly written drivers, this is much harder.

    The big malware potential comes from WebGL. This allows you to run arbitrary GLSL code in the browser's (GPU) address space. Although you probably can't take over the entire display, you can potentially take over the entire browser window without permission. Hopefully, the driver will give you entirely separate GPU address spaces per GL context, but given how incompetent AMD and nVidia's driver teams have demonstrated themselves to be, I doubt it.

    • That sounds nice in theory, but I know that at least with NVIDIA's computing driver there isn't a whole lot of memory protection going on. My experience has been that its not terribly difficult to crash your entire system with a user level program.
    • then this gives us hopes for the Gallium 3D driver stack to fix this and implement proper memory protection on hardware supporting it. Well, to bad for Windows users.

      and nonetheless, AdBlock+ and NoScript will very probably block GLSL as they does with any other scripting language. So shoddy websites won't be able to assault you with unstopable full window ads. Well, to bad for IE users.

  • I don't want to plow the horn or wave the flag unless I know it's true. But given the various access levels and things that Linux uses in X.org and all that, I wonder if those same issues are more or less likely in a Linux + X situation?

    To my understanding, there is not direct reading or writing to the screen. There is screen capture functionality, but I don't know how it works or if it is simply a standard feature of the X window system (and either way, is THAT a vulnerability to be wary of?).

    In Windows

    • Re: (Score:3, Informative)

      by TheRaven64 ( 641858 )

      No, you're thinking at the wrong level. The problem is that every application that gets an OpenGL context can upload programs to the GPU and run them. Fine in theory, and a modern GPU has the ability to isolate different context's memory from each other, but the drivers don't always use it (and don't always use it correctly when they do). If you're using an nVidia or ATi blob driver, then you have the same code controlling the GPU as a Windows user, so if the vulnerability is on Windows it will also be o

  • Sigh (Score:3, Insightful)

    by Dancindan84 ( 1056246 ) on Monday September 27, 2010 @12:02PM (#33713486)
    Headline: "Malware Running On Graphics Cards"
    TFS/TFA: "Here's a paper showing that malware on graphics cards is theoretically possible and could possibly evade detection."

    If you were trying to sensationalize the headline, you might as well have thrown "won't anyone think of the children!?!?" in there as well.
    • Headline: "Malware Running On Graphics Cards" TFS/TFA: "Here's a paper showing that malware on graphics cards is theoretically possible and could possibly evade detection." If you were trying to sensationalize the headline, you might as well have thrown "won't anyone think of the children!?!?" in there as well.

      No kidding! This is just as bad as as that rail-gun rocket launcher headline from 2 weeks ago that had nothing to with crazy weapons.

  • "I was really not watching porn, it was just the virus that infected my geforce!"

  • Does anyone find it disturbing that taxpayers' money is used to do the bad guys' work for them? I can understand researching anti-malware strategies, but why are these people given money to come up with bad things to do to my computer?

    • by blair1q ( 305137 ) on Monday September 27, 2010 @12:29PM (#33713898) Journal

      Before you can build a wall, you have to imagine someone walking over the imaginary line at the edge of your yard.

      Or you can figure out that a wall would have been useful after they come into your yard, but then it's too late.

      See, most taxpayers understand that we pay taxes to prevent the crime, we don't wait until it happens and then rail that the government isn't doing anything about it.

      • Except for the loud mouths that seem to think that the only acceptable solution to crime is more jails and longer sentences.

        For the most part if you wait until you're attacked to deal with it, you've already lost and you're pretty much stuck with damage mitigation at best.
        • Better than judges being forced to release criminals because we were out of jails, as happens frequently in Brazil.
          • by blair1q ( 305137 )

            Most people get the point after being arrested. Others after being jailed for a few days.

            It's when when you're releasing violent, repeat, or crazy offenders for budgetary reasons you know your legislature is missing the point.

    • by Hatta ( 162192 )

      No, not in the least. You have to know where you are vulnerable to know what to protect. This is much like medical research, you have to know how to induce a disease in order to test new treatments.

    • by martas ( 1439879 )
      no, because security through obscurity can only go so far. and not having realized that this entire new class of exploits are possible is obscurity.
  • Does have the gpu in the cpu make it easy to get on the system / make it harder to get rid of?

  • If you were able to use the GPU to brute-force a password hash or similar authentication token for the system, you could install a rootkit on the card's option ROM.

    1.It'd get to run with ring 0 access on each boot before the OS has a chance to do anything.
    2. On EFI systems it'd have access to a TCP stack, full FAT and NTFS filesystem access, all included in the EDK [transact.net.au]. So it could update itself on the fly each boot.

    The video card makes a great trojan horse to house your malware.
  • So does this mean that IE9 with its GPU acceleration can be used as the avenue for attack?
  • I advise reading "My Other Computer is Your GPU" by myself and Jason Rodzik from earlier this year:

    http://dank.qemfd.net/dankwiki/images/d/d2/Cubar2010.pdf [qemfd.net]

    It covers these topics, and many more. :D

  • Once a virus is running on a system, it can shut down virus scanners, and all that kind of stuff. That is always the case. Doesn't matter where it runs. The key is to keep it from running and that's what virus scanners do. They are the doormen, they stop people who are on the "Bad list" from coming in. Well even if your virus was GPU based, it'd still have to come in and execute on the CPU like normal. There is no way to directly load something in the GPU and run it there. As such the virus scanner would ge

    • by c0lo ( 1497653 )

      Oh, executable packing/encryption. Ya viruses haven't done that since always. Sorry guys, but virus scanners are wise to that. They check for packed code.

      So really, I don't see anything special here.

      What is special: that you AV would run at CPU speed while the packers/unpackers will run at GPU speed. It would be like trying to catch a bank-robber who drives a Ferrari while riding your push-bike. Granted, you can still control the "roads", but control them too tight and I'll throw away your AV solution because it's making my computer run like on a 386/40 MHz on an everyday basis

      Plus this kind of malware might be easier to deal with: Just shut down GPU processing. Unlike the CPU, that is a feasible thing to do. So if a system is infected, the GPU gets turned off, the malware cleaned, the GPU restarted. Graphics cards still work fine when addressed in old "Just a bunch of pixels," mode.

      I imagine that a wise AV would use the GPU to actually run the unpacking and analysis/detection code faster.

      • You are confused in to thinking GPUs are magical "everything acceleration" devices. Please remember if that were the case, we'd just use them as CPUs instead. It isn't the case. GPUs are special kinds of processors, called stream processors. There are only some things they are good at. Other things, they are much slower than CPUs. You also miss that the virus has to execute on the CPU first. There is no way, none, in Windows to directly execute on a GPU. Just not how the system works. You have to have a CPU

        • by c0lo ( 1497653 )

          You are confused in to thinking GPUs are magical "everything acceleration" devices.

          Oh, am I now? That's incorrect, but I wonder what made you think so?

          So the virus has to come in, and then launch on the CPU, then it can unpack code to the GPU and execute it. The CPU process will still have to keep running the whole time, because that is how the system schedules resource time.

          With the minor correction that the CPU may be scheduled to other processes/threads (until the GPU finished and releases the lock on the memory), you are correct.

          Again I fail to see what any of this gains. A virus scanner doesn't even need to look at the GPU to deal with this.

          I never said the AV scanner would need to look at the GPU (only that would be advisable the AV scanner to use the GPU by itself).

          I assert that the AV scanner cannot take any decision based solely on the behavior of "programs asks the GPU to do something then start executing from me

  • by dmitriy ( 40004 ) on Monday September 27, 2010 @06:10PM (#33717598) Journal
    None of the described future attacks are feasible. Shared framebuffer is not accessible to applications directly for security reasons (authors think that this is "unfortunate"); direct access to framebuffer is not "inevitable" in the future -- much better technique is to use driver-controlled fast GPU blits: data doesn't leave GPU. Non-timesharing is non-issue -- driver can detect timeouts and reset hardware (TDR on Vista).

    So the only issue is polymorphic virus that may use GPGPU decryption. If this happens, scanners will start using CUDA, or GPU virtualization.
  • And why not - it's a time-honored tradition to make code run anywhere you can. Those who owned C64s (especially those who read Transactor) will recall that for a while there, programming the 1541 floppy-drive CPU was just about as cool as could be.

    Nowadays systems are so complex, and tools to study them / keep an eye on them are so relatively clueless, there could be^H^H^H^H^H^H^H^H are dozens of things going on in our PCs that we are blissfully unaware of. Very unfortunate ... making things simple enoug
  • It's harder, therefore effort vs reward is not good enough unless they have some good malware. Just slap some code into a PDF and you're all set.

Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.

Working...