Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Graphics The Internet

WebGL Flaw Leaves GPU Exposed To Hackers 120

recoiledsnake writes "Google spent a lot of time yesterday talking up WebGL, but UK security firm Context seems to think users should disable the feature because it poses a serious security threat, and the US Computer Emergency Readiness Team is encouraging people to heed that advice. According to Context, a malicious site could pass code directly to a computer's GPU and trigger a denial of service attack or simply crash the machine. Ne'er-do-wells could also use WebGL and the Canvas element to pull image data from another domain, which could then be used as part of a more elaborate attack. Khronos, the group that organizes the standard, responded by pointing out that there is an extension available to graphics card manufacturers that can detect and protect against DoS attacks, but it did little to satisfy Context — the firm argues that inherent flaws in the design of WebGL make it very difficult to secure."
This discussion has been archived. No new comments can be posted.

WebGL Flaw Leaves GPU Exposed To Hackers

Comments Filter:
  • dupe (Score:4, Informative)

    by erroneus ( 253617 ) on Friday May 13, 2011 @11:39AM (#36119492) Homepage

    dupe dupe dupe

    • Re:dupe (Score:4, Insightful)

      by Shotgun ( 30919 ) on Friday May 13, 2011 @11:45AM (#36119538)

      Do you mean that the article is a dupe, or that Google is duplicating the mistake Microsoft made with ActiveX and the whole "it is so convenient to let anyone in the world do whatever they please on my computer" mentality?

      • by oGMo ( 379 )
        Yes.
      • by AmiMoJo ( 196126 )

        It seems incredible that Google would allow such a blatant security hole after making so much effort to sandbox and partition Chrome.

        • Re: (Score:3, Insightful)

          by Desler ( 1608317 )

          Don't worry, just like with the previous story they'll just claim it wasn't a flaw in Chrome (despite it bypassing the Chrome sandbox) and downplay it.

          • The whole thing... (Score:4, Insightful)

            by fyngyrz ( 762201 ) on Friday May 13, 2011 @12:58PM (#36120368) Homepage Journal

            ...is part of a serious cultural error being made: an impetus by hopeful marketers towards applications that run in/on the browser rather than in the user's machine. Both putting data "in the cloud" and running apps "from the cloud" are fraught with pitfalls; insightful users (a minority, as always) will resist this trend with traditional in-machine applications and fully local storage of data. The rest will suffer as corporations (continue to) misuse their data.

            The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know. There is no remedy for this kind of error, either -- once you hand your data over, you have lost control of it, and in turn, you have lost control over the consequences of random third parties misusing your information.

            The good news is that we have a broad set of extremely powerful applications available to us that run well in the local environment. Word processors, spreadsheets, sound, image and video editors, music and video library engines, educational software and a whole host more are all very well populated with traditional applications, so for the thinking user, there is no need to "go to the cloud" for classic compute tasks. Instead, the net can be used for communications, both as its heritage dictates and as the most sensible domain fit, while personal data and execution permissions remain secure in and at the local environment.

            To help protect yourself, I suggest beginning by disabling flash, scripting and use only CSS/HTML in the web-facing interface. As a side benefit, surfing is much more pleasant without pop-overs, flash ads, and many other corporate infections of the network.

            Neither Google or any other corporation has your best interests in mind. Start from that understanding, and the world will make considerably more sense.

            • Re: (Score:3, Insightful)

              insightful users (a minority, as always) will resist this trend with traditional in-machine applications and fully local storage of data.

              Let's hope those insightful users are also insightful enough to actually have backups.

              The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know.

              And it's not possible to avoid both of these, sorry. In fact, it's not possible to avoid the latter at all.

              The good news is that we have a broad set of extremely powerful applications available to us that run well in the local environment.

              The bad news is that any local application has at least as much access as these web apps do.

              To help protect yourself, I suggest beginning by disabling flash,

              Thus "protecting" yourself from YouTube, FreeFillableForms (the only way to file US taxes online that I know of), etc.

              scripting

              Thus "protecting" yourself from things like Gmail, Google Instant Search, and... Do I really need to spell it

              • The browser application has only the data I allow it.

                You really think you have that much control? I can assure you that you don't know if your entire disk has been uploaded or not..

                • And how's that? Barring a vulnerability in the browser, that's not going to happen. Barring a massive conspiracy through a fairly large open source community, that vulnerability is not going to be deliberate.

                  Do I really need to spell this out?

                  • Nothing to do with open source..

                    Look in the hardware, where the trade secrets hide out.. There is nothing to protect you from that.. In fact the law already protects them from your attempts at discovery... I shouldn't need to spell that out..

                    • Which again makes the browser completely irrelevant to this conversation.

                      If they've pwned my hardware, or my OS, or anything else that runs at Ring 0 or better, I'm fucked. Hell, even if it's not in Ring 0, if it just has root, I'm fucked. May as well give up and go home. I have no choice but to trust those who have that amount of control over my machine.

                      However, I don't have to trust more than that. For instance, so far as I know, Blackboard hasn't contributed to any of the hardware or root-level software

                    • Which again makes the browser completely irrelevant to this conversation.

                      That might explain why I didn't mention the browser, no?

                      I have no choice but to trust those who have that amount of control over my machine.

                      Correct

                      ...but they don't get my credit cards, SSN, email, etc.

                      You sure?

                      Really, what was your point?

                      You said it.. trust no one.. Trust nothing you can't verify.. Otherwise feel free to take your chances, but until you can verify it, you won't convince me that you have any privacy in networked commu

                    • Which again makes the browser completely irrelevant to this conversation.

                      That might explain why I didn't mention the browser, no?

                      Then we've gone pretty far offtopic.

                      ...but they don't get my credit cards, SSN, email, etc.

                      You sure?

                      As sure as I reasonably can be. Where would they get them from?

                      You said it.. trust no one.. Trust nothing you can't verify..

                      These are contradictory statements, unless you're going to claim you can't verify anyone or anything.

                      until you can verify it, you won't convince me that you have any privacy in networked communications...

                      I can verify, through at least one open source implementation, how SSL works, at every level. I know precisely what the risks are before I deem them acceptable.

                      I can verify the security of any browser I choose, and what it allows or doesn't allow. I can verify the security of every plugin I allow except Flash,

              • by fyngyrz ( 762201 )

                Let's hope those insightful users are also insightful enough to actually have backups.

                Yes, let's. Of course, on the other hand, you have to hope Google hasn't lost your email or isn't down when you need it, don't you? As opposed to your trusty email program, which is right there all the time. And lets you do things like automatically assign identities to your replies, or delete your attachments, or view your email in non-prop fonts... and doesn't surf your email for keywords... and doesn't have employees

                • Yes, let's. Of course, on the other hand, you have to hope Google hasn't lost your email or isn't down when you need it, don't you?

                  That's a bit unfair, don't you think? Aside from how rare this is, if we can assume savvy users are going to back things up, why wouldn't we assume those same users know how to backup gmail?

                  You can certainly avoid doing it over channels like the Internet, though,

                  Not if you're going to be as paranoid as you'd like to be. How do you know a given local app isn't phoning home with your data? How do you know your hardware isn't lying to you about it?

                  For that matter, if you're going to avoid executing software which has come to you through the Internet, you aren't going to be patching

                  • Thanks so much for taking the time to write this. Great points all around. Well played.

                  • by fyngyrz ( 762201 )


                    That's a bit unfair, don't you think? Aside from how rare this is...

                    No, I don't think it's unfair at all. What does "Google is down" mean to someone trying to use GMail? It can mean Google itself isn't responding, which happens from time to time. It can also mean that the path between Google and you is down, which also happens. Or it can mean that the path is so clogged with other material that using GMail isn't practical. Or it can mean that your local ISP or the company network is down or doing maintaina

                    • That's a bit unfair, don't you think? Aside from how rare this is...

                      No, I don't think it's unfair at all. What does "Google is down" mean to someone trying to use GMail? It can mean Google itself isn't responding, which happens from time to time.

                      Well, again, rare. Rare enough that it tends to make headlines.

                      It can also mean that the path between Google and you is down, which also happens.

                      Much more likely is the path from you to your ISP. Beyond that, we're again talking about the sort of outages which make headlines.

                      Or it can mean that the path is so clogged with other material that using GMail isn't practical. Or it can mean that your local ISP or the company network is down or doing maintainance. In all of these are situations you can read and write email with a local client;

                      Unless you're using IMAP.

                      Or, if you want to be fair, you can enable offline mode for either Gmail or IMAP. It is exactly as simple for Gmail as it is for IMAP -- and even if it were more complex, it's possible to use IMAP with Gmail.

                      And then there are the substandard limits of GMail as compared to a full-featured client, of which there are plenty...

                      Which? I actually miss a few features from Gmail with my local clients, though I imag

              • by EdIII ( 1114411 )

                You did not mention the the whole "3rd party" thing either.

                Yeahhhh........

                Okay. Let's spend a fuck ton of money on MS SQL servers, coding departments, internal backups ONLY (remember... anything "offsite" is 3rd party, unless you are so rich in the IT budget you can afford TWO data centers), develop non-web based front ends to applications (because anything browser based is bad and stupid) where you have to version control and a competent IT dept to keep all computers uniform and up to date to run your loc

            • To help protect yourself, I suggest beginning by disabling flash, scripting and use only CSS/HTML in the web-facing interface.

              How about you get hold of something that lets you whitelist which sites are allowed to use Flash and Javascript...? That way your computer will still be able to do something useful.

              • by fyngyrz ( 762201 )


                How about you get hold of something that lets you whitelist which sites are allowed to use Flash and Javascript...? That way your computer will still be able to do something useful.

                If your computer can't do anything useful without flash and javascript, then frankly... well, never mind.

                If you allow a site to operate those technologies in your execution space, you are allowing third -- and fourth, and 5th, and etc. -- parties access to your hardware. Not a good idea. Your "trusted" site will likely sell som

            • The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know. There is no remedy for this kind of error, either -- once you hand your data over, you have lost control of it, and in turn, you have lost control over the consequences of random third parties misusing your information.

              Well, I have bad news for you. As soon as you start up your computer, even before the operating system starts up, your c

          • The previous story was about how a plugin-- which by all counts is installed on >90% of machines-- had a flaw, and was exploitable on any of those machines. Google, in an attempt to ensure timely application of plugin updates, bundled that plugin with Chrome (which is one of the reasons I have started looking at MSI rollouts of Chrome-- if theyre going to have flash no matter what I say, at least it will be up to date).

            Fair enough to call the software package "Chrome" vulnerable, while technically the p

      • by Bengie ( 1121981 )

        "WebGL is implemented in Google Chrome[4], Mozilla Firefox 4 and in development releases of Safari and Opera."

        Khronos Group is making WebGL. Blame them, not Google. Khronos Group also makes OpenGL. Google is just following standards and bragging about how they're implementing them first.

        • by Anonymous Coward

          You don't have to implement every standard, and you certainly don't have to implement Khronos Group standards.

          In fact, a company has a responsibility not to implement standards that are ill thought-out. Standard does not, by itself, mean good.

      • Ive never been precisely clear on why "passing instructions directly to the GPU" is different than youtube / h.264 "passing data directly to be decoded by GPU" or JIT'd javascript "being passed directly to the CPU". Surely it is possible to do such things securely? Its not like WebGL allows, for example, access to the GPU BIOS, or overclocking functions, or fan speed, and presumably they would limit the other things that can be done.

        I mean, games like WoW display data on the graphics card that is pulled o

      • Comment removed based on user account deletion
        • by Shotgun ( 30919 )

          Did the security team take a vacation or what?

          Security team? I don't think you and I are referencing the same Microsoft. 8*)

          What we need is a new language, built from the bottom up for security, that can do CPU and GPU virtualization so the code doesn't need to know squat about what you have to run the site.

          It's called Java. People rejected it because it was virtualized and ran in a sandbox 8*(

      • by makomk ( 752139 )

        Do you mean that the article is a dupe, or that Google is duplicating the mistake Microsoft made with ActiveX and the whole "it is so convenient to let anyone in the world do whatever they please on my computer" mentality?

        Hopefully the former, because the latter just isn't true. TFA has fearmongering talk about how WebGL gives websites "direct access to the hardware", but that just isn't true. All it allows websites to do is to request a restricted set of operations. Furthermore, what data they can access is also restricted by a combination of hardware and software checks. What's more, we've already had examples of graphics driver bugs that could be exploited by websites - for remote code execution, even - before WebGL even

    • why so many dupes these days? :s
    • Comment removed based on user account deletion
  • by MostAwesomeDude ( 980382 ) on Friday May 13, 2011 @11:50AM (#36119588) Homepage

    http://www.youtube.com/watch?v=WgbK0ztUkDM&feature=player_detailpage#t=3195s [youtube.com] is the video. In short, I asked the NaCl guy whether they knew what they were doing by letting NaCl clients access GPUs directly. His response was that they were doing everything WebGL does to protect the system from malicious code. That's unfortunately not sufficient.

    • by Anonymous Coward

      In Second Life there are denial of service attacks on the GPU going on as we speak. People who gather at infohubs to voice chat are occasionally knocked offline by them. It can clear a room full of people in seconds. Basically, the attacker wears a spiked sculpty (a texture that turns into geometry) that is far more processing than the GPU can handle rendering. It looks like static on your screen essentially, you lag like hell, and then you crash hard. The only way to stop it is if you get lucky and smash t

    • by gl4ss ( 559668 )

      well that's the new google engineers card, blame others.

      whole idea of NaCl is though like using windows 3.11 era style(it's actually convenient if a document can have code in it, in a trusted environment, flash and such are half-way houses to that), really.

      but who would use an os without native binaries? accountants maybe, for a short while. a really short while.

    • by CODiNE ( 27417 )

      Ahhh... well you see NaCl really means Sodium Chloride or common table salt.

      Sodium or Chlorine by themselves are toxic, yet when combined they cancel out each other's effects and make our food yummier.

      This is how Google plans to use native code execution and direct GPU access together. True, separately they are dangerous and leave a computer vulnerable to hackers. Yet, when combined, incredibly they cancel out each other's effects and make the internet yummier!

  • by Anonymous Coward

    Isn't there a point at which keeping your equipment safe from intruders is such a hassle that it's no longer worth even having the equipment? This constant battle to defend your computer is tiresome. I can't imagine not having a computer but this exhausting.

    • by pushing-robot ( 1037830 ) on Friday May 13, 2011 @11:57AM (#36119684)

      Welcome to "Everyone Else"—we're happy to have you as a member! Here's your complimentary iPad.

      • The iPad is still vulnerable to flaws in the browser and apps - for example, the PDF exploits which enabled 1-click jailbreak (and could just as easily enable 1-click malware) - and to flaws in the app overview process - which enabled a tethering app to be approved and installed by thousands before it was taken down.

    • by Desler ( 1608317 )

      This just in: Guy who works for Mozilla downplays issues in a standard originating from Mozilla.

      • Except WebGL comes from Khronos Group. Ya know, the same guys that brought us OpenGL. Mozilla is part of the founding body, but you can't say it originated from them.
        • by Desler ( 1608317 )

          WebGL grew out of the Canvas 3D experiments started by Vladimir Vukievi at Mozilla. Vukievi first demonstrated a Canvas 3D prototype in 2006. By the end of 2007, both Mozilla[6] and Opera[7] had made their own separate implementations.
          In early 2009 Mozilla and Khronos started the WebGL Working Group.

          Sorry, what eventually became WebGL originated from Mozilla and then later Mozilla and Khronos started the working group to standardize it. So yes, you can say it originated from them.

        • If my kid throws a rock that breaks your window, I probably won't get away with "Yeah, I'm a part of his founding body, but that rock didn't originate with me". It's technically true but effectively worthless as a disclaimer.
          • by Desler ( 1608317 )

            It's not even technically true since the Canvas 3D that became standardized as WebGL was originally created and implemented by Mozilla.

          • Desler's statement was that the WebGL standard originates "from Mozilla". Statement is false.

            Why do you feel the need to defend something so obviously wrong? Noones playing the blame game in this thread, theyre just fighting gross misinformation.

    • by LO0G ( 606364 )

      That's... Interesting. Buffer overflow attacks were once considered "speculative at best".

      Here's a question: What happens when you take drivers which were designed to run only local content (and thus have never been hardened against malicious content) and expose their entire API surface to the internet?

      The answer is similar to the answer to the question: What happens when you take network services which were designed to run only intranet content (and thus have never been hardened against malicious content

      • Here's a question: What happens when you take drivers which were designed to run only local content (and thus have never been hardened against malicious content) and expose their entire API surface to the internet?

        The driver vendors are coerced into hardening their drivers against malicious content. After that, web developers can write awesome apps, and card vendors can sell more cards, so everybody profits!

  • Horrible Article (Score:5, Interesting)

    by ace123 ( 758107 ) on Friday May 13, 2011 @12:10PM (#36119830) Homepage

    This is nothing more than a scary article about the well-known risk of denial-of-service and using shaders to extract pixels from a remote image -- and the media is slurping it up, using senteces like "run arbitrary code on the GPU ... render an entire machine unusable". Ugh.

    It's a completely over-hyped article about something the spec designers have known since day one. The article takes the fact that bad OpenGL drivers can crash a computer to mean a security hole, something which driver vendors are actively participating in resolving for future cards.

    I wasted my time reading that whole report a few days ago, and it basically said nothing that wasn't obvious and well-known. The only thing new is they are showing that there is no way to stop GPU code from extracting pixels from remote images embedded in a canvas, which is a real "security" hole, though there's not a whole lot of use for this.

    Basically, the extent to which this *should* affect webgl is that they will disallow textures from remote sites -- in other words, it could add an extra annoying implementation step for collaborative spaces that could include models from multiple sites. Also, they might choose to add an Infobar to prevent arbitrary websites from crashing the computer or making it run slowly.

    However, thanks to the media slurping this up and using words like "run arbitrary code on the GPU", "render an entire machine unusable", etc., people who read these articles and know nothing about the subject (i.e. idiots) will start to ask browser vendors to turn those features off. But to be honest, I hope people aren't this stupid and "FUD"-y articles like these are forgotten.

    Also, the title is plain misleading -- a denial of service attack on buggy drivers should not be described as "Leaves GPU Exposed". A website can not in any way take advantage of crashing a user's computer, and browser vendors will quickly respond with a blacklist patch when they learn of the affected GPU.

    If you disagree with anything I said, feel free to comment and I'll explain in more detail why what the article describes are not "security issues" in WebGL.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      "A website can not in any way take advantage of crashing a user's computer"

      Except those crashes are usually caused by buffer overflows which eventually lead to a well-crafted attack that causes remote code execution.

      • by ace123 ( 758107 )

        You said usually. That's not true in 99% of cases -- if you read the article (which is unfortunately slashdotted), the specific crashes in this case are in locking up the GPU itself by taking too long to render frames. This means that the computer will reset due to its watchdog timer, not because any malicious code was executed on the CPU.

        You are correct that the occasional graphics driver might be buggy, and that's why Mozilla has a whitelist of graphics manufacturers and cards that are actively patching t

    • Re:Horrible Article (Score:4, Interesting)

      by Mysteray ( 713473 ) on Friday May 13, 2011 @12:40PM (#36120186)

      I agree it's misleading to imply that there's a specific 'flaw' that leaves the GPU 'exposed'. That's the entire point of WebGL: to expose the GPU to web applications. Whether or not you think that's a good idea depends on where you fall on the security vs. functionality spectrum. It's an interesting discussion.

      Look at it this way: GPUs are extremely complex hardware/software combination systems representing a huge attack surface. They're designed either for zero-cost (integrated graphics) or maximum game performance. Security has never been a big driver for this market. Newer graphics engines like WebGL allow the GPUs to be programmed with somewhat arbitrary code. These programs need lightning-fast parallel access to several different kinds of memory and the security model for this programming environment looks something like an afterthought.

      Once again, the developers probably thought they didn't need to put security first since the primary use case was running trusted applications on single-user systems (e.g., games).

      It's not uncommon to see crash bugs in GPU systems. They look a heck of a lot like the blue screens that used to plague MS Windows. There's no reason to think these bugs will be any less exploitable than those of Windows XP SP 0. We've seen this play out with Adobe Acrobat reader, Flash, and any number of other binary browser plugins. Hopefully the graphics developers are better, but their challenge is much harder too.

      In short, all the ingredients are present making in the recipe for disaster. It's probably only a matter of time for exploitable vulnerabilities to surface. I don't think we should kill off WebGL altogether, but the right thing to do is to put the focus on its security.

      Personally, I look forward to using it, but I'm going to turn it off by default. I'm counting on noscript to let me enable it selectively. This is just good practice anyway.

    • However, thanks to the media slurping this up and using words like "run arbitrary code on the GPU", "render an entire machine unusable", etc., people who read these articles and know nothing about the subject (i.e. idiots) will start to ask browser vendors to turn those features off.

      The other response said it better, but you must disable WebGL if you want a secure browsing experience. It's going to take ten years for the manufacturers to get this right: you will still be reading about consequential WebGL

      • Right, we've seen many security companies trying some publicity stunts with half-assed theoretic threats.

        That's quite dangerous world we live in, with all those "specially constructed JPEG images that can execute arbitrary code" and all.

        Right now the only plausible thing in there is DoS - which would be fault of drivers, not inherent standard flaw.

        Crossdomain image data access could be a security breach, but it won't "make it very difficult to secure".

        • That's quite dangerous world we live in, with all those "specially constructed JPEG images that can execute arbitrary code" and all.

          Right now the only plausible thing in there is DoS - which would be fault of drivers, not inherent standard flaw.

          Don't be so sure... it only took 14 days for the GDI+ JPEG exploit [infosecwriters.com] to go from being a "half-assed theoretic" threat to being an "actually attacking people's computers out there" threat. Security companies certainly try to scare up business--I won't disagree with you there--but theoretical threats are the larval form of actual threats.

          I don't think you can just blame the driver manufactures either. The exploits that will be found are inevitable given the history and realities of graphic driver development,

    • Full disclosure, I know very little about WebGL other than a few things I've read and what's in this particular TFA. On the other hand, I consider myself quite knowledgable about computers in general. I'm wondering what the feasibility is of some sort of social engineering attack based around accessing the video buffers in the GPU and essentially snooping what other windows are open on your computer. Complex examples I thought of involve using OCR to determine information about the user that will convince t

    • I especially like this part:

      Fundamentally, WebGL now allows full (Turing Complete) programs from the internet to reach the graphics driver and graphics hardware which operate in what is supposed to be the most protected part of the computer (Kernel Mode)

      It elegantly scares ordinary people with sciency words and silently implies equality/correlation between "arbitrary (Turing Complete) programs sent for GPU execution via kernel-level drivers" and "arbitrary programs executed in the most protected part of the computer (Kernel Mode)".

      By the way, aren't there moddable games out there that allow modders to use their own shaders?
      I can't seem to remember, when was the last "a malicious shader in mod for game X exploits the vulnerabilit

  • by Anonymous Coward

    gpu shaders cannot access main memory. if there are driver errors that cause undesired side effects these should be fixed but the underlying architecture is sound.

    can we please move beyond 2D graphics. go whine about real security problems. this is not one of them.

    • by Desler ( 1608317 )

      go whine about real security problems. this is not one of them.

      I seem to remember Microsoft telling us the same thing about ActiveX...

    • by 0123456 ( 636235 )

      gpu shaders cannot access main memory.

      They can if you configure a texture buffer in main memory, e.g. by exploiting a driver bug to configure the buffer in the wrong place.

      Though I agree that the warnings in this post and the previous one are overhyped.

  • by asa ( 33102 ) <asa@mozilla.com> on Friday May 13, 2011 @12:21PM (#36119970) Homepage

    Slashdot really should have published a link to this response from Mozilla http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]

    • by Desler ( 1608317 )

      Yes, the most non-biased source of information is an employee of the company who originated the technology and was a founding member of the standardization group.

  • I've already had a newer version of Firefox crash an older X11 display driver. Absolutely rock solid on Firefox 3.6 and down, and every other program I want to run. But the new GPU acceleration in Firefox? Could cause all of X11 to go away.

    And Flash inside Firefox would pretty much guarantee a visit from the Coredump Gods. Fixed with a newer driver, but man is updating annoying.

    Kinda sad that "web browsing" is the most intense job run on my work machine. I would have thought the massively parallel buil

  • First, WebGL sends shader source code to the browser and the code is compiled and executed in OpenGL. This is no different from running any other OpenGL program on your machine. The remote attacker cannot make the GPU execute arbitrary hardware instructions, only whatever source he sends.
    The shaders pretty much execute in a sandbox (shader on GPU can only access buffers bound textures, vertex buffers, constant buffers, render targets etc etc). The access outside these buffers is not possible because the ha

  • by KewlPC ( 245768 ) on Friday May 13, 2011 @02:24PM (#36121364) Homepage Journal

    As with the previous article, this is much ado about nothing.

    The GPU can only run "arbitrary code" in the loosest possible sense. What happens is that an OpenGL or WebGL application gives the shader source code to the driver, which then compiles it into the native GPU instructions. You *can* pre-compile your shaders in OpenGL ES 2.0, but even then it's just intermediary bytecode, and the bytecode is vendor-specific.

    Furthermore, GLSL, the language used for OpenGL and WebGL shaders, is *very* domain-specific. It has no pointers, and no language support for accessing anything outside the GPU other than model geometry and texture data. *AND* it can only access the model geometry and texture data that the application have provided to it, and for GPUs that don't have any on-board VRAM it's up to the *driver* to determine where in shared system memory that the texture will be located.

    And you can't get around using shaders on modern GPUs. Modern GPUs don't have a fixed function pipeline, it's not in the silicon at all. For apps that try to use the old OpenGL fixed function pipeline, the driver generates shaders that do what the fixed function pipeline *would* have done based on the current state. Drivers won't keep emulating the old fixed function pipeline forever, though.

    • by KewlPC ( 245768 )

      Also, Windows (the most likely target of any attack) has had the ability since Vista to restart the GPU if it hangs (which is the only real attack possible when it comes to shaders: use a shader that is so computationally intensive the GPU becomes unresponsive). This isn't bullet proof, of course, but if Windows isn't able to restart the GPU after a few seconds of unresponsiveness then that's a *Windows* bug.

  • What on earth could go wrong... *facepalm*

    But I guess it still beats the shit of the disease that is Flash.

Success is something I will dress for when I get there, and not until.

Working...