WebGL Flaw Leaves GPU Exposed To Hackers 120
recoiledsnake writes "Google spent a lot of time yesterday talking up WebGL, but UK security firm Context seems to think users should disable the feature because it poses a serious security threat, and the US Computer Emergency Readiness Team is encouraging people to heed that advice. According to Context, a malicious site could pass code directly to a computer's GPU and trigger a denial of service attack or simply crash the machine. Ne'er-do-wells could also use WebGL and the Canvas element to pull image data from another domain, which could then be used as part of a more elaborate attack. Khronos, the group that organizes the standard, responded by pointing out that there is an extension available to graphics card manufacturers that can detect and protect against DoS attacks, but it did little to satisfy Context — the firm argues that inherent flaws in the design of WebGL make it very difficult to secure."
Horrible Article (Score:5, Interesting)
This is nothing more than a scary article about the well-known risk of denial-of-service and using shaders to extract pixels from a remote image -- and the media is slurping it up, using senteces like "run arbitrary code on the GPU ... render an entire machine unusable". Ugh.
It's a completely over-hyped article about something the spec designers have known since day one. The article takes the fact that bad OpenGL drivers can crash a computer to mean a security hole, something which driver vendors are actively participating in resolving for future cards.
I wasted my time reading that whole report a few days ago, and it basically said nothing that wasn't obvious and well-known. The only thing new is they are showing that there is no way to stop GPU code from extracting pixels from remote images embedded in a canvas, which is a real "security" hole, though there's not a whole lot of use for this.
Basically, the extent to which this *should* affect webgl is that they will disallow textures from remote sites -- in other words, it could add an extra annoying implementation step for collaborative spaces that could include models from multiple sites. Also, they might choose to add an Infobar to prevent arbitrary websites from crashing the computer or making it run slowly.
However, thanks to the media slurping this up and using words like "run arbitrary code on the GPU", "render an entire machine unusable", etc., people who read these articles and know nothing about the subject (i.e. idiots) will start to ask browser vendors to turn those features off. But to be honest, I hope people aren't this stupid and "FUD"-y articles like these are forgotten.
Also, the title is plain misleading -- a denial of service attack on buggy drivers should not be described as "Leaves GPU Exposed". A website can not in any way take advantage of crashing a user's computer, and browser vendors will quickly respond with a blacklist patch when they learn of the affected GPU.
If you disagree with anything I said, feel free to comment and I'll explain in more detail why what the article describes are not "security issues" in WebGL.
Re:Horrible Article (Score:4, Interesting)
I agree it's misleading to imply that there's a specific 'flaw' that leaves the GPU 'exposed'. That's the entire point of WebGL: to expose the GPU to web applications. Whether or not you think that's a good idea depends on where you fall on the security vs. functionality spectrum. It's an interesting discussion.
Look at it this way: GPUs are extremely complex hardware/software combination systems representing a huge attack surface. They're designed either for zero-cost (integrated graphics) or maximum game performance. Security has never been a big driver for this market. Newer graphics engines like WebGL allow the GPUs to be programmed with somewhat arbitrary code. These programs need lightning-fast parallel access to several different kinds of memory and the security model for this programming environment looks something like an afterthought.
Once again, the developers probably thought they didn't need to put security first since the primary use case was running trusted applications on single-user systems (e.g., games).
It's not uncommon to see crash bugs in GPU systems. They look a heck of a lot like the blue screens that used to plague MS Windows. There's no reason to think these bugs will be any less exploitable than those of Windows XP SP 0. We've seen this play out with Adobe Acrobat reader, Flash, and any number of other binary browser plugins. Hopefully the graphics developers are better, but their challenge is much harder too.
In short, all the ingredients are present making in the recipe for disaster. It's probably only a matter of time for exploitable vulnerabilities to surface. I don't think we should kill off WebGL altogether, but the right thing to do is to put the focus on its security.
Personally, I look forward to using it, but I'm going to turn it off by default. I'm counting on noscript to let me enable it selectively. This is just good practice anyway.
Second Life - Already A Proven Attack Vector (Score:1, Interesting)
In Second Life there are denial of service attacks on the GPU going on as we speak. People who gather at infohubs to voice chat are occasionally knocked offline by them. It can clear a room full of people in seconds. Basically, the attacker wears a spiked sculpty (a texture that turns into geometry) that is far more processing than the GPU can handle rendering. It looks like static on your screen essentially, you lag like hell, and then you crash hard. The only way to stop it is if you get lucky and smash the key combo for disabling Volumes or already had it disabled, they combo is CTRL ALT SHIFT 9. Volumes are prims (geometry primitives), which include sculpties and also regular geometry (cubes, spheres, torus, etc) and by disabling it hair, buildings, floors, walls, jewelry, shoes, etc all vanish. You're left looking at a pretty blank world with more or less naked avatars. The people who commit these attacks are usually either just trolling for the lulz, or they have a chip on their shoulder because they've had their asses handed to them in an argument over mic in that hub. Which is often the case. I'd say it probably happens at least 2 or 3 times a day, if not more, in the korea1 infohub for example. Also, it's been reported that it can also potentially damaging your graphics card. If you can't stop it with the key combo fast enough, you're best off killing the application with Task Manager as soon as you notice it happening.
Re:ffs (Score:4, Interesting)
Can you promise that no SIMD scatter can be performed with offsets that it shouldn't?
Yes I can. For instance in ATI r6xx it can only go to a surface defined by SX_MEMORY_EXPORT_BASE/SX_MEMORY_EXPORT_SIZE described on page 127 here http://www.x.org/docs/AMD/R6xx_3D_Registers.pdf /usr/src/linux/drivers/gpu/drm/radeon and grep for VM_
In addition to that system memory is mapped to GPU via GPU VM page table, so only pages that were allocated by the process and that the kernel driver mapped into VM graphic context. See
So there are two layers of hardware enforced protection in addition to software command buffer parser that checks the addresses. Safe enough for you?