WebGL Flaw Leaves GPU Exposed To Hackers 120
recoiledsnake writes "Google spent a lot of time yesterday talking up WebGL, but UK security firm Context seems to think users should disable the feature because it poses a serious security threat, and the US Computer Emergency Readiness Team is encouraging people to heed that advice. According to Context, a malicious site could pass code directly to a computer's GPU and trigger a denial of service attack or simply crash the machine. Ne'er-do-wells could also use WebGL and the Canvas element to pull image data from another domain, which could then be used as part of a more elaborate attack. Khronos, the group that organizes the standard, responded by pointing out that there is an extension available to graphics card manufacturers that can detect and protect against DoS attacks, but it did little to satisfy Context — the firm argues that inherent flaws in the design of WebGL make it very difficult to secure."
dupe (Score:4, Informative)
dupe dupe dupe
Re:dupe (Score:4, Insightful)
Do you mean that the article is a dupe, or that Google is duplicating the mistake Microsoft made with ActiveX and the whole "it is so convenient to let anyone in the world do whatever they please on my computer" mentality?
Re: (Score:1)
Re: (Score:2)
It seems incredible that Google would allow such a blatant security hole after making so much effort to sandbox and partition Chrome.
Re: (Score:3, Insightful)
Don't worry, just like with the previous story they'll just claim it wasn't a flaw in Chrome (despite it bypassing the Chrome sandbox) and downplay it.
The whole thing... (Score:4, Insightful)
The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know. There is no remedy for this kind of error, either -- once you hand your data over, you have lost control of it, and in turn, you have lost control over the consequences of random third parties misusing your information.
The good news is that we have a broad set of extremely powerful applications available to us that run well in the local environment. Word processors, spreadsheets, sound, image and video editors, music and video library engines, educational software and a whole host more are all very well populated with traditional applications, so for the thinking user, there is no need to "go to the cloud" for classic compute tasks. Instead, the net can be used for communications, both as its heritage dictates and as the most sensible domain fit, while personal data and execution permissions remain secure in and at the local environment.
To help protect yourself, I suggest beginning by disabling flash, scripting and use only CSS/HTML in the web-facing interface. As a side benefit, surfing is much more pleasant without pop-overs, flash ads, and many other corporate infections of the network.
Neither Google or any other corporation has your best interests in mind. Start from that understanding, and the world will make considerably more sense.
Re: (Score:3, Insightful)
insightful users (a minority, as always) will resist this trend with traditional in-machine applications and fully local storage of data.
Let's hope those insightful users are also insightful enough to actually have backups.
The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know.
And it's not possible to avoid both of these, sorry. In fact, it's not possible to avoid the latter at all.
The good news is that we have a broad set of extremely powerful applications available to us that run well in the local environment.
The bad news is that any local application has at least as much access as these web apps do.
To help protect yourself, I suggest beginning by disabling flash,
Thus "protecting" yourself from YouTube, FreeFillableForms (the only way to file US taxes online that I know of), etc.
scripting
Thus "protecting" yourself from things like Gmail, Google Instant Search, and... Do I really need to spell it
Re: (Score:1)
The browser application has only the data I allow it.
You really think you have that much control? I can assure you that you don't know if your entire disk has been uploaded or not..
Re: (Score:2)
And how's that? Barring a vulnerability in the browser, that's not going to happen. Barring a massive conspiracy through a fairly large open source community, that vulnerability is not going to be deliberate.
Do I really need to spell this out?
Re: (Score:1)
Nothing to do with open source..
Look in the hardware, where the trade secrets hide out.. There is nothing to protect you from that.. In fact the law already protects them from your attempts at discovery... I shouldn't need to spell that out..
Re: (Score:2)
Which again makes the browser completely irrelevant to this conversation.
If they've pwned my hardware, or my OS, or anything else that runs at Ring 0 or better, I'm fucked. Hell, even if it's not in Ring 0, if it just has root, I'm fucked. May as well give up and go home. I have no choice but to trust those who have that amount of control over my machine.
However, I don't have to trust more than that. For instance, so far as I know, Blackboard hasn't contributed to any of the hardware or root-level software
Re: (Score:1)
Which again makes the browser completely irrelevant to this conversation.
That might explain why I didn't mention the browser, no?
I have no choice but to trust those who have that amount of control over my machine.
Correct
You sure?
Really, what was your point?
You said it.. trust no one.. Trust nothing you can't verify.. Otherwise feel free to take your chances, but until you can verify it, you won't convince me that you have any privacy in networked commu
Re: (Score:2)
Which again makes the browser completely irrelevant to this conversation.
That might explain why I didn't mention the browser, no?
Then we've gone pretty far offtopic.
...but they don't get my credit cards, SSN, email, etc.
You sure?
As sure as I reasonably can be. Where would they get them from?
You said it.. trust no one.. Trust nothing you can't verify..
These are contradictory statements, unless you're going to claim you can't verify anyone or anything.
until you can verify it, you won't convince me that you have any privacy in networked communications...
I can verify, through at least one open source implementation, how SSL works, at every level. I know precisely what the risks are before I deem them acceptable.
I can verify the security of any browser I choose, and what it allows or doesn't allow. I can verify the security of every plugin I allow except Flash,
Re: (Score:2)
Yes, let's. Of course, on the other hand, you have to hope Google hasn't lost your email or isn't down when you need it, don't you? As opposed to your trusty email program, which is right there all the time. And lets you do things like automatically assign identities to your replies, or delete your attachments, or view your email in non-prop fonts... and doesn't surf your email for keywords... and doesn't have employees
Re: (Score:2)
Yes, let's. Of course, on the other hand, you have to hope Google hasn't lost your email or isn't down when you need it, don't you?
That's a bit unfair, don't you think? Aside from how rare this is, if we can assume savvy users are going to back things up, why wouldn't we assume those same users know how to backup gmail?
You can certainly avoid doing it over channels like the Internet, though,
Not if you're going to be as paranoid as you'd like to be. How do you know a given local app isn't phoning home with your data? How do you know your hardware isn't lying to you about it?
For that matter, if you're going to avoid executing software which has come to you through the Internet, you aren't going to be patching
Re: (Score:2)
Thanks so much for taking the time to write this. Great points all around. Well played.
Re: (Score:2)
No, I don't think it's unfair at all. What does "Google is down" mean to someone trying to use GMail? It can mean Google itself isn't responding, which happens from time to time. It can also mean that the path between Google and you is down, which also happens. Or it can mean that the path is so clogged with other material that using GMail isn't practical. Or it can mean that your local ISP or the company network is down or doing maintaina
Re: (Score:2)
That's a bit unfair, don't you think? Aside from how rare this is...
No, I don't think it's unfair at all. What does "Google is down" mean to someone trying to use GMail? It can mean Google itself isn't responding, which happens from time to time.
Well, again, rare. Rare enough that it tends to make headlines.
It can also mean that the path between Google and you is down, which also happens.
Much more likely is the path from you to your ISP. Beyond that, we're again talking about the sort of outages which make headlines.
Or it can mean that the path is so clogged with other material that using GMail isn't practical. Or it can mean that your local ISP or the company network is down or doing maintainance. In all of these are situations you can read and write email with a local client;
Unless you're using IMAP.
Or, if you want to be fair, you can enable offline mode for either Gmail or IMAP. It is exactly as simple for Gmail as it is for IMAP -- and even if it were more complex, it's possible to use IMAP with Gmail.
And then there are the substandard limits of GMail as compared to a full-featured client, of which there are plenty...
Which? I actually miss a few features from Gmail with my local clients, though I imag
Re: (Score:2)
You did not mention the the whole "3rd party" thing either.
Yeahhhh........
Okay. Let's spend a fuck ton of money on MS SQL servers, coding departments, internal backups ONLY (remember... anything "offsite" is 3rd party, unless you are so rich in the IT budget you can afford TWO data centers), develop non-web based front ends to applications (because anything browser based is bad and stupid) where you have to version control and a competent IT dept to keep all computers uniform and up to date to run your loc
Re: (Score:2)
To help protect yourself, I suggest beginning by disabling flash, scripting and use only CSS/HTML in the web-facing interface.
How about you get hold of something that lets you whitelist which sites are allowed to use Flash and Javascript...? That way your computer will still be able to do something useful.
Re: (Score:2)
If your computer can't do anything useful without flash and javascript, then frankly... well, never mind.
If you allow a site to operate those technologies in your execution space, you are allowing third -- and fourth, and 5th, and etc. -- parties access to your hardware. Not a good idea. Your "trusted" site will likely sell som
Re: (Score:1)
Well, I have bad news for you. As soon as you start up your computer, even before the operating system starts up, your c
Re: (Score:2)
The previous story was about how a plugin-- which by all counts is installed on >90% of machines-- had a flaw, and was exploitable on any of those machines. Google, in an attempt to ensure timely application of plugin updates, bundled that plugin with Chrome (which is one of the reasons I have started looking at MSI rollouts of Chrome-- if theyre going to have flash no matter what I say, at least it will be up to date).
Fair enough to call the software package "Chrome" vulnerable, while technically the p
Re: (Score:2)
"WebGL is implemented in Google Chrome[4], Mozilla Firefox 4 and in development releases of Safari and Opera."
Khronos Group is making WebGL. Blame them, not Google. Khronos Group also makes OpenGL. Google is just following standards and bragging about how they're implementing them first.
Re: (Score:1)
You don't have to implement every standard, and you certainly don't have to implement Khronos Group standards.
In fact, a company has a responsibility not to implement standards that are ill thought-out. Standard does not, by itself, mean good.
Re: (Score:2)
Ive never been precisely clear on why "passing instructions directly to the GPU" is different than youtube / h.264 "passing data directly to be decoded by GPU" or JIT'd javascript "being passed directly to the CPU". Surely it is possible to do such things securely? Its not like WebGL allows, for example, access to the GPU BIOS, or overclocking functions, or fan speed, and presumably they would limit the other things that can be done.
I mean, games like WoW display data on the graphics card that is pulled o
Re: (Score:2)
Re: (Score:2)
Did the security team take a vacation or what?
Security team? I don't think you and I are referencing the same Microsoft. 8*)
What we need is a new language, built from the bottom up for security, that can do CPU and GPU virtualization so the code doesn't need to know squat about what you have to run the site.
It's called Java. People rejected it because it was virtualized and ran in a sandbox 8*(
Re: (Score:2)
Do you mean that the article is a dupe, or that Google is duplicating the mistake Microsoft made with ActiveX and the whole "it is so convenient to let anyone in the world do whatever they please on my computer" mentality?
Hopefully the former, because the latter just isn't true. TFA has fearmongering talk about how WebGL gives websites "direct access to the hardware", but that just isn't true. All it allows websites to do is to request a restricted set of operations. Furthermore, what data they can access is also restricted by a combination of hardware and software checks. What's more, we've already had examples of graphics driver bugs that could be exploited by websites - for remote code execution, even - before WebGL even
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
I asked about this at Google I/O! (Score:5, Insightful)
http://www.youtube.com/watch?v=WgbK0ztUkDM&feature=player_detailpage#t=3195s [youtube.com] is the video. In short, I asked the NaCl guy whether they knew what they were doing by letting NaCl clients access GPUs directly. His response was that they were doing everything WebGL does to protect the system from malicious code. That's unfortunately not sufficient.
Second Life - Already A Proven Attack Vector (Score:1, Interesting)
In Second Life there are denial of service attacks on the GPU going on as we speak. People who gather at infohubs to voice chat are occasionally knocked offline by them. It can clear a room full of people in seconds. Basically, the attacker wears a spiked sculpty (a texture that turns into geometry) that is far more processing than the GPU can handle rendering. It looks like static on your screen essentially, you lag like hell, and then you crash hard. The only way to stop it is if you get lucky and smash t
Re: (Score:2)
well that's the new google engineers card, blame others.
whole idea of NaCl is though like using windows 3.11 era style(it's actually convenient if a document can have code in it, in a trusted environment, flash and such are half-way houses to that), really.
but who would use an os without native binaries? accountants maybe, for a short while. a really short while.
Re: (Score:2)
Ahhh... well you see NaCl really means Sodium Chloride or common table salt.
Sodium or Chlorine by themselves are toxic, yet when combined they cancel out each other's effects and make our food yummier.
This is how Google plans to use native code execution and direct GPU access together. True, separately they are dangerous and leave a computer vulnerable to hackers. Yet, when combined, incredibly they cancel out each other's effects and make the internet yummier!
When will it end? (Score:1)
Isn't there a point at which keeping your equipment safe from intruders is such a hassle that it's no longer worth even having the equipment? This constant battle to defend your computer is tiresome. I can't imagine not having a computer but this exhausting.
Re:When will it end? (Score:4, Funny)
Welcome to "Everyone Else"—we're happy to have you as a member! Here's your complimentary iPad.
Re: (Score:2)
The iPad is still vulnerable to flaws in the browser and apps - for example, the PDF exploits which enabled 1-click jailbreak (and could just as easily enable 1-click malware) - and to flaws in the app overview process - which enabled a tethering app to be approved and installed by thousands before it was taken down.
Re: (Score:2)
Yeah, and a Honda Civic isn't immune to mechanical problems. But when it does, it's somebody else's problem.
Re: (Score:2)
You can fix your own Civic. A better example might have been a Maybach, a Nissan GTR or a McLaren F1.
"speculative at best..." (Score:2)
http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]
Re: (Score:1)
This just in: Guy who works for Mozilla downplays issues in a standard originating from Mozilla.
Re: (Score:2)
Re: (Score:1)
WebGL grew out of the Canvas 3D experiments started by Vladimir Vukievi at Mozilla. Vukievi first demonstrated a Canvas 3D prototype in 2006. By the end of 2007, both Mozilla[6] and Opera[7] had made their own separate implementations.
In early 2009 Mozilla and Khronos started the WebGL Working Group.
Sorry, what eventually became WebGL originated from Mozilla and then later Mozilla and Khronos started the working group to standardize it. So yes, you can say it originated from them.
Re: (Score:2)
Re: (Score:1)
It's not even technically true since the Canvas 3D that became standardized as WebGL was originally created and implemented by Mozilla.
Re: (Score:2)
Desler's statement was that the WebGL standard originates "from Mozilla". Statement is false.
Why do you feel the need to defend something so obviously wrong? Noones playing the blame game in this thread, theyre just fighting gross misinformation.
Re: (Score:2)
That's... Interesting. Buffer overflow attacks were once considered "speculative at best".
Here's a question: What happens when you take drivers which were designed to run only local content (and thus have never been hardened against malicious content) and expose their entire API surface to the internet?
The answer is similar to the answer to the question: What happens when you take network services which were designed to run only intranet content (and thus have never been hardened against malicious content
Re: (Score:2)
Here's a question: What happens when you take drivers which were designed to run only local content (and thus have never been hardened against malicious content) and expose their entire API surface to the internet?
The driver vendors are coerced into hardening their drivers against malicious content. After that, web developers can write awesome apps, and card vendors can sell more cards, so everybody profits!
Re: (Score:2)
Yeah, and how long did it take Microsoft to harden Windows?
Re: (Score:2)
Back in the 1990s when you reported a buffer overflow to a company, their usual answer was "So what, it's only a theoretical vulnerability, it can't be used to attack our product - all you can do is crash my app, you can't get reliable remote code execution".
In the intervening 10 years, most companies no longer feel that way.
To me, the "speculative at best..." comment seems disturbingly familar to the old complaints about the buffer overflows.
Horrible Article (Score:5, Interesting)
This is nothing more than a scary article about the well-known risk of denial-of-service and using shaders to extract pixels from a remote image -- and the media is slurping it up, using senteces like "run arbitrary code on the GPU ... render an entire machine unusable". Ugh.
It's a completely over-hyped article about something the spec designers have known since day one. The article takes the fact that bad OpenGL drivers can crash a computer to mean a security hole, something which driver vendors are actively participating in resolving for future cards.
I wasted my time reading that whole report a few days ago, and it basically said nothing that wasn't obvious and well-known. The only thing new is they are showing that there is no way to stop GPU code from extracting pixels from remote images embedded in a canvas, which is a real "security" hole, though there's not a whole lot of use for this.
Basically, the extent to which this *should* affect webgl is that they will disallow textures from remote sites -- in other words, it could add an extra annoying implementation step for collaborative spaces that could include models from multiple sites. Also, they might choose to add an Infobar to prevent arbitrary websites from crashing the computer or making it run slowly.
However, thanks to the media slurping this up and using words like "run arbitrary code on the GPU", "render an entire machine unusable", etc., people who read these articles and know nothing about the subject (i.e. idiots) will start to ask browser vendors to turn those features off. But to be honest, I hope people aren't this stupid and "FUD"-y articles like these are forgotten.
Also, the title is plain misleading -- a denial of service attack on buggy drivers should not be described as "Leaves GPU Exposed". A website can not in any way take advantage of crashing a user's computer, and browser vendors will quickly respond with a blacklist patch when they learn of the affected GPU.
If you disagree with anything I said, feel free to comment and I'll explain in more detail why what the article describes are not "security issues" in WebGL.
Re: (Score:3, Insightful)
"A website can not in any way take advantage of crashing a user's computer"
Except those crashes are usually caused by buffer overflows which eventually lead to a well-crafted attack that causes remote code execution.
Re: (Score:2)
You said usually. That's not true in 99% of cases -- if you read the article (which is unfortunately slashdotted), the specific crashes in this case are in locking up the GPU itself by taking too long to render frames. This means that the computer will reset due to its watchdog timer, not because any malicious code was executed on the CPU.
You are correct that the occasional graphics driver might be buggy, and that's why Mozilla has a whitelist of graphics manufacturers and cards that are actively patching t
Re: (Score:1)
That's not a web standards issue.
Except for when that standard relies heavily on those very same video drivers just to work?
Re: (Score:1)
Read this article from a Mozilla dev:
http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]
Note the last paragraph:
Therefore, unless something is horribly wrong in a particular graphics driver (and Mozilla/Google were careful which companies they whitelist in this regard), the worst case is a bug in the code compiler -- which is probably about as like
Re: (Score:2)
Read this article from a Mozilla dev:
http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]
Note the last paragraph:
Therefore, unless something is horribly wrong in a particular graphics driver (and Mozilla/Google were careful which companies they whitelist in this regard), the worst case is a bug in the code compiler -- which is probably about as likely as a bug in any Javascript interpreter or Adobe Flash.
You are certainly aware then that the shader compilers in the OpenGL drivers are among the buggiest compilers I've ever worked with. In my experience, a bug in any of these shader compilers is, at least at the moment, much, much more likely than one in the browser's JS interpreter.
Also, they would need the "triple-step dance" for each different buggy driver, each of which would get patched in weeks at most, and each of which probably affects 0.1% of the market due to the diversity of hardware, OS and browser -- you have to consider payoff of exploit as well.
Uhm, 0.1% of the marker? Not a chance. Especially the compilers are shared between different drivers. Again, I can't speak for ATI or Intel, but nVidia has basically exactly one driver for all their platforms and hardware variants
Re:Horrible Article (Score:4, Interesting)
I agree it's misleading to imply that there's a specific 'flaw' that leaves the GPU 'exposed'. That's the entire point of WebGL: to expose the GPU to web applications. Whether or not you think that's a good idea depends on where you fall on the security vs. functionality spectrum. It's an interesting discussion.
Look at it this way: GPUs are extremely complex hardware/software combination systems representing a huge attack surface. They're designed either for zero-cost (integrated graphics) or maximum game performance. Security has never been a big driver for this market. Newer graphics engines like WebGL allow the GPUs to be programmed with somewhat arbitrary code. These programs need lightning-fast parallel access to several different kinds of memory and the security model for this programming environment looks something like an afterthought.
Once again, the developers probably thought they didn't need to put security first since the primary use case was running trusted applications on single-user systems (e.g., games).
It's not uncommon to see crash bugs in GPU systems. They look a heck of a lot like the blue screens that used to plague MS Windows. There's no reason to think these bugs will be any less exploitable than those of Windows XP SP 0. We've seen this play out with Adobe Acrobat reader, Flash, and any number of other binary browser plugins. Hopefully the graphics developers are better, but their challenge is much harder too.
In short, all the ingredients are present making in the recipe for disaster. It's probably only a matter of time for exploitable vulnerabilities to surface. I don't think we should kill off WebGL altogether, but the right thing to do is to put the focus on its security.
Personally, I look forward to using it, but I'm going to turn it off by default. I'm counting on noscript to let me enable it selectively. This is just good practice anyway.
Re: (Score:2)
Adobe Flash aren't so much security risks to the site, they're a risk to the user's computer running the browser. The attacker gets to choose whether to attack the browser via Adobe Flash or to attack it via WebGL. Unless the user is specifically running Firefox+NoScript, the site itself has little to do with it.
This is why the discussion about Flash and WebGL security is important for users to be involved in, not just hackers and website authors.
Re: (Score:1)
The other response said it better, but you must disable WebGL if you want a secure browsing experience. It's going to take ten years for the manufacturers to get this right: you will still be reading about consequential WebGL
Re: (Score:1)
Right, we've seen many security companies trying some publicity stunts with half-assed theoretic threats.
That's quite dangerous world we live in, with all those "specially constructed JPEG images that can execute arbitrary code" and all.
Right now the only plausible thing in there is DoS - which would be fault of drivers, not inherent standard flaw.
Crossdomain image data access could be a security breach, but it won't "make it very difficult to secure".
Re: (Score:1)
That's quite dangerous world we live in, with all those "specially constructed JPEG images that can execute arbitrary code" and all.
Right now the only plausible thing in there is DoS - which would be fault of drivers, not inherent standard flaw.
Don't be so sure... it only took 14 days for the GDI+ JPEG exploit [infosecwriters.com] to go from being a "half-assed theoretic" threat to being an "actually attacking people's computers out there" threat. Security companies certainly try to scare up business--I won't disagree with you there--but theoretical threats are the larval form of actual threats.
I don't think you can just blame the driver manufactures either. The exploits that will be found are inevitable given the history and realities of graphic driver development,
Re: (Score:1)
Full disclosure, I know very little about WebGL other than a few things I've read and what's in this particular TFA. On the other hand, I consider myself quite knowledgable about computers in general. I'm wondering what the feasibility is of some sort of social engineering attack based around accessing the video buffers in the GPU and essentially snooping what other windows are open on your computer. Complex examples I thought of involve using OCR to determine information about the user that will convince t
... or just some cheap publicity. (Score:1)
I especially like this part:
Fundamentally, WebGL now allows full (Turing Complete) programs from the internet to reach the graphics driver and graphics hardware which operate in what is supposed to be the most protected part of the computer (Kernel Mode)
It elegantly scares ordinary people with sciency words and silently implies equality/correlation between "arbitrary (Turing Complete) programs sent for GPU execution via kernel-level drivers" and "arbitrary programs executed in the most protected part of the computer (Kernel Mode)".
By the way, aren't there moddable games out there that allow modders to use their own shaders?
I can't seem to remember, when was the last "a malicious shader in mod for game X exploits the vulnerabilit
ffs (Score:1)
gpu shaders cannot access main memory. if there are driver errors that cause undesired side effects these should be fixed but the underlying architecture is sound.
can we please move beyond 2D graphics. go whine about real security problems. this is not one of them.
Re: (Score:1)
go whine about real security problems. this is not one of them.
I seem to remember Microsoft telling us the same thing about ActiveX...
Re: (Score:2)
gpu shaders cannot access main memory.
They can if you configure a texture buffer in main memory, e.g. by exploiting a driver bug to configure the buffer in the wrong place.
Though I agree that the warnings in this post and the previous one are overhyped.
Re:ffs (Score:4, Interesting)
Can you promise that no SIMD scatter can be performed with offsets that it shouldn't?
Yes I can. For instance in ATI r6xx it can only go to a surface defined by SX_MEMORY_EXPORT_BASE/SX_MEMORY_EXPORT_SIZE described on page 127 here http://www.x.org/docs/AMD/R6xx_3D_Registers.pdf /usr/src/linux/drivers/gpu/drm/radeon and grep for VM_
In addition to that system memory is mapped to GPU via GPU VM page table, so only pages that were allocated by the process and that the kernel driver mapped into VM graphic context. See
So there are two layers of hardware enforced protection in addition to software command buffer parser that checks the addresses. Safe enough for you?
Re: (Score:1)
Yes, because the place to get a neutral opinion is someone who works for the company who was the originator of WebGL. Gee, no conflict of interest there...
How About the Response, Slashdot? (Score:4, Informative)
Slashdot really should have published a link to this response from Mozilla http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]
Re: (Score:1)
Yes, the most non-biased source of information is an employee of the company who originated the technology and was a founding member of the standardization group.
Been there, crashed X11 (Score:2)
I've already had a newer version of Firefox crash an older X11 display driver. Absolutely rock solid on Firefox 3.6 and down, and every other program I want to run. But the new GPU acceleration in Firefox? Could cause all of X11 to go away.
And Flash inside Firefox would pretty much guarantee a visit from the Coredump Gods. Fixed with a newer driver, but man is updating annoying.
Kinda sad that "web browsing" is the most intense job run on my work machine. I would have thought the massively parallel buil
it is all fear mongering (Score:2)
First, WebGL sends shader source code to the browser and the code is compiled and executed in OpenGL. This is no different from running any other OpenGL program on your machine. The remote attacker cannot make the GPU execute arbitrary hardware instructions, only whatever source he sends.
The shaders pretty much execute in a sandbox (shader on GPU can only access buffers bound textures, vertex buffers, constant buffers, render targets etc etc). The access outside these buffers is not possible because the ha
Re: (Score:2)
All they whine about in this article is that WebGL can use images from other domain as textures. That has nothing to do with GPU security, it is a feature and/or flaw in HTML5/WebGL which allows to use this image as a texture in the first place.
Still Much Ado About Nothing (Score:3)
As with the previous article, this is much ado about nothing.
The GPU can only run "arbitrary code" in the loosest possible sense. What happens is that an OpenGL or WebGL application gives the shader source code to the driver, which then compiles it into the native GPU instructions. You *can* pre-compile your shaders in OpenGL ES 2.0, but even then it's just intermediary bytecode, and the bytecode is vendor-specific.
Furthermore, GLSL, the language used for OpenGL and WebGL shaders, is *very* domain-specific. It has no pointers, and no language support for accessing anything outside the GPU other than model geometry and texture data. *AND* it can only access the model geometry and texture data that the application have provided to it, and for GPUs that don't have any on-board VRAM it's up to the *driver* to determine where in shared system memory that the texture will be located.
And you can't get around using shaders on modern GPUs. Modern GPUs don't have a fixed function pipeline, it's not in the silicon at all. For apps that try to use the old OpenGL fixed function pipeline, the driver generates shaders that do what the fixed function pipeline *would* have done based on the current state. Drivers won't keep emulating the old fixed function pipeline forever, though.
Re: (Score:2)
Also, Windows (the most likely target of any attack) has had the ability since Vista to restart the GPU if it hangs (which is the only real attack possible when it comes to shaders: use a shader that is so computationally intensive the GPU becomes unresponsive). This isn't bullet proof, of course, but if Windows isn't able to restart the GPU after a few seconds of unresponsiveness then that's a *Windows* bug.
Allowing 3D graphics in web pages? (Score:2)
What on earth could go wrong... *facepalm*
But I guess it still beats the shit of the disease that is Flash.
Re: (Score:2)
YouTube [youtube.com] isn't large enough for you?
Re: (Score:1)
Sure, but you
Re: (Score:1)
Sure as long as you ignore the fact that the vast majority of the content on Youtube is still exclusively only served via flash.
Re: (Score:2)
If you use Windows 7, there's an extension [interopera...ridges.com] to play H.264 videos on Firefox using the system codec.
Re: (Score:1)
Is it?
Yes. There are still tons of videos that still are only available in the old Sorenson video codec served up through flash.
Re: (Score:2)
This [craftymind.com] seems to be "so, sluggish" to you?
Re: (Score:2)
How the fuck can you comment on HTML5's video performance when your shitty browser doesn't even support it?
YOU "piss orf".
Re: (Score:2)
No, the point is that if you're one of the outliers who still hasn't moved to a modern browser that supports basic HTML5 video...
Get a real browser.
It works in Firefox and Opera. I can't test it in Chrome.
Re: (Score:2)
Re: (Score:2)
I'm running a dual-core 1.8GHz Pentium with whatever graphics card HP decided to slap in (the driver is identified as "Intel(R) Q965/Q963 Express Chipset Family") and it plays about as smoothly as I care to have any YouTube videos look.
Re: (Score:2)
Re: (Score:2)
Runs at about 4fps on my N900, even when blowing up the video, so something's seriously wrong with your computer.