New Firefox Standard Aims to Combat Cross-Site Scripting 160
Al writes "The Mozilla foundation is to adopt a new standard to help web sites prevent cross site scripting attacks (XSS). The standard, called Content Security Policy, will let a website specify what Internet domains are allowed to host the scripts that run on its pages. This breaks with Web browsers' tradition of treating all scripts the same way by requiring that websites put their scripts in separate files and explicitly state which domains are allowed to run the scripts. The Mozilla Foundation selected the implementation because it allows sites to choose whether to adopt the restrictions. 'The severity of the XSS problem in the wild and the cost of implementing CSP as a mitigation are open to interpretation by individual sites,' Brandon Sterne, security program manager for Mozilla, wrote on the Mozilla Security Blog. 'If the cost versus benefit doesn't make sense for some site, they're free to keep doing business as usual.'"
Cost vs. Benefit? (Score:4, Interesting)
If the cost versus benefit doesn't make sense for some site, they're free to keep doing business as usual.'
The author gave the best reason for not implementing this.
The benefits of this, and other various security implementations, won't be seen until it's tested. The costs of testing? Way too high compared to the current cost of operation. This is a very hard proof-of-concept problem, and unless this is already built into development standards, I doubt any deployments would switch.
Which would you take, the option which delays production for a week, or the option to just hit "next"?
Article on this and related technologies (Score:3, Interesting)
Use a file? (Score:1, Interesting)
Headline: Google other ad publishers revenues drop (Score:2, Interesting)
This is great for Firefox users... (Score:2, Interesting)
Re:Good idea (Score:5, Interesting)
If I say that my site trusts domain1.com, but domain1.com isn't using this and ends up having all sorts of dodgy scripts they're passing along, would this block them, or would they count as coming from domain1.com?
Domain1 woudn't need to use this - this is a client-side security measure. If your site uses it and declares trusted third-parties, it's enough.
Also, what is "passing along" supposed to mean? Scripts (or any other stuff) would either come from domain1 or not. If not, it wouldn't be trusted.
If domain1 proxies scripts from other sources, this means they come from domain1, as far as HTTP is concerend - and they would be trusted.
The problem I see however is domain1 declaring additional trusted domains when delivering its scripts, thereby allowing for "cascaded domain trust", which
would pretty much defeat the new system. This can easily be prevented by not accepting additional trusted domains from elements that are third-party though.
Re:How does this change userland? (Score:2, Interesting)
That reminds me -- since recently I have to tell NoScript to allow scripts from fsdn.com in order to browse slashdot.org successfully. I *know* that FSDN is slashdot's parent company, but it doesn't seem right that I can't use slashdot's discussion interface without giving permission to all of FSDN.
Similarly, recently I have to allow gstatic.com and/or googleapis.com to use Google-enabled websites that worked fine before.
Like the parent post's point: it's getting harder for a user to selectively narrow permissions down.
Re:Old Standard to Prevent All Attacks (Score:3, Interesting)
Why is this modded troll?
99.99999% of attacks are the result of:
Malicious ads and clickthrough "offers" after a sale is processed
Vulnerabilities in PDF, Flash, etc.
Malicious content uploaded by users (javascript, sql injection, malformed jpegs, what have you)
Domain hijacking
General "LOL I GOT UR PASSWORD" shenanigans
Re:as an end user (Score:3, Interesting)
Noscript does this.
Which brings me to the observation that, at least as far as I can tell from the blurb, this entire thing sounds a bit redundant in light of the ready availability of Noscript. Why not just make it part of the default firefox install instead?
Re:Old Standard to Prevent All Attacks (Score:3, Interesting)
Sexconker is modded a troll - quite unfairly. Cross site scripting sucks. Simple as that. I go to a site, first thing I see is noscript's popup message that anywhere between 2 and 20 sites want to run scripts in my browser. I click the popup, to see WHO wants to run scripts. Sometimes, it's easy to see who wants to do what, and deciding to allow site a, but not site b is quite simple.
Often enough, it's just not that simple. I want to see some stupid flash presentation, and the only way to see it is to enable flash. Unfortunately, three different sites are offering a flash. Which one do I want? I choose one to be allowed, and I get rickrolled.
That is hamshite. Nothing more, and nothing less. The original site should be hosting it's own material, or they should supply the link to see the flash presentation. Cross site scripting is a ripoff that just helps to confuse the security conscious. And, God knows there are far to few users who are conscious. (I'd like to see a scientific poll that demonstrates just how many users really are brain dead - it has to be over 20%, and might be over 50%)
Re:Good idea (Score:1, Interesting)
That is a different problem. The problem with this specification is that when enabled it doesn't allow you to use inline scripts anymore. i.e. you can no longer directly trust *your own domain* unless you use out of line scripts, which is enormously constraining for a large class of applications.
In particular, many applications dynamically generate javascript on the fly. The only way to handle that under this specification would be to generate lots of little temporary files that the browser requests on the second pass. The performance problem with such secondary requests is a serious problem, due to turnaround latency.
Speaking of which, HTTP should be extended to allow web servers to push expected inline requests for script files, images, and frames (up to a reasonable limit and under reasonable constraints) into the web browser in-memory cache to eliminate the turnaround latency associated with such follow on requests. i.e. the browser in such cases would be able to fulfil such requests immediately from the cache because they would already be there by the time the web browser had finished parsing the page.
Re:Next step (Score:2, Interesting)
How about first getting the PHP developers to add a sane and logical way of sanitizing HTML.