New Firefox Standard Aims to Combat Cross-Site Scripting 160
Al writes "The Mozilla foundation is to adopt a new standard to help web sites prevent cross site scripting attacks (XSS). The standard, called Content Security Policy, will let a website specify what Internet domains are allowed to host the scripts that run on its pages. This breaks with Web browsers' tradition of treating all scripts the same way by requiring that websites put their scripts in separate files and explicitly state which domains are allowed to run the scripts. The Mozilla Foundation selected the implementation because it allows sites to choose whether to adopt the restrictions. 'The severity of the XSS problem in the wild and the cost of implementing CSP as a mitigation are open to interpretation by individual sites,' Brandon Sterne, security program manager for Mozilla, wrote on the Mozilla Security Blog. 'If the cost versus benefit doesn't make sense for some site, they're free to keep doing business as usual.'"
as an end user (Score:2, Insightful)
I really hope the default policy is "only allow scripts from the current domain" and "do not allow the site to override my choice".
How does this change userland? (Score:5, Insightful)
I'm sorry, but NO site can be trusted 100% from a user's perspective... and giving site owners the tools to help prevent XSS from their side doesn't help with the fact that users still shouldn't trust absolutely.
The reason something like this scares me is that it lulls users into a higher level of trust... and doesn't protect them from hacked sites, or sites that choose not to implement this.
Of course, I'm slightly paranoid. And of course, this isn't transparent to Joe Sixpack, so he's going to trust|!trust based on whatever it is he's basing it on now. And for security-critical sites like banks, this is a good thing... but I try very hard to make sure my friends & family are a bit paranoid too, so they'll take precautions.
Re:Good idea (Score:5, Insightful)
First thoughts on that:
If I say that my site trusts domain1.com, but domain1.com isn't using this and ends up having all sorts of dodgy scripts they're passing along, would this block them, or would they count as coming from domain1.com?
RFC? (Score:4, Insightful)
Is this 'standard' endorsed by anyone else or written up as part of an RFC? Calling something a standard when you are the only guys doing sounds like a certain company that was started by Bill and Paul.
I am not trying to troll here, since I am all for the solution, I am just ensuring that this properly documented and shared by the right entities (think W3C).
Re:How does this change userland? (Score:4, Insightful)
Dare I say it?
Site XXXX is attempting to run a script on site YYYY.
(C)ANCEL or (A) LLOW?
All snark aside, why would I allow either of those domains to run a script on slashdot.org? Since I trust slashdot to a certain extent, I would allow from scripts.slashdot.org. But allowing scripts from a completely different domain? No way.
The point is that my security policy is annoying to implement. For site mybank.com I need to enable scripting. But if things were perfect, I could enable only for scripts from $SUBDOMAIN.mybank.com, so I don't get hosed by scripts from $HACKERSITE.bankmy.com. And if legitimate sites are hosting their scripts from an entirely different domains... well, that would have to change. Instead I have to take an all-or-none approach, since the sites I need security the most on are the ones where I need to enable scripting. That just sucks.
Re:This is great for Firefox users... (Score:3, Insightful)
Well, if Firefox users find it effective, then other companies will follow suit. It's just a standard Mozilla is adopting, though it seems to have been defined in house, that won't stop anyone else from using it.
Re:How does this change userland? (Score:2, Insightful)
Slashdot is currently pushing js from c.fsdn.com.
I think you have a pretty dim view of the ecosystem (or maybe you are viewing some really marginal sites, who knows). For the most part, a given page that you visit is not going to contain malicious code that sniffs for when you have a https cookie for your banking site and then mysteriously steals all your money. I say this confidently, as I am quite certain that the bad guys are much happier with the simpler task of installing mal-ware keyloggers.
The only browser exploit I have personally encountered came from the server getting compromised (well, the account for a domain, probably not the whole server); obfuscated javascript had been appended to the bottom of a javascript file that the page loaded (the attack was a pdf, but I had that particular exploit locked down (or it didn't work in the version of Reader I use...), so no issues). Entertainingly, it was a blog post about web security (I let them know and they fixed it).
eBay and MySpace? (Score:3, Insightful)
CSP is effectively server-side NoScript. And it isn't exactly new either. This has been in development as a Firefox extension for at least a year. The article mentions it being first crafted back in 2005.
The issue I take with this article is that they suggest this feature could even possibly be integrated into eBay or MySpace. These two giants seem like the exact opposite type of market that would use this -- any site that allows users to post their own data is not going to possibly survive the wrath they would catch if users had to explicitly allow the domains they want scripts to run on. For a corporate Web site yes, but for something for the masses or those of us that run a CMS? I don't see that as happening anytime soon.
Re:Good idea (Score:3, Insightful)
The other major problem with this solution is that it requires changes at the web site level.
In other words, you're only safe if the web site author opts into the security solution.
What are the chances that the hundreds of millions of web sites out there will all opt into this feature?
Re:as an end user (Score:5, Insightful)
Because, as a user I might not know which of the 47 different domains that CNN pulls scripts from are *supposed* to be serving scripts and which are some guy trying to get my facebook account details (not that I have one or read the CNN site regularly; largely because of the number of bloody domains they pull scripts from), whereas the owners of the CNN site *will* know which domains they're supposed to be pulling scripts from and can state so to the browser.
Re:as an end user (Score:3, Insightful)
Sounds like a bug rather than a feature to me. This would just enable CNN and others to continue the practice, removing any pressure on them to fix their broken website.
Re:How does this change userland? (Score:3, Insightful)
And I know that fsdn.com is also a trusted site.
Funnily enough, I know I don't want fsdn.com's content because the side bar is annoying bloatware that cripples the utility of the site. I'm very glad to have NoScript on the case, blocking it for me. (Which makes me wonder how many other horror websites there are out there whose horrible bloat I've been saved from by virtue of my browsers blocking XSS.)
Re:as an end user (Score:5, Insightful)
NoScript solves a different problem, which is that you don't trust the site. What this aims to solve is the problem of knowing what the site itself considers trustworthy, so that you're not required to issue a blanket statement of distrust: If you trust the site, you can (supposedly) trust its own trust list.
Re:How does this change userland? (Score:1, Insightful)
Of course, I'm slightly paranoid, too. That aside, for security-critical sites like banks, there is no such thing as a good dependency on JavaScript. These banks teach their users to behave insecurely:
1. Please, behave securely, have some tea and
antivirus installed, don't tell anybody your
PIN. Update you system and browser regularly.
Now, proceed to log in.
2. [Login doesn't work]
3. In order to use this site, please allow JavaScript.
4. [Login doesn't work]
5. DAMN1 ALLOW jAVAsCRIPT111[1]
6. [Stupid user allows JavaScript globally]
5. Profit!
[1] Caps-Lock pun indented ...