Mozilla Experiments With Site Security Policy 68
An anonymous reader writes "Mozilla has opened comments for an new experimental browser security policy, dubbed Site Security Policy (SSP), designed to protect against XSS, CSRF, and malware-laced IFRAME attacks which infected over 1.5 million pages Web earlier this year. Security experts and developers are excited because SSP extends control over Web 2.0 applications that allow users to upload/include potentially harmful HTML/JavaScript such as on iGoogle, eBay Auction Listings, Roxer Pages, Windows Live, MySpace / Facebook Widgets, and so on. Banner ads from CDNs have had similar problems with JavaScript malware on social networks. The prototype Firefox SSP add-on aims to provide website owners with granular control over what the third-party content they include is allowed to do and where its supposed to originate. No word if Internet Explorer or Opera will support the initiative."
Why not just include NoScript by default? (Score:1, Insightful)
But why trust site administrators? (Score:1, Insightful)
But the real problem is still on the client end.
For example, I block google-analytics.com, because I don't want to be a part of whatever Google stores by means of "ga.js".
Perhaps I'm misunderstanding SSP, but the last thing I want my web browser to do is automatically "decide" that, for example, my trust in slashdot.org is to be automatically extended to Javascript hosted by google-analytics.com, solely because Slashdot's site administrators have chosen to trust google-analytics.com. (Or ad.doubleclick.net, which is now owned by Google...)
In my eyes, that sort of behavior constitutes adding a security hole, not plugging it.
To use an example that doesn't quite hit so close to home, I just about flipped when my bank added a lovely little "feature" on its login box that would cut down its support costs by enabling me to "chat with a live person". I'll turn on Javashit when I log in to my bank because I trust my bank, but I was damned if I'm going to let a wholly-unrelated outfit like liveperson.com (who are a legitimate company, but they run the Internet equivalent of outsourced phone banks and chatbots) run their Javashit every time I log onto my bank, especially since I've never once felt the need to "chat with a live person" while trying to bank online.
Liveperson.com, much like Doubleclick and Google Analytics, got blocked in the HOSTS file and in the router. Those aren't options for non-technical users, but I fear that SSP may wind up enabling more holes/leaks than it closes. How much do you trust your site administrators? My bank's probably trustworthy. Slashdot's probably more interested in my data than Google is. But some random web forum whose administrator may not know his ass from a hole in the ground? The answer's still "no".
Re:FINALLY (Score:5, Insightful)
I block everything by default, and rarely allow things permanent permissions, but trying to figure out which one is causing the site to not work is a real pain in the ass. Really it's something which shouldn't be expected of a user. the JSON ans XSS vulnerabilities really ought to be enough to convince at least financial institutions not to include that kind of crap on their sites.
But of course, this assumes that the system works, is well designed and protects against the things that developers really shouldn't be doing automatically. But I'll be giving it a whirl just to see if it helps at all.
Re:Whos Job Is It? - Security (Score:3, Insightful)
It's the job of the authorities to lock up the crackers and other people that commit electronic crimes. It's your job to lock your front door.
At the end of the day, the authorities and your ISP can't do anything until the threat is known. By then the damage is done.
99.9% of stuff can be mitigated with NoScript, a $50 firewall, and a slight change in behavior (stop trying to steal music, porn, movies and software). If you can't take these steps, I'm genuinely amazed that you are capable of showering, starting your car and driving to work and functioning. Most people can, but won't. It sort of flows into that whole "not accepting responsibility for stuff I do" attitude that pervades modern society.
Typical transaction:
Dumbass (to self): Sweet I just got the entire collection of every rolling stones song ever made for free! 8)
>click
Computer: all of your files are belong to us. Send payment to uvebeenpwned@yahoo.com or you won't get your files back, MU HU HA HA.
Dumbass (to IT buddy): My computer is running slow and all my files got encrypted. I think I have spyware.
IT Buddy: You use p2p?
Dumbass: no, I have no idea what happened, I swear I dun't download...
-r
Re:Why not just include NoScript by default? (Score:1, Insightful)
Who is responsible for creating the seamless browsing experience? Users shouldn't have to block scripts at cnn.com and then waste time scratching their head while looking at their noscript menu to try and figure out whether or not the content is to be trusted. Content providers should be validating all remote linking, whether it's ads, mashups, or whatever--ahead of time, or they don't deserve your traffic!
SSL + SSP = Safer Web Apps (Score:5, Insightful)
As I commented here [hackademix.net], SSL and SSP are orthogonal technologies whose correct and joint adoption should be required for any website performing sensitive transactions: the former ensuring integrity and, to a certain extent, identity; the latter defining and guarding application boundaries.
Those websites should encourage their users to adopt a SSP complaint browser, and complaint browsers should educate users to prefer SSP complaint sites with visual clues, just like we're already doing with EV-SSL (and for better reasons in this case, maybe).
On my side, I'm considering to highlight valid SSL + restrictive SSP websites as more reliable candidates for NoScript whitelisting.