Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security Mozilla The Internet

Mozilla Experiments With Site Security Policy 68

An anonymous reader writes "Mozilla has opened comments for an new experimental browser security policy, dubbed Site Security Policy (SSP), designed to protect against XSS, CSRF, and malware-laced IFRAME attacks which infected over 1.5 million pages Web earlier this year. Security experts and developers are excited because SSP extends control over Web 2.0 applications that allow users to upload/include potentially harmful HTML/JavaScript such as on iGoogle, eBay Auction Listings, Roxer Pages, Windows Live, MySpace / Facebook Widgets, and so on. Banner ads from CDNs have had similar problems with JavaScript malware on social networks. The prototype Firefox SSP add-on aims to provide website owners with granular control over what the third-party content they include is allowed to do and where its supposed to originate. No word if Internet Explorer or Opera will support the initiative."
This discussion has been archived. No new comments can be posted.

Mozilla Experiments With Site Security Policy

Comments Filter:
  • by Anonymous Coward on Friday June 06, 2008 @03:17PM (#23685935)
    It's what I use for XSS protection. NoScript's security can get kind of annoying sometimes, but it has been useful for some pages I've come across which have tried to load my user/pass data across a affiliate site.
  • by Anonymous Coward on Friday June 06, 2008 @03:36PM (#23686227)

    The prototype Firefox SSP add-on aims to provide website owners with granular control over what the third-party content they include is allowed to do and where its supposed to originate.

    But the real problem is still on the client end.

    For example, I block google-analytics.com, because I don't want to be a part of whatever Google stores by means of "ga.js".

    Perhaps I'm misunderstanding SSP, but the last thing I want my web browser to do is automatically "decide" that, for example, my trust in slashdot.org is to be automatically extended to Javascript hosted by google-analytics.com, solely because Slashdot's site administrators have chosen to trust google-analytics.com. (Or ad.doubleclick.net, which is now owned by Google...)

    In my eyes, that sort of behavior constitutes adding a security hole, not plugging it.

    To use an example that doesn't quite hit so close to home, I just about flipped when my bank added a lovely little "feature" on its login box that would cut down its support costs by enabling me to "chat with a live person". I'll turn on Javashit when I log in to my bank because I trust my bank, but I was damned if I'm going to let a wholly-unrelated outfit like liveperson.com (who are a legitimate company, but they run the Internet equivalent of outsourced phone banks and chatbots) run their Javashit every time I log onto my bank, especially since I've never once felt the need to "chat with a live person" while trying to bank online.

    Liveperson.com, much like Doubleclick and Google Analytics, got blocked in the HOSTS file and in the router. Those aren't options for non-technical users, but I fear that SSP may wind up enabling more holes/leaks than it closes. How much do you trust your site administrators? My bank's probably trustworthy. Slashdot's probably more interested in my data than Google is. But some random web forum whose administrator may not know his ass from a hole in the ground? The answer's still "no".

  • Re:FINALLY (Score:5, Insightful)

    by hedwards ( 940851 ) on Friday June 06, 2008 @03:52PM (#23686435)
    This seems like something that would be very useful in this day and age. Noscript ends up being very annoying because a lot of sites will link in a large number of scripts from other servers. It's difficult a lot of the time because you really have to investigate to know what akamai.net does for example, and then there's admdt.com and the like. The list is often times 20 different sites without a good way of restricting the permissions to just the current site.

    I block everything by default, and rarely allow things permanent permissions, but trying to figure out which one is causing the site to not work is a real pain in the ass. Really it's something which shouldn't be expected of a user. the JSON ans XSS vulnerabilities really ought to be enough to convince at least financial institutions not to include that kind of crap on their sites.

    But of course, this assumes that the system works, is well designed and protects against the things that developers really shouldn't be doing automatically. But I'll be giving it a whirl just to see if it helps at all.
  • by rgviza ( 1303161 ) on Friday June 06, 2008 @04:18PM (#23686817)
    If you care about your machine and what happens to it, it's your job. If you just want to flail around in anger when your box becomes a big paperweight, leave it up to someone else.

    It's the job of the authorities to lock up the crackers and other people that commit electronic crimes. It's your job to lock your front door.

    At the end of the day, the authorities and your ISP can't do anything until the threat is known. By then the damage is done.

    99.9% of stuff can be mitigated with NoScript, a $50 firewall, and a slight change in behavior (stop trying to steal music, porn, movies and software). If you can't take these steps, I'm genuinely amazed that you are capable of showering, starting your car and driving to work and functioning. Most people can, but won't. It sort of flows into that whole "not accepting responsibility for stuff I do" attitude that pervades modern society.

    Typical transaction:
    Dumbass (to self): Sweet I just got the entire collection of every rolling stones song ever made for free! 8)
    >click
    Computer: all of your files are belong to us. Send payment to uvebeenpwned@yahoo.com or you won't get your files back, MU HU HA HA.

    Dumbass (to IT buddy): My computer is running slow and all my files got encrypted. I think I have spyware.
    IT Buddy: You use p2p?
    Dumbass: no, I have no idea what happened, I swear I dun't download... /snicker

    -r
  • by Anonymous Coward on Friday June 06, 2008 @04:27PM (#23686981)
    If a website owner is going to include links from a third party site, it absolutely SHOULD be his/her responsibility to validate each and every one of those sites, if necessary, in a granular fashion. NoScript is a band-aid for the failure of content owners to do this.

    Who is responsible for creating the seamless browsing experience? Users shouldn't have to block scripts at cnn.com and then waste time scratching their head while looking at their noscript menu to try and figure out whether or not the content is to be trusted. Content providers should be validating all remote linking, whether it's ads, mashups, or whatever--ahead of time, or they don't deserve your traffic!
  • by Giorgio Maone ( 913745 ) on Friday June 06, 2008 @05:17PM (#23687659) Homepage

    As I commented here [hackademix.net], SSL and SSP are orthogonal technologies whose correct and joint adoption should be required for any website performing sensitive transactions: the former ensuring integrity and, to a certain extent, identity; the latter defining and guarding application boundaries.

    Those websites should encourage their users to adopt a SSP complaint browser, and complaint browsers should educate users to prefer SSP complaint sites with visual clues, just like we're already doing with EV-SSL (and for better reasons in this case, maybe).

    On my side, I'm considering to highlight valid SSL + restrictive SSP websites as more reliable candidates for NoScript whitelisting.

Today is a good day for information-gathering. Read someone else's mail file.

Working...