Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security IT

Reddit Javascript Exploit Spreading Virally 239

Nithendil writes "guyhersh from reddit.com describes the situation (warning: title NSFW): Based on what I've seen today, here's what went down. Reddit user Empirical wrote javascript code where if you copied and pasted it into the address bar, you would instantly spam that comment by replying to all the comments on the page and submitting it. Later xssfinder posted a proof of concept where if you hovered over a link, it would automatically run a Javascript. He then got the brilliant idea to combine the two scripts together, tested it and it spread from there."
This discussion has been archived. No new comments can be posted.

Reddit Javascript Exploit Spreading Virally

Comments Filter:
  • Re:NoScript (Score:3, Interesting)

    by CKW ( 409971 ) on Monday September 28, 2009 @09:47AM (#29564931) Journal

    I love how *their* mistake causes viral problems in YOUR browser. All one needs is some sort of cross site vulnerability now and ...

  • Reddit Hacks (Score:3, Interesting)

    by jDeepbeep ( 913892 ) on Monday September 28, 2009 @09:55AM (#29565031)
    This is nothing new. There is a quiet tradition of Reddit users finding the weak points of the site, like this [reddit.com] for example.

    Putting javascript:$(".up").click()() in the address bar upvotes everything on the page.
  • Myspace (Score:3, Interesting)

    by RalphSleigh ( 899929 ) on Monday September 28, 2009 @10:36AM (#29565557) Homepage
    Reminds me of a very similar worm that hit myspace years ago:

    http://web.archive.org/web/20060208182348/namb.la/popular/tech.html [archive.org]

    Same thing, find a way of executing javascript and then have it self-replicate by posting itself all over the site.
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Monday September 28, 2009 @11:22AM (#29566157) Journal

    Hi there - you must have just popped in from some alternate universe

    Yep. It's called Google Chrome -- or, more accurately, the Chromium nightly. Javascript executes quickly, and I don't have to wait for an entire separate page to load. Additionally, if I have to wait, the "submit" button has a countdown timer.

    And regardless of speed, it is convenient to have that much more context on the page. For example, right now, I can see your post and mine, and I can expand the parents if I need to. If I was replying from the main discussion, I could scroll up to see the whole discussion. Yes, I know about tabs, but even switching with keyboard shortcuts isn't as nice as being able to actually see a few posts of context as I type.

    In this universe, the speed with javascript is noticeably slower - in many cases it's so slow as to be unusable.

    Which browser?

  • Re:proof of concept (Score:1, Interesting)

    by Anonymous Coward on Monday September 28, 2009 @11:26AM (#29566251)

    The implications of XSS vulnerabilities are much greater than you describe. Read this White paper, [virtualforge.de] particularly pages 19-27 to see the implications.

    Give me an unfixed XSS vulnerability on a trusted site with top secret archives of sensitive material [xssed.org] and I can show you all sorts of mischief, including retrieval of the secret archive, administrator impersonation, password theft, phishing, page defacing, and much more. The opportunities are endless. I've seen this in recent legal documentation.

    I can think of all sorts of exploits that would be specific to Reddit that could steal valuable information. I won't outline them here, as some jackass will go do them.

  • by lwsimon ( 724555 ) <lyndsy@lyndsysimon.com> on Monday September 28, 2009 @11:31AM (#29566333) Homepage Journal

    Amen. I've gotten into the habit of structuring the document, outputting the data into readable form, then using CSS and JS to make it look and behave how I want it to.

    There are some pages where "no access without javascript" is acceptable - but they are few and far between. For the most part, you should be able to use Lynx and view the content.

  • by Anonymous Coward on Monday September 28, 2009 @12:17PM (#29567095)

    What about "failing gracefully because if your customer is big and important enough, it means enough people will care and it can lead to class action suits for violating section 501 in the US if it's too disgraceful for a screen reader to work"

  • by horza ( 87255 ) on Monday September 28, 2009 @01:26PM (#29568223) Homepage

    Absolutely right for your personal homepage. A professional web designer would not be able to get away with this. This kind of laziness translations directly into additional support costs for the client. And each time Microsoft recommends turning off Javascript due to a 0-day exploit you are cutting off more than 1%.

    I can't think of any cases where it is ok to not fail gracefully. I hope you are not talking about just using client side validation, one of the most used cases for Javascript but where you must always fail over to doing server side also. Can you give an example?

    Phillip.

  • Re:Mod parent down (Score:3, Interesting)

    by Firehed ( 942385 ) on Monday September 28, 2009 @01:30PM (#29568303) Homepage

    Tools like that aren't foolproof, especially since browsers go out of their way to attempt to parse malformed input (unless you're serving content as application/xml, in which case the browser will just show an ugly parse error). I can't speak about that tool not having used it, but all it takes is one hacker finding yet another way to create a broken script tag that a browser will still run that they don't yet know about and all your efforts are for nothing.

    I think the parent's suggestion of BBCode is safer overall, but the safest solution is to not allow users to format content at all.

  • by wowbagger ( 69688 ) on Monday September 28, 2009 @01:33PM (#29568363) Homepage Journal

    OK, consider this assertion:

    Web pages and web applications are different, so perhaps we need a new URL type?

    Consider: the original purpose of HTTP was HyperTEXT Transport protocol: a means to have linked TEXT pages. Thus, such pages were not a Turing Complete language (indeed, they weren't any form of "active" language at all, just a markup presentation layer). As such, they were simple to evaluate from a security protocol standpoint.

    Since then, the web has evolved into a collection of Web pages (text, graphics, but basically NOT "active") and Web Applications (things like Google Maps).

    To be an "active" page pretty much requires a Turning Complete language, and it is impossible to fully say that a Turing Complete language is "safe" (at a minimum, you cannot guarantee halting, so you have a denial of service attack if nothing else).

    What if we separate the idea of a "Web Page" and a "Web Application", and put certain rules on each (web pages should not require Turing Complete behavior to operate, Web Applications should be bounded in where the fetch code from)? When the user selects a Web Application, the browser can check if the application has been cleared to run by the user previously, and if not, ask them "You have selected to run a web application 'FutzorJooMachine' from 'evilbad.example.com' - are you sure?" Ideally, the web application should provide to the browser a list of sites and components it plans on using (and the browser should ENFORCE that only those items are used).

    Now, if the only difference between a Web Page and a Web Application is the MIME type, then you are going to have a hard time letting the user know when he is about to step on a mine - so what if we create a new transport type, "WATP" (Web Application Transport Protocol), which is the same as HTTP in implementation but has a different default link type, and different permissions from the browser.

    That way, things that are supposed to be Web Pages (Yes, 'You Cannot Delete Messages Without Javascript'-Slashdot I am looking at you) can be constrained to a safe set of behaviors (as in NO JAVASCRIPT OF ANY FORM, INCLUDING FLASH), and the web apps can be identified as such and allowed to do what they need to do AT THE USER'S DISCRETION.

  • by BikeHelmet ( 1437881 ) on Monday September 28, 2009 @03:38PM (#29570731) Journal

    There is not a single brake pedal! And worse, the W3C or MS or Mozilla or whoever could introduce a new gas pedal, and you the website operator have to filter out the new gas pedal when it's introduced.

    Undid my mods, but I had to post this.

    There used to be a break pedal. I think it was Firefox 1.5 where this code didn't evaluate any tags:

    element.append(document.createTextNode(sText));

    The solution, therefore, was to manually parse italic/bold/a tags, to append those elements - and then create a text node inside. A perfect working DHTML/DOM solution, properly sanitized!

    However, with Firefox 3, text nodes now evaluate HTML tags. This handy function went out with eval usage for local callbacks. :/ Opera and Chrome also evaluate some(all?) tags for appended text.

  • by Anonymous Coward on Monday September 28, 2009 @04:21PM (#29571515)
    Failing gracefully tends to make pages work better for normal users, too. Ex. the old use of Java applets / Flash for navigation buttons on websites where an image + CSS would have worked. It makes the page unusable in lynx and makes users of other browsers unable to open links in new windows or new tabs.

"Money is the root of all money." -- the moving finger

Working...