Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security IT

Reddit Javascript Exploit Spreading Virally 239

Nithendil writes "guyhersh from reddit.com describes the situation (warning: title NSFW): Based on what I've seen today, here's what went down. Reddit user Empirical wrote javascript code where if you copied and pasted it into the address bar, you would instantly spam that comment by replying to all the comments on the page and submitting it. Later xssfinder posted a proof of concept where if you hovered over a link, it would automatically run a Javascript. He then got the brilliant idea to combine the two scripts together, tested it and it spread from there."
This discussion has been archived. No new comments can be posted.

Reddit Javascript Exploit Spreading Virally

Comments Filter:
  • by Anonymous Coward on Monday September 28, 2009 @09:41AM (#29564863)

    I don't know. Sounds good !!

  • proof of concept (Score:3, Insightful)

    by yincrash ( 854885 ) on Monday September 28, 2009 @09:45AM (#29564907)
    seriously. using the 'onhover' event is considered inventive enough to call it a proof of concept?
    • Re:proof of concept (Score:4, Informative)

      by immortalpob ( 847008 ) on Monday September 28, 2009 @09:53AM (#29564995)
      This is a flaw in Reddit's comment system, that allows the poster to get javascript code executed. A comment system should not allow you to use "onhover" that is the point.
      • by Otto ( 17870 )

        A comment system should not allow you to insert javascript code of any kind, period. How exactly did he slip this past the filters? Does reddit even have filters?

        Regardless, I've added reddit.com to my blocklist. Simple immunity. :)

        • by TheLink ( 130905 ) on Monday September 28, 2009 @10:37AM (#29565569) Journal
          Years ago I actually proposed to the W3C and the mozilla bunch to add a tag to disable dynamic stuff like javascript.

          Basically it would work something like this:

          <shield lock="some_random_hard_to_guess_string_here" enabled="basic_html_only">
          The browser will only recognize basic HTML stuff here, it won't recognize javascript or any _future_ dynamic stuff that the W3C or browser people think off
          </shield unlock="some_random_hard_to_guess_string_here">

          The some_random_hard_to_guess_string_here would be different for each page.

          The idea is while the website should still have filters, even if in the future the W3C or browser wiseguys create some new fangled way of inserting javascript or some other dynamic content that the filters do not protect against (since it's new and the filters have not been updated), the browser will just ignore the new stuff that some hacker inserts when it's between the tags.

          To me the current state of things is a bit crazy - basically it's like having a car with 1000 gas pedals (tags) and to stop the car you have to make sure all 1000 pedals are not pressed (escaped or filtered). There is not a single brake pedal! And worse, the W3C or MS or Mozilla or whoever could introduce a new gas pedal, and you the website operator have to filter out the new gas pedal when it's introduced.

          With something like this tag there is a brake pedal, so even if you don't manage to filter out all the 1000 gas pedals, the brake helps to keep stuff safe.

          If they had implemented such a tag, the google and myspace worms would not have worked for so many browsers.

          FWIW, these sort of worms are not new. I managed to find a hole in advogato some years ago (iframe worm) - and hence my suggestion to the W3C and Mozilla.

          But it seems to me than NONE of them are really interested in improving security. They're all just interested in inventing new gas pedals for people (and hackers) to step on. They're not even interested in creating a single brake pedal. They just pay lip service to security.

          See the thing is - it's not too difficult to code a browser to go "OK from now on there's no such thing as javascript till I see a valid unlock tag", so even if there is a browser parsing bug and a hacker manages to insert javascript via a stupid browser bug (that the website filters naturally do not and cannot cater for) it does NOT matter - since javascript will be disabled - between those tags the browser will be respecting the flag that says "I do not know javascript, java and all that fancy stuff" - it does not even have to parse javascript - since for all intents and purposes between those tags, the browser does not know there's such a thing as javascript (or activex or flash etc).

          This is very useful for sites that have to include 3rd party content - sites like slashdot or webmail sites or even sites that serve up ads from 3rd parties.
          • by sukotto ( 122876 )

            So then people would first have to paste a comment "" and THEN paste the comment with the exploit code?

            • No, the entire point is that any exploit would need to know the "unlock" code in order to operate, yet the unlock code is generated when the page is sent to the browser, long after the exploit has been submitted.
            • by sukotto ( 122876 )

              strange. that showed up in the preview. let's try again:

              paste a comment "</shield>"

          • You can't assign attributes to end tags. XML/HTML won't let you do that and extending it to be able to do so would be a bit of a revolution. Too many existing parsers rely on the current behaviour. But maybe you could possibly do something along '''<startshield key="lalala" /> stuff <endshield key="lalala" />''', although I believe that'd also be a bit of a hack.

            What we actually really need, and what is the real solution, is just a little more careful programming on the server side. Write a func
          • by Timmmm ( 636430 ) on Monday September 28, 2009 @11:31AM (#29566317)

            Well that's an overly complicated and... well *wrong* way to do it. The correct solution is:

            1. Escape all <'s and >'s and &'s in the input.
            2. Interpret BB-code to add links & basic formatting.

            Simple.

            • Mod parent down (Score:3, Informative)

              by bluej100 ( 1039080 )
              The correct solution is a whitelisted HTML parser and generator, like HTML Purifier [htmlpurifier.org].
              • Re: (Score:3, Interesting)

                by Firehed ( 942385 )

                Tools like that aren't foolproof, especially since browsers go out of their way to attempt to parse malformed input (unless you're serving content as application/xml, in which case the browser will just show an ugly parse error). I can't speak about that tool not having used it, but all it takes is one hacker finding yet another way to create a broken script tag that a browser will still run that they don't yet know about and all your efforts are for nothing.

                I think the parent's suggestion of BBCode is saf

            • There are many situations other than forum posting where it is desirable to include third-party content in your site. Advertisements are the first thing that jump to mind, but web widgets are also becoming popular. Having some browser markup that will limit what the third-party code can do would enable this to be done safely, without having to trust the third party or load and filter third-party content server-side.

            • Re: (Score:3, Insightful)

              by TheLink ( 130905 )

              That's all very nice and simple till stuff like UTF8, UTF7, etc get involved...

              See:
              http://nedbatchelder.com/blog/200704/xss_with_utf7.html [nedbatchelder.com]
              http://www.securityfocus.com/bid/31183/discuss [securityfocus.com]
              http://ha.ckers.org/blog/20060817/variable-width-encoding/ [ckers.org]

              You don't have to believe me when I tell you there are 1000 (or more) gas pedals and no brake pedal and it's a crazy situation. But that's the truth as I see it.

              I daresay many of the website folks who have been burnt before will believe me. Yes you can and SHOULD use th

            • by AusIV ( 950840 ) on Monday September 28, 2009 @04:06PM (#29571265)
              Reddit does escape all of those symbols, and they use Markdown [wikipedia.org] for adding links. Still, they managed to get owned by an obscure vulnerability that was discovered only because their code is open source.

              And that's the point TheLink was trying to make. It would be far simpler to tell the browser not to accept javascript in a certain block of code than it is to explore all the possible exploits that could be leveraged against your alternative markup language. There are hundreds if not thousands of places you can make mistakes, and it could be remedied by a single mechanism that prevented javascript from existing in certain blocks of code.

          • Re: (Score:3, Interesting)

            by BikeHelmet ( 1437881 )

            There is not a single brake pedal! And worse, the W3C or MS or Mozilla or whoever could introduce a new gas pedal, and you the website operator have to filter out the new gas pedal when it's introduced.

            Undid my mods, but I had to post this.

            There used to be a break pedal. I think it was Firefox 1.5 where this code didn't evaluate any tags:

            element.append(document.createTextNode(sText));

            The solution, therefore, was to manually parse italic/bold/a tags, to append those elements - and then create a text node inside. A perfect working DHTML/DOM solution, properly sanitized!

            However, with Firefox 3, text nodes now evaluate HTML tags. This handy function went out with eval usage for local callbacks. :/ Opera and Chrome also evaluate some(all?) tags for appende

    • Re: (Score:3, Insightful)

      by MathFox ( 686808 )
      It is just a reminder to programmers of public forum software how important input sanitation is.

      Apparently the damage was limited to only one site... But similar hacks could be done on other vulnerable sites.

  • NoScript (Score:5, Insightful)

    by corychristison ( 951993 ) on Monday September 28, 2009 @09:45AM (#29564909)

    "NoScript FTW!" comments commencing in 3... 2... 1...

    I skimmed the FAQ on the first link, and it seems reddit is responsible for not scrubbing input.

    Next!

    • Re: (Score:3, Interesting)

      by CKW ( 409971 )

      I love how *their* mistake causes viral problems in YOUR browser. All one needs is some sort of cross site vulnerability now and ...

      • The obvious solution is that you need to get revenge. Go start a popular site for owners of popular sites so you can cause viral problems in THEIR browser.
    • Re:NoScript (Score:4, Insightful)

      by RiotingPacifist ( 1228016 ) on Monday September 28, 2009 @11:26AM (#29566239)

      Cue me reposting my views on noscript being a pretty crappy tool for modern web security then.

      NoScript comes from a broken way of thinking, "you can identify attacking sites and trusted sites", the attack code for this was coming from reddit.com (a site you have to allow in order to use reddit). The only way this sort of bug can be protected against is by use of javascript filtering tools such as controldescripts [mozdev.org] that filter javascript request by type and domain, with such a tool it would be possible to protect yourself much more effectively.

      mouseclick is submitting info -> allow
      mouseover is requesting data -> allow
      mouseover is submitting data -> request user confirmation
      javascript function is doing something weird -> request user confirmation
      javascript is trying to use a known exploit* -> deny and notify user (as a workaround for 0-days simply blocking the bad JS calls will protect users much faster than browsers usually get patched) ...etc

      You could also combine this with domain checking to have lists of pages where you allow
      *no-js (untrusted),
      *simple-JS (google, youtube, etc) but [it might allow functionality but could prevent tracking],
      *complex-js (facebook, etc) [all the ajax stuff means simple-JS wouldn't work]
      *all-JS (fancynewsite.com) [even the complex list of functions you allow just isn't enough]

      Such tools could also help the paranoid among us use website that require JS, by disabling mousetracking and sending of data on non-click actions.

      As long as people stick to the broken thinking of trusted/untrusted domains, there is little chance of this actually happening. The worst thing about noscript is that for an unkown site you often have to allow JS on it to see what it looks like, so unless you plan on only browsing sites you've already been to and those that don't use javascript, it is completely useless yet its users claim, nay genuinely think they are more secure!

      • Re: (Score:3, Insightful)

        by 0123456 ( 636235 )

        The worst thing about noscript is that for an unkown site you often have to allow JS on it to see what it looks like, so unless you plan on only browsing sites you've already been to and those that don't use javascript, it is completely useless yet its users claim, nay genuinely think they are more secure!

        If I go to an unknown site and it doesn't display anything useful without JS then I generally go somewhere else; if the developers are so inept that they can't make their site do something useful without it then the site is probably a heap of steaming monkey poo or a malware distributor.

        Back in the real world, it's hard to see how allowing arbitrary JS to run on your system can be considered 'more secure' than only running it from sites you trust. This 'exploit' is nothing to do with insecurity, it's to do

    • by Neoncow ( 802085 )

      Neither me nor aedes work for reddit. We were simply reporting what was known at the time to prevent further spreading and panicking users (people were thinking they were going to get banned for spamming, worrying about loss of karma, et cetera).

      The admins acted within an hour. KeyserSosa is an admin, his username is highlighted in red and has a [A] next to it.

  • by mcgrew ( 92797 ) * on Monday September 28, 2009 @09:49AM (#29564953) Homepage Journal

    guyhersh from reddit.com describes the situation (warning: title NSFW)

    Does anybody have a SFW link? Something like this certainly must have more than one FA.

  • Reddit Hacks (Score:3, Interesting)

    by jDeepbeep ( 913892 ) on Monday September 28, 2009 @09:55AM (#29565031)
    This is nothing new. There is a quiet tradition of Reddit users finding the weak points of the site, like this [reddit.com] for example.

    Putting javascript:$(".up").click()() in the address bar upvotes everything on the page.
  • by Anonymous Coward

    Indeed, it will educate people to surf with javascript turned off, and it will hopefully educate webmasters to stop programming their sites in a way that requires javascript even for basic functionality.

    Anyone who believes this has simply never written a web application. Javascript and cookies are absolutely essential to any web programmer who wishes have any type of dynamic content on a page. It annoys me to no end when someone says the solution to security holes is to turn these features off. The solution is for programmers to stop being idiots and write secure code, both in web applications and in the browsers themselves.

    • by Anonymous Coward on Monday September 28, 2009 @10:21AM (#29565353)

      As a web developer, I beg to differ. There is absolutely no excuse for writing a page that doesn't 'fail gracefully' when javascript isn't present. Let's face it, for every reputable page out there (att.net, youtube.com, etc) there are a hundred others designed by average joe-schmo webprogrammers. And lord only knows if they designed their page securely, and lord only knows if someone has hacked them and injected malicious scripts. I seem to recall hearing a few weeks ago that the majority of malicious scripts were being put into hollywood celebrity gossip sites that people were hitting off their google searches.

      For me, the solution is to just whitelist the sites I visit frequently, only allowing scripts/cookies when I know they can be trusted. I'm not saying that you shouldn't design without javascript, but I am saying that you shouldn't assume that everyone visiting your page is going to have it. Besides, how hard is it to write a page that vomits up its contents in a readable form when the javascript doesn't run to position all the css objects? It doesn't have to look pretty, but it should be usable.

      • Re: (Score:3, Interesting)

        by lwsimon ( 724555 )

        Amen. I've gotten into the habit of structuring the document, outputting the data into readable form, then using CSS and JS to make it look and behave how I want it to.

        There are some pages where "no access without javascript" is acceptable - but they are few and far between. For the most part, you should be able to use Lynx and view the content.

      • by spike2131 ( 468840 ) on Monday September 28, 2009 @12:08PM (#29566947) Homepage

        There is absolutely no excuse for writing a page that doesn't 'fail gracefully' when javascript isn't present.

        Yes there is. Making your page fail gracefully takes extra time and resources, which could be put to better use than supporting the 1% of users who choose to handicap their browsers by turning off javascript.

        Failing gracefully is an important concern, but its not the only concern, and should be balanced against other priorities.

        • Re: (Score:3, Insightful)

          by psydeshow ( 154300 )

          The idea is to build the page in fail-state first, and then use JavaScript to enhance it. Or in other words, build your DOM and then restyle, add event listeners, etc.

          It doesn't take extra time, and it's a great technique for future-proofing your pages. It also makes them accessible to people who, for whatever reason, can't take advantage of teh javascript. If your website is in the US, and is big enough for anyone to care, ADA compliance pretty much requires it.

        • Re: (Score:3, Interesting)

          by horza ( 87255 )

          Absolutely right for your personal homepage. A professional web designer would not be able to get away with this. This kind of laziness translations directly into additional support costs for the client. And each time Microsoft recommends turning off Javascript due to a 0-day exploit you are cutting off more than 1%.

          I can't think of any cases where it is ok to not fail gracefully. I hope you are not talking about just using client side validation, one of the most used cases for Javascript but where you must

    • by aardvarkjoe ( 156801 ) on Monday September 28, 2009 @10:32AM (#29565507)

      The solution is for programmers to stop being idiots

      Any proposal that relies on any group of people to not be idiots is doomed to failure.

    • The solution is for programmers to stop being idiots and write secure code

      Yeah because that mantra has really caught on, especially with Microsoft employees.

      Face it, programs are written by people, people are made to f*** up on epic scale [wikipedia.org], therefore, you need to be ready to handle epic f*** ups or just not play ball. Granted you don't get the same dynamic experience but that's the trade off. I'm sure the guy your quoting understands that.

    • by ultranova ( 717540 ) on Monday September 28, 2009 @10:39AM (#29565585)

      Anyone who believes this has simply never written a web application. Javascript and cookies are absolutely essential to any web programmer who wishes have any type of dynamic content on a page.

      So by advising people to disable Javascript, I'm doing my part for killing off "Web Applications" and getting us back to good old Web Pages. Excellent.

      Seriously, why would I want "dynamic content" when all that really means is a thousand pauses as more data is fetched? Give me static pages whenever possible. Better yet, give me a single large static page rather than a dozen small pages, so I don't have to wait while the next page is being loaded and rendered.

      The solution is for programmers to stop being idiots and write secure code, both in web applications and in the browsers themselves.

      The solution is to understand that most web sites are not applications, from the users point of view, and stop stuffing them full of scripts that do nothing but slow things down.

      • by k8to ( 9046 )

        Hooray for ultranova.

        There's a few rare cases where I actually want a web application. Most of the web applications I view as totally useless or inferior to native applications.

        Most web pages aren't even bad web applications, they're just WEB PAGES. Don't require javascript to do amazingly trivial things like.. load the content.

    • by naasking ( 94116 )

      Well, neither cookies nor JavaScript are strictly necessary. REST demonstrates that URLs suffice. JavaScript certainly makes it more pleasant, and cookies can be used to address some usability problems (though they are currently abused).

  • Already fixed. (Score:3, Informative)

    by complete loony ( 663508 ) <Jeremy@Lakeman.gmail@com> on Monday September 28, 2009 @10:08AM (#29565183)

    KeyserSosa Thanks for this (and thanks aedes ). I'm going to steal his idea and post here as well. We've fixed a couple of underlying bugs in markdown.py, and will write a blog post for those interested once the dust settles. We've also gone through and deleted the offending comments. This exploit was a good old-fashioned worm, and its only purpose seems to have been to spread (and spread it did). The effect was limited to the site, and no user information was compromised.

    So obviously this is no longer spreading.

  • A Good Idea (Score:5, Insightful)

    by CopaceticOpus ( 965603 ) on Monday September 28, 2009 @10:13AM (#29565255)

    Hey, everyone, there is a javascript exploit on Reddit! Click on these links to Reddit to learn more.

    Incidentally, this old sock smells awful. You should smell it.

  • Myspace (Score:3, Interesting)

    by RalphSleigh ( 899929 ) on Monday September 28, 2009 @10:36AM (#29565557) Homepage
    Reminds me of a very similar worm that hit myspace years ago:

    http://web.archive.org/web/20060208182348/namb.la/popular/tech.html [archive.org]

    Same thing, find a way of executing javascript and then have it self-replicate by posting itself all over the site.
  • by Thanshin ( 1188877 ) on Monday September 28, 2009 @10:41AM (#29565603)

    Can you imagine the same people in other fields of science?

    "...Hey guys, look! I made the black hole generator we were theorizing yesterday! See? I just have to press this button and

  • Comment removed based on user account deletion
  • Oh cool, now I can finally create the signature virus!

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...