Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security The Internet

Thwarting New JavaScript Malware Obfuscation 76

I Don't Believe in Imaginary Property writes "Malware writers have been obfuscating their JavaScript exploit code for a long time now and SANS is reporting that they've come up with some new tricks. While early obfuscations were easy enough to undo by changing eval() to alert(), they soon shifted to clever use of arguments.callee() in a simple cipher to block it. Worse, now they're using document.referrer, document.location, and location.href to make site-specific versions, too. But SANS managed to stop all that with an 8-line patch to SpiderMonkey that prints out any arguments to eval() before executing them. It seems that malware writers still haven't internalized the lesson of DRM — if my computer can access something in plaintext, I can too."
This discussion has been archived. No new comments can be posted.

Thwarting New JavaScript Malware Obfuscation

Comments Filter:
  • SANS (Score:1, Funny)

    by Anonymous Coward

    I'm still not sure what I think of them.
    I mean, it's a great idea. But they update their diary every day, which means for the most part, it's totally boring crap. Today's entry is a little different.

    I still think of SANS as a bunch of old guys all sort of pontificating about the most mundane things. Wind back 5 years and I think they had a valid part to play, especially with the amount of viruses and worms flying around. These days, not so much. Security is so much higher on everyone's radar that they'

    • Re: (Score:3, Insightful)

      by sm62704 ( 957197 )

      But they update their diary every day, which means for the most part, it's totally boring crap.

      Welcome to my slashdot journal (NSFW)

      they're a bit old in the tooth now

      Piece of cake, easy as pie. The saying is "long in the tooth", comrad.

      the Internet just isn't that risky anymore.

      You're not paying attenton.

  • Is it just me or is this way of getting around it mind-blowingly obvious.

    The techniques the malware writers are using are quite interesting though, i've never heard of arguments.callee.

  • It seems that malware writers still haven't internalized the lesson of DRM -- if my computer can access something in plaintext, I can too.

    In fact, thats the lesson from any digital copy protection scheme, some of which precede DRM (at least the term DRM)

    • by panaceaa ( 205396 ) on Tuesday July 15, 2008 @03:02PM (#24202109) Homepage Journal

      I'm glad you highlighted that line of the summary. The point of the obfuscation was to slow down analysis of the code and require special tools (SpiderMonkey) that average web users don't have. Here the malware author clearly won. The article author spent hours figuring out a new obfuscation technique and writing an article about it. If there are malware detectors, they have to be updated to detect the new obfuscations.

      This is not the traditional DRM argument. No one's trying to decode a video or music file they have legal rights to access. This is a malware arms race: The point IS to hide what's going on, not to lock things down. What's more interesting here, and not even discussed, is the parallel between Javascript malware development and computer viruses. The technique the author uncovered is an adaptation of polymorphic virus concepts into web malware. And while the technique is something many developers could come up with, I haven't heard being used in practice yet, so it's likely a noteworthy step in the arms race.

      • The point IS to hide what's going on, not to lock things down.

        Right, this is not security through obscurity folks.
        It is obscurity through obscurity!

        • Oh that'll never work.
        • Plus Firebug 1.2 already does what their patch does. If you want to see what the final execution result is, click the dropdown in the Scripts window to see the text of all eval() calls.

          How long until they do setTimeout("final code", 1) instead of eval(), and how long until they do document.write("<div id='foo' onclick='malware.code;'></div>"); document.getElementById('foo').onclick(); etc? As gp said, it's a malware arms race, they're changing their obfuscation techniques to bypass the current

      • by Anonymous Coward

        > I'm glad you highlighted that line of the summary. The point of the obfuscation was to slow down analysis of the code and require special tools (SpiderMonkey) that average web users don't have. Here the malware author clearly won. The article author spent hours figuring out a new obfuscation technique and writing an article about it.

        Actually, it looks like they spent more time writing about it than solving it. That 8-line patch isn't exactly complex, and it appears that it'll kill most techniques they

      • by kesuki ( 321456 )

        "The point of the obfuscation was to slow down analysis of the code and require special tools (SpiderMonkey) that average web users don't have."

        noscript is an easy firefox add-on. if you're advising 'normal' people how to use the internet safely, you'd have told them to use noscript and only allow really trusted sites that you can't live without.

        personally, I don't trust any site that much. besides, with noscript, slashdot tells you to use the old style instead of the new style layout.

      • Here the malware author clearly won

        No more than the RIAA has "won" by imposing or using DRM on their music downloads and on iTunes. In that case it only takes one cracked version to leak out before everyone else gets the benefit of the original cracker's behavior ad-infinitum. Likewise, once the detection is worked into a Javascript blocking and filtering tool, such as no-script [mozilla.org], everyone using no-script benefits from the original analysis (it doesn't have to be cracked afresh each time it comes up). So the malware author is really only inco

      • by asrail ( 946132 )

        SpiderMonkey is the Mozilla's C's JavaScript engine - they got one in Java, called Rhino and one in JavaScript that I don't remember its naming.

        SpiderMonkey is used on the Firefox browser, so a millions of people got it handy.

        It's fairly easy to make a graphical interface to allow any end user to copy or save the data to a file.

        Everywhere you have to update your techniques when new malwares appear.
        The better your heuristic, the less you have to update.

  • Funding (Score:1, Flamebait)

    by Joebert ( 946227 )
    Who pays these freakin people ?!

    I could sit here all day comming up with ways to put ghosts in the machine !
    • Sorry, your job application has been denied. If you don't know where we are you don't belong here. :)
      • by Joebert ( 946227 )
        Apparently I don't belong on either side of the fence. I suppose I'll just have to burn it down with my bait of flame. :)
  • This is too much, now we all will have to download a pre validator for javascript to view the code (what does this code do, i can't read this, I am an 80 year old grandmother...) before going to the webpage and view it...sucks to go on the web these days!

    • by loftwyr ( 36717 )

      No, no... it prints it out to itself and then executes it. You'll see no more than you did before.

      Unless you're a hacker and then it will stick it's tongue out at you first.

  • Baby & Bathwater? (Score:5, Informative)

    by XanC ( 644172 ) on Tuesday July 15, 2008 @02:15PM (#24201275)

    There are certainly legitimate uses of eval, and legitimate reasons to "obfuscate". Like to compress the script that you send to each & every client. The savings in bandwidth for you (and for them, especially if they're on dialup) can add up. For example: http://www.javascriptcompressor.com [javascriptcompressor.com]

    • gzip compression is just as effective.
      • by XanC ( 644172 )

        Well, if you leave off the base62 encode option, you get a file that's "prepped" for better gzipping. But of course that doesn't require an eval, which was the point of this whole thread, so you're right about that.

        I've also noticed, though, that IE will barf on long Javascript files, so doing the base62 compression on the Javascript even with gzip is a workaround for that.

    • by TLLOTS ( 827806 )

      Simply minifying your scripts by stripping out comments and unneeded whitespace will do almost as good a job compressing the data with eval. Add in gzipping and the difference is negligible, plus there's no additional delay on the client side when that script is decompressed for each and every page that uses it.

    • It isn't an either/or choice, but programs with verbose variable names (which is typically one of the first targets of javascript compression: "replace timeSinceLastUpdate with r") compress disgustingly well. You may find that the gzip compression is effective enough that the obfuscation isn't worth the various attendant headaches (maintaining two versions of the code, etc).

  • document.referrer (Score:4, Interesting)

    by Anders ( 395 ) on Tuesday July 15, 2008 @02:16PM (#24201299)
    I turn off referrer headers for privacy (set network.http.sendRefererHeader to 0 in about:config in Firefox). Now it seems that it can also save me from malware :-). Why would you want it enabled, anyway?
    • Re: (Score:2, Informative)

      by ypctx ( 1324269 )
      Many sites won't work without it, mainly to prevent "hotlinking".
      • Re: (Score:2, Informative)

        by Anders ( 395 )

        Many sites won't work without it, mainly to prevent "hotlinking".

        That is about as effective as User-Agent sniffing.

        This Firefox addon [mozilla.org] gives you arbitrary Referer headers on a per-site basis.

      • Re: (Score:2, Informative)

        by ArcticFlood ( 863255 )

        Most "hotlinking prevention" methods (either in a .htaccess or in PHP) that I've seen allow no referrer, since no referrer usually means it was a bookmark or a URL entered by hand. Since this also allows people to copy and paste links to site, these methods are generally pointless unless there is a real problem.

        • by ypctx ( 1324269 )
          If you allow no referrer for a web page, you will usually get no traffic from outside, like from search engines and other pages that might link to you.
          If you allow any referrer for an image, you are allowing anybody to embed this image into their page, thus stealing your bandwidth. To prevent that, you only allow your own pages to refer to your own images. Of course this can be spoofed manually by the client, but too complicated for most people.
          A funny thing is that the "HTTP_REFERER" header name is wrong
          • Re: (Score:2, Informative)

            by ArcticFlood ( 863255 )

            I was unclear. I meant an empty referrer, which occurs when you weren't referred by a URL (such as typing the URL manually or clicking a bookmark). If you prevent the use of an empty referrer, your page cannot be bookmarked or manually typed in the address bar, which is why it is allowed.

    • Re:document.referrer (Score:4, Interesting)

      by geminidomino ( 614729 ) * on Tuesday July 15, 2008 @02:20PM (#24201381) Journal

      I turn off referrer headers for privacy (set network.http.sendRefererHeader to 0 in about:config in Firefox). Now it seems that it can also save me from malware :-).

      Why would you want it enabled, anyway?

      Silly websites that check it as some sort of "security." Easily foiled by sending the site's own URL as the referer though.

      • by vrmlguy ( 120854 )

        I turn off referrer headers for privacy (set network.http.sendRefererHeader to 0 in about:config in Firefox). Now it seems that it can also save me from malware :-).

        Why would you want it enabled, anyway?

        Silly websites that check it as some sort of "security." Easily foiled by sending the site's own URL as the referer though.

        Of course, that might revive any Javascript malware.

      • I turn off referrer headers for privacy (set network.http.sendRefererHeader to 0 in about:config in Firefox). Now it seems that it can also save me from malware :-).

        Why would you want it enabled, anyway?

        Silly websites that check it as some sort of "security." Easily foiled by sending the site's own URL as the referer though.

        Even that doesn't work for all sites. Newegg [newegg.com], for example, won't let you finish checking out if you forge the referrer. I had to add an exception to it to RefControl.

        P.S. I have RefCon

      • Re: (Score:2, Interesting)

        by Nos. ( 179609 )
        I check the referrer header for images on some sites, not for security, but for reducing bandwidth thieves doing hotlinking. On more than one occasion folks have linked to images on busy forum sites which costs me bandwidth. Checking that the referrer is either the local site or blank reduces that bandwidth waste to virtually zero. Yes, some will still get through, but the few minutes it takes to add to the virtual host configuration in Apache is well worth it.
        • I agree with your method, but "bandwidth thief" is misplaced. Nothing wrong with a referrer check, and I don't hotlink on forums...it's rude. But that's it. Rude, at best...not thievery. You posted the image, and it's your hosting bill. So if you only want it served in a specific context (certain referrer, certain browser, who knows) it's your responsibility to host it that way. Otherwise, people's browsers are asking for the image, and you're serving it. If you don't like giving out Halloween candy, don't
          • by Nos. ( 179609 )

            If you don't like giving out Halloween candy, don't answer the door.

            Its more like my neighbour handing out the candy I bought. He gets the "credit" while I paid for the goodies.

    • by antdude ( 79039 )

      Some sites hate it and I get blocked, missing images, etc. :(

    • by Snaller ( 147050 )

      "Why would you want it enabled, anyway?"

      To access the thousands of sites which check it to make sure nobody isn't "stealing" their bandwith.

    • by Bisqwit ( 180954 )

      Because website authors can use the referrer field to improve their services, by figuring out which access patterns are most common, and which links should be made more or less prominent.
      By hiding that information, you are depriving them of that possibility, and you are therefore depriving the Internet a certain means of becoming better.

      • by IdeaMan ( 216340 )

        I thought this was Slashdot, and privacy trumped all?
        Any time we give information away it gets used against us. Thanks to one of the previous posters as of today I now use RefControl.

    • Some sites could potentially use it to aid in navigation. It's not a great option to use it but it can be better than using back options, especially if there are lots of forms used in the site.

      Never actually used it like that (prefer to store that kind of thing in session variables if I'm forced to) but I could see someone doing so
  • DRM Lesson (Score:3, Insightful)

    by MyLongNickName ( 822545 ) on Tuesday July 15, 2008 @02:18PM (#24201329) Journal

    It seems that malware writers still haven't internalized the lesson of DRM â" if my computer can access something in plaintext, I can too.

    The malware writers don't need a 100% success rate. They are simply tring to get their software on enough machines to build a nice bot empire.

    • Exactly, the idea behind javascript obfuscation is to get past automated tools (antivirus engines) not flesh and blood analysts, and it does the job very well. It really isn't the same thing as DRM.
  • stop (Score:5, Funny)

    by ypctx ( 1324269 ) on Tuesday July 15, 2008 @02:18PM (#24201335)

    stop all that with an 8-line patch to SpiderMonkey

    Cool, and now malware engineers will lose their jobs, you insensitive clods! Internet Explorer to the rescue!

  • lets hope they don't try stuff like this:

    http://en.wikipedia.org/wiki/Rob_Northen_copylock [wikipedia.org]

    it was hard. I guess there's no way to use special cpu modes, but you could still knock up a large amount (megs) of seemingly random data which contains code which you decrypt a few bytes at a time, re-encrypting the 'code' you just executed and hide, within thousands of jumps, loops and other messed up logic the actual guts of your code.

  • Its not obfuscation (Score:5, Informative)

    by Anonymous Coward on Tuesday July 15, 2008 @02:49PM (#24201879)

    Sure it may look like the attacker is cleverly trying to obfuscate their malware from prying eyes but usually they could care less about that. By the time you go reversing their code, they've already gotten the bulk of their victims anyway.

    Rather, they're most often using it to make the code easy to replicate elsewhere. A lot of places they'll host it will inadvertently hiccup on certain characters in the code and change them. Like < to &lt;, or + to space, or new line chars to end the string. Using an encoder that converts everything to alphanumeric is much easier to guarantee a successful propagation.

    Especially true for XSS worms

  • Surely, that should read -
    evil()
    ?
    It is after all, an inbuilt function of Javascript.
  • Isn't that the thing I've got turned off in my preferences?
  • by X ( 1235 ) <x@xman.org> on Wednesday July 16, 2008 @09:09AM (#24211955) Homepage Journal

    I did a fairly detailed analysis [xman.org] of an instantiation of typical Javascript malware these days.

No spitting on the Bus! Thank you, The Mgt.

Working...