Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Image

New JavaScript-Based Timing Attack Steals All Browser Source Data 167

Trailrunner7 writes "Security researchers have been warning about the weaknesses and issues with JavaScript and iframes for years now, but the problem goes far deeper than even many of them thought. A researcher in the U.K. has developed a new technique that uses a combination of JavaScript-based timing attacks and other tactics to read any information he wants from a targeted user's browser and sites the victim is logged into. The attack works on all of the major browsers and researchers say there's no simple fix to prevent it."
This discussion has been archived. No new comments can be posted.

New JavaScript-Based Timing Attack Steals All Browser Source Data

Comments Filter:
  • by Anonymous Coward

    Disable Javascript.

    • Re: (Score:3, Informative)

      by Anonymous Coward

      Disable Javascript.

      You might as well stay off of the Web then.

      I tried that a couple of times and I couldn't do any banking, use my brokerage account, use any financial sites, all other content would not show correctly.

      Unfortunately, JavaScript has become a necessity for the Web.

      I can't think of any website that actually worked without it.

      • You could try enabling it on your bank's website.

        • by Anonymous Coward

          You could try enabling it on your bank's website.

          Which I did.

          The trouble is, very few websites work without it.

          In other words, I was whitelisting every website that I visited.

          Javascript is used so much, I never came across a website that would function without it.

          No JavaScript == No Web.

          • by Anonymous Coward

            This is just not true. Most of the sites I visit work fine with JS off (NoScript). Any site using Unobtrusive JavaScript will work fine. I use SSBs for the ones that require credentials and require JS. In other cases, I just don't use the site.

            Second, add RequestPolicy. Then you can enable JS per-site but be free from all of the cross-domain attacks*. It also requires that you build up a whitelist, but makes you MUCH safer online.

            *won't help with stored XSS, but not much will without major changes.

          • Yup - I used to be a religious user of NoScript, but gave up when I started allowing JavaScript to use even, uh, less than trustworthy sites.

            • I started allowing JavaScript to use even, uh, less than trustworthy sites.

              Well, that's on you. I use it too to good effect. I have a few web sites I trust more than most, but I still disallow popups, scripts from 2nd level sites, and it appears to prevent cross site scripting effectively enough. Not using it... what's your point?

              • My point is that when even untrusted sites are whitelisted, I wasn't really protecting myself anymore, just being anal. I'm sure I was slightly safer that way, but it was no longer worth the effort to me. Clearly you value your time and effort differently than I do mine in relation to the amount of protection you are afforded.

          • Which I did.

            The trouble is, very few websites work without it.

            In other words, I was whitelisting every website that I visited.

            Javascript is used so much, I never came across a website that would function without it.

            No JavaScript == No Web.

            There's a reason you're posting AC...?

            a) Your story sounds unlikely
            b) If you think your security is worth less than a mouse click then you'll get the browsing experience you deserve.

        • web 1.0 ftw!
        • Comment removed based on user account deletion
      • by dicobalt ( 1536225 ) on Sunday August 04, 2013 @10:50AM (#44470173)
        NoScript is your friend.
        • Re: (Score:3, Informative)

          by Anonymous Coward

          Other fix: disable iframes,

        • by plover ( 150551 ) on Sunday August 04, 2013 @11:37AM (#44470415) Homepage Journal

          Javascript is cool for offering great content. But why would anyone allow JavaScript from non-primary-domain sources? Advertisers may want their readers to have an "rich, interactive, dynamic experience". Fine, they can offer that: on their site, after the users click over to your site from a static image.

          The rest of the linked-in javascript out there is mostly analytics, which do not benefit you as a user.

          And as a web site operator, you can be pretty sure that customers don't want to be pwned just because of a javascript brought in by your site. Should you really be linking to others that offer it?

          The GP said "he's whitelisting everything." He's doing it wrong - allow the javascript from servers in the *.domain.com for any given page, then selectively enable it from sites that add on features you care about, like disqus and vimeo. It's not a long list, and once you've whitelisted vimeo and vimeocdn for one site, you're not constantly enabling them on others.

          • by marauder ( 30027 )

            A lot of people load their JQuery libraries or whatnot from a CDN. In fact I think that's the preferred behaviour. There are multiple CDNs so the list is a bit longer and more annoying than you'd think.

            Some links for background:
            http://encosia.com/3-reasons-why-you-should-let-google-host-jquery-for-you/ [encosia.com]
            http://royal.pingdom.com/2012/07/24/best-cdn-for-jquery-in-2012/ [pingdom.com]

            • by plover ( 150551 )

              That considers only the performance viewpoint. As a web developer, it's valuable to him because A) he's not responsible for hosting the latest version himself, B) he's not paying to deliver it to his viewers, and C) his users can use their already cached version of the script they got visiting a different site.

              Security wise, it's risky. If someone's encountered malware that's stored a poisoned version of jquery in their web-cache, and they go to your site, they're already pwned - and now they're on your si

              • It's not just security to worry about - although that isn't really a big concern as most CDNs hopefully never fall to exploits themselves...

                Working in a "content agency," we've used CDN hosted JS before. Not shabby, but I prefer to keep it on our servers. It doesn't really save you on point B - if you're hosting the page, host the damn JS. The caching is nice, but your JS should be small anyway...hopefully. We ran into an issue with jQuery Mobile over CDN - we were pointing to the latest stable, but surely

              • by cas2000 ( 148703 )

                it's worse than that - if a site uses a CDN to host javascript (e.g. common stuff like jquery) then the user has to allow javascript from that CDN.

                That's not just the jquery or whatever they need to view one particular site, it's every other script hosted on that CDN by any other site they happen to visit or stumble upon.

                The end result if you do this is almost identical to just enabling javascript everywhere.

                it also puts the CDN operators in a privileged position of being able to spy on users across a huge

          • by pmontra ( 738736 )
            As a noscript user I have to allow cloudflare or many sites don't work. Many sites don't degrade gracefully nowadays and they put their main js on accelerators like cloudflare.
          • by jfengel ( 409917 )

            They also like to link to sites like jquery and google and other sites who are hosting basic Javascript features that they depend on. I'd just as soon they download it and serve it from their domain, but that way they get automatic, dynamic upgrades and bug fixes.

          • Analytics do benefit you as a user. That's how companies find out things. They find out what is popular so they can devote more resources to it.

            Plenty of other examples but maybe you can think them through yourself.

            • by plover ( 150551 )

              Analytics allow site owners to see interesting data about their pages - how many visits, etc. The owners can theoretically improve their pages so I get a better experience. And it's free! Who doesn't love a free service?

              But the well-known rule is that if someone's giving you something for free, you are the product, not the consumer. So they're selling you out the back end. Think about what they now have to sell.

              The answer is that the analytics providers are tracking behavior from search to sale. They a

          • So how do you deal with sites that use a CDN domain? As I understand performance 'best practice', you should deliberately host your static assets on a secondary domain so that it doesn't ever get sent cookies. You mention vimeo.com and vimeocdn.com, but many CDN domains aren't quite so obviously named - even here on /. we seem to have slashdot.org and a.fsdn.com and www.gstatic.com (not sure if gstatic is serving any JS, but you get the idea).

            FWIW, I use Adblock + Ghostery to avoid most of the cross-domain

            • by plover ( 150551 )

              Like you, I'm running Adblock Plus, Ghostery, and NoScript, and I'm manually whitelisting them in NoScript. It's definitely not a process that is ready for the general public. And like you I find that some sites are indeed difficult to unwind to discover what should really be enabled to restore functionality, and what is not valuable to me.

              When I need something from a site, I usually walk the list of NoScript "forbids", enabling them one at a time until it works. I start with any that might appear to be

        • by jfengel ( 409917 ) on Sunday August 04, 2013 @02:36PM (#44471283) Homepage Journal

          Frenemy. Or rather, lots of web sites are my frenemies, scooping up Javascript from dozens of web sites with no clear indication that they're aware of the interactions or trustworthiness of those sites. Slate.com is my particular nemesis here; I once counted two dozen separate sites that would have had to be enabled before the site could run as its designers intended, some of them down 4 and 5 layers of indirection.

          NoScript, who treats everybody as an enemy until told otherwise, requires an awful lot of hand-holding before permitting that. NoScript I trust (more or less) to be on my side, but lots of web site designers consider them the enemy, and that makes our mutual encounters... tense.

      • Only because devs expect to be abe to use it. Interactive websites worked before the lastest fad for JS, and they still do! As for one that works without: you are on one!

        • Wrong. Take a look at the page source...
          • I see lots of tags but I believe they are rendered completely ineffectual thanks to NoScript.

            • You believe? What is this, faith based science? :)

              • I guess he could have studied the NoScript code (and the Firefox code it interacts with) in detail to make sure that NoScript indeed does what it is supposed to do in all cases. However he decided to not do that work, but instead to trust the author of NoScript to provide what he advertises. Probably encouraged by the fact that there seem to be no publicly known failures of NoScript in this regard.

      • Re: (Score:3, Interesting)

        by Teun ( 17872 )
        Today I booted up the WinXP partition on a netbook that normally runs Kubuntu, last time was over a year ago and I thought why not update now it's still possible.

        Java popped up explaining there was an update and I let it install.
        Once the install was done I was surprised by being asked my permission to run a check on the Java website, I was even given the option to tick a box to 'always trust Java from this publisher'.

        Does the latest Java version now have such a site by site or publisher dependent protec

      • by DrSkwid ( 118965 )

        You obviously weren't doing it right.

        NoScript for Firefox and equivalent for Chrome have a site-by-site whitelist. One only enables JS as required.

        I have also found it instructional as to the amount of third party access sites are willing to sell. Particularly keen seem to be online newspapers. e.g. WashingtonTimes.com has Javascript served up from 15 domains.

        Combine that with other add-ons like Ghostery and Self-Destructing-Cookies and you will be unpleasantly surprised at the unnecessary crap that's throw

      • by cas2000 ( 148703 )

        use a separate browser (preferably on a separate machine or VM, or at least a separate login account) with js enabled for your banking.

        Unfortunately, JavaScript has become a necessity for the Web.

        not quite a necessity, but many sites are over-using jquery and other javascript toolkits (even for basic stuff like links that can and should be done a a href tags) and the web is *far* worse for it. it makes the web slow, frequently causes 100% CPU utilisation (which is no easy task on a modern 4 or 6 core mach

    • by gl4ss ( 559668 )

      or add random delays to drawing and fix the bugs which allow viewing of other sites sources and screenshotting.

      those are really the bigger thing than finding out if a link has a visited class applied on it or not.

      • Yes, adding random delays from the browser should fix the problem for the timing attack.
        • by plover ( 150551 )

          Random delays only delay an attacker, they may not prevent it.

          Let's say you had a reply that took 25 milliseconds if it was cached, and 75 milliseconds if it wasn't. To fix it, you add a delay from 0-100 milliseconds to a reply. The attacker would just have to repeat his attack about six times to see the average response time. He'd figure it out soon enough.

          • by gl4ss ( 559668 )

            well, always act like the link was visited before or like it never was.

            getting the contents of iframes is the bigger issue here anyways, I'm not sure why he is making such a big deal about the timing attack since the timing attack in principle has nothing to do with the iframe contents peeking or the gfx effect filter bug ??

    • by Anonymous Coward

      Disable Javascript.

      The fucking assholes in the Firefox project decided against all common sense to take away the option to disable javascript on a global/per site.
      But hey, the modern rage is all about taking away options so...

    • Comment removed based on user account deletion
    • Re: (Score:3, Insightful)

      by jarle.aase ( 1440081 )
      I agree.

      If enough users disable javascript, sites will be forced to provide a content generating back-end alternative. Js is becoming the new Flash. Opening wide up for vulnerabilities, and draining your laptops battery.

  • by Anonymous Coward

    like disabling javascript?

    • Or just port the browser over to Java. Then the attack can't tell between a slow link, or the obnoxious garbage collector kicking in. :P
    • Have you actually tried this? Are there any sites left that don't rely on javascript?

      You might as well just disable your browser.

      • Do you mostly surf porn sites? I find that something like 80% of web sites I browse display just fine without javascript. And the remaining 20% can often be substituted with equally good sites that do display without javascript.

  • by Anonymous Coward

    Seems like turning off javascript should be a simple fix to a javascript based attack.

    • by Horshu ( 2754893 )
      That's akin to turning off Flash to get rid of ads. Sounds like a good - no, great - idea, until you run into the problem of so many sites depending on it. Better fix would be for the browsers to allow disabling JS on a per-site basis, or better yet, allowing disabling of individual JS APIs (yeah, it could turn site behavior into a clusterf$%k, but I would give up red meat to be able to disable window.open())
      • That's akin to turning off Flash to get rid of ads. Sounds like a good - no, great - idea, until you run into the problem of so many sites depending on it.

        Not very many sites depend on Flash. Mainly video and online game sites. And there's always the option to whitelist.

      • Now that 99% of video is on YouTube and therefore accessible via html5, is Flash actually used by any significant site for anything but games and ads? I guess a few porn sites use it for video?

        For about three years I haven't had Flash installed in my main browser and I haven't missed it. Maybe twice a year I open my other browser to see one of the above.
      • by 0123456 ( 636235 )

        That's akin to turning off Flash to get rid of ads. Sounds like a good - no, great - idea, until you run into the problem of so many sites depending on it.

        I uninstalled Flash a while ago. Other than youtube, I run into maybe one site a month that won't work without Flash, and they're clearly run by retards so I'm better off not going there.

  • My browser won't let me open the target web site because it thinks it's nasty!

  • I thought it would be secure like my telecom....

    Sanitary like the stadium men's room.

    And trustworthy like my bank and credit score.

    Now I am very upset that I'll just have to wear clothing in public, once again. There goes the neighborhood.

  • by Natales ( 182136 ) on Sunday August 04, 2013 @11:20AM (#44470297)
    TFA is correct that there isn't anything to patch per se. However, it's possible to mitigate the effects of this by using multiple completely isolated browser sessions for different purposes. Your banking VM should always be used for banking, nothing else. Clear cookies and browser history at the end of the session. All that while other VMs should be used for their own specific purposes with their own security configuration.

    This is very well implemented in Qubes OS [qubes-os.org] but can also be implemented via regular VMs. The guys at Bromium [bromium.com] have also an interesting approach to this issue via microvirtualization using hardware.

    Net/net, the important thing is to make sure that whatever the attacker can get, it's irrelevant in the big picture of things.
    • by omz13 ( 882548 )
      That's all very well and good, but do you think the average web surfer even knows what you're talking about? Any solution needs to be baked into the bog-standard browsers instead of asking users to do VM magic.
      • by Reziac ( 43301 ) *

        True (tho I was interested to learn of these products) ... so how does one go about that, design-wise? run each part of the browser in its own VM, so it can't see the rest without user intervention?

  • by Anonymous Coward

    Javascript seems like it would be the easier fix (I use three browser for different tasks one of which is everyday browsing and it has javascript turned off.) but javascript is necessary for pretty much everybody. Think GMail and Google Maps. Sure Google could support Thunderbird and have native map clients for everybody but that would require a lot of work( arguably less since javascript is so nasty, but they have kludged around that for the most part already...) So the simpler answer is a no frames/iframe

    • That's what I was wondering--no iframes.
      And for the folks who hate javascript for whatever reason, apparently they think they can substitute an UNBREAKABLE software for it.
      Clue for the clueless--software is only unbreakable when it remains unpopular.
    • I think you underestimate just how many sites rely on frames. Gmail uses them for some functionality for one, though I dunno how critical it is.
      • by pmontra ( 738736 )
        iframes are the only safe way to inject css and js into a third party page with no fears of conflicts with local code. Think of widgets, Facebook social widgets were using iframes last time I checked.
  • Fuck with the timing.

    Huh? What do you mean, "that would require today's programmers to at least know OF Assembler"...

    • Fuck with the timing.

      Huh? What do you mean, "that would require today's programmers to at least know OF Assembler"...

      But... How would learning ASM.js help?

  • by Tetravus ( 79831 ) on Sunday August 04, 2013 @11:29AM (#44470369) Homepage

    So the guy figured out that browsers render all links on a page and then reflow any that should by styled to indicate they have already been visited. Apparently you can figure out which links have been reflowed by checking the number of frames that have to be rendered to display a link. Not a big deal, and if your site uses the same style for links that are already visited, not an actual attack vector.

    The second attack, using SVG (or, I assume) canvas to create a screenshot of what's visible to the end user could be leveraged for an actual attack, you know, if everyone didn't put iframe busting code on their pages served over SSL. Vendors can update the SVG rendering system to adhere to the same cross domain restrictions as other components and not include pixels from iframes in the buffer that is available to inspect via JS and this hole will be closed.

    Not too much to worry about here, but I'm surprised that SVG doesn't already do this (canvas won't allow JS to work with cross-domain images unless they have been served with a header that marks them as "safe" according to their originating service).

  • by minstrelmike ( 1602771 ) on Sunday August 04, 2013 @11:34AM (#44470401)
    What if I just change the css so visited and unvisited links are identical?
    Would js then redraw anything at all?
    • by Noishe ( 829350 )

      yeap. the issue is the browser code, which essentially boils down to:
      ---
      draw link with normal style
      lookup link in visited database
      if link exists in database
      then draw link with visited style
      ---
      The problem is that visited links get drawn twice, while non-visted links get drawn once. It doesn't matter if the links are styled the same or not, as the browser will still go through the motions, and take additional time in the visited case.

      The browser doesn't care if the styles are both the same or n

      • yeap. the issue is the browser code, which essentially boils down to:

        • draw link with normal style
        • lookup link in visited database
        • if link exists in database
        • then draw link with visited style

        Wouldn't the obvious solution be to change the order to lookup the link first, and if the link exists in the visited database, then draw it with visited style, else draw it with unvisited style? That shouldn't be any slower (since the DB has to be checked anyway) and it would eliminate the timing attacks discussed

        • Yes. When they say that there's "no easy way to fix" something, they mean it would require that one of the coders writing web browsers not be fucking idiots. However, history has proved this impossible.
  • If you do things in a secure manner (e.g. OWASP top 10) this should not be a big deal. Turning off JavaScript IMHO is awfully extreme and few would do it anyway. Obviously iFrames should be used judiciously because it opens you to a potential for cross site scripting and other undesirable things on your site. Awareness that linking to public libraries is intentional cross site scripting is critical too. Pre-populating content and controls from user supplied text must be filtered, and fields like passwords,
  • I'll solve this for half.
  • This is great news! (Score:4, Interesting)

    by StripedCow ( 776465 ) on Sunday August 04, 2013 @12:24PM (#44470673)

    The attack works on all of the major browsers and researchers say there's no simple fix to prevent it.

    This may mean that the web will finally be properly redesigned from scratch, using modern insights!
    It's about time!

    I, for one, am looking forward to running webpages in near-native-speed virtual-machine sandboxes!

  • The Slashdot Web site makes extensive use of JavaScript. If the article is accurate, does that mean Slashdot will abandon such use?

  • by Connie_Lingus ( 317691 ) on Sunday August 04, 2013 @12:51PM (#44470795) Homepage

    you know...the locks that (supposedly) protect you and your loved ones and valuables can be easily picked by people with just a tad bit of training and practice...

    terrorists will strike again and kill lots of people but the odds are beyond tiny it will be you or anyone you know...

    the internet is loaded with potential threats and *maybe* someone will actually build a real site that does everything the article says it can...

    i guess im just sick of kneejerk "omfg something is possible so lets all freak out and throw away our freedoms and turn off our browsers and blah blah blah". we live in a world where yes, you just might die in your bed when a giant sinkhole opens up underneath you, and you know what?? that's ok...whats better that we build a giant police state that gives the illusion of security?

    oh yeah...the u.s. IS doing that...never mind.

  • by Joining Yet Again ( 2992179 ) on Sunday August 04, 2013 @01:18PM (#44470905)

    This sort of timing attack was discussed three years ago on the Mozilla blog. [mozilla.org]

    Could someone elaborate on exactly what hasn't been fixed for the Mozilla-based browsers? Dunno about the rest.

  • I'll second (or fifth) the NoScript recommendation. Yes, NoScript can be a bit of a pain in the ass at times, but it sure trims down the the amount of crap that runs. JavaScript wasn't designed with security in mind, so it'll never be secure - they can only spackle over the cracks. Best you can do is minimize how much runs in the cesspit of the Internet. I also find that I only have to allow it for a few regular sites, so once you're past that there's not much maintenance. Most sites still work without it,

  • Use a totally separate browser context for each different place you want to carry out secure, JavaScript-based, web activities. Although there are other ways (need a little coding), the simplest way to do this is just create multiple users (on your own computer), each designated for visiting the places you need security (one for each bank account, one for each retailer, one for access to work related stuff, etc). Browsers do have special features to do this kind of thing, but I have found they are not as

An adequate bootstrap is a contradiction in terms.

Working...