Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Businesses Apple

How to Save Mac OS X From Malware 222

eXchange writes "Well-known hacker Dino Dai Zovi has written an article at ZDNet discussing last week's discovery of a critical threat to Mac OS X, and another announcement of a Trojan horse exploiting this discovery. He suggests that Snow Leopard, or Mac OS X 10.6, should integrate more robust means of preventing malware attacks. Some of the suggestions he has include mandatory code-signing for kernel extensions (so only certified kernel extensions can run), sandbox policies for Safari, Mail, and third-party applications (so these applications cannot do anything to the system), and some lower-level changes, such as hardware-enforced Non-eXecutable memory and address space layout randomization."
This discussion has been archived. No new comments can be posted.

How to Save Mac OS X From Malware

Comments Filter:
  • by rsmith-mac ( 639075 ) on Tuesday June 24, 2008 @11:43AM (#23918063)

    Make Mac OS X like Windows Vista (64bit Vista has almost all of the things listed in his article).

    If it does get implemented, it'll be interesting to see how Jobs talks it up since Apple wouldn't have been first.

    • by mingot ( 665080 ) on Tuesday June 24, 2008 @11:45AM (#23918109)
      Won't matter. Most malware is installed via the user while installing the latest screensavers, emoticon packs, and browser toolbars. Nothing will ever be able to defeat the uneducated user.
      • Re: (Score:3, Insightful)

        by timster ( 32400 )

        Indeed -- leave it to OS hackers to dream up a worthless technological solution to a UI problem. If the interface was designed to give users the faintest notion of what was happening on their computers, we would see progress. Instead we give people interfaces that pretend to simplify complexity while really just glossing over important details, and then we whine about users being uneducated about the details that we've glossed over.

        • by virgil_disgr4ce ( 909068 ) on Tuesday June 24, 2008 @01:08PM (#23920173) Homepage
          It's not the interface's problem, it's the fact that 98% of computer users do not want to and will not learn anything about their computer. Some people will actively refuse to learn anything. So in light of that, the root of the problem is far, far deeper :(
          • by Goeland86 ( 741690 ) <`goeland86' `at' `gmail.com'> on Tuesday June 24, 2008 @01:20PM (#23920429) Homepage

            It's not the interface's problem, it's the fact that 98% of computer users do not want to and will not learn anything about their computer. Some people will actively refuse to learn anything. So in light of that, the root of the problem is far, far deeper :(

            Well then the solution's simple. Give people a license to use a computer. A computer is infintely more complex than a car, yet you need a driver's license for a car. Pending that, if a user decides to NOT get their "computing license", well they deserve to be infected by spyware, regardless of OS, browser etc.

            Attempting to make products idiot-proof should not exist. If you want everything to be idiot-proof, you're ensuring that evolutions stops. Even the most hardliner christian can't deny the fact that some people are morons, dangerous or otherwise incapable of contributing to society.

            Hence why we need to keep darwinism alive in some form or another. Unfortunately the US has too many lawyers that allow idiots to sue companies into making products idiot-proof, instead of letting idiots manage their population the only way they know how to: let the idiots be idiots and see which ones pull it through. They're either very lucky, or not that idiotic if they manage to not kill themselves.

            • Bad car analogy (Score:4, Insightful)

              by DrYak ( 748999 ) on Tuesday June 24, 2008 @02:04PM (#23921427) Homepage

              Give people a license to use a computer. A computer is infintely more complex than a car, yet you need a driver's license for a car.
              Except that someone trying to drive a car without having learned it first will very probably lead to an accident which could even lead to several dead people include both him and innocent by standers.

              A car with an uneducated driver is a potential very powerful weapon.

              A computer used by an uneducated user... well at worst he'll screw his computer. Maybe piss off some innocent other web users with the spam mail that the zombied PC will spit. And even eventually might got some money stolen if too much personal data is spied.
              But unless the random guy is operating a computer controlling a nuclear core (and those already *are* selected and trained to be good at their job), it's very unlikely that the screw-up will result in deaths.

              That's why you won't see computer license any time soon, because the perceived risk (nobody will die at the end) is much lower than the perceived advantage (internet usage has become pervasive, it's so important and useful that anyone *must* have access to it).

              The only thing that you could remotely imagine is a tiered approach to internet security :
              the global net is accessible to anyone, but only common service are found on it. Special service are connected to a different network, which is more secure and more reliable but does necessitate special clearance.

              Think in terms of "Internet freely available for all, Internet2 & GEANT only for hospitals, nuclear reactors and those who pass some license".

              But you can't just shut people of internet because our society relies on it and anyway, nobody will die.

            • by UnknownSoldier ( 67820 ) on Tuesday June 24, 2008 @06:32PM (#23925577)

              > Well then the solution's simple. Give people a license to use a computer.

              Riiiiiight, just like a driver's license prevents traffic accidents, a gun license prevents shootings....

              A license is not an indicator of any safety, wisdom, or experience.

              You can't regulate stupidity or intelligence.

            • by Macgrrl ( 762836 ) on Tuesday June 24, 2008 @07:59PM (#23926609)

              Well then the solution's simple. Give people a license to use a computer. A computer is infintely more complex than a car, yet you need a driver's license for a car.

              It'll happen sometime after they make it compulsory to have a license to have children - which lets face it - are several times more complex than either a car or a computer.

          • Re: (Score:3, Interesting)

            It's not the interface's problem, it's the fact that 98% of computer users do not want to and will not learn anything about their computer.

            Bullshit. How hard is it to create an interface that can easily and consistently show executables and data differently. Seriously, add a red ring around all executables, or something more subtle, just something that isn't duplicated by the icons for data. That would solve a myriad of security problems and I don't think it would be to onerous for users to learn. But instead we expect them to interpret hundreds of three letter codes indicating file types, codes which are sometimes visible and sometimes hidd

            • by virgil_disgr4ce ( 909068 ) on Tuesday June 24, 2008 @02:26PM (#23921853) Homepage
              Whoa there, tiger. You seem to be missing the point of my post: that most users don't know what an "executable" or "data file" is in the first place, and will likely not use the computer often enough to learn by exposure.

              And I never said that there aren't bad interfaces. I personally think Windows has one of the worst, for the very reasons you describe.

              It's still incredibly important that interfaces are designed logically and efficiently! But any interface nonetheless requires some degree of learning--"intuition" in interfaces is only, in fact, "familiarity."
              • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday June 24, 2008 @03:09PM (#23922633)

                Whoa there, tiger. You seem to be missing the point of my post: that most users don't know what an "executable" or "data file" is in the first place, and will likely not use the computer often enough to learn by exposure.

                How would they know if the user interface makes no distinction? You have to fix the UI first, to reduce the level of education needed to something reasonable. Seriously, most user want to run programs they don't completely trust and their inability to do so is one of the primary causes of insecurity. Current OS's make this incredibly common task very, very onerous. Really the easiest way to do that these days is to but a VM, install it, configure it appropriately for the program you want to run, create a new image, install an OS, install the program within the OS, and finally run it. That takes money and significant skill and time and is simply too onerous for the normal user.

                But any interface nonetheless requires some degree of learning--"intuition" in interfaces is only, in fact, "familiarity."

                You can call it whatever you want, but different interfaces and the functionality they connect to make a huge difference in how much education, skill, time, and money it takes to compute securely. Until OS's catch up, people constantly calling for education and blaming users are part of the problem, more than the solution, IMHO.

                • How would they know if the user interface makes no distinction?

                  A visual distinction imparts little to no knowledge without context. If the end user doesn't understand what an executable is, the user interface making a distinction becomes meaningless. "There's a red ring around these pictures that launch my word processor, image editor, and web browser. Those all do different things. How are they similar? What gives?" is a more likely reaction.

                • by Sancho ( 17056 ) *

                  Really the easiest way to do that these days is to but a VM, install it, configure it appropriately for the program you want to run, create a new image, install an OS, install the program within the OS, and finally run it. That takes money and significant skill and time and is simply too onerous for the normal user.

                  Only you shouldn't have to do this. The OS should protect programs from each other, and unless the program needs it, it should not ever be able to see outside of its own memory space. Ideally, it shouldn't be able to see outside its own area for storing files and temporary data. This alone would go a long way towards preventing data leaks from malware.

                  Of course, the user needs to know not to allow the program to elevate privileges. That's where the onerous tasks you mention come in. Make the user type

            • I can't tell if my reply to this earlier is displaying correctly; in case it is not, this reply was intended for this post:

              Whoa there, tiger. You seem to be missing the point of my post: that most users don't know what an "executable" or "data file" is in the first place, and will likely not use the computer often enough to learn by exposure.

              And I never said that there aren't bad interfaces. I personally think Windows has one of the worst, for the very reasons you describe.

              It's still incredibly impo
            • by Chelloveck ( 14643 ) on Tuesday June 24, 2008 @04:48PM (#23924175)

              Seriously, add a red ring around all executables, or something more subtle, just something that isn't duplicated by the icons for data.

              Sure... But only if you can first give me unambiguous definitions of "executable" and "data". Into which category does a Word document fall? How about an HTML file? An arbitrary file without a filename extension?

              Simplistic "solutions" like this have gotten us where we are now. A warning is popped up whenever the user tries to do anything useful with the computer. "Oooh, that file might be dangerous, do you really want to open it?" Give the user a half dozen of those a day and you've trained him to just blindly click "Yes, dammit!" to the security dialogs.

              And that doesn't even begin to address the bigger issue, which is that users are easily tricked into running programs that they shouldn't. "Wow! Some random person just emailed me a picture of Natalie Portman naked in hot grits! Let me just double-click that self-extracting ZIP..." Or, more subtle, "Wow, that Comet Cursor looks really cool. Let me just click 'yes' to all these security warnings, because I really do want to install and run it."

              • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday June 24, 2008 @06:25PM (#23925501)

                Sure... But only if you can first give me unambiguous definitions of "executable" and "data".

                For the most part, that distinction is clear, although a few programs blur the lines, we should probably be asking if that is a useful thing to do or just a security mess from lousy design.

                Into which category does a Word document fall? How about an HTML file?

                With properly coded applications, both should be data if stored locally anyway. When accessed via a browser, we should establish a convention. I see no reason for word or HTML files to do anything outside of the sandbox of the program opening them.

                An arbitrary file without a filename extension?

                That's easy, data... until you change the file to be executable and assign it a proper extension.

                Simplistic "solutions" like this have gotten us where we are now. A warning is popped up whenever the user tries to do anything useful with the computer. "Oooh, that file might be dangerous, do you really want to open it?"

                That's not a simplistic solution, nor even a solution in most cases. It is just a way for the manufacturer to transfer blame for security failures. Most don't even seem to be intended to increase overall security. That doesn't mean you can't make good security changes or simplify things in ways that make things easier for users. Seriously, what we have now is not working.

                And that doesn't even begin to address the bigger issue, which is that users are easily tricked into running programs that they shouldn't.

                This is, in my opinion, a misstatement of the problem. The problem is not that users run programs that they shouldn't. It is that users want to run programs they don't trust, but without significant risk. They can do it today using VMs, but surely OS manufacturers should be able to come up with a more convenient method of letting people run potentially dangerous software in a safe way. The main problem now is users have to take a gamble. I want to play this game, if it is a game, so I'll guess it isn't malware and give it a try. The OS should be telling them it is malware or if it is unknown, should be telling them what it is trying to do, before it does it. You'd think this incredibly common use case would be a priority by now, but for the most part only Windows has a big trojan problem and they also have a monopoly so why should they care?

                • by Anpheus ( 908711 )

                  What about excel files? The whole point of an Excel file is to manipulate data, sometimes pull it from a database or a file.

                  I bring that up because this is how a minority of people use Word documents as well.

            • What is executable and what is data?

              What about a word processing document that supports a macro language, or a bundle of HTML+JavaScript+resources (one of those buzzword-compliant local web 2.0 RIA/flex/flash/AIR application thingies). In these cases, the user is implicitly opening an executable elsewhere on the system to handle the documents, and now you're trusting that application to properly validate and sandbox those document/programs.

              OK, so you can also flag documents that contain executable data. B

          • by erroneus ( 253617 ) on Tuesday June 24, 2008 @02:26PM (#23921851) Homepage

            Having knowledge is having additional responsibility. It took me quite a while to arrive at that conclusion, but if people can claim they didn't know or don't understand something, they are therefore not responsible for it. This goes well beyond knowing about computers and into all facets of life. For me, knowledge has always been important and desirable, so it was really hard to understand why the majority of people don't want any. But I believe I've hit upon the precise essence of why people don't want to know anything... they don't want it to be their fault.

          • Someday, it's going to get to the point where you're not allowed a computer on a network unless it's maintained and certified by an admin as network-worthy, just like you're not allowed a car on the road unless it's maintained and certified by a mechanic as road-worthy. Until then, we're all doomed to endless spam, and users complaining that they should be able to maintain a complex computer themselves without any effort.

          • the root of the problem is far, far deeper :(
            And by "deeper" you mean "between the keyboard and the chair"
      • by Shivetya ( 243324 ) on Tuesday June 24, 2008 @12:19PM (#23918891) Homepage Journal

        It was always going to eventually happen. Given the increasing market share of OS X it was only a matter of time before the hackers got interested. Yet even they had to wait till a sufficient base of idiots got into OS X to make their job easier. I know people who significant other has trashed home PCs more than once opening attachments or running attachments even after all the pop ups. Note the more than once.

        People forget or get in a hurry. Its the hacker's job to exploit that nature. That makes it difficult for the owners of the OS because even if you require a password/etc to execute something many people will just do that, type in the password regardless. Its like the story of the young girl who was a latch key kid, told to never ever let people in the house while mom was gone. Yet she did three times and even denied it until shown the film showing these people being let in. Worse, she didn't recall because it was so automatic. She was distracted by something else and that focus let her pass over doing what was right.

        I look at it this way on my iMac, if that password prompt comes up and I didn't click initiate it from some update I know came from Apple or I was loading a package I downloaded I am going cancel the process. Yet I am quite sure my friends SO would dutifully type the password in. Can't be helped. Sometimes people cannot accept they did something wrong even when you show them

      • by vertinox ( 846076 ) on Tuesday June 24, 2008 @12:29PM (#23919157)

        Nothing will ever be able to defeat the uneducated user.

        True, but you can mitigate the damage a single user can do. Its called sandboxing.

        If you prevent a user from installing applications that get to do things like put themselves in start up or have the ability to hide themselves from the user or start on their own without user intervention then you've done half the battle right there.

        OS X still can do this with admin rights which I fear most people run, but its a start at least.

        Of course, a malicious one time application can always wipe the user directory in these situations but that is what backups are for. However, its a lot easier to get rid of that malicious program if you the OS itself won't allow you to create startup programs or allow applications to run in stealth mode.

      • Re: (Score:3, Insightful)

        by vux984 ( 928602 )

        Won't matter. Most malware is installed via the user while installing the latest screensavers, emoticon packs, and browser toolbars. Nothing will ever be able to defeat the uneducated user.

        True enough for the average home user, but the corporate/enterprise/government desktop is a whole other ballpark, and in that environment stuff like sandboxes and driver signing make a lot of sense.

        Also as a 'sophisticated' user, using Vista x64, I quite like the driver signing concept.

        I think its GREAT that some driver I

        • Screw that. Mandatory driver signing is unacceptable, as it's no longer a general purpose computer strictly under my control. The answer to your question is that NO, you can't sign your own drivers for Vista and/or distribute them to other people to use. It would be like the vendor keeping control of the root account with some super secret password, and only giving the user some crippled 'admin' account without access to the whole computer. When I bought my computer, the OS and all its files became mine, an

          • by jcgf ( 688310 )

            The answer to your question is that NO, you can't sign your own drivers for Vista and/or distribute them to other people to use.

            It should just be and there, shouldn't it? You can sign your own drivers. You can also distribute drivers. You just can't do both with the same driver at the same time (not for technical reasons though - license ones).

            I don't mind having a certification process for 'safe' drivers, and then have some mechanism for booting in safe mode with only safe drivers loaded if there is a problem with one of the unapproved drivers.

            Well, they do it the other way (you have to manually disable the signing requirement - we did it for a CSP at work - using a hex editor and instructions from MS (edit advapi32.dll) - it should be similar for drivers) which is kind of half way to what you want.

          • by vux984 ( 928602 )

            Screw that. Mandatory driver signing is unacceptable, as it's no longer a general purpose computer strictly under my control.

            It is if you have a signing key for that computer.

            The answer to your question is that NO, you can't sign your own drivers for Vista and/or distribute them to other people to use.

            Of course you can sign your own drivers and give them to other people. You have to buy a certificate for that, but lots of companies have manged it, including some very small ones.

            The more interesting scenario

        • Re: (Score:3, Interesting)

          And with that said, IS it possible to sign your own drivers for your own Vista machine? I'd very much like to know what is involved in doing that.

          I only have an indirect answer: According to the vendors of some of the specialized hardware my clients and I use, the only way to use their hardware under Vista is for them to either get their drivers signed by Microsoft, or for them to rewrite their firmware and DLLs to allow using generic drivers. All of them chose to do the rewrite and use the generic driver. For example, several of the devices we use utilize the FT2232 USB microchip in the hardware. Originally, the vendors licensed the driver source

          • by vux984 ( 928602 )

            I only have an indirect answer: According to the vendors of some of the specialized hardware my clients and I use, the only way to use their hardware under Vista is for them to either get their drivers signed by Microsoft, or for them to rewrite their firmware and DLLs to allow using generic drivers.

            Yes, if you want to distribute hardware that 'just works', you *have* to get your drivers signed by MS.

            All of them chose to do the rewrite and use the generic driver. For example, several of the devices we use u

      • Re: (Score:2, Interesting)

        by Sentry21 ( 8183 )

        Part of that can be resolved by sandboxing. Prevent screensavers, etc. from being able to access anything on the system outside of a small, well-defined set of resources; have the author define that list, and the system enforce it. Network access? Disk access? Safari RSS feeds? Require authentication and code signing.

        Oh, and make code signing easy, so people don't have to fork out huge amounts of money to sign their code. Apple could provide a signing service, where you have to apply and go through a verifi

      • by mingot ( 665080 ) on Tuesday June 24, 2008 @06:39PM (#23925689)
        Replying to myself here, and to all above who have proposed solutions: The same day they make an OS/Computer on which a user can't screw himself is the same day they come out with unbreakable DRM. It's the same game, really.
    • which we don't need. if we make the malware AUTHORS more like Vista 64, they won't be able to infect anything else.

      Accept or Deny?

  • by jonwil ( 467024 ) on Tuesday June 24, 2008 @11:47AM (#23918147)

    Signed kernel modules would not just stop malware but it would stop some of the hacked (and custom written) kernel modules being used to get OSX to run on non apple machines (or being used to make the experience of using OSX on those machines better)

    • Re: (Score:3, Insightful)

      by Hierophant7 ( 962972 )
      please, the mach kernel was hacked to bypass TPM, it'll be hacked to bypass driver-signing.
      • The point of driver signing isn't to act as a copy protection mechanism. You can boot Vista64 in a mode that'll allow you to load any drivers. The point is to stop programs loading crap into the kernel without the users knowledge. If you have to put the OS into some kind of very obvious "unsafe mode" then the problem becomes much less serious. Can you imagine malware popping up a dialog explaining some complicated boot sequence to the user?
    • Signed kernel modules would [...] stop some of the hacked (and custom written) kernel modules being used to get OSX to run on non apple machines (or being used to make the experience of using OSX on those machines better)

      Opinions on whether or not this is a good thing are varied.
  • deja vu? (Score:5, Insightful)

    by neongrau ( 1032968 ) on Tuesday June 24, 2008 @11:49AM (#23918175)
    Isn't that excactly the same stuff Microsoft talked about years ago and many ppl on slashdot cried "foul!" about it?

    But then again it all makes sense for Apple. The iPhone's App Store pretty much does all that. And when it works out Apple might just start an Mac App Store. No executable program launchable if it doesn't originate from the App Store. Or only in some considered insecure sandboxed VM. That could even work, but is that really what users want?
    • And when it works out Apple might just start an Mac App Store. No executable program launchable if it doesn't originate from the App Store.
      Developers don't want to have to immigrate to the United States and pay an annual or per-application fee just to develop Macintosh applications. That would only serve to drive smaller developers to Ubuntu.
      • Apple's OS becomes the paragon of security people think it is and Linux gets more devs. Everybody's happy.

    • Code signing (Score:3, Insightful)

      by Sloppy ( 14984 )

      Isn't that excactly the same stuff Microsoft talked about years ago and many ppl on slashdot cried "foul!" about it?

      Where Microsoft went wrong with code signing, is that insist the code be signed by them, because the user or administrator is an enemy (i.e. might install a video driver that doesn't respect DRM).

      Code signing is harmless if the machine's administrator is the ultimate authority.

      The issue is: whose interests should the OS serve: the OS maker, the user, or (in the case of malware) anyone w

      • by Sloppy ( 14984 )

        Where Microsoft went wrong with code signing, is that insist the code be signed by them, because the user or administrator is an enemy (i.e. might install a video driver that doesn't respect DRM).
        Oh, and judging by the iPhone, Apple's attitude is identical, so if they implement code signing for MacOS, I expect them to make the same mistake.
        • Microsoft has never attempted to require code signing for drivers. Users have always been able to override that.

          They tried to require it for easy, warning free install but unfortunately a lot of manufacturers attempted to game the system (ie, hide the warnings in some way or instructed the user to ignore them) - unsurprisingly, these very same vendors were the ones writing buggy crash-prone crap.

          Given that most users are their own administrators at home, I don't know who exactly you think should be sign

      • by Anonymous Coward

        Microsoft does not require that the code be signed by them. They simply require that the code be signed, by any certificate issued by a signing authority.
        All the code we develop for Windows is signed by us, and installs perfectly fine on Vista, and Microsoft has never seen a single line of our code.

      • The issue is: whose interests should the OS serve: the OS maker, the user, or (in the case of malware) anyone who manages to get their code onto the machine?

        From the point of view of the OS maker's lawyers. the OS maker.

        IANAL, but as I understand the argument, in order to protect the user's interest, the OS must protect itself from the user, just in case he does something stupid, like authorize the installation of a malicious driver. Otherwise, said user might sue the OS maker, claiming "You put code signing in the OS to protect me from malicious code, so why did it not project me?"

        • The issue is: whose interests should the OS serve: the OS maker, the user, or (in the case of malware) anyone who manages to get their code onto the machine?

          From the point of view of the OS maker's lawyers. the OS maker.

          IANAL, but as I understand the argument, in order to protect the user's interest, the OS must protect itself from the user, just in case he does something stupid, like authorize the installation of a malicious driver. Otherwise, said user might sue the OS maker, claiming "You put code signing in the OS to protect me from malicious code, so why did it not project me?"

          Actually the thinking is along very different lines. Think about what it takes to know if random-installer.exe you just downloaded is malware, or not:

          • Is the software from the vendor who claimed to have written it?
          • Can we track this vendor down and sue them, if the software turns out to be malware
          • If we're downloading s/w from a different site (bittorrent or download.com) how do we know it was not tampered with, before we downloaded it?

          Digital signatures are intended to address these problems:
          - The ha

      • Re:Code signing (Score:4, Insightful)

        by dhavleak ( 912889 ) on Tuesday June 24, 2008 @06:08PM (#23925281)

        Isn't that excactly the same stuff Microsoft talked about years ago and many ppl on slashdot cried "foul!" about it?
        Where Microsoft went wrong with code signing, is that insist the code be signed by them, because the user or administrator is an enemy (i.e. might install a video driver that doesn't respect DRM).

        Here's the list of Windows' trusted Root CAs: http://msdn.microsoft.com/en-us/library/ms995347.aspx [microsoft.com]. Only third-parties are on that list -- not Microsoft.

         

        Code signing is harmless if the machine's administrator is the ultimate authority.
        Take a look at CertMgr.exe (specifically, play around with the 'import' function). The administrator is the ultimate authority, and this is the case in XP/2003/Vista/2008.

         

        The issue is: whose interests should the OS serve: the OS maker, the user, or (in the case of malware) anyone who manages to get their code onto the machine? If the OS designer answers that question correctly, then there's no problem with code signing (or other whitelisting approaches).
        I agree. I think you have to admit that MS has addressed these concerns.

         

        Naturally, the author of TFA got it wrong:

        Most kernel extensions are from Apple anyway and for the few common 3rd party ones, they should be required to get a code signing certificate.
        Required by whom? A certificate from whom? And the amount of trust delegated to this CA is what?
        I'd say the author got it right. Your concern is valid, but it's orthogonal to the point of TFA. Code signing is a Good Thing and Apple might implement it -- that's the point of TFA. The third-party approach is the correct way to do it -- that's your point.

        What's sad is the number of people on /. that crucify MS without realizing that their implementation has already addressed all the things they are complaining about (and has done so from day 1).

    • by Ilgaz ( 86384 )

      http://developer.apple.com/releasenotes/Security/RN-CodeSigning/ [apple.com]

      It has nothing to do with iPhone things store or Microsoft. It is YOU who sign the application, the developer, freely.

      I can't blame you for the misunderstanding, thanks to iPhone model for that.

  • Impossible (Score:2, Insightful)

    by katch22 ( 1248646 )
    This doesn't make sense--I always thought Macs were impervious to the simple things that "plague" my Windows PC.
  • Why would a sandbox for Mail, Safari, etc. be necessary if the user isn't running these applications with root privileges?
    • Re: (Score:3, Informative)

      by rsmith-mac ( 639075 )

      Because running as the user is basically just as good. The user doesn't care what a piece of malware has infected or destroyed, only that it has done so.

      • Re: (Score:3, Insightful)

        by cowscows ( 103644 )

        Also, to me as a user, the single most important thing on my computer would be all my documents, which are accessible from my account. Sure, it's not great for my machine to be turned into an spam zombie or whatever, but reinstalling my OS isn't the worst thing in the world. It'd take me a couple hours at most. But recreating all the documents/photos/movies that I've got saved under my account would take much longer, and in many cases be impossible.

        I know that's what backups are for, and I've got backups of

        • Re: (Score:3, Insightful)

          Also, to me as a user, the single most important thing on my computer would be all my documents, which are accessible from my account.

          Unfortunately, for a sandbox to protect these documents will greatly limit the usefulness of applications running in a sand box.

          Of course, a web browser or chat client would be least limited. But if you had something legitimate to upload/send, then you are looking at poking holes in the sandbox. With email, even if you never send an attachment, or save a received attachment, it gets complex, because all those messages - and the address book - are valuable to the user. If you keep them in the sandbox, the

          • I fully agree. I'm not suggesting sandboxing my browser out of my files, that'd be far too limiting. I was only attempting to explain to the grandparent post why malware doesn't need root access to be problematic. I guess that wasn't really clear from my original comment.

  • It's a local-only root privilege escalation exploit.

    If you're in a position to exploit this, you're already running code with full local user privileges.

    Once the system is penetrated, it's game over. You don't need to get root access, or Administrator access, or even break out of the "Reduced Security" sandbox to win basically everything that the guy writing the malware actually needs. Multiuser security is there to protect users from each other, not from themselves.

    Recent studies of anti-lock brakes and safety have discovered that ABS doesn't improve safety in general. It improves braking, by letting people brake faster and smoother, but people get used to it and enough people end up depending on ABS that they end up just braking later and when they need the extra edge from ABS they've already used it up.

    Before going off half cocked proposing more layers of complex software that has to work correctly to maintain system integrity (because if it's there, enough software developers will end up depending on it) how about looking at what features of systems promote malware distribution? Design applications so they are inherently safe, rather than filling them with holes and backfilling with kernel patches and warning dialogs?

  • by owsla ( 78381 ) on Tuesday June 24, 2008 @12:02PM (#23918493) Homepage

    Apple already does address space layout randomization in Leopard (Mac OS X 10.5)

    See "Library Randomization" on
    http://www.apple.com/macosx/features/300.html#security [apple.com]

    Notice that the new security features list also includes code signing and sandboxing. The technology is there, it's just not setup throughout the system.

    • by argent ( 18001 )

      Address space randomization and no-execute are useful tols.

      Code signing and sandboxing are nothing more than speedbumps, like the stupid security dialogs in Windows that are leaking into OS X.

      The places to strengthen are the front lines, because once the attacker's gotten into a place where he can modify applications or attack an OS sandbox he's already running local code and he's already gotten virtually everything he needs to **** you.

      • Re: (Score:3, Interesting)

        by maxume ( 22995 )

        UAC is as much about putting social pressure on application vendors to write applications that take advantage of the multi-user security as it is about backwards compatibility. It is more about both of those than it is about actual security.

        • by argent ( 18001 )

          UAC is as much about putting social pressure on application vendors to write applications that take advantage of the multi-user security as it is about backwards compatibility.

          I'm not talking about UAC. I'm talking about all the stupid security dialogs that Microsoft has added to Windows over the years. I could have made this comment any time in the last decade... in fact I have. Many times.

          UAC is nothing more than the latest player to tread the boards of Microsoft's Security Theater.

      • by Ilgaz ( 86384 ) on Tuesday June 24, 2008 @05:59PM (#23925189) Homepage

        On OS X, sandboxing is different. Please read couple of pages from Apple mailing lists before comparing it to its bad photocopy. OS X hasn't got a problem with Applications running under normal user account so there is no community to educate with stick (like MS does).

        Safari.app will be able to say "Here are my directories and the system calls I will make". So Safari won't even see a Framework or System folder. Way more detail at http://www.318.com/techjournal/?p=107 [318.com]

        On OS X Leopard, there are couple of deep level technologies already having sandbox technology (spotlight and bonjour) and Apple is preparing it for general developer use.

        OS X "stupid security" dialogue works well, so damn well that it is able to figure out Adobe AIR Applications user installed over the web. The "stupid dialogue" could be a life saver in future. I am not speaking about the Windows horrible copy.

        Code signing is not like the Verisign pyramid scheme on Windows, ANY Developer can sign their application free. People actually adopt it, even including Adium X like open source applications. There is no "Apple certified" or "Verisign Secure" junk, it is application signing which is meant to benefit the user and developer. By signing it, you just make sure your files aren't tampered after user trusts it so no lamers taking advantage of your application (and users trust). There are no other advantages, OS X treats your Application just like unsigned Applications. It is not the signing in Microsoft Windows. If user updates unsigned Application, OS will prompt if he/she wants to grant access since there is no way making sure that it is the same binary from very same developer user trusted at first place. If user updates a developer signed binary in a normal way and the signature is the same, it doesn't prompt.

        Read this for more info:
        http://adiumx.com/blog/2008/04/adium-application-security-and-your-keychain/ [adiumx.com]

        • I've been using UNIX for 30 years, I've worked on safety-critical software and in the control systems industry for 20 years, and I was solely responsible for network security for over a decade of that. I'm pretty familiar with this stuff.

          On OS X, sandboxing is different. Please read couple of pages from Apple mailing lists before comparing it to its bad photocopy.

          The problem is that it is not in principle possible to build a sandbox around an application like Safari that would both permit it to do the useful things it is supposed to do and prevent it from doing malicious things.

          * If Safari can make connections to websites, then Safari can make connections to botnet peers and engage in attacks on websites.

          * If Safari can send mail, it can send spam.

          * If Safari can read my keychain, it can read my website passwords and pass them to an attacker.

          * If Safari can open my bank's web page, it can transfer money out of my account.

          * If Safari can upload files, it can upload them places I don't want it to access.

          * If Safari can download files, it can "download" garbage over the files I value.

          * If Safari can do the things I need Safari to do, a compromised Safari can do the things I don't want it to do.

          A sandbox can not protect the things in my computer that I care about from the applications that manipulate them. The only sandbox that is secure is one that does not allow the application the ability to access any non-volatile resources on my computer, except those that are strictly restricted to the sandbox and not used by any other application. Oh, and it can't make network connections, except in very specific conditions... for example, the Java sandbox lets the application connect back to the originating site.

          THAT is a security sandbox.

          I don't think I would be happy running Safari or Mail under something like that.

          OS X "stupid security" dialogue works well, so damn well that it is able to figure out Adobe AIR Applications user installed over the web.

          But you want to run them, don't you, so you go ahead and approve them, and you are trained to approve these dialogs. I've watched that scenario play out time and time again, with the same people coming back to me saying "I clicked the wrong button again, I think I've got a virus".

          By signing it, you just make sure your files aren't tampered after user trusts it so no lamers taking advantage of your application (and users trust).

          I was building the tripwire configuration for my Cheswick-Bellovin bastion firewall back when Steve Jobs was still at NeXT. I know about the capabilities, restrictions, limitations, and drawbacks of far more pervasive and complete file security mechanisms than what Apple has implemented. Particularly the drawbacks...

          If an attacker is in a position to modify my applications, then there is nothing OS X can do to stop him, he has already got he keys to the kingdom. He already has remote root access, however achieved, and he's not going to hide a trojan horse inside Mail.app, he's going to hide it in /private/etc/somethingobscure, running as root, and use Mach injection to patch Mail.app on the fly.

          As for your linked story: "If you mess with the Adium binary in any way, you will invalidate the signature, and access to secure resources -- specifically keychain items where your passwords are stored -- will be disallowed by Mac OS X."

          That's a hell of a drawback. That by itself is enough to make me hold off installing Leopard until I've got time to look up how to disable that paranoid security theatre.

    • The technology is there, it's just not setup throughout the system.

      Is having a security tool and not using it system-wide any different from not having it at all?
      • The technology is there, it's just not setup throughout the system.
        Is having a security tool and not using it system-wide any different from not having it at all?

        Yes. You can use it for high-risk applications.

  • by psydeshow ( 154300 ) on Tuesday June 24, 2008 @02:18PM (#23921719) Homepage

    I don't care what kind of malware it might be, you can pry the CoolBook Controller extension from my cold dead hands!

    Third-party extensions by dodgy developers are often required to extend the lame control panels that Cupertino sees fit to bless us with. I shudder every time I install an update to smcFanController or CoolBook, but if I don't want my laptop running at 170F what other choice do I have?

    Signing isn't going to make the problem go away. I won't trust these random developers just because they have a certificate. If Apple engineers had time to certify the code itself, they would have time to fix the problems in OSX and firmware that require the use of third-party extensions in the first place.

  • ba dump ump (Score:4, Funny)

    by fred fleenblat ( 463628 ) on Tuesday June 24, 2008 @02:19PM (#23921735) Homepage

    No word yet on MacOS 10.8 Cougar, to be designed with the "active" older woman in mind.

  • Comment removed based on user account deletion
  • I think the "Repair Permissions" thing should be extended to check/repair/normalise user home directory permissions too.

    I am speaking about the "Reset home directory permissions" functionality inside Leopard DVD boot to be an option for disk utility. Also Disk Utility should alert users about SUID files whether they got BOM or not and label it clearly without creating panic. They say "These messages are true but not cause of concern". No, it is a very big concern. An unexpected SUID file on Unix is always a

  • "Mandatory code signing for any kernel extensions. I dont want to have to worry about kernel rootkits, hyperjacking, or malware infecting existing kernel drivers on disk. Most kernel extensions are from Apple anyway and for the few common 3rd party ones, they should be required to get a code signing certificate."

    It is possible to run old kernel extensions on OS X and many benefits from it. Kernel extensions have "minimum" and "maximum" version values. You can't expect every company to release Leopard sign

What is research but a blind date with knowledge? -- Will Harvey

Working...