Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Businesses Apple

How to Save Mac OS X From Malware 222

eXchange writes "Well-known hacker Dino Dai Zovi has written an article at ZDNet discussing last week's discovery of a critical threat to Mac OS X, and another announcement of a Trojan horse exploiting this discovery. He suggests that Snow Leopard, or Mac OS X 10.6, should integrate more robust means of preventing malware attacks. Some of the suggestions he has include mandatory code-signing for kernel extensions (so only certified kernel extensions can run), sandbox policies for Safari, Mail, and third-party applications (so these applications cannot do anything to the system), and some lower-level changes, such as hardware-enforced Non-eXecutable memory and address space layout randomization."
This discussion has been archived. No new comments can be posted.

How to Save Mac OS X From Malware

Comments Filter:
  • by rsmith-mac ( 639075 ) on Tuesday June 24, 2008 @11:43AM (#23918063)

    Make Mac OS X like Windows Vista (64bit Vista has almost all of the things listed in his article).

    If it does get implemented, it'll be interesting to see how Jobs talks it up since Apple wouldn't have been first.

  • by mingot ( 665080 ) on Tuesday June 24, 2008 @11:45AM (#23918109)
    Won't matter. Most malware is installed via the user while installing the latest screensavers, emoticon packs, and browser toolbars. Nothing will ever be able to defeat the uneducated user.
  • deja vu? (Score:5, Insightful)

    by neongrau ( 1032968 ) on Tuesday June 24, 2008 @11:49AM (#23918175)
    Isn't that excactly the same stuff Microsoft talked about years ago and many ppl on slashdot cried "foul!" about it?

    But then again it all makes sense for Apple. The iPhone's App Store pretty much does all that. And when it works out Apple might just start an Mac App Store. No executable program launchable if it doesn't originate from the App Store. Or only in some considered insecure sandboxed VM. That could even work, but is that really what users want?
  • by Hierophant7 ( 962972 ) on Tuesday June 24, 2008 @11:51AM (#23918241)
    please, the mach kernel was hacked to bypass TPM, it'll be hacked to bypass driver-signing.
  • Impossible (Score:2, Insightful)

    by katch22 ( 1248646 ) on Tuesday June 24, 2008 @11:53AM (#23918291)
    This doesn't make sense--I always thought Macs were impervious to the simple things that "plague" my Windows PC.
  • It's a local-only root privilege escalation exploit.

    If you're in a position to exploit this, you're already running code with full local user privileges.

    Once the system is penetrated, it's game over. You don't need to get root access, or Administrator access, or even break out of the "Reduced Security" sandbox to win basically everything that the guy writing the malware actually needs. Multiuser security is there to protect users from each other, not from themselves.

    Recent studies of anti-lock brakes and safety have discovered that ABS doesn't improve safety in general. It improves braking, by letting people brake faster and smoother, but people get used to it and enough people end up depending on ABS that they end up just braking later and when they need the extra edge from ABS they've already used it up.

    Before going off half cocked proposing more layers of complex software that has to work correctly to maintain system integrity (because if it's there, enough software developers will end up depending on it) how about looking at what features of systems promote malware distribution? Design applications so they are inherently safe, rather than filling them with holes and backfilling with kernel patches and warning dialogs?

  • by timster ( 32400 ) on Tuesday June 24, 2008 @12:03PM (#23918509)

    Indeed -- leave it to OS hackers to dream up a worthless technological solution to a UI problem. If the interface was designed to give users the faintest notion of what was happening on their computers, we would see progress. Instead we give people interfaces that pretend to simplify complexity while really just glossing over important details, and then we whine about users being uneducated about the details that we've glossed over.

  • by vertinox ( 846076 ) on Tuesday June 24, 2008 @12:29PM (#23919157)

    Nothing will ever be able to defeat the uneducated user.

    True, but you can mitigate the damage a single user can do. Its called sandboxing.

    If you prevent a user from installing applications that get to do things like put themselves in start up or have the ability to hide themselves from the user or start on their own without user intervention then you've done half the battle right there.

    OS X still can do this with admin rights which I fear most people run, but its a start at least.

    Of course, a malicious one time application can always wipe the user directory in these situations but that is what backups are for. However, its a lot easier to get rid of that malicious program if you the OS itself won't allow you to create startup programs or allow applications to run in stealth mode.

  • by iminplaya ( 723125 ) on Tuesday June 24, 2008 @12:46PM (#23919617) Journal

    hardware-enforced Non-eXecutable memory?

    Unless you can could turn it off, it just sounds like DRM. Why we let third party stuff do anything to the OS is totally beyond me. Yeah, let's leave the cockpit door wide open.

  • Code signing (Score:3, Insightful)

    by Sloppy ( 14984 ) on Tuesday June 24, 2008 @01:01PM (#23919999) Homepage Journal

    Isn't that excactly the same stuff Microsoft talked about years ago and many ppl on slashdot cried "foul!" about it?

    Where Microsoft went wrong with code signing, is that insist the code be signed by them, because the user or administrator is an enemy (i.e. might install a video driver that doesn't respect DRM).

    Code signing is harmless if the machine's administrator is the ultimate authority.

    The issue is: whose interests should the OS serve: the OS maker, the user, or (in the case of malware) anyone who manages to get their code onto the machine? If the OS designer answers that question correctly, then there's no problem with code signing (or other whitelisting approaches).

    Naturally, the author of TFA got it wrong:

    Most kernel extensions are from Apple anyway and for the few common 3rd party ones, they should be required to get a code signing certificate.
    Required by whom? A certificate from whom? And the amount of trust delegated to this CA is what?
  • by vux984 ( 928602 ) on Tuesday June 24, 2008 @01:05PM (#23920087)

    Won't matter. Most malware is installed via the user while installing the latest screensavers, emoticon packs, and browser toolbars. Nothing will ever be able to defeat the uneducated user.

    True enough for the average home user, but the corporate/enterprise/government desktop is a whole other ballpark, and in that environment stuff like sandboxes and driver signing make a lot of sense.

    Also as a 'sophisticated' user, using Vista x64, I quite like the driver signing concept.

    I think its GREAT that some driver I download, or some source code for a driver I download and compile myself, or even a driver I might write myself from scratch can't by default run on everyone's computers.

    That's a good barrier to rootkits etc. Even if a naive user says 'I agree' the driver still won't load. And if a rootkit does get signed, the keys can be revoked at MS, and a gazillion PCs will be immune next time they update.

    Its a good system.

    Of course, its has its frustrations - oss drivers, home made drivers, etc, etc won't work. And as a result:

    Most of the chatter on the net about it, is 'how to disable driver signing', 'how to bypass it', etc. Yet the question people SHOULD be asking is: "How do I sign a driver to run on MY PC?"

    THAT WOULD BE FAR MORE USEFUL.

    It is after all YOUR PC, and you should be allowed to run any driver you want on it. So there *should* be a way of signing it for your PC. As the owner I should have my own private signing key, and anything I sign should run on any PC that has my public key trusted on it. Obviously stuff I sign with this key won't run on your PC because you won't have my public key trusted on your systems, but that's fine and as it should be.

    Of course, this is somewhat at odds with the RIAA/MPAA/DRM objectives with driver signing. But so what, people should be demanding the keys to their computers, and getting them.

    Code/Driver signing isn't evil, its on par with putting a lock on your car or home. Not giving the owners the keys is evil.

    And with that said, IS it possible to sign your own drivers for your own Vista machine? I'd very much like to know what is involved in doing that.

  • by virgil_disgr4ce ( 909068 ) on Tuesday June 24, 2008 @01:08PM (#23920173) Homepage
    It's not the interface's problem, it's the fact that 98% of computer users do not want to and will not learn anything about their computer. Some people will actively refuse to learn anything. So in light of that, the root of the problem is far, far deeper :(
  • Re:Sandbox? (Score:3, Insightful)

    by cowscows ( 103644 ) on Tuesday June 24, 2008 @01:32PM (#23920677) Journal

    Also, to me as a user, the single most important thing on my computer would be all my documents, which are accessible from my account. Sure, it's not great for my machine to be turned into an spam zombie or whatever, but reinstalling my OS isn't the worst thing in the world. It'd take me a couple hours at most. But recreating all the documents/photos/movies that I've got saved under my account would take much longer, and in many cases be impossible.

    I know that's what backups are for, and I've got backups of my important stuff, but the world is an imperfect place and not everything gets backed up.

  • Bad car analogy (Score:4, Insightful)

    by DrYak ( 748999 ) on Tuesday June 24, 2008 @02:04PM (#23921427) Homepage

    Give people a license to use a computer. A computer is infintely more complex than a car, yet you need a driver's license for a car.
    Except that someone trying to drive a car without having learned it first will very probably lead to an accident which could even lead to several dead people include both him and innocent by standers.

    A car with an uneducated driver is a potential very powerful weapon.

    A computer used by an uneducated user... well at worst he'll screw his computer. Maybe piss off some innocent other web users with the spam mail that the zombied PC will spit. And even eventually might got some money stolen if too much personal data is spied.
    But unless the random guy is operating a computer controlling a nuclear core (and those already *are* selected and trained to be good at their job), it's very unlikely that the screw-up will result in deaths.

    That's why you won't see computer license any time soon, because the perceived risk (nobody will die at the end) is much lower than the perceived advantage (internet usage has become pervasive, it's so important and useful that anyone *must* have access to it).

    The only thing that you could remotely imagine is a tiered approach to internet security :
    the global net is accessible to anyone, but only common service are found on it. Special service are connected to a different network, which is more secure and more reliable but does necessitate special clearance.

    Think in terms of "Internet freely available for all, Internet2 & GEANT only for hospitals, nuclear reactors and those who pass some license".

    But you can't just shut people of internet because our society relies on it and anyway, nobody will die.

  • by psydeshow ( 154300 ) on Tuesday June 24, 2008 @02:18PM (#23921719) Homepage

    I don't care what kind of malware it might be, you can pry the CoolBook Controller extension from my cold dead hands!

    Third-party extensions by dodgy developers are often required to extend the lame control panels that Cupertino sees fit to bless us with. I shudder every time I install an update to smcFanController or CoolBook, but if I don't want my laptop running at 170F what other choice do I have?

    Signing isn't going to make the problem go away. I won't trust these random developers just because they have a certificate. If Apple engineers had time to certify the code itself, they would have time to fix the problems in OSX and firmware that require the use of third-party extensions in the first place.

  • by virgil_disgr4ce ( 909068 ) on Tuesday June 24, 2008 @02:26PM (#23921853) Homepage
    Whoa there, tiger. You seem to be missing the point of my post: that most users don't know what an "executable" or "data file" is in the first place, and will likely not use the computer often enough to learn by exposure.

    And I never said that there aren't bad interfaces. I personally think Windows has one of the worst, for the very reasons you describe.

    It's still incredibly important that interfaces are designed logically and efficiently! But any interface nonetheless requires some degree of learning--"intuition" in interfaces is only, in fact, "familiarity."
  • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday June 24, 2008 @03:09PM (#23922633)

    Whoa there, tiger. You seem to be missing the point of my post: that most users don't know what an "executable" or "data file" is in the first place, and will likely not use the computer often enough to learn by exposure.

    How would they know if the user interface makes no distinction? You have to fix the UI first, to reduce the level of education needed to something reasonable. Seriously, most user want to run programs they don't completely trust and their inability to do so is one of the primary causes of insecurity. Current OS's make this incredibly common task very, very onerous. Really the easiest way to do that these days is to but a VM, install it, configure it appropriately for the program you want to run, create a new image, install an OS, install the program within the OS, and finally run it. That takes money and significant skill and time and is simply too onerous for the normal user.

    But any interface nonetheless requires some degree of learning--"intuition" in interfaces is only, in fact, "familiarity."

    You can call it whatever you want, but different interfaces and the functionality they connect to make a huge difference in how much education, skill, time, and money it takes to compute securely. Until OS's catch up, people constantly calling for education and blaming users are part of the problem, more than the solution, IMHO.

  • by Chelloveck ( 14643 ) on Tuesday June 24, 2008 @04:48PM (#23924175)

    Seriously, add a red ring around all executables, or something more subtle, just something that isn't duplicated by the icons for data.

    Sure... But only if you can first give me unambiguous definitions of "executable" and "data". Into which category does a Word document fall? How about an HTML file? An arbitrary file without a filename extension?

    Simplistic "solutions" like this have gotten us where we are now. A warning is popped up whenever the user tries to do anything useful with the computer. "Oooh, that file might be dangerous, do you really want to open it?" Give the user a half dozen of those a day and you've trained him to just blindly click "Yes, dammit!" to the security dialogs.

    And that doesn't even begin to address the bigger issue, which is that users are easily tricked into running programs that they shouldn't. "Wow! Some random person just emailed me a picture of Natalie Portman naked in hot grits! Let me just double-click that self-extracting ZIP..." Or, more subtle, "Wow, that Comet Cursor looks really cool. Let me just click 'yes' to all these security warnings, because I really do want to install and run it."

  • Re:Sandbox? (Score:3, Insightful)

    by UnderCoverPenguin ( 1001627 ) on Tuesday June 24, 2008 @04:57PM (#23924305)

    Also, to me as a user, the single most important thing on my computer would be all my documents, which are accessible from my account.

    Unfortunately, for a sandbox to protect these documents will greatly limit the usefulness of applications running in a sand box.

    Of course, a web browser or chat client would be least limited. But if you had something legitimate to upload/send, then you are looking at poking holes in the sandbox. With email, even if you never send an attachment, or save a received attachment, it gets complex, because all those messages - and the address book - are valuable to the user. If you keep them in the sandbox, they are open to theft and corruption. If outside the sandbox, you are poking holes, again. Other applications (word processors, drawing tools, etc) have their own legitimate needs for reading/writing files.

    Ultimately, it gets down to a choice between protecting the users so much the computer becomes just a fancy TV, or letting the users make mistakes and hope you can afford to defend yourself for failing to protect them.

  • by BalkanBoy ( 201243 ) on Tuesday June 24, 2008 @05:28PM (#23924741)

    Isn't that just another way of saying, ignorance is bliss? :)

  • by Ilgaz ( 86384 ) on Tuesday June 24, 2008 @05:59PM (#23925189) Homepage

    On OS X, sandboxing is different. Please read couple of pages from Apple mailing lists before comparing it to its bad photocopy. OS X hasn't got a problem with Applications running under normal user account so there is no community to educate with stick (like MS does).

    Safari.app will be able to say "Here are my directories and the system calls I will make". So Safari won't even see a Framework or System folder. Way more detail at http://www.318.com/techjournal/?p=107 [318.com]

    On OS X Leopard, there are couple of deep level technologies already having sandbox technology (spotlight and bonjour) and Apple is preparing it for general developer use.

    OS X "stupid security" dialogue works well, so damn well that it is able to figure out Adobe AIR Applications user installed over the web. The "stupid dialogue" could be a life saver in future. I am not speaking about the Windows horrible copy.

    Code signing is not like the Verisign pyramid scheme on Windows, ANY Developer can sign their application free. People actually adopt it, even including Adium X like open source applications. There is no "Apple certified" or "Verisign Secure" junk, it is application signing which is meant to benefit the user and developer. By signing it, you just make sure your files aren't tampered after user trusts it so no lamers taking advantage of your application (and users trust). There are no other advantages, OS X treats your Application just like unsigned Applications. It is not the signing in Microsoft Windows. If user updates unsigned Application, OS will prompt if he/she wants to grant access since there is no way making sure that it is the same binary from very same developer user trusted at first place. If user updates a developer signed binary in a normal way and the signature is the same, it doesn't prompt.

    Read this for more info:
    http://adiumx.com/blog/2008/04/adium-application-security-and-your-keychain/ [adiumx.com]

  • Re:Code signing (Score:4, Insightful)

    by dhavleak ( 912889 ) on Tuesday June 24, 2008 @06:08PM (#23925281)

    Isn't that excactly the same stuff Microsoft talked about years ago and many ppl on slashdot cried "foul!" about it?
    Where Microsoft went wrong with code signing, is that insist the code be signed by them, because the user or administrator is an enemy (i.e. might install a video driver that doesn't respect DRM).

    Here's the list of Windows' trusted Root CAs: http://msdn.microsoft.com/en-us/library/ms995347.aspx [microsoft.com]. Only third-parties are on that list -- not Microsoft.

     

    Code signing is harmless if the machine's administrator is the ultimate authority.
    Take a look at CertMgr.exe (specifically, play around with the 'import' function). The administrator is the ultimate authority, and this is the case in XP/2003/Vista/2008.

     

    The issue is: whose interests should the OS serve: the OS maker, the user, or (in the case of malware) anyone who manages to get their code onto the machine? If the OS designer answers that question correctly, then there's no problem with code signing (or other whitelisting approaches).
    I agree. I think you have to admit that MS has addressed these concerns.

     

    Naturally, the author of TFA got it wrong:

    Most kernel extensions are from Apple anyway and for the few common 3rd party ones, they should be required to get a code signing certificate.
    Required by whom? A certificate from whom? And the amount of trust delegated to this CA is what?
    I'd say the author got it right. Your concern is valid, but it's orthogonal to the point of TFA. Code signing is a Good Thing and Apple might implement it -- that's the point of TFA. The third-party approach is the correct way to do it -- that's your point.

    What's sad is the number of people on /. that crucify MS without realizing that their implementation has already addressed all the things they are complaining about (and has done so from day 1).

  • by UnknownSoldier ( 67820 ) on Tuesday June 24, 2008 @06:32PM (#23925577)

    > Well then the solution's simple. Give people a license to use a computer.

    Riiiiiight, just like a driver's license prevents traffic accidents, a gun license prevents shootings....

    A license is not an indicator of any safety, wisdom, or experience.

    You can't regulate stupidity or intelligence.

  • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday June 24, 2008 @06:35PM (#23925617)

    The only way to fix this is with mandatory access control, but how will a normal untrained user set these up properly? How will we prevent them from screwing up the secure default settings?

    The normal user should probably not have to set hem up at all. Rather, ACLs should be certified by security companies who review the software looking for problems and malware and then feed that data to the OS. These could be free and community driven like ClamAV is now, payware, like Norton and the like, or supplied by the OS vendor. Ideally, the user should be able to subscribe to them and weight them as they like.

    I don't see any reason why MAC can't be transparent to the user, except in weird edge cases. Users should only have to do anything when software is not pre-installed, not identifiable from one of the services I describe, and wants to exceed a strict sandbox that untrusted software defaults to. For normal users, that should pretty much mean they never have to interact with setting up an ACL and only be prompted if they are dealing with malware. They can learn if they see such a prompt, something fishy is going on and they should not run it (the default) and maybe look into the source of the software more closely. For advanced users that want to run custom software or company specific software, well they are advanced and can deal with it.

  • by mingot ( 665080 ) on Tuesday June 24, 2008 @06:39PM (#23925689)
    Replying to myself here, and to all above who have proposed solutions: The same day they make an OS/Computer on which a user can't screw himself is the same day they come out with unbreakable DRM. It's the same game, really.
  • I've been using UNIX for 30 years, I've worked on safety-critical software and in the control systems industry for 20 years, and I was solely responsible for network security for over a decade of that. I'm pretty familiar with this stuff.

    On OS X, sandboxing is different. Please read couple of pages from Apple mailing lists before comparing it to its bad photocopy.

    The problem is that it is not in principle possible to build a sandbox around an application like Safari that would both permit it to do the useful things it is supposed to do and prevent it from doing malicious things.

    * If Safari can make connections to websites, then Safari can make connections to botnet peers and engage in attacks on websites.

    * If Safari can send mail, it can send spam.

    * If Safari can read my keychain, it can read my website passwords and pass them to an attacker.

    * If Safari can open my bank's web page, it can transfer money out of my account.

    * If Safari can upload files, it can upload them places I don't want it to access.

    * If Safari can download files, it can "download" garbage over the files I value.

    * If Safari can do the things I need Safari to do, a compromised Safari can do the things I don't want it to do.

    A sandbox can not protect the things in my computer that I care about from the applications that manipulate them. The only sandbox that is secure is one that does not allow the application the ability to access any non-volatile resources on my computer, except those that are strictly restricted to the sandbox and not used by any other application. Oh, and it can't make network connections, except in very specific conditions... for example, the Java sandbox lets the application connect back to the originating site.

    THAT is a security sandbox.

    I don't think I would be happy running Safari or Mail under something like that.

    OS X "stupid security" dialogue works well, so damn well that it is able to figure out Adobe AIR Applications user installed over the web.

    But you want to run them, don't you, so you go ahead and approve them, and you are trained to approve these dialogs. I've watched that scenario play out time and time again, with the same people coming back to me saying "I clicked the wrong button again, I think I've got a virus".

    By signing it, you just make sure your files aren't tampered after user trusts it so no lamers taking advantage of your application (and users trust).

    I was building the tripwire configuration for my Cheswick-Bellovin bastion firewall back when Steve Jobs was still at NeXT. I know about the capabilities, restrictions, limitations, and drawbacks of far more pervasive and complete file security mechanisms than what Apple has implemented. Particularly the drawbacks...

    If an attacker is in a position to modify my applications, then there is nothing OS X can do to stop him, he has already got he keys to the kingdom. He already has remote root access, however achieved, and he's not going to hide a trojan horse inside Mail.app, he's going to hide it in /private/etc/somethingobscure, running as root, and use Mach injection to patch Mail.app on the fly.

    As for your linked story: "If you mess with the Adium binary in any way, you will invalidate the signature, and access to secure resources -- specifically keychain items where your passwords are stored -- will be disallowed by Mac OS X."

    That's a hell of a drawback. That by itself is enough to make me hold off installing Leopard until I've got time to look up how to disable that paranoid security theatre.

  • by Macgrrl ( 762836 ) on Tuesday June 24, 2008 @07:59PM (#23926609)

    Well then the solution's simple. Give people a license to use a computer. A computer is infintely more complex than a car, yet you need a driver's license for a car.

    It'll happen sometime after they make it compulsory to have a license to have children - which lets face it - are several times more complex than either a car or a computer.

  • by KURAAKU Deibiddo ( 740939 ) on Tuesday June 24, 2008 @10:12PM (#23927835) Homepage

    Basically, there isn't a huge difference between how Mac OS X handles log files (apart from Leopard using bzip2 [wikipedia.org] for compression, instead of the gzip [wikipedia.org] that Hardy Heron uses). Logs are in /var/log on both operating systems, and provided that you're using the default Gnome UI on Ubuntu, you can use the Gnome System Log viewer to view them.

    You can pull this up by going to System > Administration > System Log, or by typing gnome-system-log into Terminal.

    For more information on logging in Ubuntu (with pictures, no less), you might take a look at either this random Google search result [watchingthenet.com] or this one [cyberciti.biz]. The first has more screen grabs for illustration. ;)

    On Mac OS X, you'd use Console [wikipedia.org], which can be found in Applications > Utilities.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...