Forgot your password?
typodupeerror
Security Cellphones Privacy Apple

Malware Could Grab Data From Stock iPhones 127

Posted by timothy
from the swamp-of-bog-standard dept.
Ardisson writes "Swiss iPhone developer Nicolas Seriot presented last night a talk on iPhone Privacy in Geneva. He showed how a malicious application could harvest personal data on a non-jailbroken iPhone (PDF) and without using private APIs. It turns out that the email accounts, the keyboard cache content and the WiFi connection logs are fully accessible. The talk puts up several recommendations. There is also a demo project on github."
This discussion has been archived. No new comments can be posted.

Malware Could Grab Data From Stock iPhones

Comments Filter:
  • by bobdotorg (598873) on Friday December 04, 2009 @02:30AM (#30321398)

    I felt a great disturbance in the Smug, as if millions of fanboys suddenly cried out in terror and were suddenly silenced.

    • Re: (Score:3, Insightful)

      by JohnBailey (1092697)

      I felt a great disturbance in the Smug, as if millions of fanboys suddenly cried out in terror and were suddenly silenced.

      Don't be daft.. Nothing can silence fanboys.

  • by Anonymous Coward

    This isn't any different from any other computer. Users can run software that has access to their personal files.

    • by timmarhy (659436) on Friday December 04, 2009 @02:39AM (#30321456)
      so apple products aren't secure. could have fooled me with the mind set around here.
      • by SJ (13711) on Friday December 04, 2009 @02:43AM (#30321474)

        Isn't it more of a case that someone has found a bug, and now it's over to Apple to fix it?

        Or is that just applying far to much logic to an Apple related topic...

        • by Serious Callers Only (1022605) on Friday December 04, 2009 @03:54AM (#30321720)

          No, it's a case of a binary with permissions being able to access public APIs (as intended). Most platforms currently have this problem in some form - if you run an authorised program you must trust the author to some extent, though I think Android has some better controls. So they really need a chance in policy rather than to fix a bug (though perhaps filesystem access to user prefs could be considered a bug).

          Apple have sandboxed apps, so they can't access data from other apps or the main system (save user prefs) unless it's through public APIs.

          What this article points out is that those public APIs provide access (because it is useful, and sometimes essential for apps tp function) to your address book, phone number etc. and also to the file system for your user, under 'var/private/mobile' which lets you see the system prefs (most of which are accessible via private apis anyway). The keyboard cache (though without passwords) is worrying though, so this is a hole that needs fixed.

          What Apple could do is offer more fine grained control (as they do with location services for example) so that apps cannot access data like phone numbers without explicit permission from the user. Obviously this needs some thought, as the last thing you want is a forest of permission dialogs for each app when it starts up, but it's certainly doable without much trouble. The file system access to system preferences would probably need to be locked down too.

          However this is not some new security breach or bug - it's been known about since day one, but it is something that needs to be pointed out repeatedly until Apple fixes it, because most users will not be aware of it, and it does have privacy implications.

          Note that apps we run on our desktop systems (Mac, Windows, Linux) currently have few such controls and have access to a lot of data about us which we might prefer to keep private - similar sandboxing is required there too.

          • by Anonymous Coward on Friday December 04, 2009 @10:15AM (#30323288)

            No, it's a case of a binary with permissions being able to access public APIs (as intended). Most platforms currently have this problem in some form - if you run an authorised program you must trust the author to some extent, though I think Android has some better controls.

            Blackberries have very granular access controls. When an application tries to do something it isn't authorized to do, the user is asked for permission (or these permissions can be allowed/denied at the server level).

            Everything from access the gps, access email, sms, address book, http connections, https connections can be allowed/denied separately.

            So google maps on my blackberry is configured to connect to google (to download maps), but nothing else. It can't connect to the internet at large. This greatly reduces security concerns.

          • Um you actually need access to the keyboard cache for some programs. It's a very useful thing to have. There is very little difference between what happens here and in any other OS using standard APIs I can really screw up a windows box if you will just run my program. So given that, let's see how many other OSs we can screw up using standard APIs.
            • Um you actually need access to the keyboard cache for some programs.

              Of course, but I wonder if it should be a per-application cache rather than storing all sorts of strings from all apps (I think it is global). That would make more sense, and remove any possibility of malicious use.

              There is very little difference between what happens here and in any other OS using standard APIs I can really screw up a windows box if you will just run my program.

              Agreed, but that doesn't mean we shouldn't look for something better. I would welcome the chance to restrict desktop apps with policies I decide myself as to which data etc they can access on my computer - default to a locked down configuration and let me allow them access if I wish. The vast maj

              • If it were global it would have made a great way to cut and paste but obviously since that wasn't possible via standard APIs I doubt it was global.
              • default to a locked down configuration and let me allow them access if I wish.
                Wow, sounds like exactly what Vista did and everybody immediately turned off because it was annoying as hell.
      • When you consider what Jailbreak *is* (root-level exploit) I thought this was already fairly well established? Especially when you consider how quickly each successive jailbreak has been released, and how little effort some have required. Say what you will about their histories, but Apple still hasn't gotten the wake-up call regarding how paranoid you really have to be for software security - something MS had thoroughly bashed into its head over the last decade.

        Keyboard cache is a good example - turns out that the keystrokes entered during bootup (such as to enter a hard drive decryption passpharse/PIN) remain in memory and can be retrieved after the system has booted. Obviously, this is a problem for things like TrueCrypt, and Microsoft's BitLocker. Except, by the time the vulnerability was revealed, Microsoft had already fixed it. That kind of twisty thinking is what Apple has yet to show any particular knack for.

        • something MS had thoroughly bashed into its head over the last decade.

          And yet they are still by far the most exploited and exploitable OS in the world. Simply knowing about the fact that you need to be secure does nothing to protect any users.
          • by toadlife (301863)

            If OSX was on 85% was of the worlds desktops it would be by far the most exploited and exploitable OS in the world.

            • No more than any other UNIX system is, this is true. Unix has also had the security thing pounded into them but they still make mistakes. What's your point.
        • by rsborg (111459)

          ...Apple still hasn't gotten the wake-up call regarding how paranoid you really have to be for software security...

          And how paranoid do you have to be? It's silly. Any lock can be opened, there is no such thing as complete security.

          Apple is still making money hand over fist and there's yet to be an iPhone malware released that actually compromised a large number of users' data.

          I bet their security team is really paranoid, but have to deal with their usability and other teams to make sure that the paranoia d

      • by mcgrew (92797) *

        so apple products aren't secure

        "Secure" is a relative term. My house is reasonably secure from burglars, but compared to my bank it's not secure at all. Compared to a lean-to in the woods my house is incredibly secure.

        There is no such thing as absolute security, but Mac users don't have to worry about picking up a virues just by surfing the internet. Neither do Linux users. Apparently, iPhone users do in fact have to wory about having info stolen; at least until they fix this design flaw.

    • by Mr2001 (90979) on Friday December 04, 2009 @02:47AM (#30321496) Homepage Journal

      It is different from Android, actually. Android runs each app under a separate user ID, and one app can't access another app's data unless the other app explicitly allows it to. Typically this access will go through the standard Android permission system, so the user will see when they install the app that it's requesting permission to read their SMS logs or whatever.

      • That's interesting, it's basically using RBAC ?

      • by mjwx (966435) on Friday December 04, 2009 @04:03AM (#30321752)

        It is different from Android, actually. Android runs each app under a separate user ID, and one app can't access another app's data unless the other app explicitly allows it to. Typically this access will go through the standard Android permission system, so the user will see when they install the app that it's requesting permission to read their SMS logs or whatever.

        Whilst I'm not disagreeing with you, Android has a very good security model and enforcing separate UID's and permissions is essential towards that but... This still wont stop the less intellectually endowed users from just clicking yes and permitting malware to read their private data.

        To paraphrase Ron White, there is no pill to fix stupid, you cant fix stupid and neither can Google.

        In other words we'll still suffer from the stupid acts of moronic users, the good part is that more astute users will suffer from less attacks.

        • by dbcad7 (771464)
          a cyanide pill should do the trick.
        • by Zebedeu (739988)

          Typically this access will go through the standard Android permission system, so the user will see when they install the app that it's requesting permission to read their SMS logs or whatever.

          Which to me is a missing feature in Android -- the ability to enable/disable each permission individually.
          As it stands now, you're presented with a laundry list of permissions which the app requests upon installation, and you either accept it wholesale, or cancel the installation.

          I think Java ME does it correctly. You can look at the app's properties and enable/disable each policy, and even set it to ask once, or everytime that feature runs.

          Sure, this would break the business model of most of those advertis

          • You can look at the app's properties and enable/disable each policy,
            Until you make the whole app useless and you might as well not install it. If you are in fact knowledgeable enough to know what all those things are anyway.
            • by Zebedeu (739988)

              I guess the app would install normally with full access, but you'd have an advanced preferences screen where you could set these options.

              I don't agree with the idea of limiting important functionality because of novice users.

        • by Inda (580031)
          I hit my thumb with a hammer once. Make that twice, three times, four... I'll probably do it again in my lifetime.

          Can you, as an astute user, claim that you've never hit OK to a dialog by mistake? Maybe you were expecting one dialog, but received another.
          • by mjwx (966435)
            I was talking in general.

            Can you, as an astute user, claim that you've never hit OK to a dialog by mistake? Maybe you were expecting one dialog, but received another.

            We've all done that, which was your point.

            Android lacks a centralised place to enable/disable permissions, to be fair Android is less then 2 years old and I'm certain there would be a third party program that could do this.

            However the permission system in Android works when you install an application Android checks which API's it access

        • So basically you're saying Android is doing as much as it can do about what it can control, and not much about things it can't control. None of that is good news for Apple who is not doing enough of the former and pissing off a lot of people with their attempts at the latter.
    • If they can access the keychain, then it is a flaw. The keychain is stored encrypted and controlled by a daemon (which sets the flag preventing debuggers from attaching when it starts). When an app requests data from the keychain, it must be authorised by the user to access that specific key. It is not able to access any other key and the authorisation is invalidated if the binary is modified.
  • by SuperKendall (25149) on Friday December 04, 2009 @02:36AM (#30321440)

    There's actually not much surprising here (at least for an iPhone developer) but two things were interesting:

    1) It can read EXIF data from your image library (including GPS tags if any) by just reading the library directly. In theory you are not supposed to do that, and go through an API - which annoyingly gives you only raw image data with no EXIF.

    2) Your "location" is reported, without the dialog that normally arises asking you if you want to reveal your location. Alarming at first, until you look and realize what it's really done is found the last location Maps knew about. Since you don't run Maps all the time this data really doesn't mean that much and is not real time as you get with real CoreLocation calls.

    One other thing of note is that a great deal of this involves poking about in /var/mobile/... at preference and temporary files. Given that Apple is now scanning for strings in app review, I'm not sure if an app that included these techniques would actually make it to the app store. Even if you obsfucated the string the filesystem could simply report if anything under that directory were being accessed and what the call stack was like, though I think it unlikely they would go to these lengths.

    • the filesystem could simply report if anything under that directory were being accessed and what the call stack was like,

      ...only if they managed to exercise all of the code. So, you couldn't actually download and execute code remotely, but I bet you could trigger something based on a date, or on some web service.

      Based on how randomly they seem to accept and reject apps these days, I wouldn't be surprised if a few made it through.

    • Re: (Score:3, Interesting)

      One other thing of note is that a great deal of this involves poking about in /var/mobile/... at preference and temporary files....Even if you obsfucated the string the filesystem could simply report if anything under that directory were being accessed and what the call stack was like, though I think it unlikely they would go to these lengths.

      They'd be better just to lock down access to the files which apps have no business accessing directly - get system apps to save their preferences elsewhere for example, or restrict permissions artificially for sandboxed apps via the filesystem apis and refuse access to all files except the sandbox. That way even if someone gets past the filters (that's a game of whack-a-mole really, and the current controls are easy to defeat) they cannot access the files.

      They need to move to restricting access fully at the

    • Re: (Score:3, Informative)

      by TheRaven64 (641858)
      Scanning for strings is pretty easy to circumvent. You can just concatenate the path components in code. 'var' and 'mobile' are quite innocuous strings. The same is true for private APIs, by the way, because Objective-C lets you look up both classes and methods by name.
  • It's great that there are those making people aware of what data might be accessed by malicious apps on any platform. The question is can this be avoided ? Restricting the data this app was able to access would also stop applications from doing some pretty useful things: accessing the address book, reading files on the filesystem, autocomplete (this is the keyboard cache mentioned), accessing pictures, etc. This is a balancing act between allowing enough freedom to produce good software and being too restri

    • by netsharc (195805)

      Actually the dialogs-way is the way BlackBerry does it, usually with an option to "Don't show this again", i.e. "always say yes" or "always say no". It does this for many features like contacts reading, access to the messages (apps can hook and get a notification each time a new message arrives), and even telephony (yes the API actually lets you send DTMF tones). There's an extra layer, apps accessing protected APIs must be digitally signed, so they have your identity (just like Apple does). And you can cha

  • by iamacat (583406) on Friday December 04, 2009 @02:46AM (#30321490)

    It depends on manual app approval process and ability to ban/sue developers who abuse the system. There is probably also a kill switch to delete the app from existing devices that Apple hasn't yet had to activate for catastrophic malware. Runtime-enforced security has been tried with J2ME and nobody liked the app functionality. In fact people are not willing to live with Java's limitations on desktop either. Perhaps someday such a system will become viable with much more powerful mobile hardware and better thought out security system that allows more functional legitimate apps (for example, user will be able to give an app access to some or all e-mail as an intuitive option).

    • by Mr2001 (90979) on Friday December 04, 2009 @03:23AM (#30321620) Homepage Journal

      Perhaps someday such a system will become viable with much more powerful mobile hardware and better thought out security system that allows more functional legitimate apps

      It's already here, and it's called Android. When you install an app, it tells you what permissions [android.com] the app is requesting, and you can cancel if you're suspicious. Most operations that you'd consider potentially harmful or privacy-violating (reading various types of personal data, accessing the internet, making phone calls, preventing the phone from sleeping, etc.) can only be performed if the app listed the relevant permissions in its manifest.

      It's not perfect... you know what the app is capable of doing, but not what it actually will do. Without looking at the code, you can't tell if the app that requests "read GPS position" and "access the internet" is going to send your GPS position to someone over the internet, or if the two features are unrelated. But it does prevent surprises like the ones in TFA.

      • When you install an app, it tells you what permissions [android.com] the app is requesting, and you can cancel if you're suspicious.

        I thought 'security through user awareness' had been shown to be a fail paradigm.

        • When you install an app, it tells you what permissions [android.com] the app is requesting, and you can cancel if you're suspicious.

          I thought 'security through user awareness' had been shown to be a fail paradigm.

          It is because most users don't know what those permissions mean to them.

      • by Viol8 (599362) on Friday December 04, 2009 @07:16AM (#30322382)

        Just like on Windows , your non techie user is just going to end up learning a pavlovian response to any such permissions dialog and just click OK no matter what. Yes , you can blame the user but ultimately these are supposed to be simple to use gadgets for people who have more important (to them) things in their life to worry about than application access permissions they probably don't even understand. So you can't really blame users for treating a gadget thats marketed as simple to use in a simple way.

        • by DJRumpy (1345787)

          I prefer an Apple controlled approach. Why don't they authorize apps with certain access levels, so if an app is submitted and indicates it needs ACL access to the FS, and the Internet, then I would prefer that Apple runs some Apple owned and signed script that blesses that app with access to those areas, specific files, and so on. I don't want to see UAC popups on my iPhone. Seems like if these apps were given controlled acl access only to those items that they submitted and were granted, that even later a

          • If you want to be really depressed, type 'man sandbox' into an OS X system. Not only does the kernel (the same kernel used on the iPhone) have support for fine-grained sandboxing, Apple actually ships a set of five default policies for restricted apps. It would be trivial to provide profiles for games and so on for the iPhone, but for some reason Apple doesn't bother.
            • by DJRumpy (1345787)

              Disappointing. If the infrastructure is already built in, you would think it would be somewhat trivial to enable and enforce those policies.

              Odd that they haven't done so already.

              • Re: (Score:3, Interesting)

                by TheRaven64 (641858)
                They introduced this mechanism a few years ago, with 10.5. It's used to isolate the mDNS responder (which is why the security hole in that a couple of years ago was a DoS on 10.5 and a remote root hole on 10.4 and Windows), but not much else. It's a real shame that when you download a binary from the Internet they pop up a window when you run it saying giving the choice of running with full privileges or not at all, rather than in a couple of predefined sandbox configurations. I was expecting 10.6 to inc
        • True to some extent, and it can be mitigated to some extent by making all default values to "deny" privilege. Unless you convince Mr Joe Schmoe to grant privilege the app wont get anything. Since Joe Schmoe is unlikely to do it, the apps will be developed without assuming such privilege will be available.

          In fact this is one of the fundamental reasons why *nix applications run nicely with user privileges and the Windows applications barf if root privileges are not available. The *nix was originally deploy

        • Yes , you can blame the user but ultimately these are supposed to be simple to use gadgets for people who have more important (to them) things in their life to worry about than application access permissions they probably don't even understand.

          Sorry, but these are not simple gadgets any more so than a motor vehicle is a 'simple gadget'. ANY complex computing and/or net-connected device deserves as much respect in use as driving a motor vehicle on the highway.

          There is a minimum amount of understanding required to safely operate a motor vehicle, and there is considerable potential harm that can come to oneself or to others in both scenarios.

          If people don't take the time to learn about the device and it's proper and safe operation, they de

          • by iamacat (583406)

            Are you really suggesting that people should have to take classes to safely download and play games on their cell phone?

        • Well, there are a couple of things to note here.

          The first is that yes, some users will not read what is in front of them no matter what you do. Such is life.

          The second is that Androids implementation of this dialog is about as good as you'll ever get. I wrote an Android app and was very surprised at how many queries I got regarding permissions. There have been several cases whereby app devs shipped apps that listed permissions they clearly didn't need, and then uploaded fixed versions with minimized privile

      • by iamacat (583406)

        How exactly are the permissions enforced? Did Google implement some kind of filesystem with application-specific ACLs? Was there a serious effort to close the backdoors, such as updating another apps shared libraries or reading GPS coordinates from system logs? Are applications prevented from taking over the whole screen and mimicking another app's interface to trick users into entering their passwords?

        Unless a comprehensive solution is implemented, this is just a security theater in that only legit apps wi

        • by Mr2001 (90979)

          How exactly are the permissions enforced? Did Google implement some kind of filesystem with application-specific ACLs?

          Each app gets its own user ID and has no read access to files owned by other apps. To obtain data from another app, you have to go through the system-wide content provider interface. The other app has to actively support that, and it can enforce permission checks on the caller in code and/or in its manifest.

          Was there a serious effort to close the backdoors, such as updating another apps shared libraries or reading GPS coordinates from system logs?

          Yes. One app can't overwrite files belonging to another app, and system logs aren't world readable.

          Are applications prevented from taking over the whole screen and mimicking another app's interface to trick users into entering their passwords?

          Not exactly -- apps always take up the whole screen, and the system can't stop you from making your app

      • by Ilgaz (86384)

        J2ME which Android was based on had that concept for years. It is a J2ME concept, you know, the VM which trendy developers ignore for years while it is well beyond 700M installed base.

    • by 0ld_d0g (923931)

      What happened to MMU based security? User & Kernel address space division seems to work for desktop OSs. Why hasn't it been adopted for the mobile platform?

      • by Dog-Cow (21281)

        Probably for the same reason 0ld d0g hasn't embraced intelligence. It's not relevant.

      • MMU based app sandboxing has been adopted on the mobile platform. That's how Android works (and iphone too actually - it just sucks at it).
  • Closed system (Score:4, Informative)

    by Anubis IV (1279820) on Friday December 04, 2009 @03:32AM (#30321646)
    The security the iPhone uses is called a "closed system" and an "approval process," both of which I believe we've heard about here before in great detail, and the attack they're talking about is nothing more than a trojan, essentially. There have already been a few such apps that have sprung up over the years on the iPhone (I recall hearing reports of one that harvested your contacts), but Apple has been quick to squash them so far. Whether that will continue or is even a viable strategy as things scale up remains to be seen, however.
    • The problem is that nobody knows if it's really just "a few apps that have sprung up" or if there is actually a systematic problem with apps harvesting your personal details. The companies you're thinking of were busted because they actually phoned the users up themselves en-masse rather than simply (eg) selling the data. The Android approach is far more trustable because it doesn't rely on (very) fallible humans trying to inspect binary blobs.
  • In the wild Apple product that is as easy as MS.
    • I'm sure they'll be enchanted to switch to it on such a vague, misleading report : "if it's as insecure as the windows we use, surely we can trust it with the security of the nation"

    • Trust me, it won't be that easy on Windows Mobile. Windows Mobile has a security model similar to Symbian and J2ME.

  • by Jane Q. Public (1010737) on Friday December 04, 2009 @04:52AM (#30321926)
    ""Swiss iPhone developer Nicolas Seriot presented last night a talk on iPhone Privacy in Geneva"

    No, Nicolas Seriot did not present a talk to "last night". Nor was was his talk about "iPhone Privacy in Geneva".

    Try this: "Last night in Geneva, noted Swiss iPhone developer Nicolas Seriot presented a talk on iPhone Privacy".

    There. Fixed that for you.
    • by Phroggy (441)

      If you'd step outside your own borders once in awhile, you'd recognize this sentence structure as something that works in a foreign language, but has been translated into English by a non-native speaker. Constructive criticism is useful, but please don't be unkind.

      • If you'd step outside your own borders once in awhile, you'd recognize this sentence structure as something that works in a foreign language, but has been translated into English by a non-native speaker.

        Yes, hence the subject line.

        Constructive criticism is useful, but please don't be unkind.

        What part of that post was "unkind?" It was a fairly straightforward, if blunt, explanation of why the sentence structure makes its meaning ambiguous and how it could be clarified.

    • "There." Is not a sentence, I though you were using English...
    • by Jay L (74152) *

      No, Nicolas Seriot did not present a talk to "last night". Nor was was his talk about "iPhone Privacy in Geneva".

      And from you, we should take advice?

    • by sootman (158191)

      > There. Fixed that for you.

      Hmm, I'm looking for the joke, but either I'm not seeing it or it wasn't funny.

      Oh, wait, you actually fixed it for him. Gotcha. Nice work. :-)

  • Theres an App for that?

    • by mgblst (80109)

      HAHAHAHA, this gets funnier with every story about the iphone or Apple. Please kill yourself.

  • by redstar427 (81679) on Friday December 04, 2009 @06:22AM (#30322216)

    This baby is overclocked and water cooled, running at 2.3 Ghz!
    It's so fast when talking on the phone, my friends sound like Alvin and the Chipmunks.

  • This is news? (Score:4, Informative)

    by argent (18001) <peter@slashdot.2 ... m ['nga' in gap]> on Friday December 04, 2009 @06:47AM (#30322294) Homepage Journal

    You install an application on your computer. That application has access to stuff stored on your computer. This is news?

    Wake us up when you have a remote exploit.

    • You wake up and smell the coffee. The days when you can trust all application developers to play nice are gone. Even when you install an application yourself on your own computer, you have to assume the application is not trusted and set up privileges explicitly about what it can and can't do. The only question is how to specify these privileges in an easy to use and enforceable manner.
      • by Dog-Cow (21281)

        Viruses and trojans have existed since the days of the original Apple, Commodore and Atari home computers. The days you allude to never existed.

        • The parent was not talking about viruses and trojans. He is talking about knowingly installing an application. Some of the functions of the applications are beneficial and actually desired by the user. So much so that he went to the app store and paid money for it. Then once installed, in addition to doing what it should do, it is snooping around and phoning home personal details. That is more recent.
          • by Tim C (15259)

            Actually, you described exactly what a trojan is - an app that performs (or claims to perform) a desirable function, but surreptitiously also performs an undesirable one.

            • by cheros (223479)

              .. with the added bonus of getting money for that installation too, allow me to omit the obligatory Microsoft joke in this context.

              I think the primary -dangerous- underlying assumption is that an app you pay for is somehow safer..

      • The days when you can trust all application developers to play nice are gone.

        They were gone before the Internet was called the Internet.

        If you install every application that shows up on your computer without paying any attention to what you're getting... you're going to be sorry. I've cleaned up enough malware from people's computers over the past twenty years to know that.

        If an application has the ability to do useful things, it has the ability to do dangerous things. Sandboxing applications while still le

      • Ya know I'd rather just use my computer instead of running around trying to make every last bit secure. Privacy is an illusion for a useful member of society and the time i spend not being overly paranoid is more time spent with my family.
    • It is news on smart phones which started with lessons learned from Desktop and mobile trojans themselves. Apple ignored those lessons and now paying for it.

      http://wiki.forum.nokia.com/index.php/Symbian_Platform_Security_Model [nokia.com]
      http://developers.sun.com/mobility/midp/articles/permissions/ [sun.com]
      http://technet.microsoft.com/en-us/library/cc512651.aspx [microsoft.com]

      In fact, security scene kind of gave up on iPhone&iPod. Let whoever buys makes his/her own mind.

      • I don't believe there was any actual content in your post. But I'm not sure. It's still not news and no one is paying for anything. What exactly are you talking about.
    • One day, one dirty rival of Apple or a psychopath will use one of these "theoretical" exploits which Apple keeps ignoring for years. You won't need to check news that day, your newspaper won't simply arrive since the Quark/Indesign Machine they use won't function.

      I don't hope for it but Apple is really inviting it...

      • by argent (18001)

        You won't need to check news that day, your newspaper won't simply arrive since the Quark/Indesign Machine they use won't function.

        What's a newspaper?

    • Why are your goals so low? Shouldn't Apple be showing the rest of the industry how it's done?
      • by argent (18001)

        Shouldn't Apple be showing the rest of the industry how it's done?

        You want to mind meld Steve Jobs and Theo Deraadt?

  • by dawilcox (1409483) on Friday December 04, 2009 @10:28AM (#30323420)
    Why is it that every time something like this is discovered for Microsoft, it's their fault because they should have provided a more secure operating system. When something like this happens for other companies, malware is a fact of life.
  • Apple may not have the best approval process based on how engineers review apps for functionality, but this is NOT the only process apps go through...

    Apple has a whole series of programs that crawl the source code of each app (which is how apps are submitted, not in compiled form). First of all, they look for apps that touch unapproved APIs, and summarily reject them (with thus far 1 exception noted recently, and they were told not to include that function in their next release...) next, any app that acce

    • No, what you say is wrong.

      Firstly, Apple examine binaries not source code. Secondly, they're looking for non-malicious usage of private APIs. It's quite easy to build dynamic/obfuscated code that their simple symbol dump process can't detect. Thirdly, as the presentation notes, nothing stops an app from changing its behavior after it was reviewed based on an internet fetch, or date. Fourthly, I don't know how you can think data transfers are "monitored". It's easy to hide data such that it looks boring (lik

  • Yes, malware could grab data from stock iPhones in much the same way that I could be President of the United States. Wikipedia has the scoop, plus a lot of other juicy details, here: http://en.wikipedia.org/wiki/United_States_Constitution [wikipedia.org]

    Granted, the probability of my becoming Prez is likely lower; particularly if my opponents find out that I hang out around here.

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...