Private Data On iOS Devices Not So Private After All 101
theshowmecanuck (703852) writes with this excerpt from Reuters summarizing the upshot of a talk that Jonathan Zdziarski gave at last weekend's HOPE conference:
Personal data including text messages, contact lists and photos can be extracted from iPhones through previously unpublicized techniques by Apple Inc employees, the company acknowledged this week. The same techniques to circumvent backup encryption could be used by law enforcement or others with access to the 'trusted' computers to which the devices have been connected, according to the security expert who prompted Apple's admission. Users are not notified that the services are running and cannot disable them, Zdziarski said. There is no way for iPhone users to know what computers have previously been granted trusted status via the backup process or block future connections.
If you'd rather watch and listen, Zdziarski has posted a video showing how it's done.
Re: (Score:2)
Apparently not so safe.
Re: (Score:3, Insightful)
These *attacks* require the attacker to have the keys from a trusted computer. Is your linux secure if you give somebody the root pass? Is your house safe if you give a friend the keys? These "security" headlines are just clickbait.
Re: (Score:2)
JZ's response response here: http://www.zdziarski.com/blog/... [zdziarski.com]
dropping some fact bombs on this conversation.
Re: (Score:2)
Re: (Score:3)
It's enough to have a friend PC compromised, where you connected your iPhone once, a year ago, to recharge your battery and you don't even remember that now. When his computer is compromised, your phone becomes compromised as well and vulnerable to remote attacks.
That's a bit different story than what you described above.
Re: (Score:1)
Sure man, trivial. It happens to everybody every day of the week. Seriously, do you guys have a bit of common sense? If you have malware slurping the keys, the malware can already be slurping the synced data of the phone, which is the point of this attack. Why go roundaway to something you already have access to on the machine? For the lulz? And don't tell me there might be data on the phone that is not on the machine, because then I claim you wouldn't be syncing in the first place the phone, neither to Apple iCloud, neither to your own machine.
All the case scenarios you guys are painting are the equivalent of xkcd 538.
Um... hello there? XKCD 538 is important here. Just look at the Slashdot stories, and you will see abuses left and right, and this is by every single government out there.
Take the UK, a judge can ask a person 30-50 times for their password, each no is 3-4 years in Her Majesty's prison system, due to RIPA. Other places like Syria and most of the Middle East will answer a "no" with 240VAC to the regions of the body normally used for reproduction... and likely to family members too.
So, it is a big concern,
Re: (Score:1)
It's hardly "copying the keys". It's simply connecting the device to some PC and then finding yourself vurnerable for remote attacks. After you are aware that something like that is possible, it of course makes sense to be careful, but otherwise - how would you even expect it to be possible? Especially if you're not tech-savvy? No sane security design should allow something like that, especially on things like mobile phones or tablets that are often connected to various other devices at various places.
Re: (Score:3)
Re: (Score:2)
No, you can't retrieve anything from my computers in that way even with my root password.
The encryption keys are my own and I have full control over them.
In the TFA case, apple has control over your keys.
Re: (Score:2)
Except any sane person doesn't allow remote root logins.
Re: (Score:1, Informative)
In the TFA case, apple has control over your keys.
False. The private keys are unique to the phone and the paired device. The public keys are shared between the two when they are paired. Apple doesn't have have the private keys (or the public keys for that matter), and thus cannot read either side of the communication.
Re: (Score:2)
"By default, the Root account password is locked in Ubuntu. This means that you cannot login as Root directly or use the su command to become the Root user. "
Perhaps you can define exactly what it means to "jailbreak" an iDevice? Seems you do something to gain "root" access? Remember when simply going to a website would root your phone?
Re: (Score:2)
sudo xterm or sudo mc is how I get a root prompt under Ubuntu
Re: (Score:2)
Nice to see the fanboy glossing over the issue...
Yes, you did gloss over the issue of Google being a Hipster marketing company.
Stallman was right (Score:5, Insightful)
These so-called "smart telephones" aren't telephones at all; they are computers. Computers that you cannot control. And if you aren't, who is?
Some folks thought Richard Stallman was crazy for saying no-one should run software or use hardware that is based on clandestine (proprietary, hidden) knowledge. This latest revelation is just one reason he was right all along.
Re:Stallman was right (Score:5, Informative)
Fortunately, if someone wants a "smartphone" that is under full control of the user, there are a few choices: Openmoko Neo Freerunner, OpenPhoenux GTA04 or latest device in development - Neo900 ( http://neo900.org/ )
The last one even goes further and implements monitoring over some unavoidably closed parts, like GSM modem (and all of them have proper modem isolation, so the modem cannot access the main RAM, possibly rendering any software encryption moot like on most of recent mainstream smartphones)
Re:Stallman was right (Score:4, Informative)
Not sure about that particular case, but there are some legal requirements that, I believe, entail controls that are not user controlable. Things like frequency, signal encoding, etc. Those seem liike reasonable constraints, so long as we aren't using spread spectrum, which, IIUC, is illegal.
Given that, modem isolation is probably the just and reasonable approach to take.
Re: (Score:2)
That is just obfuscation. Not making the baseband firmware open-source certainly makes it more annoying to mess with that stuff, but it certainly doesn't make it impossible. It is misguided in any case, as anybody can build a cell phone jammer/spoofer/cloner/whateverer out of their own parts running whatever code they want to.
But, I have no doubt that the clowns at the FCC would give any company that wanted to implement an open-source baseband a hard time, so even though it really doesn't make the device
Re: (Score:2)
Sure, but what about the service part. How do we know what happens to the data that goes in and out of these phones? Can we encrypt from our phones to the other end like voice communications, textings, etc.?
Re: (Score:2)
Let's not go off the deep end. Stallman is a lunatic...
Re: (Score:2)
Stallman is crazy. Even crazy people can be right about a few things here and there, but overall he's a zealot. The jokes goes "even a stopped watch is right a couple times a day - though you need a second working watch to see when."
The Hurd has been under development since 1983. Three decades, and still not a stable version [gnu.org]? When he started the HURD we didn't have the web, nor the Internet. If we waited for Stallman to actually ship, we would have lost out on a lot (both good and bad, but mostly good).
Th
Re: (Score:1)
There's only one operating system in existence today that is worthy of even a small degree of trust: OpenBSD.
OpenBSD is the only operating system I know of that is open source, continually undergoes rigorous review, and has developers who put security above all else.
Since OpenBSD is the only operating system that is anywhere close to being secure, the only type of secure mobile device would be one running OpenBSD. I'm not aware of any of those, so it's obvious that any device not running OpenBSD should be considered insecure to begin with.
I'm an OpenBSD user, but just remember that the software a computer runs isn't the only thing that can be doing evil things. Realistically, you're never completely safe, even if I might decide to completely trust the OpenBSD developers, my nic card could be siphoning my data. I don't blindly trust them of course, not that I have evidence they do evil things, but it's the best OS I can figure for me since I'm not a programmer and I can't write my own. That and it's so simple to configure compared the the
it's the future (Score:3, Insightful)
The more we buy devices whose master is someone else, the more things of this very nature will become a problem.
Do not buy devices that you do not control after you buy them. You must be able to run any kernel and any userspace you want, you must be able to control the machine top to bottom. If you give this up in exchange for convenience, then you will be taken advantage of by companies that don't have your interests at heart.
Re: (Score:2, Interesting)
You got modded down by Apple fans for telling the truth.
Re: (Score:2)
True words. Sadly, people consider things that are trendy or have more raw power as more valuable, even if they don't really need that. When someone actually comes up with the device that you can control (instead of it controlling you), all he hears is "meh, too slow", "too expensive", "no capacitive screen? are you joking?"
You would expect people to be more sensible than that, especially in the post-Snowden era.
Re: (Score:3)
Unfortunately, no, I wouldn't "expect people to be more sensible than that, especially in the post-Snowden era", even though this actually isn't the post-Snowden era. He's still around, and still occasionally releasing new tid-bits.
I normally expect people to be short-sighted, and to have little memory of history. I regret that I'm rarely disappointed.
So... (Score:5, Insightful)
If you store sensitive stuff on your iPhone, don't make backups from it onto an insecure/unencrypted computer.
And if you were making backups from anything secure onto anything insecure, it is time to revise your security policy.
Re: (Score:2)
Re: (Score:2)
Also, turning off this behavior - plugging a phone into a computer, pressing "OK" without any authentication allows siphoning - is pretty hard to do. You need to download a wonky piece of software called Apple Configurator to do this. It's usually for corporate/educational bulk deployments, and the UI shows this.
Article got it wrong (Score:5, Informative)
Almost all the reports are getting the gist of the paper wrong -- any press summation that doesn't go into the paper to understand it will get it wrong. The paper goes into deep detail that Apple has several services that, while protected by several layers of security that could be bypassed, can transfer data in the clear. There are also several services that don't have any obvious connecting software.
It's a rather deep hacker-style dive into iOS.
A good video about this is by TWiT Network. At http://twit.tv/sn465 [twit.tv] Security Now ep 465 has expert Steve Gibson explain the actual paper.
Re: (Score:1)
You lost me when you said "expert Steve Gibson". If by "expert" you mean "shameless selfpromoting security wannabe", then OK.
Re: (Score:3)
You lost me when you said "expert Steve Gibson". If by "expert" you mean "shameless selfpromoting security wannabe", then OK.
No. These are examples of shameless, self-promoting wannabes:
https://en.wikipedia.org/wiki/... [wikipedia.org]
https://en.wikipedia.org/wiki/... [wikipedia.org]
Steve Gibson at least provides genuinely useful information most of the time and from what I can see does a decent job of teaching non-technical folks to understand and implement good security practices. He's a little hard to take in large doses when I've seen him on This Week in Tech and his website hurts my eyes, but I wouldn't paint him with such a broad brush. He doesn't seem
Re: (Score:2)
Hmm, like the AC joke below, I'm a bit torn when you said "Security Expert" for Steve Gibson. Aside from prodigious self promotion, as far as actual security talent, Steve's both good and bad. I may listen to this one, because this one is more in his wheelhouse - specifically describe in easier terms a complicated subject previously researched and digested by someone else.
He's much less useful when making declarations of what to do - he's too enamored of assembly (which can lead to more security holes - the
Horribly Inaccurate (Score:1)
The "researcher" and Reuters forgot to clearly call out that for the information to be extracted with the developer tools an iOS device must be trusted. Trust is established by plugging the device into a computer and the device MUST be unlocked.
This is akin to giving someone you don't trust a key to your house.
Re: (Score:2)
Trusted by whom? I don't think there's any requirement that the purchaser of the device trust the "trusted" data extractor. IIUC it could become trusted before the customer ever received the device, or anytime it's in for service.
So this *probably* means that J. Random Hacker can't access the information. If the assertion is true. It doesn't say anything about Apple, their employees, or anyone they share information with...transitivly.
Re:Horribly Inaccurate (Score:5, Insightful)
Trusted by whom? I don't think there's any requirement that the purchaser of the device trust the "trusted" data extractor. IIUC it could become trusted before the customer ever received the device, or anytime it's in for service.
Step 1: Plug iOS device into a Mac.
Step 2: Unlock iOS device.
Step 3: Click on YES when the iOS device asks if it should trust the computer.
The critical part is Step 2, which you can only perform if you know how to unlock the device. In other words, if you know the passcode. But if you know the passcode, then you can do _anything_ with the phone. That's what the passcode is there for.
So basically, this security "expert" found a way for a thief to enter my home through the backdoor, as long as the thief has the keys for my front door.
Re: (Score:2)
So basically, this security "expert" found a way for a thief to enter my home through the backdoor, as long as the thief has the keys for my front door.
This security "expert" [zdziarski.com] has a very solid background and street cred in the field of iOS forensics so I would not dismiss him so lightly.
Re: (Score:2)
I am not sure he was really dismissing him, just clarifying what he actually did with a useful analogy. I would not dismiss the GP so easily. He may not have the same background, but he is right. Right trumps background every time....
FUD (Score:3, Informative)
The it only works with a trusted device AND the device being unlocked.
If you gave your device PIN to someone, they already have your data and don't need to do this.
Re: (Score:2)
It's still a problem if users don't know about it and has no way to disable it.
Re: (Score:2)
Not sure why I am responding to an AC that is just clearly making crap up, but what he hell. So in summary, you just made that crap up.
Expectation of privacy (Score:2)
Re: (Score:3)
If you're doing something incriminating don't use paper either. Governments have spent literally centuries figuring out how to make a piece of paper spill it's secrets.
At this point expecting the government not to be able to get it's hands on your data if it really wants to is pretty damn naive. Folks who think that are like the mid-17th century folks who tried to skimp on telling the King their last names. They could make it work for the first few decades, but eventually the bureaucracy figured out the tec
Re: (Score:2)
That works, when you remember to do it.
Lots of these little privacy tips are like that. They work great, as long as you specifically remember to do them even when it's 3:30 AM and you've had a few too many to drink.
Wrong. (Score:1)
Already debunked.
Irresponsible post.
Apple's Admission? (Score:4, Informative)
When did Apple admit to anything? They said the researcher was wrong and described the settings that he found and what they are used for! I would trust Apple over Google any day! Eric Schmidt has lied so many times along with his colleagues that the whole company isn't trustful!
http://support.apple.com/kb/HT6331
http://www.macrumors.com/2014/07/22/apple-ios-backdoors-support-document/
Nothing new here (Score:4, Informative)
iPhones have always been able to sync data out of their secure storage to the user's computer since launch. How did people think USB sync worked? Magical leprechauns that flew out of your phone carrying the data?
Heck, one of these is the developer daemon that runs on the phone to install apps from Xcode. Again, how exactly did people think Xcode did that?
These tools all require the phone be logged in, and that the right key exchange take place.
I can't tell if the "security researcher" here is just trolling, has never actually used an iPhone, it is just stupid.
Re: (Score:2)
The funny thing is, even the most naive user in the world would understand that giving their pass code to the government agent or to Apple would give them access to the data on that device. Sure. Should iOS have a way to change the encryption key? Sure, why not.
Re: (Score:2)
BlackBerry... (Score:5, Interesting)
and yes, full disclosure, I own a z10. I also find it to be the best smart phone I've ever owned with battery life that my android friends can only dream about.
Re: (Score:1)
I own a Z10 as well. Not sure what you're raving about battery-wise. It will (barely) survive a weekend with basically no use, which is better than other phones but still not saying much.
Re: (Score:3)
Blackberry, the company that routinely gives access to customer's "secure" data to governments all over the world without the customer doing anything at all. Are you high? You must be high if you purchased a z10.
Who's down with O.P.E? (Score:1)
Yeah, you know me! Why trust Other People's Encryption? If you encrypt data yourself, you control who can decrypt it - unless all crypto algorithms are compromised. When Google or Apple encrypt on your behalf, you don't really know what they're doing.
Re: (Score:2)
I am not sure what you mean, it is clear you are the one who did not read or comprehend either the article or the post you responded to.
Re: (Score:2)
Huh? You think the user unlocking their device with their pass code and then agreeing to trust a computer the device is physically connected to is the same as a backdoor?