Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Crime Iphone Apple Technology

FBI Asks Apple To Help Unlock Two iPhones (nytimes.com) 134

An anonymous reader quotes a report from The New York Times: The encryption debate between Apple and the F.B.I. might have found its new test case. The F.B.I. said on Tuesday that it had asked Apple for the data on two iPhones that belonged to the gunman in the shooting last month at a naval base in Pensacola, Fla., possibly setting up another showdown over law enforcement's access to smartphones. Dana Boente, the F.B.I.'s general counsel, said in a letter to Apple that federal investigators could not gain access to the iPhones because they were locked and encrypted and their owner, Second Lt. Mohammed Saeed Alshamrani of the Saudi Royal Air Force, is dead. The F.B.I. has a search warrant for the devices and is seeking Apple's assistance executing it, the people said.

Apple said in a statement that it had given the F.B.I. all the data "in our possession" related to the Pensacola case when it was asked a month ago. "We will continue to support them with the data we have available," the company said. Apple regularly complies with court orders to turn over information it has on its servers, such as iCloud data, but it has long argued that it does not have access to material stored only on a locked, encrypted iPhone. Before sending the letter, the F.B.I. checked with other government agencies and its national security allies to see if they had a way into the devices -- but they did not, according to one of the people familiar with the investigation.
"The official said the F.B.I. was not asking Apple to create a so-called backdoor or technological solution to get past its encryption that must be shared with the government," the report adds. "Instead, the government is seeking the data that is on the two phones, the official said."

"Apple has argued in the past that obtaining such data would require it to build a backdoor, which it said would set a dangerous precedent for user privacy and cybersecurity." Apple did not comment on the request.
This discussion has been archived. No new comments can be posted.

FBI Asks Apple To Help Unlock Two iPhones

Comments Filter:
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday January 08, 2020 @09:10AM (#59598704)
    Comment removed based on user account deletion
    • by hey! ( 33014 )

      Because that's not necessarily true. The data is still there, but Apple doesn't know what it means. That does NOT, however, mean Apple couldn't find out.

      The key to decrypt that data still exists, in the "security enclave" of the iPhone processor. While it may not be practical for a third party to get at that key without physically breaking into the CPU package, Apple may have a way in ... by software updates. After all, it's the phone's firmware that controls access to the key, not the owner.

      • Any firmware update that overrides the key in some way and lets the FBI in and is a backdoor. Once this backdoor exists, the FBI will demand to have the code for it so they can break into iPhones without asking Apple. Then, two bad things happen. One, we need to trust the FBI to only use this in appropriate circumstances and never abuse it. (If you believe this wouldn't be abused at all, then I've got a bridge to sell you.) Two, the backdoor will leak out in some way. Once it does, all manner of foreign gov

        • Heck - you really think it would have to get into the hands of the FBI before it leaked?

          If a backdoor exists, the key will inevitably spread. Not even Apple can prevent 100% of leaks.

        • by hey! ( 33014 )

          Encryption doesn't quite work that way. You can't "override the key". But you *can* grant access to it, or give other people the use of it.

          I'm not saying that's right or wrong, I'm saying don't take the simplistic model of the *intended effect* for what *actually is possible*. The difference between those things is where security always fails.

        • <quote><p>Any firmware update that overrides the key in some way and lets the FBI in and is a backdoor. Once this backdoor exists, the FBI will demand to have the code for it so they can break into iPhones without asking Apple. Then, two bad things happen. One, we need to trust the FBI to only use this in appropriate circumstances and never abuse it. (If you believe this wouldn't be abused at all, then I've got a bridge to sell you.) Two, the backdoor will leak out in some way. Once it does, all
          • by Pieroxy ( 222434 )

            foreign governments and random hackers

            It doesn't have to leak. Once it exists, any and all governments will ask for it. Look at the recent Russian law to force a backdoor onto mobile phones.

            Should random hackers also ask for it ?

        • I wish I hadn't already used my mod points. I would mod you up.

          The following does not apply to YOU the poster I am replying to. Why idiots on this board don't immediately understand this and instead make up arguments for nefarious users, and evil Apple without actually having any knowledge is beyond me.

      • Not the way the security enclave works. What the FBI asked Apple to do last time was to give a phone special update that lowers the security settings of the phone specifically in the passcode attempts. First override the setting where the phone is wiped after 10 failed attempts and then change the time between attempts to zero. After consecutive failed attempts the phone makes you wait longer and longer until it set at an hour. This made a brute force attack impracticality long.
        • by hey! ( 33014 )

          If the user is locking with PIN, brute force is a practical attack. That's why there's an attempt limit after which the secret key is wiped. Change that, and the phone is as good as unlocked.

          • by larwe ( 858929 )
            How do they get a firmware update onto the phone without first unlocking the phone?
          • Actually, the passcode checking code is designed to always take 80ms per passcode, independent of the speed of the iPhone, and can only be performed on that phone. So 80ms per key is an unbeatable lower bound for brute forcing the passcode.

            So if all protection was completely broken, a six digit key would take a day to crack, eight digits about 3 months, 10 digits 25 years, and you can use an alphanumerical passcode.
            • by hey! ( 33014 )

              Sure, and what is the chance you think the user chose an eight digit alphanumeric code that's not in a dictionary?

              • Pretty high. The problem is that people can’t remember longer than 8 characters easily and so reuse them. 8 characters is pretty low when the forced time between attempts are not high. Using a GPU as an cracker can take maybe an hour because the limit is the speed of computer component I/O: CPU, memory, etc.
                • You apparently didnâ(TM)t see the unbreakable 80ms. Per attempt lower limit designed into the HARDWARE of the Secure Enclave.

                  • You apparently didn’t read to what I was responding. The poster asked how unique 8 character passwords could be. I responded they could be unique but that’s not the problem. 8 characters is easily breakable these days with GPUs with the limit being the computer I/O. In the case of the iPhone, brute force attacking 6 digit or 8 characters is not practical as Apple has restricted the I/O to limit the attempts. It doesn’t matter if your GPU can guess billions of passcodes per second if the ph
          • What do you mean “after the secret key is wiped”? I’m not sure that is how the enclave works.
            • by hey! ( 33014 )

              I've read Apple's white paper on the enclave. The encryption key is held in the enclave, with no direct access possible. This makes it possible to "wipe" the phone without having to actually write to all the bits in RAM. You just wipe the key, and all the information in RAM becomes permanently unrecoverable, unless you can somehow reconstruct the key.

              • That makes no sense as wiping the main key basically makes the chip useless for decrypting that data. There would be no need for attempts after the key is wiped as the main key is used to generate directly and indirectly all the other keys. The only use would be if you cared about reusing the phone not the data which you’ve actually made harder by not wiping out the device.
                • by hey! ( 33014 )

                  Yes. That's the point. If you want to wipe your phone fast (e.g. remotely after it was stolen), it's practically instantaneous.

                  • There are two settings to overcome. 1) Wipe after 10 failed attempts and 2) increase time between attempts to an hour after so many failed attempts. The first one makes brute force attack nearly impossible unless the passcode is miraculously guessed in the first 10 attempts. The second one requires 114 years to go through the combinations of 6 digits and makes it impractical. 114 years is not a practical time for a brute force. It is possible but not practical.
          • Except that an extremely security-conscious iOS user can enter an up to 53 character Alphanumeric/symbol passphrase. That pretty much negates brute-forcing. Add 2FA to that, and itâ(TM)s pretty much Game Over for brute forcing. And with the owner dead, that eliminates the $5 wrench âoedecryption methodâ.

      • Why are they asking Apple for help? They could crack the AES256 encryption used to encrypt the data. I realise thats really difficult, but they could ask Vincent Rijmen and Joan Daemen for help.

        /sarcasm
        • Just get Uri Gellar on the case. He reckons he's erased secret discs on their way to moscow so the key on an iphone right infront of him should be no problem. I'm suprised the FBI haven't asked already considering they are such big fans of his apparently.

          https://news.sky.com/story/uri... [sky.com]
      • by mysidia ( 191772 )

        While it may not be practical for a third party to get at that key without physically breaking into the CPU package

        This shouldn't be off the table.... Providing a procedure to reliably get the key after physically breaking the CPU Package ought to be an acceptable answer, even if that means a deep analysis of the chip under electron microscope may be required.

        Apple may have a way in ... by software updates. After all, it's the phone's firmware that controls access to the key

        Well; clearly, if there

        • Apple should not be allowed to use "preserving security" as a defense for failing to provide a procedure to use physical access to get access to that Encrypted form of all persistent data which is stored as Encrypted on the chips and the Non-Encrypted form of all persistent data that is physically stored Non-Encrypted on the chips, And the full documentation of all Encryption of encrypted persistent storage on the physical device.

          Sorry.

          The policemanâ(TM)s job is only easy in a Police State.

          If you like that, Uncle Putin will welcome you with open arms, Comrade.

          Most of the rest of us Americans still hold dear the few freedoms we still have.

        • Apple should not be allowed to use "preserving security" as a defense for failing to provide a procedure to use physical access to get access to that Encrypted...

          Really? Why "should" not? Do you have some ethical or moral principle behind that assertion?

          What you're proposing is just the sort of "back door" security experts warn against. Yes, there are legitimate uses for such a procedure Problem is, if the procedure exists, someone will exploit it. And that's the tradeoff: how likely is it we'll use the back door for good versus how likely will it be used for evil?

          50 years of computer security research and development says that the only way to make a reasonably trus

          • by mysidia ( 191772 )

            Really? Why "should" not? Do you have some ethical or moral principle behind that assertion?

            Yes; compliance with lawful warrants. Just the same way as a Safe Manufacturer provides the technical details
            to law enforcement experts that are required to physically defeat their safe If they have their search warrant, and the owner cannot or will not provide the combination that unlocks the safe.

            What you're proposing is just the sort of "back door" security experts warn against. [...]

            No its not at all.

      • We could make a similar argument about the suspect's brain. I wonder if we will ever consider it ethical in the future to put the head of a perp on ice in order to harvest what data we can.
    • The data on the phone died with owners brain. Why can't any of those authoritarian thugs accept the concept?

      Because it really doesn't any more than a hardcopy version would similarly "die" when the creator dies. Just because it is a bunch of 0's and 1's doesn't make it somehow different and subject to a different set of rules. How and when is it reasonable to allow authorities access to information, hardcopy or electronic, is a different issue separate from the information's format.

      I personally am glad Apple built in strong encryption and security measures into the iPhone; but am not under the illusion the data i

      • Comment removed based on user account deletion
        • Yes, the data for sure is a hard copy. But the means to decipher it died with the brain. They can print out all those 1's and 0's to their hearts content. Good luck figuring out what the hell it means.

          Hardly. Apple conceivably can turn the information on the phone into a useable format; after all that is exactly what the phone does.

          • Except that’s not how the phone is designed. Every single file on the drive is encrypted with a 256-bit key that resides in the Secure Enclave which is on the hardware. So removing the drive is useless. Tampering with the Secure Enclave could wipe it out which means those files are essentially useless.
            • Except that’s not how the phone is designed. Every single file on the drive is encrypted with a 256-bit key that resides in the Secure Enclave which is on the hardware. So removing the drive is useless. Tampering with the Secure Enclave could wipe it out which means those files are essentially useless.

              My point is Apple could conceivably access the data without damaging it, since they control the OS. It has nothing to do with removing the drive, just accessing the data on it. If they would is another question.

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Wednesday January 08, 2020 @11:01AM (#59599008)
      Comment removed based on user account deletion
      • there's no reason to suppose that just because evidence was obtained from a dead terrorist's iPhone that the people brought to justice would be (1) tortured, (2) have their due process rights violated, or (3) be denied a fair trial.

        And I would argue that the possibility of those people being "brought to justice" is not worth the risk associated with a precedent of Apple being able to decrypt device data when asked by anyone.

        Like it or not, law enforcement has a perfect right to investigate crimes and bring those responsible to justice.

        That right is not being interfered with. They are free to do whatever they like with the phones in their possession. Fundamentally "rights" do not require anyone else to do anything. In fact the free exercise of rights requires people or organizations to not do things. For instance, I have a right to be secure in

        • by larwe ( 858929 )
          "And I would argue that the possibility of those people being "brought to justice" is not worth the risk associated with a precedent of Apple being able to decrypt device data when asked by anyone."

          The horrible problem here is, the only thing that is restraining Apple from opening up that data isn't a technical design that makes the data actually safe; it is a business decision. Not a justice decision, not a safety decision, not a moral "we need to make the world a better place" decision, but purely a *BU

          • by Big Boss ( 7354 )

            That's the argument the government and law enforcement use. But is it accurate? Based on the whitepapers on the secure enclave and the security design they used, it's debatable at best. If you are designing something like this, you have to assume that if YOU can get the data, an attacker can. So you have to at least attempt to prevent that. The keys are held in a secure part of the chip that physically limits access attempts and tries to detect attacks and wipe itself if it thinks it's being attacked. They

            • by larwe ( 858929 )
              Isn't the argument the government is using "You have to give us a way in, and people need to accept the security risks inherent in that"? I agree with you that there is - probably - no way AAPL can unlock an individual phone (though who knows - maybe there's a JTAG interface they can use to flash firmware onto the phone that will allow brute-force password attacks). But they can certainly a) provide the aforementioned permanent backdoor/escrow (yey Clipper chips, remember?), and/or b) make future versions o
        • by Zak3056 ( 69287 )

          Fundamentally "rights" do not require anyone else to do anything.

          I'm offtopic here, but the above captures the current culture war perfectly. Some people have redefined rights from "things you cannot keep me from doing" to "things you have to do FOR me." Use my preferred pronouns, stop talking because I don't want to hear what you have to say, pay off the debt I've accrued, etc. With that context, it's easy to see how comments like yours come to be.

          Also, to dogpile on your response to the above (which was quite well said!) "law enforcement" (i.e. the government) does

      • No one is saying they were angels. Apple is saying as a matter of principle they don’t have to help the FBI in this way. They have turned over every thing they have but the FBI wants to cross some lines. If the FBI gets it, then Russia, China, Saudi Arabia will come asking for it.
        • Re: (Score:3, Informative)

          by SCPRedMage ( 838040 )

          Actually, Apple is saying as a matter of principal, they designed the iPhone so they can't help the FBI; the FBI is asking for help decrypting the data on the phone, which requires the key that's only stored on the phone itself, and the phone is designed to prevent access to said key. The FBI is asking for something Apple does not have and cannot get.

          Apple can't help them with this phone, and they're refusing to change the software so they can help them with future cases.

        • Apple is saying as a matter of principle they donâ(TM)t have to help the FBI in this way.

          As a matter of principle, Apple should be saying they can't help the FBI in this way. "Sorry, the encryption requires the key that only the owner knew. We can't crack it any more than you can."

          • My point of clarification is Apple had already handed over any iCloud data they have access to hand over. Breaking the phone encryption is not something they are willing to do.
      • Like it or not, law enforcement has a perfect right to investigate crimes and bring those responsible to justice.

        And Apple apparently is on board with that, in that they say they cooperate by handing over data in Apple's possession.

        Apple also consistently says is has no ability to unlock a phone, a device not actually in Apple's possession. That's what is annoying: why does the FBI keep coming back, essentially asking "OK, I know you couldn't unlock the last dozen phones we asked you about, but how about this one?" No, they can't, and nothing has changed since the last request.

        • why does the FBI keep coming back

          To keep the pressure on. Those in the FBI that actually undertand what's going on want Apple to change things so they can help. The constant stream of requests is a way to pressure Apple to do this.

          • by hjf ( 703092 )

            Yes. Once they reach critical mass, they will go on TV saying "iPhones are the #1 devices used by criminals to commit crimes, because they can't be investigated by law enforcement. Apple is supporting criminals. ISIS is using Apple devices". Throw in some "child predator" in the mix, and you got yourself a case for public outcry.

            Apple, WHY DO YOU HATE AMERICA SO MUCH!

            Just wait.

        • by tlhIngan ( 30335 )

          Apple also consistently says is has no ability to unlock a phone, a device not actually in Apple's possession. That's what is annoying: why does the FBI keep coming back, essentially asking "OK, I know you couldn't unlock the last dozen phones we asked you about, but how about this one?" No, they can't, and nothing has changed since the last request.

          Because eventually it'll work. Not by wearing Apple out, but by forcing the public to be on "their side".

          It doesn't have to work on this phone. Or the next doze

      • The FBI already has his call records, so they know everyone he communicated with. The NSA (probably) logs everything (phone/text/email) so they can already get everything the phone contains.
      • The people who are being investigated are being investigated for violent crimes, not for showing opposition to the government, and virtually the only reason the FBI has for trying to obtain this information is to determine who else was involved and to either stop subsequent attacks and/or bring those responsible to justice.

        Completely irrelevant.

      • Very recently the Inspector Generalâ(TM)s Report on 2016 FBI Spying revealed a scandal of historic magnitude [theintercept.com] exposing not only the FBI but the media (namely corporate media which publishes stories that are often uncritically repeated on sites like /.). The language of "authoritarian thugs" is light going given the FBI's record.

        This is a test case not just for Apple but for the public: will the public stand in defense of a software proprietor whose software is unavailable for inspection, improvement, an

    • The data on the phone died with owners brain. Why can't any of those authoritarian thugs accept the concept?

      Data doesn't die and there's nothing wrong with a phone search after obtaining a warrant from the judge, much as doing a house and/or financial background search.

      If we were talking about authoritarian thugs this shooting probably wouldn't have happened in the first place.

    • The data on the phone died with owners brain. Why can't any of those authoritarian thugs accept the concept?

      (Emphasis mine.) I'm pretty pro-privacy. But the previous time this came up (San Bernardino mass shooter), the pro-privacy crowd and pro-Apple crowd cheering Apple on failed to realize or deliberately ignored that the iPhone in question belonged to the San Bernardino County government. It was a work phone, purchased and owned by the county, and assigned to the shooter. Apple refused to help the ph

      • Comment removed based on user account deletion
      • Being the owner of a device does not give you the right to compel the manufacturer to redesign the code running that device with new capabilities that will harm other owners of said device.
      • None of what you typed means anything.

        Apple should not build backdoors into their devices. Apple should not work to crack their devices. Apple should not do anything, in any way, to compromise the security of their devices.

  • all for show (Score:2, Insightful)

    by v1 ( 525388 )

    The FBI has always had access to the data on encrypted phones. "physical access trumps most security", there's nothing new there. In the case of the iPhone, the encryption is solid, but the 4-6 digit passcode renders it worthless if you have physical access.

    What the FBI is doing here is trying to encourage the big threats to store data on their devices that the FBI can rifle through after they commit a mass shooting or something. They don't need Apple to assist them with anything, and they do not want Ap

    • Re:all for show (Score:5, Informative)

      by AmiMoJo ( 196126 ) on Wednesday January 08, 2020 @09:55AM (#59598772) Homepage Journal

      No, they don't have access.

      The encryption key is stored in a secure chip. It has physical security so that if you try to tamper with it there is a very good chance you will destroy the key.

      The firmware requires the passcode. If you get it wrong 10 times it will erase the key permanently.

      The only way in is to find a flaw in the firmware. This is not unique to Apple either, these kinds of chip are common now.

      • No, they don't have access.

        The encryption key is stored in a secure chip. It has physical security so that if you try to tamper with it there is a very good chance you will destroy the key.

        The firmware requires the passcode. If you get it wrong 10 times it will erase the key permanently.

        The only way in is to find a flaw in the firmware. This is not unique to Apple either, these kinds of chip are common now.

        Here's Google's perspective [googleblog.com], and Google is pushing [android.com][*] the rest of the Android ecosystem in the same direction.

        [*] See last paragraph of section 9.11.2, in particular the last sentence of that paragraph.

      • by idji ( 984038 )
        Can't they copy the bits elsewhere and bruteforce the key? or probe the key with a tunneling microscope?
    • Re: (Score:2, Informative)

      by Anonymous Coward
      The key on the chip is only half of it, the other half is the passcode. The hardware part of the key is created by dropping carbon nanotubes onto a grid of connectors - some connect and some are open, making it more random than a RNG. This means that inspecting the chip with something like an electron microscope has a good chance to destory the nanotubes, making the key unrecoverable before you
    • The FBI has always had access to the data on encrypted phones. "physical access trumps most security", there's nothing new there. In the case of the iPhone, the encryption is solid, but the 4-6 digit passcode renders it worthless if you have physical access.

      You need to read before Posting.

      You can set several more secure options, including an up to 52 (IIRC) character alphanumeric passphrase, erase after 10 attempts, remote wipe, etc.

      Read first, then Post:

      https://support.apple.com/en-g... [apple.com]

  • guess the newer iPhones are a way tougher nut to crack.
    and who knows, maybe the phone is NOT using a 4 or 6 digits passcode, long alphanumericals/symbols are also possible...

    • I haven’t kept up with the details but it seems that Apple patched whatever flaw Cellebrite was using. Their expensive phone breaking machines are worthless for newer iPhones unless they patch them.
  • "The official said the F.B.I. was not asking Apple to create a so-called backdoor or technological solution to get past its encryption that must be shared with the government," the report adds. "Instead, the government is seeking the data that is on the two phones, the official said."

    So... how do you get to the data without a backdoor, given that the owner of the phone is deceased? That's like a burglar claiming they didn't want to break the window or door on a house, they just wanted access to the liquo
    • This was their sly way of saying "If you have a crack, please use it, and hand us a decrypted image of the phone, please". Meaning they would be satisfied with just a copy of the decrypted data, and were not pressuring Apple to give up the crack methodology or firmware to them, just the data.

      There is more likelihood that Apple, if the crack existed, would perhaps use it internally but not have to give it up. However, it may be a trap - once Apple used it, that confirmed that there IS a way into the newer

  • by rmdingler ( 1955220 ) on Wednesday January 08, 2020 @10:03AM (#59598798) Journal

    There are companies that do this for a living. Cellibrite [wired.com] notoriously cracked the San Bernadino shooter's phone.

    tldr, 06/19:

    "On Friday afternoon, the Israeli forensics firm and law enforcement contractor Cellebrite publicly announced a new version of its product known as a Universal Forensic Extraction Device or UFED, one that it's calling UFED Premium. In marketing that update, it says that the tool can now unlock any iOS device cops can lay their hands on, including those running iOS 12.3, released just a month ago. Cellebrite claims UFED Premium can extract files from many recent Android phones as well, including the Samsung Galaxy S9."

    Last year, Apple upgraded security to cripple another company's decryption product, Greyshift's GreyKey device, but if the ever evolving tech is approachable by multiple for-profit companies in the free market, the FBI most likely has it, too.

  • Comment removed based on user account deletion
    • by EvilSS ( 557649 ) on Wednesday January 08, 2020 @10:49AM (#59598936)

      You don't have to break the encryption. You can pass a law outlawing any features which cause the phone to self-wipe after X failed attempts unless the phone is under MDM.

      Sure, and I could retire selling my old, un-updated iOS devices then.

      Son, if they have physical access to your phone it's either gone or you're about to get a lesson via $5 wrench on why encryption has far more limited ability to protect you than you might like to believe.

      That only works if you aren't already dead.

      • by hjf ( 703092 )

        Sure, and I could retire selling my old, un-updated iOS devices then.

        So they will pass a law banning all IMEIs of older, now illegal, iPhones.

  • ... that made a living breaking into iPhones for a living? The police were paying to have just this type of forced entry by an Israeli firm [timesofisrael.com].

    Israeli tech company says it can unlock all iPhones ever made, some Androids
    Cellebrite, believed to be the company hired by the FBI to hack into San Bernardino killer’s phone in 2016, has faced criticism for hiding phones’ vulnerabilities from Apple
    By TOI staff 17 June 2019, 2:17 pm>/quote>

  • In my experience, an Apple ID password can be reset by having Apple send a reset to the associated email address. So shouldn't the FBI try to gain control of the deceased's email accounts, first? Then they wouldn't even need to talk to Apple - aside from the automated "talking" to get the password reset...

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...