Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security Hardware Technology

Secret Chips in Replacement Parts Can Completely Hijack Your Phone's Security (arstechnica.com) 62

Dan Goodin, writing for ArsTechnica: People with cracked touch screens or similar smartphone maladies have a new headache to consider: the possibility the replacement parts installed by repair shops contain secret hardware that completely hijacks the security of the device. The concern arises from research that shows how replacement screens -- one put into a Huawei Nexus 6P and the other into an LG G Pad 7.0 -- can be used to surreptitiously log keyboard input and patterns, install malicious apps, and take pictures and e-mail them to the attacker. The booby-trapped screens also exploited operating system vulnerabilities that bypassed key security protections built into the phones. The malicious parts cost less than $10 and could easily be mass-produced. Most chilling of all, to most people, the booby-trapped parts could be indistinguishable from legitimate ones, a trait that could leave many service technicians unaware of the maliciousness. There would be no sign of tampering unless someone with a background in hardware disassembled the repaired phone and inspected it. The research, in a paper presented this week (PDF) at the 2017 Usenix Workshop on Offensive Technologies, highlights an often overlooked disparity in smartphone security. The software drivers included in both the iOS and Android operating systems are closely guarded by the device manufacturers, and therefore exist within a "trust boundary."
This discussion has been archived. No new comments can be posted.

Secret Chips in Replacement Parts Can Completely Hijack Your Phone's Security

Comments Filter:
  • Phone manufacturers (Score:4, Interesting)

    by Dan East ( 318230 ) on Friday August 18, 2017 @10:54AM (#55040595) Journal

    I wonder which phone manufacturers sponsored this FUD. Technically possible? Sure. Any evidence it has ever occurred in the wild? No. Would this sort of malicious hardware have to transmit data in some way to offload the stolen information, thus raising alarms in various corporate type networks and the like? Eventually.

    • How can a screen or digitizer communicate to the outside world? It likely isn't on a bus where it can ask the radio or NIC to packetize stuff it feels like. At best, it can record taps on a screen, but getting those out would be a different story. Perhaps for physical snooping where the device is captured later on (say to glean someone's PIN), but for a remote attacker, it isn't that feasible.

      • If it knows what touches to simulate, then it can use any of a number of ways of doing so when the user isn't actively handling their phone (e.g. after it has been set down, but not yet auto-locked).
        • If it knows what touches to simulate, then it can use any of a number of ways of doing so when the user isn't actively handling their phone (e.g. after it has been set down, but not yet auto-locked).

          But unless it knows exactly WHAT App is receiving those "taps" (which the display and digitzer most assuredly do NOT), so the fuck what?

      • How can a screen or digitizer communicate to the outside world? It likely isn't on a bus where it can ask the radio or NIC to packetize stuff it feels like. At best, it can record taps on a screen, but getting those out would be a different story. Perhaps for physical snooping where the device is captured later on (say to glean someone's PIN), but for a remote attacker, it isn't that feasible.

        Right. Because the user taps in several places on the screen; but unless the display/digitizer is privvy to exactly WHAT App is running in the foreground, those taps and swipes are USELESS outside of the phone.

        FUD.

        • by tlhIngan ( 30335 )

          Right. Because the user taps in several places on the screen; but unless the display/digitizer is privvy to exactly WHAT App is running in the foreground, those taps and swipes are USELESS outside of the phone.

          FUD.

          No, it's not. Because if you log where you touch on the screen and where you swipe, you can probably figure out what's going on.

          Look at the lock screen on your phone, and your keypad is probably laid out like every other keypad out there. In fact, it looks remarkably like the phone keypad too (if

          • Right. Because the user taps in several places on the screen; but unless the display/digitizer is privvy to exactly WHAT App is running in the foreground, those taps and swipes are USELESS outside of the phone.

            FUD.

            No, it's not. Because if you log where you touch on the screen and where you swipe, you can probably figure out what's going on.

            Look at the lock screen on your phone, and your keypad is probably laid out like every other keypad out there. In fact, it looks remarkably like the phone keypad too (if you're using a PIN). So any succession of taps in that region of the screen with the relatively wide spacing may be either a phone number, or the PIN code to unlock your phone.

            Ditto with the keyboard - if you're making a bunch of taps in the lower 1/3rd of the screen, I don't need to know what you're running in order to guess you might be typing something. If I record the locations of the taps, and then try to play it back with various scaling on the keyboard, I might be able to recreate what you typed.

            Heck, I might log information about when the touch screen chip is turned off so I can tell when you power it up, you're screen is probably locked and to note the next few taps and swipes.

            I'm sorry; my lock screen doesn't have a keypad, nor do I swipe to unlock.

    • Not only is it FUD, but it could be done with brand-new phones. Thousands of people have access to the supply chain and at any point could pull inventory, modify/replace the original parts, and swap them back in. The fact is that there is no reasonable commercial incentive for the random repair person at a store to spy on the random customer that has his screen replaced, and it would be super simple to catch the responsible party. Talk about hard evidence!

      • if you're a front for organized crime, then it's an easy way to hijack people's phones

        • But like I said, super easy to trace back and with tons of hard evidence, both on-premises and in all of your victim's hands. Plus the circumstantial evidence of all the ripped off people having a common experience of using your business. Much better off with a software hack. I mean, you have the person's phone in your possession... why not just install whatever software that you want - where there is at least some degree of plausible deniability?

          • But like I said, super easy to trace back and with tons of hard evidence, both on-premises and in all of your victim's hands. Plus the circumstantial evidence of all the ripped off people having a common experience of using your business. Much better off with a software hack. I mean, you have the person's phone in your possession... why not just install whatever software that you want - where there is at least some degree of plausible deniability?

            Good luck doing that with a signed OS, like iOS... You can't even install a LEGIT, but no-longer-signed, version of iOS; let alone some backroom-hacked FrankenWare version.

            • OK, but if you can't do it with physical access to the phone, then a screen hack also won't help you.

          • The hardware is not something a small scale actor would be able to create. At the moment this is nation state, and maybe organized crime or the teamsters level of involvement to create a hack that you could then re-close the phone up. Done properly the hardware would be selective about the data it captures and when it sends it. And if you're a bad actor shop you only install the spyhard-ware on select victims phones and use legitimate hardware on others. You limit your footprint. If you are a bad actor on t
    • by WD ( 96061 ) on Friday August 18, 2017 @11:15AM (#55040711)

      Perhaps you're not familiar with how security research works. Stopping at "is this being exploited in the wild now?" is shortsighted.

      For some background, read:
      https://blog.osvdb.org/2017/08... [osvdb.org]
      (about "L0pht, Making the theoretical practical since 1992." )

    • Dumb question here. Why do we trust Apple or Samsung parts more than Huwai?

      • In theory, Apple and Samsung have a lot more to lose. Apple especially, since their reputation as a phone provider rides on how secure their devices are, and if something is discovered, there are many rivals who will be happy to take the loss. Samsung, similar.

        Huawei? Not as much, as they are in a different market segment.

        • In theory, Apple and Samsung have a lot more to lose. Apple especially, since their reputation as a phone provider rides on how secure their devices are, and if something is discovered, there are many rivals who will be happy to take the loss. Samsung, similar.

          Huawei? Not as much, as they are in a different market segment.

          Exactly.

      • by bsDaemon ( 87307 ) on Friday August 18, 2017 @12:02PM (#55041047)

        Apple and Samsung devices and software have been evaluated and validated against FIPS 140-2, Common Criteria and Commercial Solutions for Classified (CSfC) standards and are considered safe enough for use by the US government and others which respect those certifications (such as the 20+ countries in the Common Criteria Recognition Agreement).

        Huawei has financial and political ties to the Chinese government, which has a well known history of taking "cyber" action for both political and industrial espionage purposes, in addition to siding with adversarial countries such as Russia, North Korea, etc. on a number of issues.

        Therefor, Apple and Samsung are probably better choices from a trustworthiness standpoint. On the other hand, they're largely manufactured and assembled in the PRC and would be targets for the kind of supply-chain-infiltration type hardware implant attack. It'd just be less easy to accomplish than embedding implants or back doors into the hardware of one of their own companies.

        • Apple and Samsung devices and software have been evaluated and validated against FIPS 140-2, Common Criteria and Commercial Solutions for Classified (CSfC) standards and are considered safe enough for use by the US government and others which respect those certifications (such as the 20+ countries in the Common Criteria Recognition Agreement).

          Huawei has financial and political ties to the Chinese government, which has a well known history of taking "cyber" action for both political and industrial espionage purposes, in addition to siding with adversarial countries such as Russia, North Korea, etc. on a number of issues.

          Therefor, Apple and Samsung are probably better choices from a trustworthiness standpoint. On the other hand, they're largely manufactured and assembled in the PRC and would be targets for the kind of supply-chain-infiltration type hardware implant attack. It'd just be less easy to accomplish than embedding implants or back doors into the hardware of one of their own companies.

          PERFECT answer!

          Mod Parent "Informative"!

      • Dumb question here. Why do we trust Apple or Samsung parts more than Huwai?

        Because by now, any nefarious transmissions would have long-ago been discovered by people like you.

    • thus raising alarms in various corporate type networks and the like?

      Only if you assume that no one ever uses a network outside their corporate network and that all networks used employ various ways to detect this data transmission. For most consumers, the normal is not to have such high security. They don't employ such detection methods and they connect to outside networks all the time.

      Also consumers are far more likely to buy these 3rd party parts than someone with a corporate phone who will most likely send it to their company for repair who will use genuine parts.

    • Smart "chip in the middle" devices would wait until you were off wifi and on the LTE or other telecom data. Or if they were really suave, even if you were on wifi they'd use the telecom communications channel.
  • Such as faulty/counterfeit batteries used in Galaxy Note 4s during repair [engadget.com].

  • by mlw4428 ( 1029576 ) on Friday August 18, 2017 @11:06AM (#55040665)
    Once you give up physical access to your device, you give up security. This is no different than the possibility that a locksmith could use a copy of a key he made for you. It's stupid.
    • This is why the USER should be able to set the locks themselves. The 'physical access' loophole can be defeated with 'Trust, but Verify' methods.
    • by Sloppy ( 14984 )

      Once you give up physical access to your device, you give up security.

      And when it comes to phones, that happens before you even buy it. The idea of a phone's security being subverted is laughable. It never had any security! It was always someone else's computer.

      Granted, you would probably prefer your phone to have n masters above you, rather than n+1. But for high values of n, the more you care about that, the less sense it makes. You should probably worry more about n and less about the +1. Solve the real

  • So fine, your screen part has a malicious "touch logger" capability... how does it send data? Oh yeah, it CAN'T.

    This sounds like FUD to make sure customers use the most expensive repair channel - the original manufacturer - to have the work done.

  • "A hacked touchscreen can inject pre-scripted touch events into the event screen"

    Of course, this assumes:

    a) the device is unlocked

    b) the malicious driver can guess where the required touch zones are located (no small feat, considering the diversity of softkey layouts (e.g, Samsung vs Nexus vs LG vs HTC), homescreen launchers, and the layout of app drawers (depending upon what the user installed).

    c) Since malware (in addition to the driver itself) is almost a requirement (given a & b), the hardware itsel

  • I'll be sure to pay out the a$$ for the expensive parts with spyware approved by the shiny people.

  • by gillbates ( 106458 ) on Friday August 18, 2017 @12:13PM (#55041145) Homepage Journal

    Ken Thompson's Reflections on Trusting Trust [win.tue.nl] is well worth a read. Long story short, anyone with access to the hardware/software stack of your machine can compromise its security.

    These attacks are not merely theoretical. The key to good security is to make the cost of compromise greater than the value of whatever would be received by doing so. For the average person, their privacy is not worth the effort of surrepitiously installing hardware. However, if you're a Palestinian terrorist... You may just want to have someone else purchase/service your electronic devices, as the Israeli equivalent of the CIA has planted explosives in the cellphones of Palestinians (and successfully carried out assassinations this way.)

  • This just shows that you should always order by mail order or by Amazon, so that the NSA can install their own chips inside instead of the other ones a repair shop would install.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...