Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Technology

Five Years After the Equation Group HDD Hacks, Firmware Security Still Sucks (zdnet.com) 49

In a report published today, Eclypsium, a cyber-security firm specialized in firmware security, says that the issue of unsigned firmware is still a widespread problem among device and peripheral manufactures. From a report: According to researchers, many device makers still don't sign the firmware they ship for their components. Furthermore, even if they sign a device's firmware, they don't enforce checks for the firmware signature every time the driver/firmware is loaded, but only during installation. Researchers say this leaves the door open for malicious actors to tamper with local firmware after it's been installed in order to plant persistent and nearly invisible malware on user devices. To prove their point, in their report, the Eclypsium team disclosed vulnerabilities in four types of peripheral firmware -- for touchpads/trackpads, cameras, WiFi adapters, and USB hubs. "Apple performs signature verification on all files in a driver package, including firmware, each time before they are loaded into the device, to mitigate this type of attack," the Eclypsium team said. "In contrast, Windows and Linux only perform this type of verification when the package is initially installed." But while some might be quick to blame the operating systems for not enforcing a stricter firmware signing practice, the Eclypsium team is not on this boat.
This discussion has been archived. No new comments can be posted.

Five Years After the Equation Group HDD Hacks, Firmware Security Still Sucks

Comments Filter:
  • Security is no selling argument. You will see the same in every area, if you think that's bad, take a look at the firmware of consumer electronics, like TVs, NAS or anything that goes by the buzzwords "Internet of Things".

    You can see it best when you got to one of the big retailers who usually will have a little card with a lot of checkboxes next to the product advertising its capabilities. You will also notice that none of those capabilities have anything to do with security. And you will also notice that

    • by hey! ( 33014 )

      Even when buyers understand that security is important, they have no way of evaluating the security of the product.

      Most laymen don't even have the basic intellectual framework to understand system security. They fall victim to the "fallacy of composition" -- believing that systems are secure if and only if they are made entirely out of secure components. Vendors themselves might not understand that the impracticality of brute force overcoming the algorithms they use doesn't mean their product is hard to c

      • I also do not understand enough about physics or car design to understand what's going on in a crash test, but I can tell if a car is a security hazard or not. No later than when someone tells me. And, lo and behold, security features are a selling point with cars.

        Why? Just because they can literally kill you if their design is crap? Maybe it's time we develop an IoT device that can kill people to get them to wake up.

        • by hey! ( 33014 )

          Here's the thing though: nature isn't actually *hostile*. It isn't *trying* to find new ways to kill you in a crash. So developing a crash test that will work across all models and vendors is pretty straightforward, especially as they all do more or less the same thing.

          You really can't build those kinds of black box tests for *software*. Even leaving aside the intelligent opponent aspect of the problem, that'd be like building a safety test that works for cars, front end loaders, and kitchen stoves. You

          • Oh please, like the makers of IoT nightmares invented something new. 9 out of 10 of those gadgets are some sort of Linux implementation with standard tools, standard services and stanard vulnerabilities. It's not like I have to dig deep into my security knowledge, it's most of the time nothing more than finding some way to connect (nmap helps), finding out what version of the software is running (since they usually just do a standard compile it will announce it), then enter the ancient version they use in e

        • Are you sure?
          One of the best safety features put in automobiles was the creation of crumple zones.
          However if you look at crash test, you think the older car is more secure, because there is less damage to the automobile because it is so rigid. While the crash test dummies show they have a better survival rate.

          For a lot of people, they think they are getting a secure device, because it is physically heavy. They have ISO stamps on the box, or passed this and that security checks. However there could be a wi

          • by sjames ( 1099 )

            That's a lot of the problem. They assumed Linux = secure. The more subtle truth is that Linux certainly can be quite secure but not if you deliberately punch holes in it and otherwise do all the don'ts

            As for crumple zones, they do enhance passenger safety but I wish they wouldn't crumple until necessary. Perhaps reinforcements that fail brittle at a threshold force.

        • by sjames ( 1099 )

          No, Security theater is a selling point in cars. Many cars share the same key (enough that people occasionally find out accidentally). Many cars are vulnerable to relay attacks where they convince the car that the key on your nightstand is right next to the door. Many cars have a "cheat code" to bypass the immobilizer, known only to the manufacturer and every car thief in the worls. The biggest 'security feature' in a car is that you can't steal them at a distance (yet) so the risk is hard for the thief to

          • "Endless stream of updates"?

            Care to inform me about the magical IoT device that gets updates after the mandated warrenty period? Hell, most of them don't get any at all. And yes, that includes things like TVs and even NAS systems.

    • I think you are generally right when it comes to the "common" consumers. Still there is a market for those looking for truly secure stuff. I recently learned about Power9 based desktops and even though they are kinda pricey I am seriously contemplating buying one, specifically because the source all of their firmwares is available and also the CPUs don't have the backdoors Intel/AMD have.
    • As long as there was a way to still install any driver I wanted I wouldn't mind if it checked.

      Windows checks on installation now, this lets me still run some hoops and install unsigned drivers if I want to.

      but that's not really same as firmware anyways? if the device itself doesn't check what does it matter. if windows itself is already hacked to enable installing said thing, who cares - it's already corked? and how the firmware is updated on the attached devices is left to the device manufacturers themsel

    • by cusco ( 717999 )

      Not just consumer devices, HID, the largest seller of security equipment in the world, doesn't sign its firmware or require a signature when its installed. Most of the professional-grade CCTV cameras out there are in the same boat.

  • by Luthair ( 847766 ) on Tuesday February 18, 2020 @11:03AM (#59739388)

    In order to exploit these presumably you would need to have root access on the system.

    Consider, if hardware starts to check signatures then that will prevent hobbiests from creating custom firmware e.g. removing lenovo's white listed wifi cards or making your optical drive region free.

    • by sinij ( 911942 )

      In order to exploit these presumably you would need to have root access on the system.

      I think it is more accurate to say that you need to have root access to a system. This could be an attack on supply chain, where your brand new hard drive shows up with malware and your OS does not validate firmware it is running.

      • Re:Does it Matter? (Score:4, Insightful)

        by Bert64 ( 520050 ) <bert@[ ]shdot.fi ... m ['sla' in gap]> on Tuesday February 18, 2020 @11:38AM (#59739494) Homepage

        If you have access to the supply chain, then you can almost certainly subvert the signing checks in any case.
        If you compromised the original vendor you can get the key and sign your own compromised firmware.

        Plus if users are unable to replace the firmware themselves, they will be at the mercy of the vendor - a lot of vendor firmwares are garbage with very short support cycles. You'll be even worse off because you'll not only have an insecure product, but no way for you to fix it yourself.

        There are countless devices out there, android phones, dvr systems, routers etc, that are usable precisely because someone has created a replacement firmware for them. The original firmwares supplied with these devices is long since out of support and full of security holes that will never be patched. The only other alternative for such devices is landfill.

        • What's to say you didn't insert your own backdoored chips which have an identical copy of the silicon which can run the "verified" firmware and now has a few undocumented functions?

          You CANNOT perfectly lock down a system.
          • by Bert64 ( 520050 )

            Exactly...
            You can lock it down enough to severely disrupt your paying customers.
            You can't lock it down enough to stop serious malicious actors from compromising it and therefore your paying customers.

      • by AmiMoJo ( 196126 )

        I'm not sure this attack is very practical. Anyone with physical access can just reprogram or replace the flash memory chip on the HDD's PCB. For malware to be able to do it then it would have to have already 0wned the system so hard that it's not clear what more it could do by attacking the firmware. Go persistent I suppose, but if you are worried about that then it's better to boot a clean OS from live CD to access your data anyway.

      • and it get's wiped by the raid card, ZFS, CEPH, etc raid system.

    • In order to exploit these presumably you would need to have root access on the system.

      This is incorrect. To reprogram a USB device, you only need to have direct IO access to the device. All major OSes allow users to do this by default.

      This basically means that any executing malware could reprogram many USB devices to be malicious.

      Consider, if hardware starts to check signatures then that will prevent hobbiests from creating custom firmware

      You're not wrong.

    • Yes, it matters. If you like open source and believe you should be able to change the software running in your devices, you should be AGAINST enforcement of firmware signing.

    • by AHuxley ( 892839 )
      It matters for the NSA and GCHQ.
      Thats their easy collect it all plain text access lost if security actually works per user, service, device, network, nation.
    • by sjames ( 1099 )

      One of the more interesting approaches to that I have seen are on the control board of a 3D printer. Normally the booltlader refuses to run firmware that wasn't signed by the manufacturer. Howerer, the 3D printing community is very much about DIY and customizing your tools, so there is a small tab on the controller board. If you snap the tab off (signifying your agreement that the warranty will be further limited), it will allow arbitrary firmware to run.

      In many cases, it would be good enough security to on

  • Wrong "solution" (Score:5, Insightful)

    by Hizonner ( 38491 ) on Tuesday February 18, 2020 @12:00PM (#59739584)

    This obsession with signing stuff has got to stop.

    Signatures don't solve the fundamental problem, which is that binary blobs are untrustworthy regardless of who signs them. There are too many chances to suborn or infiltrate whoever is producing the blob, not to mention that there's no way to check the quality.

    The "sign everything and have the device check the signatures" approach assumes manufacturers and everybody who works for them are trustworthy, and not just at some single point in time, but forevermore. That's crazy.

    The right answer is for manufacturers to:

    1. Publish the full source code and designs for everything they sell, and not use any inaccessible toolchains.
    2. Use reproducible builds, so that anybody can verify that an update is what they say it is.
    3. Not store the firmware in the device itself to begin with, except, if absolutely necessary, for an immutable, unupdatable "golden" copy used to boot the OS. Once the OS is up, firmware should be loaded from the OS, every time. Yes, that means that if the golden copy is wrong, you may be hosed... but only in the relatively controlled boot-time environment, and even at boot time, you're only exposed to one potentially bad version, rather than the possibility that any signed version may be bad.
    4. Then start signing stuff. But you don't need the device to check the signatures, because by requiring the firmware to be reloaded at every boot, you've removed its usefulness as a malware persistence vehicle. Without the temporal loophole of persistent updatable firmware, it's safe to just run whatever firmware the OS gives you, since you're supposed to be doing the OS' bidding anyway.

    The signing step is the last one because it's not very useful without the other ones.

    Of course, they're not gonna do any of that, either, but to actually get any real security, that's what you need.

    • The right answer is for manufacturers to:

      1. Publish the full source code and designs for everything they sell, and not use any inaccessible toolchains.

      Your recipe is great from a theoretical security perspective, but in real-world terms it breaks down right here. It creates a conflict between end-user security and manufacturer competitive position, or at least the manufacturer's perception of their competitive position, and the user is basically always going to lose that battle. Very, very few device makers are going to be okay with open sourcing their firmware, and that position is just getting stronger as more and more "hardware" functionality is actu

      • by Hizonner ( 38491 )

        I claim that people's ability to make things physically blow up is just as "real" as manufacturer's competitive positions. And some of these security issues could in fact make things physically blow up.

        You're right that manufacturers would never give out such information willingly, and you are also right that various social scruples will probably prevent governments or even customers from forcing them to do so. That is a reality. It's even a reality that even if they did that, it would only be a first step.

    • Isn't this kind of how it works now (at least in better devices)?

      You have basically some immutable ROM code that does something really simple, like copy the first N blocks from a storage device into the first Y pages of RAM, than jumps to executing whatever it loaded at $Y.

      Except now, what happens is that the ROM just does whatever it does and then jumps to some fixed point in flash, and then the "firmware" loads the operating system. I say "better" devices above because the better devices have a button (a

    • so hardware goes windows only??
      Why not just go back to kickstart rom for the OS.

      • by tlhIngan ( 30335 )

        Why not just go back to kickstart rom for the OS.

        It resulted in a very interesting Unix port on Macintosh called A/UX. The thing booted into a very light version of the Mac OS (with full GUI) which then launched A/UX. But since the GUI was already there, the loader and A/UX had full utilization of it so you got a GUI progress bar as the kernel booted, among other things.

        And then when it was done, it spawned full terminal windows with the Mac borders and such because A/UX supported Mac applications, Unix app

    • by cusco ( 717999 )

      The right answer is for manufacturers to:

      Publish the full source code and designs for everything they sell, and not use any inaccessible toolchains.

      And are you competent to read and analyze that source code? Do you have the spare time and tools to do it? My answer is "No" and "No", and that will be the answer for over 99 percent of the population since writing device firmware and drivers is a very specialized profession. Open sourcing software is not a magic wand, how many years has Linux, probably the most-analyzed OSS package around, carried some really serious security failures that only recently were found?

      • by Hizonner ( 38491 )

        Yes, I am competent to do it. Yes, I have the tools to do it. No, I do not have the time to do it for most devices I use. But I might be able to do it for one or two.

        You also missed the part where, however competent I may be, I can miss things. It's pretty easy to write a subtle, exploitable bug that's hard to notice.

        You are, of course, right that bugs sit there in the open, even in more accessible software for a long time and don't get discovered, partly because those bugs aren't obvious, partly because no

    • Signatures don't solve the fundamental problem, which is that binary blobs are untrustworthy regardless of who signs them.

      Not true. Trust is a variable scale not some magical absolute. While open source is the gold standard, but just because something is a binary doesn't mean it isn't trusted. There is a certain amount of trust afforded to any verifiable entity with an address and the potential to be at the other end of a lawsuit. There is significant amounts of trust afforded to those who stand to lose from reputataional damage as well.

      It's why I wouldn't trust you to look after $50000 for me, but I do trust the bank, an enti

  • just what we need HP only HDD's with 1.5-3.0X markup in servers.

  • Chromebooks (running Linux underneath) validate their system every boot (Even more thoroughly than Apple, I would say). It is actually really well thought out, but limits what can be natively installed on the system.

    Here is a link to Google's design for Verified Boot in ChromeOS:
    https://www.chromium.org/chrom... [chromium.org]

    • Chromebooks (running Linux underneath) validate their system every boot (Even more thoroughly than Apple, I would say). It is actually really well thought out, but limits what can be natively installed on the system.

      Here is a link to Google's design for Verified Boot in ChromeOS: https://www.chromium.org/chrom... [chromium.org]

      I sat through a presentation from Google about Chromebook firmware recovery process and I can tell you right now that it is absolutely trivial to get a Chromebook to rollback to the factory microcode release that, with almost 100% certainty, has known security vulnerabilities. Not only do I refuse to trust an advertising company to sell me hardware, but I know from their own mouths that their security is, in practice, not as good as you think it is.

      • I'm curious what you mean by "Firmware" and "Microcode". The way you describe this seems to indicate you think rolling back to previous a signed release is a huge problem.

        If by Firmware you mean the BIOS/bootloader, then I am pretty sure that by "rolling back" you would typically not overall impact the operation of the operating system that is booted by the old firmware. If there is some vulnerability in the firmware, you may be able to exploit it to boot something malicious, perhaps, but I think you woul

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...