Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security

Second Root Cert-Private Key Pair Found On Dell Computer (threatpost.com) 65

msm1267 writes: A second root certificate and private key, similar to eDellRoot [mentioned here yesterday], along with an expired Atheros Authenticode cert and private key used to sign Bluetooth drivers has been found on a Dell Inspiron laptop. The impact of these two certs is limited compared to the original eDellRoot cert. The related eDellRoot cert is also self-signed but has a different fingerprint than the first one. It has been found only on two dozen machines according to the results of a scan conducted by researchers at Duo Security. Dell, meanwhile, late on Monday said that it was going to remove the eDellroot certificate from all Dell systems moving forward, and for existing affected customers, it has provided permanent removal instructions (.DOCX download), and starting today will push a software update that checks for the eDellroot cert and removes it. The second certificate / key pair was found by researchers at Duo Security.
This discussion has been archived. No new comments can be posted.

Second Root Cert-Private Key Pair Found On Dell Computer

Comments Filter:
  • Unavoidable (Score:4, Interesting)

    by Ed Tice ( 3732157 ) on Tuesday November 24, 2015 @11:17AM (#50994601)
    I feel bad for those who switched from Lenovo to Dell after the SuperFish fiasco.
    • I feel bad for those who switched from Lenovo to Dell after the SuperFish fiasco.

      I read this as "SuperFish Taco"...anyone else?

    • Re:Unavoidable (Score:5, Insightful)

      by fuzzyfuzzyfungus ( 1223518 ) on Tuesday November 24, 2015 @11:56AM (#50994995) Journal
      The only consolation is that 'superfish' was clear evil, executed with some degree of effectiveness; while the current Dell thing appears to be unbelievable failure at even the concepts behind safe certificate handling; but without an overt evil objective.

      It is, at least, possible, that stupid will be cured by enough 3rd party testing; but evil is harder to expunge.

      That said, the level of stupid on display here(especially for a company that is supposed to know how to, say, sign and deploy device drivers; and run a website with a secure order form) is pretty terrifying. Bugs are bad; but at least some of them are subtle. Adding a trusted root cert with an easily extractable private key to a huge number of customer systems isn't a 'bug', it's insanity.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      It's completely avoidable. Do your homework on a new laptop (manufacturer doesn't matter.) Make sure it has good Linux compatibility. Buy it and install your favorite distro. I've been doing this for the past 10 years. It's great because you benefit from the lower price (thanks to all the shovelware) without having to actually live with the shovelware.

  • by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Tuesday November 24, 2015 @11:18AM (#50994613) Homepage Journal

    private key used to sign Bluetooth drivers has been found on a Dell Inspiron laptop

    So, the happy owners of the affected laptops can now issue certificates and/or sign drivers, which will be accepted as genuine by other owners of Dell hardware?

    Seriously? If so, that's just too dumb to be malicious...

    • by gstoddart ( 321705 ) on Tuesday November 24, 2015 @11:49AM (#50994933) Homepage

      Seriously? If so, that's just too dumb to be malicious...

      Companies are so bad about security these days that I refuse to differentiate between stupidity and malice.

      If they do it to sell ads, or they do it to make support easy but don't have proper security people review it ... I don't see much difference.

    • by Anonymous Coward

      The term you're looking for is "criminally negligent".

    • by hey! ( 33014 )

      So, the happy owners of the affected laptops can now issue certificates and/or sign drivers, which will be accepted as genuine by other owners of Dell hardware?

      Seriously? If so, that's just too dumb to be malicious...

      It's not too dumb to be willful negligence -- defined in legal dictionaries as "Intentional performance of an unreasonable act in disregard of a known risk..."

      Having the know-how to do such a thing necessarily entails knowledge of why its a bad idea. So either an engineer acted in breech of professional ethics, or managers rode roughshod over the engineers' objections.

      • by mi ( 197448 )

        So either an engineer acted in breech of professional ethics, or managers rode roughshod over the engineers' objections.

        This suspicion would've made sense, if there was some profit opportunity there. But I can not see one...

        Making their own CA recognizable as valid by users of their computers would've been understandable — and even acceptable. But what possible use is publishing your private key?

        Perhaps, it is to be able to deny responsibility for bad software later, but that's a little too far-fetc

        • by hey! ( 33014 )

          But what possible use is publishing your private key?

          Perhaps, it is to be able to deny responsibility for bad software later, but that's a little too far-fetched...

          Well, we're not talking about publishing THE private key to anything Dell cares about. We're talking about publishing A private key that Dell can use to do things on the client's machine that undermine the security model. Why? Well there's lots of potential ways to create revenue or cut costs that way. For example Lenovo did it so they could inject ads into web pages that were supposedly cryptographically protected from tampering.

          • by mi ( 197448 )

            For example Lenovo did it so they could inject ads into web pages that were supposedly cryptographically protected from tampering

            This makes no sense. Why do you need your private key to be located on the users' computer for that?

            • by hey! ( 33014 )

              It's not *your* private key. It's a private key that the browser is configured to trust.

              • by mi ( 197448 )

                It's not *your* private key.

                I know, it is not mine, darn it. It is Dell's.

                It's a private key that the browser is configured to trust.

                Yes, but the browser does not need to have access to the private key to establish that trust — that's the whole point of public/private key cryptography.

                The question was — and remains — why does this private key need to be present on the user's computer, if the sole goal is to show the user ads as "trusted"?

            • by DarkOx ( 621550 )

              for example Lenovo did it so they could inject ads into web pages that were supposedly cryptographically protected from tampering

              This makes no sense. Why do you need your private key to be located on the users' computer for that?

              Why because you can't defeat the certificate checking logic of the local SSL stack. You need 'a' private key there for a trusted root CA so you can generate certificates on the fly other parts of the system will see as valid.

              Browser tries goolge.com -> You intercept it -> You go fetch the cert from the original destination ip -> you validate it or don't -> you generate a new cert based on the content of the one you got and sign it with the private key -> send the response to the browser ( wh

              • by mi ( 197448 )

                you generate a new cert based on the content of the one you got and sign it with the private key

                If that's, what it is, why would you permanently store the private key on the machine? You can generate a new one at will — because the browser is configured to trust your CA...

                Neah, I tend to go with the Hanlon's Razor [wikipedia.org]: Never attribute to malice that which is adequately explained by stupidity.

    • I've actually seen this before with OpenVPN setups. The standard setup procedure has you generate the keys and certificates on the server, but doesn't make clear which files are the private keys and which are public. One of the guides now carefully points out which files you're supposed to keep secret. But I've seen several OpenVPN setups where someone didn't know better and just installed the client, then copied all the config files (all the certificates and keys) from the server to the client.

      Explai
    • This part is actually FUD I think. This particular Dell private key does not chain up to a trusted root CA.

      Also - Windows will only install drivers silently that are Microsoft WHQL signed - they are the only ones who sign these drivers, and this key does not chain up to that either.

      At most you could sign a driver with this key, and install said driver onto a machine that had the public key already installed - assuming you had local admin as well - and for a user mode driver (like a printer) it will give you

  • by retroworks ( 652802 ) on Tuesday November 24, 2015 @11:19AM (#50994629) Homepage Journal
    My new XPS 15 9050 had just arrived and I tested it and found it vulnerable, now looking forward to implementing the fix over the holiday. In the meantime, the fact that Firefox protected the machine on the test websites (and Chrome and Explorer did not) caused me to swap to Firefox on all my other machines, just cause I appreciate they had my back.
    • by DarkOx ( 621550 ) on Tuesday November 24, 2015 @11:31AM (#50994755) Journal

      You need to wait for the holiday to delete a certificate out of your trusted roots on your personal machine? Wow.

      Secondly Firefox did not protect you from anything, the fact they don't share the system cert store did. Yeah it worked out this time to your favor but I honestly don't think Mozilla's failure to integrate with system certificate stores is a win in general. Its actually one of the biggest reasons I think about leaving my beloved SeaMonkey for something else.

      For one thing you now have not one but 2 certificate stores you need to audit. That sucks! If a CA says they have been compromised I have to remember to fix it in 2 place instead of one. That isn't a security win. Many users don't probably even realize they don't use the system trusts, so if they get instructions to fix an issue by removing a CA they will likely fail to fix the Mozilla based browser.

      Second in managed environments revoking a trust in Mozilla isn't easy to script out, that means Firefox and SeaMonkey installs likely just don't get fixed, again not a security win.

      Frankly I think its rather a shame Mozilla does not provide at least the option to use the system trusteded roots.

      • I agree that not using the system-provided certificate storage is a disadvantage; but I'd be curious to know if you've actively had lousy luck with certutil [mozilla.org], or whether it works but is more of a pain than just using group policy to manipulate the Windows-native store?
        • by DarkOx ( 621550 )

          The latter, certutil works fine, but you have to build some custom fix packages to use it. Which can get complex if you have cases where those installations are not in the default locations.

          ie. non local admin users can't install FF to its usual places so they install it to a directory inside their profile. Now you are playing find the Firefox / SeaMonkey install.

      • Or better yet download the drivers to a USB and wipe it with a fresh image from the media creation tool or iso from Microsofts website.

        I have not seen anyone use the malware and Spyware bloated image since last decade. You always do a clean install

    • by quetwo ( 1203948 )

      It's not that they have your back -- it's that they use their own certificate chain of trust that doesn't rely on the OS. It's baked into the source code, and can't be updated unless you upgrade versions (also, if one gets blacklisted, you don't notice it either).

  • Steps (Score:2, Informative)

    by Anonymous Coward

    Step 0: Don't buy any equipment from a manufacturer that supports Microsoft Windows Platform Binary Table (WPBT).

    Step 1: Wipe any pre-existing OS on your equipment.

    Step 2: Stop buying anything from vendors (Lenovo, now Dell) who are proven to do this shit.

  • These companies are just plain sleazy.

    My next computer won't be a Dell or IBM or OEM for that matter.

    I think it's about time for an open source computer.
  • A word document? (Score:4, Interesting)

    by jlv ( 5619 ) on Tuesday November 24, 2015 @12:11PM (#50995139)
    Why were the removal instructions provided as a word document? They couldn't just have a simple web page with pictures?
  • by Lumpy ( 12016 ) on Tuesday November 24, 2015 @12:23PM (#50995273) Homepage

    WE don't get clean reinstall DVD's, Microsoft allows the builder to put whatever crap they want on the computer. Honestly it's all microsoft's fault.

    Go back to shipping a MICROSOFT PRESSED installation DVD with the machine as a requirement and the install must be done from a clean image no extra crap is allowed to be installed on the machine. yes that means they have to use decent chipsets instead of the crap-tastic stuff like Marvell and other really low end china dog food devices.

    • Buy your gear from a quality vendor (e.g. Taiwanese OEM) and that's what you'll get. Require the cheap-crap vendors to not provide cheap crap? That's why we have options.

  • Asymmetric crypto always looks like the cat's meow at first, and then over time you find out that it sucks hairy donkey balls.

Technology is dominated by those who manage what they do not understand.

Working...