Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Microsoft Windows IT

Microsoft Has No Plans To Patch New Flaw 217

Trailrunner7 writes "Microsoft has acknowledged the vulnerability that the new malware Stuxnet uses to launch itself with .lnk files, but said it has no plans to patch the flaw right now. The company said the flaw affects most current versions of Windows, including Vista, Server 2008 and Windows 7 32- and 64-bit. Meanwhile, the digital certificate that belonging to Realtek Semiconductor that was used to sign a pair of drivers for the new Stuxnet rootkit has been revoked by VeriSign. The certificate was revoked Friday, several days after news broke about the existence of the new malware and the troubling existence of the signed drivers."
This discussion has been archived. No new comments can be posted.

Microsoft Has No Plans To Patch New Flaw

Comments Filter:
  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Sunday July 18, 2010 @02:55PM (#32944624) Homepage Journal

    Couldn't they just start making driver signatures verify with the hardware they support instead of the OS? Screw the OS saying whether or not it's legit, does the actual hardware it's meant for say it's legit code?

    • by beelsebob ( 529313 ) on Sunday July 18, 2010 @02:57PM (#32944636)

      Yes, that's working out really well for Motorola's publicity department with the Droid X just now.

      • Re: (Score:3, Interesting)

        by Khyber ( 864651 )

        There is a small difference to note, however; One is addressing an entire hardware set (motorola) the other is using code from a piece of hardware (is it a sound card/network driver certificate that got jacked?)

        Actually, bad example. let me see what my medicated brain can re-think.

        It's more like this, Motorola is stopping you from using hardware you purchased in a manner you wish with a hardware security check, where on the other hand, someone usurped a certificate from Realtek and used that to bypass secur

        • by Drew M. ( 5831 ) on Sunday July 18, 2010 @03:19PM (#32944806) Homepage

          Did you even read the summary? Realtek's signing keys were stolen. That's why Verisign revoked them. Putting the verification keys in hardware wouldn't fix this issue.

          • Re: (Score:3, Insightful)

            by AusIV ( 950840 )
            If anything, it would make things worse because they'd be harder to revoke.
          • by Khyber ( 864651 )

            Did you read my idea? Run verification key PLUS CODE through the hardware itself. If the key matches the hardware but the code produces BS results in the hardware (such as a nonsensical static when it should get several test tones,) then it gets denied.

            • So, you're saying you want to be stuck with the buggy driver that ships with the hardware, rather than the at least semi stable one that ships a year later?

            • Run verification key PLUS CODE through the hardware itself. If the key matches the hardware but the code produces BS results in the hardware (such as a nonsensical static when it should get several test tones,) then it gets denied.

              Nothing short of fully sentient artificial intelligence can tell malicious code from non-malicious code. And even that can only make an educateted guess, and will be wrong every now and then.

        • Re: (Score:3, Informative)

          Excellent idea. In that way, when companies refuse to develop free drivers for GNU/Linux, we won't be able to make our own because the hardware will reject them. And all of that just because microsoft refuses to make a secure operating system because they want to keep users buying new versions, antivirus software, etc. And because the users refuse to switch to an operating system that works.

          Brilliant idea.
          • by drsmithy ( 35869 ) <drsmithy@gm[ ].com ['ail' in gap]> on Sunday July 18, 2010 @05:47PM (#32945636)

            And all of that just because microsoft refuses to make a secure operating system [...]

            Can you outline what features and capabilities of a "secure operating system" are missing from Windows ?

            • Re: (Score:2, Troll)

              by cynyr ( 703126 )

              lack of a *.lnk based root kit, the ability to audit the source, the lack of ability to run 99% of the viruses in the wild.[1]

              Can you run any version of windows from something like a ramdisk, so there is no real way to write to the disk? how about the old, start the system up, shut it down, but leave iptables running router hack? A highly transparent bug/flaw reporting system, with a quick turn around?

              If you hear of a mac mini pro, let me know. :)

              [1]yes yes, all strawmen, but the issue for me is the last ve

            • by mcgrew ( 92797 ) *

              How about fixing vulnerabilities when they're found, for starters? This is what the discussion is about -- MS's refusing to fix a known vulnerability in their newest operating systems. This is an EPIC security fail. Getting rid of the deeply flawed "securith through obscurity" where they know about a vuln but stupidly, arrogantly, and unethically thinking nobody will find it for another. Not getting rid of hActive-X. And that's just for starters, the list is almost endless. MS appears to not take security s

          • by Khyber ( 864651 )

            Indeed, I should patent it quickly, so that it may not come to pass without my blessing!

          • by westlake ( 615356 ) on Sunday July 18, 2010 @08:38PM (#32946604)

            And because the users refuse to switch to an operating system that works.

            The number of PC users is about 1 to 1.2 billion, based on most estimates I've seen. That would put the number of Windows users at 900 million to 1 billion, at all skill levels.

            I will take that as pretty strong evidence that the Windows OS works just fine for those who use it.

            In that way, when companies refuse to develop free drivers for GNU/Linux, we won't be able to make our own because the hardware will reject them.

            I suspect that signed drivers are inevitable, whatever your platform.

            • by Ciggy ( 692030 )

              The number of PC users is about 1 to 1.2 billion, based on most estimates I've seen. That would put the number of Windows users at 900 million to 1 billion...I will take that as pretty strong evidence that the Windows OS works just fine for those who use it.

              Join the dole, 3 million can't be wrong

            • Re: (Score:3, Insightful)

              by mcgrew ( 92797 ) *

              The number of PC users is about 1 to 1.2 billion, based on most estimates I've seen. That would put the number of Windows users at 900 million to 1 billion, at all skill levels. I will take that as pretty strong evidence that the Windows OS works just fine for those who use it.

              I don't think I've ever met a non-nerd that even knows what an OS is. When I tell people there's a free replacement for Windows that doesn't get viruses, their jaws drop; they have no clue. Windows came with their computer and it's al

        • Re: (Score:3, Funny)

          by PopeRatzo ( 965947 ) *

          let me see what my medicated brain can re-think.

          Did you bring enough to share with the whole class?

    • The ATI video card I have fails hard on XP64, so I got a driver some random guy that has nothing to do with ATI made instead, and it works great. If I were stuck using only drivers that were ATI-approved, I'd be majorly SoL.

      I'm all for having the hardware verify that the driver actually is a valid driver for the hardware in question, just make sure that's ALL it does, or we'll lose the ability to use someone's hack to force a piece of hardware to work.

    • by Arainach ( 906420 ) on Sunday July 18, 2010 @04:12PM (#32945108)
      That eliminates the possibility to revoke a certificate if one is comprimised. Also, it leads to situations like the TI calculator incident, which Slashdot seems to hate.
      • by cynyr ( 703126 )

        yep, and it's what the "TIVO" clause in the GPL3 is for. I bought the hardware, I can do as i like with it, including blend it, make it into a rocket(not for sooting at something, but like a model rocket), use it to prop the window open, etc. The reason that TI doesn't like it, is they sell the same hardware with additional software features for a premium this way, and people buying a lowend calculator and flashing advanced firmware on it hurts their profit part.

      • It also assumes the hardware has enough processing power - on its own - to handle approving the driver. That's not easy when the driver very possibly supplies the firmware that the hardware executes.

    • by RCL ( 891376 ) on Sunday July 18, 2010 @04:35PM (#32945226) Homepage
      I don't like security news precisely because they result in such overreactions like yours one.

      We should not care about security too much. Security is the opposite of freedom, and by concentrating our efforts on security we may end up with completely locked environment.

      It's better to tolerate certain threshold of hijacked/owned computers than to require hardware verify the software.
      • We should not care about security too much. Security is the opposite of freedom, and by concentrating our efforts on security we may end up with completely locked environment.

        Welcome to the physical world. If you do not like security and are afraid to be locked out of your own house, you are free to remove the lock on your front door.

      • Security is the opposite of freedom,

        No, it isn't. You do not have to sacrifice freedom to gain security. Yes, that's what the authorities have been telling you forever, but that's just because they want/like the power that comes from limiting freedom, and use people's fears to make them think that they will be more secure if their freedoms are reduced. But it's bollocks.

        Freedom is not antithetical to security. You can have both. In fact, it has generally been shown that the less free a society is (think pol

        • by RCL ( 891376 )
          Freedom is about having choices. Security is about limiting the choices. They are inherently in conflict.

          I don't see how you can have both. You can have some trade-off between those, but not both. Law is one of such trade-offs - you aren't free to kill people, but you're safe from being killed yourself.

          And as a former Soviet citizen I can testify that living in a police state IS safe, if you agree to follow the rules (e.g. if you limit your freedom...). Whether or not this means that the state itself is
    • Couldn't they just start making driver signatures verify with the hardware they support instead of the OS?

      That's a really, really bad idea.

      Drivers are for hardware, yes, but they're also for software too. As soon as you switch to that type of signature verification model, you lose the ability to load drivers for virtual hardware, like ImDisk. [ltr-data.se] Microsoft's iSCSI initiator is also a virtual mass storage driver, and that wouldn't work either.

      There's probably a gazillion other examples, but generally speaking, driver and software signing as it's currently implemented is working well enough for most things. It's

    • What about non-hardware drivers, like anti-virus drivers, virtual devices, etc? Or drivers for generic devices like USB HIDs? And if a manufacturer's certificate gets compromised, what do you do? Require people to update their hardware or face an increased risk of malware? Require people to reflash their hardware? How do you secure the reflash process? What if it crashes in the process? Do you have bricked hardware?
    • by Lehk228 ( 705449 )
      are you suggesting having every hardware device include a microprocessor that requests a copy of it's drivers from the host system and validates them? because that sounds expensive and fragile
  • I'm not getting it. There's a security problem and MS refuses to fix it? Really? How many times has this happened before? It's happened enough that I didn't even blink at it. It's like saying a politician told a lie. So?
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      it's hardly an OS problem if some wanker has written a nasty driver then signed it with a legit cert
      dam i consider most of my linux wifi driver malicious

      • Re: (Score:3, Insightful)

        by 0123456 ( 636235 )

        it's hardly an OS problem if some wanker has written a nasty driver then signed it with a legit cert

        I somewhat disagree: it clearly shows the flaws in an either/or trust model of that kind. Either it's signed and it's trusted to do anything at all to your system or it's not trusted to do anything at all... you only need one rogue signing key to break that model.

        • Re: (Score:3, Insightful)

          by TheRaven64 ( 641858 )
          Do you propose a better model? How about the Linux model, where if the user decides to load it then it can do absolutely anything with the system? Of course, it would be great to be able to run drivers in unprivileged mode, but until we have an IOMMU in every system that won't actually buy any security (a malicious driver can just tell the device to DMA random data from anywhere in physical memory to the device and then back to the driver's address space, or data from the driver's address space into anoth
          • by 0123456 ( 636235 ) on Sunday July 18, 2010 @04:14PM (#32945114)

            Do you propose a better model?

            Yes, don't trust anything unless you absolutely have to. In user land, for example, we have SELinux and Apparmor to prevent applications from accessing things they shouldn't; protecting the kernel is obviously harder.

            How about the Linux model, where if the user decides to load it then it can do absolutely anything with the system?

            Generally speaking, Linux drivers are only installed if signed by the distro repository, and you have to trust that key: if it's compromised you're toast. Windows has three bazillion drivers signed by three bazillion keys and only one needs to be compromised.

            Nor will Linux drivers be loaded automatically from a random USB key just because you browsed there.

            • by rawler ( 1005089 ) <ulrik...mikaelsson@@@gmail...com> on Sunday July 18, 2010 @04:59PM (#32945368)

              Generally speaking, Linux drivers are only installed if signed by the distro repository

              Actually, for most distros, "drivers" (code executed as root, which is the main barrier in a Linux-system) are installed if they're signed by _any_ key in the keyring, including 3:d-party repositories.

              Many people add 3:d party repositories to access newer versions of various packages, or packages not included in the distro, significantly increasing the attack vector. If you manage to get a hold of a key for any of those repository-signers, you pretty much have root-access to thousands-millions of users.

              One of the things Linux distributions must really rethink is the concept of 3d-party software, and how it can be integrated and allowed more safely than it is today.

              One concept could be special repository-system for 3:d-party packages, chrooted to separate container, and not allowed to execute any scripts during installation (or allowed, but at non-root privileges). Another idea could be per-user installs of 3d-party apps that installs to $HOME/.local or similar, and never root.

              • This is still far fewer keys than a Windows install has, and you still need to get the package into the repos as well (so you need to steal a key an crack a repo server).

                In addition, no key will let you attack more than a fraction of Linux installs. It would be nasty if someone got hold of an Ubuntu key (as its the most popular distro) and they managed to place a package on the Ubuntu repo server, but most Linux installs would not even be exposed to the risk, and only those Ubuntu users who install the malw

              • I have exactly three keys on my keyring: Adobe, Fedora stable, Fedora testing. I seem to remember that, with the addition of RPM Fusion, this is all that Fedora users ever install, statistically.

                My roommate has an HP printer, Wacom tablet, nVidia graphics card, Logitech trackball, Intel motherboard, and Creative soundcard. Not counting plug'n'play drivers, he's already going to have more keys to track if he ever upgrades from XP.

                Just an anecdote.

            • by sjames ( 1099 )

              Nor will Linux drivers be loaded automatically from a random USB key just because you browsed there.

              That's the big point. It doesn't matter if drivers are signed or not nor does it matter if someone steals a random signing key IFF the OS doesn't go installing drivers from random USB keys that get plugged in.

              USB devices are well defined so that as long as the vendor doesn't do something incredibly stupid like hiding all of the functionality behind a vendor specific extension, you don't usually need a bunch of special drivers.

              For the exceptions or where the USB driver is just enough to let a userspace progr

              • Still, on most Linux distros you're talking about maybe a dozen keys that the user themselves specifically trusted, and the chances of any 2 Linux users trusting the same 3rd party will be remarkably small. Under the Windows model, any Verisign trusted certificate will get you access, there's got to be millions of those (unless they restrict drivers to a different root certificate than app or website signing, but even then it'll be in the thousands), and if any one is compromised then every Windows user w

                • by sjames ( 1099 )

                  Which is why I say that simply being signed is not a good enough reason to just let some driver on a random device be installed.

            • Yes, don't trust anything unless you absolutely have to. In user land, for example, we have SELinux and Apparmor to prevent applications from accessing things they shouldn't; protecting the kernel is obviously harder.

              You can set Windows to trust even less. In general a user can't install drivers at all on Windows, it takes an administrator to do it. If the administrator decides to install something without checking it well first, you're boned no matter what other steps you took.

              Nor will Linux drivers be loaded automatically from a random USB key just because you browsed there

              This is not a problem with the model, it is a bug in the implementation. Are you saying linux doesn't have any coding errors in it?

              With this bug, the code only runs as the current user. So if the current user isn't an administrator, there's no ri

    • >>>It's like saying a politician told a lie.

      Yes but some people still think politicians/government are completely honest so they need a reminder from time to time that they aren't. Likewise some people think Windows is safe. Just this morning a Slashdotter posted that Windows is no more insecure than Linux. This story proves them wrong. (If this was Linux it would be fixed within a week, but some resourceful OSS programmer.)

    • Reading the referencing articles and Microsoft's sites... They're not refusing to fix it. They said they're investigating and there's no plans to release an immediate fix. At best, this could summary could be stretched to "urgent 0day attack vector that Microsoft hasn't released a fix for". I wish there was a way to rate articles as flamebait. Somedays Slashdot is just like playing the "Telephone Game". sigh!
  • Source? (Score:5, Insightful)

    by Arainach ( 906420 ) on Sunday July 18, 2010 @03:03PM (#32944688)
    I know Slashdot's editorial standards have dropped, especially when it comes to Anti-Microsoft articles, but there is no link here to any article that claims Microsoft has no plans to patch the flaw. Do we even have editors anymore?
    • Re: (Score:2, Troll)

      by jwilhelm ( 238084 )

      Microsoft statement via Technet blog:
      http://blogs.technet.com/b/mmpc/archive/2010/07/16/the-stuxnet-sting.aspx [technet.com]

      • Re:Source? (Score:5, Informative)

        by Arainach ( 906420 ) on Sunday July 18, 2010 @03:14PM (#32944770)
        That's from their Anti-Malware team talking about how they detect it. Nowhere does it say that they have no plans to fix the bug.
    • Re:Source? (Score:5, Informative)

      by alexhs ( 877055 ) on Sunday July 18, 2010 @03:16PM (#32944784) Homepage Journal

      there is no link here to any article that claims Microsoft has no plans to patch the flaw.

      To be fair the summary states

      it has no plans to patch the flaw right now

      Which is in the 2nd link actually.

      Microsoft said it is investigating the flaw and looking at possible solutions, however there was no clear indication that the company intends to patch the flaw in the near future.

      Well, from that quote to the summary, there is quite a stretch, but what did you expect ?

      • "no clear indication" isn't exactly a definitive response from Microsoft at all. It just means that one source hasn't heard a plan in *either* direction (to patch now/not patch now). Lots of room for ambiguity there, in my opinion.
    • by drsmithy ( 35869 )

      I know Slashdot's editorial standards have dropped, especially when it comes to Anti-Microsoft articles, [...]

      That's not really correct. Slashdot has excellent editorial standards when it comes to Anti-Microsoft articles, and have been serving up some of the best ones on the Internet for going on a decade now.

    • I swear we get this article every couple of months. The google researcher patch was released this patch cycle after slashdot gave us a dozen breathless articles about how MS won't fix it. Its our two minute's hate. Yet, people still buy it. I guess if you're so anti-corporate you'll believe anything that is compatible with your bias. Its like guys who are into 9/11 conspiracy theories who later talk to you about UFO abductions and the hushed-up car that gets 100 mpg or somesuch. They want to believe bullshi

  • I didn't put it through exhaustive tests, but I actually tried to make some link files and put them on a usb drive and have them install something when I accessed the shortcuts in Windows explorer. No luck whatsoever. I looked for some working examples but I couldn't find any, either.

    And funny, I did some work for a large oil/gas company that stored the config files for some flowmeters on usb thumb drives and left them in the battery boxes. It was really fun when the first wave of thumb drive viruses hit! T

    • Well, it's clearly some kind of bug in the icon handler for shotcuts, as Microsoft's workaround is to disable that with Regedit, which results in every shortcut having the generic file icon (a rather plain looking Start menu results). I'd guess it some sort of buffer overrun related to custom icons in the shortcut or something like that. Quite nasty really, you look at a directory with Explorer and Windows will execute code because Microsoft seemingly can't load an icon without it causing a major problem.

      Ju

    • The MS Security Advisory (2286198) [microsoft.com] states,

      The vulnerability exists because Windows incorrectly parses shortcuts in such a way that malicious code may be executed when the user clicks the displayed icon of a specially crafted shortcut.

      Sounds like a vulnerability in the way the Details panel in Explorer is updated when the user highlights a maliciously malformed icon. However the MS page did indicate that the user has to actually click on the icon, so it appears that simply autoplaying the drive would not be enough to infect you (unlike some of the PDF/JPEG - or was it PNG - exploits that I seem to remember which infected your computer by an exploit of the code that generated the Explorer thumbnai

      • Well, looks like I might have misunderstood the advisory. Further down in the details:

        When attempting to load the icon of a shortcut, the Windows Shell does not correctly validate specific parameters of the shortcut. ... An attacker could present a removable drive to the user with a malicious shortcut file, and an associated malicious binary. When the user opens this drive in Windows Explorer, or any other application that parses the icon of the shortcut, the malicious binary will execute code of the attacker’s choice on the victim system.

        From this, it appears that the part about clicking was rather misleading in that you do not have to click the icon for the exploit to execute.

        However, it also requires a secondary malicious binary in addition to the malicious shortcut, and it seems to me that AV software could easily be updated to detect and clean this sort of malformed file. In the meantime, I would suggest disabling Autoplay and using particular caution

  • Who fault is it? (Score:5, Interesting)

    by KlomDark ( 6370 ) on Sunday July 18, 2010 @04:35PM (#32945228) Homepage Journal

    I think Microsoft is right on this issue. This problem is truly not theirs, except for the amount it negatively affects them. (Which they can do little except attempt spin control on the issue.)

    They designed their driver verification process intelligently: By implementing the requirement of the drivers being signed by an appropriate third-party certificate registrar (VeriSign in this case), thus leaving the issue of managing the business of encryption keys to the established so-called "experts".

    Part of the process of obtaining a trusted VeriSign cert such as the device driver key involves the company desiring a high-trust certificate of this nature involves signing and complying with a detailed set of procedures describing the physical/organizational processes how to handle and store the signed keys in a very secure and documented "chain of trust".

    In the case where the security chain was broken by a (previously) trusted third party, in this case we'll probably find that RealTek is the cause of the issue by not properly following the chain of trust requirements, or how else would a rogue employee be able to sign his malicious driver?

    <CoolStoryBro
    A decade ago, I was a systems engineer for the internet banking division of a large bank that owned a bunch of other regional banks, and I was a "primary key custodian" (A defined role in the chain of trust requirements), so I was the one who would handle the technical details as far as getting the cert created and installing it on the web banking servers. (Just SSL certs rather than driver signing certs, but at the core they're the exact same thing.)

    The amount of procedural rigamorole for handling the certs was complex, and well thought-out. I would create our private key in front of a few handpicked suits from corporate and data security who would observe me as I created our unsigned private key, then I would look away while one of the security people entered a complex password that I was not allowed to know, then I would get the cert signed by VeriSign which would require the security guy to re-enter the password that I did not know, then we would get the certs back, print out several copies, seal them in an envelope, all of us would sign it and take it to a safety deposit box. The security guys were not allowed to have a copy of the unsigned private key, and I was not allowed to know the password to the VeriSign-signed (VeriSigned?) key.

    [And it's been 10 years since I worked there, and the certs were only one-year certs (renewed each year going through the same type of process), so don't come try to hold me hostage for any info about the bank, my info expired 9 years ago! :) ]
    </CoolStoryBro

    So it looks like RealTek may have dropped the ball on their cert handling procedures. Maybe VeriSign was lacking in their process auditing as well. Who knows? (I don't)

    But to blame this one of Microsoft is assinine, how were they supposed to do anything different?

    I suppose Microsoft could release a Windows update that revokes trust for any cert signed by VeriSign, but would be devastating to online commerce as VeriSign has a near monopoly on the certificate registry market, so encryption would suddenly stop working on nearly all online businesses overnight. // But the bright side: All those sites would still work in the morning on Linux, giving it a huge boost! :) /// But on the dark side: All those sites would still work in the morning on Macs as well, giving the idiocracy movement a huge boost as well. :(

    • Re: (Score:2, Informative)

      The flaw that isn't going to be fixed "in the near future" is the "if a shortcut's icon is shown in Windows Explorer, then automatic execution of malicious code may occur" (perhap's this is some sort of buffer overflow in the icon parameter reader?). The best workaround? Disable the display of icons for shortcuts. Attack vectors? WebDAV, USB sticks, and LAN shares mostly. To that end, I'd imagine Microsoft is directly at risk given they likely have multiple rather huge LAN and it's already been demonst

      • The flaw that isn't going to be fixed "in the near future" is the "if a shortcut's icon is shown in Windows Explorer, then automatic execution of malicious code may occur"

        I’m still not sure where that idea comes from. Microsoft admitted a flaw in the icon display code for shortcuts (“When attempting to load the icon of a shortcut, the Windows Shell does not correctly validate specific parameters of the shortcut.”), so presumably they will be patching it shortly.

    • Re:Who fault is it? (Score:5, Informative)

      by causality ( 777677 ) on Sunday July 18, 2010 @06:15PM (#32945756)

      But to blame this one of Microsoft is assinine, how were they supposed to do anything different?

      Do you have any familiarity whatsoever with this situation?

      Windows has an acknowledged flaw/vunlerability related to its handling of .lnk files (shortcuts). That flaw is being exploited to install this malicious driver. The problem has been greatly compounded by the fact that the driver is signed by a previously-trusted private key, but this is not the original flaw. Normally the act of merely plugging in a USB thumbdrive does not immediately install system software such as device drivers. It is that acknowledged .lnk flaw that makes this possible.

      If you can install a hardware driver with an exploit, you can also install a worm, rootkit, etc. This attack happens to install a device driver. If Realtek's private key had never been compromised, then instead of installing a malicious device driver, you'd have Windows users plugging in infected USB thumbdrives and immediately becoming members of botnets. The flaw is in the Windows system and its handling of shortcut files.

      It is that flaw and only that flaw for which Microsoft is being blamed.

      I suppose Microsoft could release a Windows update that revokes trust for any cert signed by VeriSign

      Why would they do that when Verisign can revoke only this specific Realtek cert? In fact that's exactly what they have done.

      Seriously. Did you even bother to read the summary? At all? I'll quote it for you. This is the summary, verbatim:

      "Microsoft has acknowledged the vulnerability that the new malware Stuxnet uses to launch itself with .lnk files, but said it has no plans to patch the flaw right now. The company said the flaw affects most current versions of Windows, including Vista, Server 2008 and Windows 7 32- and 64-bit. Meanwhile, the digital certificate that belonging to Realtek Semiconductor that was used to sign a pair of drivers for the new Stuxnet rootkit has been revoked by VeriSign. The certificate was revoked Friday, several days after news broke about the existence of the new malware and the troubling existence of the signed drivers."

      Emphasis is mine. Now go clean the egg off your face.

  • by goodmanj ( 234846 ) on Sunday July 18, 2010 @04:56PM (#32945348)

    I'm not Windows expert, but isn't this exactly the way the certificate system is supposed to operate? This sounds like a security success story, not a failure.

    Driver needs certificate to work with OS. Driver is found to contain security flaw. Certificate is revoked, OS refuses to recognize driver, security hole is closed. Now driver manufacturer has to clean up their act before their drivers are allowed back in the house.

    The headline reads "Microsoft has no plans to patch new flaw", but isn't the certificate revocation at least as good as a patch? More so, because it seals off any *other* undiscovered bugs in the driver? Or am I missing something?

    • by causality ( 777677 ) on Sunday July 18, 2010 @06:23PM (#32945790)

      I'm not Windows expert, but isn't this exactly the way the certificate system is supposed to operate? This sounds like a security success story, not a failure.

      Driver needs certificate to work with OS. Driver is found to contain security flaw. Certificate is revoked, OS refuses to recognize driver, security hole is closed. Now driver manufacturer has to clean up their act before their drivers are allowed back in the house.

      The headline reads "Microsoft has no plans to patch new flaw", but isn't the certificate revocation at least as good as a patch? More so, because it seals off any *other* undiscovered bugs in the driver? Or am I missing something?

      Please see this post [slashdot.org] where I correct a similar false notion. Then, please berate your teachers for failing to transmit basic reading comprehension skills to you. Hint: the signed malicious device driver is incidental and is not the flaw that Microsoft may or may not patch.

      Sorry for the tone but I just don't see what part of this is difficult to understand.

  • by mysidia ( 191772 ) on Sunday July 18, 2010 @07:07PM (#32946044)

    The article doesn't say it, and at no time was Microsoft reported as saying there were no plans to patch this bug.

    Just because you are unaware of them reporting they will release a patch does not mean they have no plan to patch it.

    They have offered workarounds and appear to be treating this seriously.

    Just because it's the weekend and they haven't told you there will be a patch available monday DOES NOT mean they are ignoring or refusing to work on patching this.

  • On OS X; you have to run "Keychain Utility" and its preferences to enable OCSP functionality to check certificate revocation. Does Windows mechanism to check certificate revocation run by default?

    So, revoking certificate won't mean a thing until some windows update (aka updated root certificates) comes. That would -of course- change if Microsoft takes it serious enough to ship a 5 KB (yes, kilobyte) Windows out of band update which won't require reboot or impossible to cause issues.

    Don't they have slightest

  • ...If the reason for the Delay in fixing the bug is with purely commercial...

    Think about it.

    MS probably own a fair whack of shares in most of the big AV vendors. MS tips off the vendors of the exploit and they find a way to mitigate the effects (not fix the problem).

    The Vendors then use the month or so between MS scheduled updates to panic the masses that they need to renew their AV subscription to help with this new virus attack.

    Once they have milked the masses for a month or so of re-subscriptions, MS the

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...