Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security

Microsoft Taking Longer to Fix Flaws 192

An anonymous reader writes "A look back at the last three years of security patches from Microsoft shows Redmond is taking at least 25 percent longer to issue patches for "critical" vulnerabilities, now averaging around 135 days to issue a fix. The exception appears to be with "full disclosure" flaws, for which Redmond issued fixes in an average of 46 days last year."
This discussion has been archived. No new comments can be posted.

Microsoft Taking Longer to Fix Flaws

Comments Filter:
  • So they're concentrating efforts on the full disclosure exploits... and this is bad why?
    • I was thinking the same thing. They announced months ago that they would only release updates on certain days of the month unless it was highly critical.
    • by kg4gyt ( 799019 ) on Wednesday January 11, 2006 @11:23AM (#14446240)
      Focusing on the exploits or not, 46 days is a long time to wait for a critical fix.
      • by saleenS281 ( 859657 ) on Wednesday January 11, 2006 @11:36AM (#14446340) Homepage
        when you're accountable to that many customers with so many "supported" configurations, it takes a while to test. They don't have the luxury of most linux distro's where if it breaks some obscure program they can go "whupps, well, tell the author to write a fix for his app".
        • when you're accountable to that many customers with so many "supported" configurations, it takes a while to test. They don't have the luxury of most linux distro's where if it breaks some obscure program they can go "whupps, well, tell the author to write a fix for his app". And yet Debian manages to consistently not break stuff despite supporting more architectures than Microsoft could dream of.

          Apart from that time a while back when they had to transition between GCC versions, that could have been bett

          • architecture != software packages. And definitely != enterprise software packages. Veritas, oracle anyone?

            I won't even begin to go into how many times a redhat update has "broken" both of these.
            • "architecture != software packages. And definitely != enterprise software packages."

              Well, to be accurate, debian as distro supports a number of packages that dwarfs what Microsoft supports. Just look at a list. Now multiply that by the number of platforms, compared to MS's platforms, which is just one. As for 'enterprise' packages, yes many of the debian (and linux in general) packages are smaller than things like exchange or veritas, but many are also on par as well. So the statement "definitely !=
              • by Lifewish ( 724999 ) on Wednesday January 11, 2006 @12:16PM (#14446671) Homepage Journal
                Strictly from a customer-is-always-right point of view, what's their excuse? Not enough testors? Not enough programmers? Not enough managers?
                I'd go with "not enough clearly-defined interfaces". If software producers are forced to use undocumented APIs to get their product working fast/well enough, it seems obvious that any behind-the-scenes changes are going to break a whole load of products.
        • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Wednesday January 11, 2006 @11:40AM (#14446386)
          when you're accountable to that many customers with so many "supported" configurations, it takes a while to test.
          What is this "a while"?

          Is it a day?
          Is it a week?
          Is it a month?

          Doesn't Microsoft have enough money to maintain images of different configurations just for such testing?

          Doesn't Microsoft have the people who could automate such testing?

          Is the problem that they don't have enough money? Or that they don't have people who are smart enough? Or that they just aren't doing it?
          • by Anonymous Coward
            I think it's more along the line of how long does it take to build the binaries for the new components and run it through a battery of automated test scripts.

            The software that we write at my current employer is a complex vector editing system and image RIPing. Our regression test suite can take up to 3 days to run. Whoops, that last fix broke something in abc.dll that depended on some behavior coming from def.dll. That will take a day to fix, 4 hours to build and rerun the test suite. Rince repeat until
            • It's more likely how long it takes to run that battery of test scripts on several hundred "typical" hardware configurations. It takes a while, we should not berate MS for testing, if indeed that is what is happening.

              In all likihood they are diverting resources from patching to Vista so they can ship it sooner. This is bad.
            • The software that we write at my current employer is a complex vector editing system and image RIPing. Our regression test suite can take up to 3 days to run. Whoops, that last fix broke something in abc.dll that depended on some behavior coming from def.dll. That will take a day to fix, 4 hours to build and rerun the test suite. Rince repeat until no more errors. An average fix may take us up to 10 days to code, test and deploy for patching.

              The question is, how many people and machines do you have dedica

          • Is the problem that they don't have enough money? Or that they don't have people who are smart enough? Or that they just aren't doing it?

            No, the problem is it takes time.

            Much like you can't produce a baby in a month just by getting 9 women in the same room.

        • by freshman_a ( 136603 ) on Wednesday January 11, 2006 @12:04PM (#14446569) Homepage Journal

          when you're accountable to that many customers

          When who's accountable? The disclaimer included with the last MS security update I downloaded read as follows:

          In no event shall Microsoft Corporation or its suppliers be liable for any damages whatsoever including direct, indirect, incidental, consequential, loss of business profits or special damages, even if Microsoft Corporation or its suppliers have been advised of the possibility of such damages.

          Now, unless I misunderstood, it's telling me that if I install said security patch, and it breaks something, I can't hold MS accountable.
          • by Itchy Rich ( 818896 ) on Wednesday January 11, 2006 @01:00PM (#14447056)

            Now, unless I misunderstood, it's telling me that if I install said security patch, and it breaks something, I can't hold MS accountable.

            You may or may not be able to hold them accountable in court, but third party adjudication is not the only form of accountability.

            If Microsoft didn't bother to test their patches carefully they'd risk upsetting their corporate customers, and hence their bottom line.

            • You mean they will upset the companies IT department. I hardly think that would trouble management that much.
            • So it is a delicate balance between pissing them off with a delay and pissing them off with a potentialy broken fix. They can address reduce the toal amount of pissed of user in two ways.
              1- Produce and test a fix faster. I'll assume that since this is Microsoft and they have a lot of money in the bank, they could afford a few more coders and testers.
              2- Release a work around/fix with some simple testing and only release the official patch after some amount of testing has been completed. This allows the sy
          • Now, unless I misunderstood, it's telling me that if I install said security patch, and it breaks something, I can't hold MS accountable.

            There's accountable, and then there's accountable.

            Let's say MS releases a patch that ends up causing major problems for mission critical systems at a nonzero number of Fortune 500 companies. The next time those companies are looking at major systems overhauls, do you think they're going to seriously consider MS products?

            Sure, MS isn't liable if their products caus
            • by Fruit ( 31966 )
              Let's say MS releases a patch that ends up causing major problems for mission critical systems at a nonzero number of Fortune 500 companies. The next time those companies are looking at major systems overhauls, do you think they're going to seriously consider MS products?
              Actually, yes. Just like they always have.
            • Let's say MS releases a patch that ends up causing major problems for mission critical systems at a nonzero number of Fortune 500 companies. The next time those companies are looking at major systems overhauls, do you think they're going to seriously consider MS products?

              Good point. Similarly, Let's say MS releases a product that ends up causing major problems for mission critical systems at nearly every Fortune 500 company, a product that requires them to spend exorbitant amounts of money and resources

        • All flag-ship companies should look to set an example. They after all have the money and resources to fix their own problems. It's not as if Microsoft are short of either.
      • Focusing on the exploits or not, 46 days is a long time to wait for a critical fix.

        Fixes like this have to be tested and re-tested which is not exactly something you do .... Yawn.... While you wait for the expresso machine to finish filling up that paper cup. I used to work for a *NIX vendor where the usual procedure was to offer a workaround to plug up the security hole. The patch was then developed and sent off for testing from where it would sometimes return for a rework because it caused unexpected prob
        • I used to work for a *NIX vendor where the usual procedure was to offer a workaround to plug up the security hole. The patch was then developed and sent off for testing from where it would sometimes return for a rework because it caused unexpected problems in some other part of the OS.

          How long ago ? What were your userbase demographics like ?

  • Unfortunately, this trend seems to plague many of the major application vendors as well.
  • by Godeke ( 32895 ) * on Wednesday January 11, 2006 @11:20AM (#14446203)
    I was expecting to find a scathing review of the patch process, but instead found a fairly reasonable assessment of the realities of issuing security patches: disclosed vulnerabilities get patched faster in an attempt to cover the users from the most probable exploit vectors whereas undisclosed vulnerabilities give the breathing room to do more testing and attempt to repair related flaws that are discovered in the process.

    That doesn't make me happy with the current situation, but it does make sense to react quickly (even if it puts the reaction at risk of being a problem itself) when something is actively being exploited. More quality assurance can be placed on patches that are not actively exploited (although each day increases the chance it will be exploited) and even more quality assurance can be placed on patched for flaws that are unlikely vectors.

    Being responsible for very high reliability networks (our customer facing web and their support servers), high reliability networks (the corporate network, where I can apologize to someone's face if it blows up) and low reliability networks (my own internal network where I can fire anyone who complains) I have different thresholds for pain in the patching process depending on the network involved.

    I'm far more willing to just slap a patch on my internal network: after all, it is my testing ground and it affects me far more than anyone else if it dies. After I have assured myself it isn't total bunk, I will patch our corporate network. Finally, our high reliability network is patched only after the corporate network's servers and clients have given us confidence in the patch. Of course, that means our high reliability network has to be far more insulated (URL scanning proxies in another operating system, tightly controlled trust relationships, intrusion detection, etc) but it is worth the extra effort and cost to avoid a "bum" patch bringing down the show.

    Microsoft may not be reacting perfectly, but I think they are trying to balance corporate stability with the realities of exploitation. It sounds like they do need to throw some more resources to the departments involved to shorten the critical path, but with a system this complex, test cycles are going to be long and involved.
    • It sounds like they do need to throw some more resources to the departments involved to shorten the critical path, but with a system this complex, test cycles are going to be long and involved.

      Would throwing more resources actually help speed the process, though? More resources (meaning more people) just tend to get more done in the same or longer time. It's not a linear relationship, anyway. And the "more" they get done is not necessarily productive. On its face, adding more resources to the test p

    • Microsoft may not be reacting perfectly, but I think they are trying to balance corporate stability with the realities of exploitation. It sounds like they do need to throw some more resources to the departments involved to shorten the critical path, but with a system this complex, test cycles are going to be long and involved

      It's not just corporate stability. A lot of it is the architecture of their systems. While Dave Cutler designed a nice, highly-modular system in Windows NT, the newest versions of t
      • We use both Windows and Linux in our environment, so I agree that the "UNIX philosophy" is usually a favorable choice for speed of patches and such. However, I have run into instances where someone coded to "buggy behaviour". If you build an app that makes a call and it doesn't work, sometimes you notice "oh, it did X, it just needs Y to work". A better solution would be addressing the flawed component, but when the component is in C and the programmer works in something else, the workaround is often implem
      • While Dave Cutler designed a nice, highly-modular system in Windows NT, the newest versions of that OS are a far cry from Cutler's original design. Everything is tightly integrated and various system components do many many different things these days. So when changing one component, Microsoft programmers have to way that change against the rest of the system and all of the software that relies on that component.

        I don't think "modular" and "integrated" mean what you think they mean.

        Combined with truly st

      • The UNIX philosophy has always been and remains: "Do one thing and do it well."

        IT has? Why does cat have like 6 different options then, including numbering lines (which we also have nl for). Why did we move on from ed to vi? And then move on to xemacs?:) -- I still read my email in xemacs/gnus. One word: perl. Another one word: apache-httpd (this one is esp. close to my heart as I wrote my own webserver because apache-httpd had way to many options/bugs).

  • by tont0r ( 868535 ) on Wednesday January 11, 2006 @11:20AM (#14446207)
    a bear shits in the woods.
  • So, you mean to tell me that they fix flaws faster when they have users and system administrators breathing down thier necks? Say it ain't so!
  • Meh (Score:4, Insightful)

    by Anonymous Coward on Wednesday January 11, 2006 @11:21AM (#14446213)
    Seems as though the reason stems from the fact that Microsoft actually has to make sure their patches are compatible with the rest of the things they support. As they support more and more hard and software, the total can only go up.
    • Re:Meh (Score:3, Insightful)

      by varmittang ( 849469 )
      So Linux doesn't? I mean, it runs on more hardware, PPC, SPARC, blah blah put your chip in here. Linux also has multiple languages and lots of programs that need to share the same libraries. Sure you are more likely to have something break in Linux after a patch, but usually a few hours or a day later you have a patch for the program that got broken so it works properly again, although I haven't had a program break due to a security patch yet on Linux but I have on MS. And Linux vendors have their patch
      • Re:Meh (Score:4, Insightful)

        by The Angry Mick ( 632931 ) on Wednesday January 11, 2006 @12:15PM (#14446662) Homepage
        So, again, why does MS take so long?

        The legal department?

      • This one bit me bad the other day:

        http://lists.debian.org/debian-user/2006/01/msg004 08.html [debian.org]

        However, the "issued patch" solved the problem. And better yet, I could patch it myself by editing one text file and rebooting.

        So yes; patches can and do break stuff in linux.

        That being said, a similar issue in would have required a reinstall.

        Like the three win2k machines I have here right now which *refuse* to actually use windows update. I'm having to download all patches by hand and force feed them one at a time.
      • So Linux doesn't?

        Nowhere near as much (I'm assuming here by "Linux" you mean "people writing open source software for Linux).

        Sure you are more likely to have something break in Linux after a patch, but usually a few hours or a day later you have a patch for the program that got broken so it works properly again [...]

        And that's the problem it produces.

  • In the Windows Meta File case MS patched my machine and rebooted it without my consent after just a few days. I guess it all depends on how serious the flaw is.

    http://www.stockmarketgarden.com/ [stockmarketgarden.com]
  • by chriss ( 26574 ) * <chriss@memomo.net> on Wednesday January 11, 2006 @11:21AM (#14446216) Homepage

    The most interesting result of Security Fix's study is that Microsoft took longer to fix a problem if the researcher waited to disclose the problem until after Microsoft published the patch.

    I'd like to know if the time to issue a fix also depends on existing exploits, i.e. is Microsoft faster if there is already an exploit out there. If yes, than it seems obvious that Microsoft does not really put as much afford into fixing bugs as they claim, they're "motivated" by public pressure.

    One explanation for additional delay in case of a not yet disclosed or not yet exploited problem may be more thorough testing, so it may not even be a bad thing. But I'm afraid that the delay is not really "in the best of the customers", more in the best of Microsoft. I have no prove, but it seems to be the general company policy.

    Chriss

    --
    memomo.net [memomo.net] - brush up your German, French, Spanish or Italian - online and free

    • Another explanation is that banging out a patch to fix the symptom is faster than fixing the problem. When there's exploit code running in the wild, the former is what happens. When there is no evidence that the vulnerability is being exploited, the latter is what happens.
    • Or a simpler explanation might be that, given a certain budget for fixing bugs/security flaws, they have to prioritize, and since bugs that have an exploit out in the wild are much more likely to have a negative impact, they get pushed to the front of the queue... which makes sense to me.

      I don't think they set out to solve X bugs in Y months. I would assume they have a certain number of manhours devoted to fixing bugs, and fix however many they get around to. They can always increase the resources devoted
      • I don't think they set out to solve X bugs in Y months. I would assume they have a certain number of manhours devoted to fixing bugs, and fix however many they get around to. They can always increase the resources devoted to this, yes, but I doubt anyone over there says "oh, this one doesn't have an exploit in the wild, try to take as long as you can to fix it".

        If you had the public reputation of Microsoft and also declared years ago, that from now on security would be priority no 1, don't you think tha

    • One explanation for additional delay in case of a not yet disclosed or not yet exploited problem may be more thorough testing, so it may not even be a bad thing.

      The problem with this is simply that you can never know that a given exploit is NOT being taken advantage of somewhere. "It's safe for now; nobody knows about it." Meanwhile someone is quietly carrying the goods out of the back door somewhere.

      Just because a flaw isn't being broadcast from the rooftops doesn't mean that it's not be
    • by SgtChaireBourne ( 457691 ) on Wednesday January 11, 2006 @03:57PM (#14448763) Homepage
      There's a lot of misdirection going on here. The day an exploit is made public is not the same as when the bug it uses is reported. Nor is that the same as when the bug is found, not is that the same as when MS acknowledges the bug.

      We're dealing with a number of different dates, some of which are often months or years apart:

      1. Date bug found by black hat
      2. Date bug found by white hat
      3. Date bug is reported
      4. Date bug is made public
      5. Date exploit is published
      6. Date exploit is found 'in the wild'
      7. Date MS acknowledges the bug
      8. Date MS announces a patch
      9. Date MS releases a patch
      10. Date MS releases a patch that fixes the bug / repairs damage from first patch

      Somehow, being a political movement / cult, MS becomes exempt from the rules of a normal business and from what customers expect. No other device or appliance has had even a fraction of the defects as MS' without going through a major product recall. Our dear Chairman Bill will go down in history as the man that made bad engineering acceptible aka the Microsoft Effect

  • by esac17 ( 201752 ) on Wednesday January 11, 2006 @11:23AM (#14446234)
    In the Linux world, the deployment of a bug fix and discovery of any potential bugs is part of the testing cycle. So you get a quick turn around time when a bug is reported.

    When Microsoft has to issue a bug fix (and all jokes aside about not testing), I am sure they have a team devoted to testing it, then it has to get sent to all internal Microsoft employees and tested, and then probably even has some initial customer testing with the bigger companies to make sure nothing breaks, and then finally gets released to the public.

    Hopefully 165 or 365 days .. whatever it takes to make sure it is tested is a GOOD thing. I don't want to be their beta tester :)
    • How dare you post such a comment in this story! Where do you think you are?!

      BTW, you're totally right and I completely agree with you.
    • by randyflood ( 183756 ) on Wednesday January 11, 2006 @11:59AM (#14446531) Homepage Journal

      You ask why it is a bad thing if the time between the discovery of a security vunerability and the time to relase a patch is increasing. You ackowlegde that in the Linux world, patches are fixed much faster due to their development model. So why is it a big deal if hackers can own your systems for longer without a patch being availiable? Isn't it obvious? HACKERS CAN OWN YOUR SYSTEM FOR LONGER BECAUSE A PATCH IS NOT AVAILIABLE. That is what the big deal is. They can use whatever development model they want. Releasing shoddy patches is only one solution that is available to them. The fact that they are able to cut the time it takes to release a patch in half if a working exploit has been publically released shows that it is more a matter of what resources they want to bring to bear on the problem rather than the minimum time to release a good patch. Or another way of stating this is, they are 25% less concerned with getting patches out in a timely manner than they used to be. So, the importance of security at Microsoft is decreasing.

    • It's a bad thing because Linux's process—which involves having thousands of alpha and beta testers of the patch with direct access to the source code and the knowledge to make that access useful deploy it on their boxes—turns out to produce better patches faster. You, as a user who "doesn't want to be their beta tester", don't have to be. In 5-10 days (not 46 or 135) your distro vendor will have enough evidence that the patch is harmless and effective that they will make it available to you, a

  • by gasmonso ( 929871 ) on Wednesday January 11, 2006 @11:29AM (#14446299) Homepage

    If you look at the data, you will notice that some critical flaws were patched in less than 3-4 weeks. While that may seem long, it is somewhat reasonable due to the amount of verification/validation necessary. People forget that 95% of the world runs on M$ so they have to really test a patch before releasing it.

    On the other hand... because so much of the world depends on M$, they have an obligation to its customers to provide a secure OS and timely patches. Personally, I feel they are doing an "ok" job and seem to be getting better. Alot of vulnerabilities can be avoided just by running your PC behind a router and/or by using a firewall application. Personally, I have NEVER had a virus at home on any of my computers because I take simple preventative measures like running Norton AV and AdAware. I also put all my pcs behind a router.

    http://religiousfreaks.com/ [religiousfreaks.com]
    • Slightly OT, but a legitimate question..
      The background: I've never had a virus at home (well, not since DOS days). I don't run antivirus; I used to run antispyware, but it kept turning up nothing so I stopped. I run 3 windows xp PCs and several linux PCs. I don't use MS products for web browsing or e-mail (ever. period.) I do run windows firewall on my laptops (my wife uses hers at school, and I use mine at work and school, so it's safest), and I have a hardware firewall/router. I have open ports fo
    • Umm... How exactly is MS's track record improving?

      Details, please?

      Why do you feel like they are doing better? Because they release more marketing materials advertising security?

      How is XP now more secure than at release? Is the rate of infection down? (no, its not). Are patches being released more quickly? (no, they aren't).

      I guess the XP firewall is on by default since SP2. I can't think of anything else, however.
    • some critical flaws were patched in less than 3-4 weeks. While that may seem long, it is somewhat reasonable due to the amount of verification/validation necessary. People forget that 95% of the world runs on M$ so they have to really test a patch before releasing it.

      HOW THE HELL can you be so indulgent ? Sure 3-4 weeks may seem reasonable but the average 135 days can in no way whatsoever be justified by this argument ("they need to QA patches"). Microsoft is a multi-billion-dollar software company wh

    • Personally, I feel they are doing an "ok" job and seem to be getting better.

      No, statistics show they are not getting better (though it looks like Microsoft is putting more efforts into improving their patch development process), read TFA: "In 2003, Microsoft took an average of three months to issue patches for problems reported to them. In 2004, that time frame shot up to 134.5 days, a number that remained virtually unchanged in 2005."

      • No, statistics show they are not getting better (though it looks like Microsoft is putting more efforts into improving their patch development process).

        That probably reflects the standard problem with large-scale software development - as the product gets larger, the number of bugs increases and the difficulty of fixing each bug also increases. One of the reasons you see so many apps being duplicated and rewritten from the ground up is that it is often easier to start from scratch than fix a flawed progra

    • by EXTomar ( 78739 ) on Wednesday January 11, 2006 @03:55PM (#14448738)
      This would be akin to having the anology of cars without modern safety features. "Personally, I have NEVER had a serious injury while driving any car because I take simple preventative measures like buying seat belts, safety glass, and air bags." The question one should be asking is why does the user have to buy "seat belts, safety glass, and air bags" for their computer in the first place?? Shouldn't these things be standard features? Turning around responsiblity to the user is allowing MS off the hook. Users are using Windows as designed and getting sometimes serious malfunctions. It would be one thing if people were abusing their machines and breaking them. It is something else to be normally surfing the internet, reading email, or doing any other nominal activity and hitting a serious problem that leaves their system bare to the hackers. This is squarely Microsoft's problem not the users!!

      I'm tired of this kind of applogetic excusing for Microsoft. As much as people want to blame the users, its still all in MS's lap since many of the problems stem from software doing things that it should never be allowed to do in the first place. AV software, hardware and software firewalls, malware scanners...its all a hack to stop users from breaking their machines doing normal operations because MS won't or can't engineer a system that disallows it.

      Years of experience on other systems have shown that computers are complex machines with complex interactions all of which are prone to error and worst exploit if not carefully designed. On the other hand Microsoft sold most of the world on the promise that Windows is as easy to use as a VCR and requires just as much maintaince and look at where we are. We have to throw more and more money and time into work arounds while MS takes longer and longer to fix up things. Why aren't more people asking why does Windows work this way?
    • People forget that 95% of the world runs on M$ so they have to really test a patch before releasing it.

      No, 95% of the desktop world runs on Microsoft. Microsoft certainly doesn't have that kind of marketshare in server systems.

  • by ThinkFr33ly ( 902481 ) on Wednesday January 11, 2006 @11:31AM (#14446308)
    While it is certainly interesting (if true) that Microsoft takes longer to release patches with no known exploits roaming around, I would find it far more interesting to see which causes more harm: the longer patch times or the full disclosure.

    Just because Microsoft releases a patch quicker when full disclosure is used doesn't mean this results in less harm to users. It might take Microsoft 200 days to release a patch, but if the only people who know about the bug are the researchers who discovered it and Microsoft, then the end result is that little harm was done to the users.

    If, however, an easily understandable exploit is posted before Microsoft has fixed the bug, those 45 days might be a lot more dangerous for those users than the 200 days in the previous example.

    Of course, it's very difficult to know if the security researchers who discovered the bug are the only ones with knowledge of that bug. Could other people know about it and be actively using it to compromise machines? Maybe. But I would really like to see some data on this.

    I suspect that the vast majority of major worms and viruses take advantage of well known exploits published on the Internet by usually well meaning security researchers. Certainly all of the major worms I can think of off the top of my head follow this pattern. (MYTOB, LOVGATE, NETSKY, SASSER, ZAFI, SOBER, BAGEL, etc.)

    If so, people really are safer when the exploit is not published before Microsoft releases a patch despite the significant lag time for those fixes.

    So I guess which approach you take depends on your goal. If your goal is the glory of a 0-day exploit, then post away. But if your goal is the security of the end user, maybe you should keep it to yourself for the time being.
    • That's an insightful and interesting question. The security community hasn't agreed on an answer yet.

      There's a good biblbiography of the full disclosure debate [wildernesscoast.org] that will point you to many, many arguments.

      My personal favorite was a big study of many CERT reports which concluded that the practical window of vulnerability begins when the first automated exploit code hits the street and ends only when attackers lose interest. Practically speaking, not enough people install patches to affect the dynamics. Yo

      • You do, I do, my clients do, everyone who listens to us does, but think of all the worms and spyware that have exploited vulnerabilities that had been patchable for months.

        Good point. Hopefully things like auto-updates will mitigate this problem a bit. Since WinXP SP2 comes with auto-updates enabled, I have a feeling this will slowly be changing for the better as more and more people update to SP2 or buy new computers with SP2 pre-installed.
    • You're begging the question of whether the Major Internet Worm/Virus is the main thing worth patching to avoid. I'm not sure that's the case; most vulnerabilities are used in a variety of exploits, and the major malware is just the one that makes the most headlines. Also, most of the ones you mention already had patches available; they weren't zero-day exploits precipitated by "usually well meaning security researchers."

      I appreciate that you've said you need more data, but I think you actually need a lot

    • So I guess which approach you take depends on your goal. If your goal is the glory of a 0-day exploit, then post away. But if your goal is the security of the end user, maybe you should keep it to yourself for the time being.

      You've made a number of incorrect assumptions and failed to consider several important concerns. First, is the vulnerability likely being exploited? Is the vulnerability able to be mitigated by users and if so, are there drawbacks to the fix? What systems would be made vulnerable?

      F

    • Major worms tend to happen after full disclosure, because full disclosure allows people with no particular interest in exploiting a flaw to exploit it, and major worms are general of little or no value to their creators. This is not to say that any particular unreported flaw is being exploited, but if nefarious types know about a flaw, chances are that the result will be more effective targetted phishing, data theft, etc., not a major worm, and nobody will ever realize that the flaw had been exploited, beca
    • Of course, what we can't see here is the long tail effect. How many Windows boxes are being exploited by holes unknown to the public, but that Microsoft is aware of. There is not any way to tell easily.

      Heres a new benchmark that Microsoft would not like.

      T.C.C.M.

      Total Cost of Code Maintnence, how much does it cost to patch and test the base operating system source code per year? Microsoft vs Other commerical operating systems? Vs opensource operating systems.

      The T.C.O Microsoft does not talk about is on ther
    • If so, people really are safer when the exploit is not published before Microsoft releases a patch despite the significant lag time for those fixes.

      I'd counter that with the WMF vulnerability. The details of it were released with no Microsoft patch available. Now, once I know where the vulnerability is, I can protect myself immediately by unregistering the offending DLL and using my registry-monitoring tool to block any attempt by other software to re-register it. Or I can take advantage of a third-party

  • by Anonymous Coward on Wednesday January 11, 2006 @11:32AM (#14446313)
    This is a great case for Intrusion Prevention Systems. I have seen many vendors providing "Virtual Software Patches" during the window from when a vulnerability is released to the time that it's actually patched. It's not the ideal solution, but it's definitely one of the best ways to take care of the problem today without waiting for m$ to get their stuff together.

    I'd say that in this week I've seen stuff from 3Com/TippingPoint, Secure Computing, Sonicwall, etc. all about securing WMF fairly quickly after the exploit had been announced.

  • With work going full force on Windows Vista, you would have to understand that Microsoft has other things on its plate. Also, perhaps these issues are a little more difficult to fix.
    • With work going full force on Windows Vista, you would have to understand that Microsoft has other things on its plate. Also, perhaps these issues are a little more difficult to fix.

      I hope you're being facetious because making excuses for the world's largest software company is just plain ignorant.

      They have NO excuse. Period. OpenBSD, a free open source operating system, is constantly auditing their code for security flaws. Windows has millions, perhaps billions of more code to audit, however they also have
    • Translation:

      "Sit down, shut up, and eat the gruel we put in front of you. We're better than you, smarter than you, and we know whats best."

      I'm not aware of any other software project, free or proprietary, that has as poor of a security record as an equivalent Microsoft product.

      Don't blame it on marketshare; otherwise, Apache would lead IIS in terms of infection. And even if it is because of marketshare, you would think that the completely untouched (as in 0 viruses) environment of OS X would be a great targ
  • by Anonymous Coward
    If only they reported a MEDIAN time to fix rather than an AVERAGE.

    Assuming that the more important repairs are done in under thirty days,
    I'm willing to overlook the 365 day fixes that push the average way up.

    • If only they reported a MEDIAN time to fix

      OK, so how useful would it be to know that exactly half the patches are over the median time, and exactly half are under? If you want something really useful, we should have the mean time plus a standard deviation or two...
  • That's it! I'm putting in a suggestion to my company that we put in this 3rd party patch [redhat.com] for the few Windows servers we have left.
  • Redmond is taking at least 25 percent longer to issue patches for "critical" vulnerabilities, now averaging around 135 days to issue a fix.

    It wasn't necessarily because it actually took longer for them to fix these new vulnerabilities, rather, their marketing department just wanted you realize the immediate benefits of installing Microsoft Anti-Spyware beta.
    • I don't live under a bridge and eat little children, really.

      The reason it is taking so long to fix vulnerabilities in my best estimation is that they have many different applications/OSs to test these patches with, while at the same time are trying to ramp up the efforts for a smooth release of Windows Vista. Attacks against Windows PCs are increasing by the day and it is probably much more time consuming to fix the myriad of these vulnerabilities than what it was say 5 years ago. But that is just a g
  • While releasing non-critical patches on a monthly cycle seems sensible, three months is a very long time. I wish I could believe it was all being spent on testing.

    A danger is that the time difference between patches for undisclosed vs. fully-disclosed vulnerabilities will encourage people to fully disclose without waiting. I hope Microsoft are working to bring down their cycle time for characterising the vulnerability, and developing and testing the patch.

    Does anyone have statistics for the number of bugs f
    • Sorry, guilty of not fully R'ing TFA before posting. To quote:
      One final note: Security Fix did not attempt to determine whether there was a correlation between the speed with which Microsoft issues patches and the quality or effectiveness of those updates.
      They thought that would require too much work to compile.
  • by XMilkProject ( 935232 ) on Wednesday January 11, 2006 @11:47AM (#14446436) Homepage
    The timeframe doesn't seem entirely unreasonable. When you think that they are releasing a patch which will be automatically downloaded and installed on literally tens of millions of computers, most of which without any system administrator to aid in the process.

    That is a daunting task, and I can imagine theres a very lengthy process a patch must go through.

    To Microsofts credit, I can hardly remember a time that a patch was released which cuased any major problems, which in itself is a great achievement given the amazing variety of hardware and software the users may have. There was of course alot of hype over compatibility issues in SP2, but to the best of my knowledge any actual issues were understood ahead of time and due to compromises that were made intentionally for one reason or another.
    • Good points. The last patch I recall that negatively affected our business environment was Windows NT 4.0 Server SP6. This patch basically broke the TCP/IP stack and left dozens of my company's servers on their knees. Of course it was partially our stupidity for not testing out the patch on a non-production box first :-) Good thing SP6a came out relatively quickly thereafter.

      I know that as consumers we should expect Microsoft to test out there patches and since back in 1997 I think they are obviously doing
  • by DoktorFuture ( 466015 ) on Wednesday January 11, 2006 @12:04PM (#14446568) Homepage
    I'm sure that the QA aspect of testing the patches takes the most time, because that is where Microsoft has the most to loose.

    Imagine if their patch accidentally disabled * * * TENS OF MILLIONS * * * of computers. If that happened, they'd loose so much consumer confidence -- essentially loosing whatever gains (if any) they have made in the last several years (and billions in spending).

    (okay, that did happen on a lot of sp2 systems, and MS is not loved for it)

    MS has to ensure that the patch works on a staggering and dizzying array of systems and architectures (lots of different mobos, pentiums, AMD's, dual core CPU's, XENON's, via chips), and for dozens upon dozens of applications. That's why you often find that they'll often release a patch on NT or more server based systems before they release it for consumer systems.

    Another reason is that, depending on the type of problem, will do a full tracability check, and also cross reference all their code that references the changed module, and evaluate (probably manually) if they put that dependency at risk. A huge, horrible job, suitable only for type-A micro-detail oriented folks. I wouldn't want to do it!

    If MS disabled TENS OF MILLIONS of computers, you would see a huge shift away from regular Patch Tuesday activities, towards one of 'install on a test bed' -- extremely tedious and manual that everyone would hate. Millions of people would be put out. Seriously bad Karma.

    So, they can:

    • Release a damaging patch -> like an A-Bomb wiping away consumer confidence
    • Release a patch late -> some systems might be infected, but often, threats can be mitigated on key systems (firewall rules, policies, use different software), or third party patches appear to fix the problem.
    • Ignore a problem -> Perhaps try to luer people to exploit it instead of finding new holes? :) Perhaps encouraging the industry to develop technologies like 'IPS' and 'worm crushers'?

    I'm sure at least someone is thinking "Heck: our flaws are the manure in which an entire security industry will grow in".

    • Consider:

      Linux runs on ALL those platforms (Intel, AMD, etc). And more (Alpha, IBM Mainframe, Sparc, etc.). There is really no comparision.

      Consider:

      Linux supports more legacy hardware with the OS core.

      Consider:

      Linux vendors typically support 4+ GB of object code with a typical installation.

      Not that I care one way or the other about Linux/Windows comparisions, but this should give you something to think about.

      The obvious conclusion to draw? That the Open Source model is SO SUPERIOR to Microsofts, that there
    • Imagine if their patch accidentally disabled * * * TENS OF MILLIONS * * * of computers.

      Imagine if them delaying a patch ended up with HUNDREDS OF MILLIONS of disabled computers.

      I, for one, am amazed this hasn't happened yet. Fortunately malware authors haven't gotten to the pure vandalism stage of their development.
  • On Full Disclosure (Score:2, Insightful)

    by SHP ( 8391 )
    A common argument of those who oppose full disclosure is that it does harm by allowing the development of worms, and provides infection vectors for Spyware. I personally think the widespread worms are a good thing. The act like wildfire clearing the underbrush of vulnerable machines.

    What really concerns me is not some 14 year kid in Bulgaria playing "my botnet is bigger than yours" games. I'm concerned about hostile governments, terrorist groups, and organized criminals who already have a stable of zero
    • Except worms don't clear the underbrush. Traffic from slammer, blaster, and other worms can STILL be seen today. Those machines are still compromised. They will probably remain compromised until the machine dies.

      EVERY vulnerability is a race between the people trying to fix it and the people tring to exploit it. It is NOT possible to "win" every race; the best you can do is set the rules in such a way that winning is much more likely for the good guys than the bad guys.

      I don't care what metric you try t
  • If you're a MICROSOFT GOLD member click here to get the patch !!NOW!! from our super faster servers!!


    All other members please use the public servers. Wait time is 10 minutes to 90 days. Please be patient. OR UPGRADE TO GOLD MEMBERSHIP STATUS!! FIND OUT HOW BY CLICKING HERE!!!
  • I mean, when you consider how long it takes them to put the flaws in their products in the first place, it's only reasonable that it would take them longer to get the flaws back out again, right?
  • by WhiteWolf666 ( 145211 ) <<sherwin> <at> <amiran.us>> on Wednesday January 11, 2006 @01:08PM (#14447138) Homepage Journal
    Why? Because the black hat community is very, very nice to MS.

    I've never met a truly destructive worm or trojan. I don't mean one that disabled systems as a side effect of its operation. I mean one specifically designed to destroy data, and/or BIOS/CMOS/anything flashable.

    A 4 month patch cycle. I imagine that if North Korea, or whoever felt angry about the global economy, decided to try and do something devestating that they could easily prepare some kind of trojan payload that would install itself, replicate for a week or so, and then destroy the system in question. Blow away the BIOS (won't be determined until a reboot), blow away the partition table, and then start writing loads of garbage all over the disk.

    Such a worm would break MS. MS execs would be brought before a congressional hearing.

    That is, after banks, airlines, and major companies managed to rebuild some kind of IT infrastructure.

    MS is very luck that no black hats have decided to do such a thing. I guess its most likely because no one wants to bring THAT kind of heat down upon themselves.
  • I mean really, why? If I get a patch to, say, gaim, I know that my motherboard and soundcard just aren't going to matter. That's what the job of the OS is, to abstract away those details from the application.

    Why does Microsoft have to test patches to things like browsers against all possible configurations? Why does it matter which CPU or motherboard or soundcard you have for a stupid browser issue?

    This all comes down to the stupidly broken architecture of having a largely monolithic system that has a

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...