Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security IT Technology

Emergency Software Patches Are on the Rise (nbcnews.com) 43

Emergency software patches, in which users are pushed to immediately update phones and computers because hackers have figured out some novel way to break in, are becoming more common. From a report: Researchers raised the alarm Monday about a big one: The Israeli spyware company NSO Group, which sells programs for governments to remotely take over people's smartphones and computers, had figured out a new way into practically any Apple device by sending a fake GIF through iMessage. The only way to guard against it is to install Apple's emergency software update. Such emergency vulnerabilities are called "zero days" -- a reference to the fact that they're such an urgent vulnerability in a program that software engineers have zero days to write a patch for it. Against a hacker with the right zero day, there is nothing consumers can do other than wait for software updates or ditch devices altogether.

Once considered highly valuable cyberweapons held mostly by elite government hackers, publicly disclosed zero-day exploits are on a sharp rise. Project Zero, a Google team devoted to identifying and cataloging zero days, has tallied 44 this year alone where hackers had likely discovered them before researchers did. That's already a sharp rise from last year, which saw 25. The number has increased every year since 2018. Katie Moussouris, founder and CEO of Luta Security, a company that connects cybersecurity researchers and companies with vulnerabilities, said that the rise in zero days is thanks to the ad hoc way that software is usually programmed, which often treats security as an afterthought. "It was absolutely inevitable," she said. "We've never addressed the root cause of all of these vulnerabilities, which is not building security in from the ground up." But almost paradoxically, the rise in zero days reflects an online world in which certain individuals are more vulnerable, but most are actually safer from hackers.

This discussion has been archived. No new comments can be posted.

Emergency Software Patches Are on the Rise

Comments Filter:
  • maybe add more QA time and stop deadlines?

    It's seems like the easy that is to update then the less testing that you get!

    • Or fundamentally change the way we create complex software because things aren't getting simpler. Just like hardware isn't getting simpler.

      • Instead of waiting for some privileged, sheltered, prodigy to decide the standards suddenly don't fit the scope. Just expand the discussion to the general public (schools) and and let the standards be defined by the industry and responsible hands who are concerned with the industry. It will mean less software billionaires and maybe less open source, but we can't continue on the path we have been on up to now....
        • by Junta ( 36770 )

          I suppose I'm not seeing how this could map to specifics. Broadly speaking there are classes of historical problems with systematic ways to detect/avoid, but a great deal of security vulnerabilities are design issues of random software X that are just uniquely problematic in a specific context.

          Not every piece of code in the world can reasonably have some specific standard to save it. In fact I would say that an open implementation tends to win over a 'standards body' for making useful, reliable, and secure

    • by Junta ( 36770 ) on Wednesday September 15, 2021 @07:09PM (#61800069)

      Security QA is pretty hard. You can have penetration testers come in, but ultimately they will never have as much time spent on it as the general world will spend.

      What I generally see are a lot of analysis tools with some manual auditing that can be more effective than you might think, but if your product attracts enough attention, it's hard to compete with the malicious community manually examining all sorts of nuance of how your product works. A very popular product can be subjected to decades of man-hours in a few days upon release, and there's no way you can budget in enough to compete. You can impress upon all your developers the importance of thinking through the potentially unsafe failure modes of deliberately bad input and do better than an audit and if your team is pretty good you can have a solid outcome, but one slip up is all it takes.

      Deadlines are tricky as it's also a very competitive landscape. Slow and steady does not, in fact, win the race in business. Look no further than Windows itself, which was a cesspool of security issues for years and the market chose to give them more money which ultimately allowed Microsoft to largely address the fundamental issues in time rather than other more hardened desktops that tried to proceed more carefully to avoid precisely some of the pitfalls that Windows suffered from.

      • by swilver ( 617741 )

        QA is not the answer. No matter what the kind of QA, you'll never find all problems by just "testing" it. In any large enough piece of software there are probably an order of magnitude more bugs than QA will *ever* discover, no matter how rigorous the process. Short of analyzing the source code by other highly experienced developers, testing will simply not find even a fraction of the issues.

        Instead, what you need are good developers programming experienced in avoiding security issues in a language with v

        • by Junta ( 36770 )

          I think it is overstating the case that almost all are bad memory handling, and that's a dangerous overstatement. It leads to a complacency like "If I just write my code in (Java, Go, Rust) I'm all good, it's only the C developers who have to worry".

          For example, parsing feature rich file format as interchange strategy. Even in a secure language, you run as privileged user and the malicious interchange says 'include /etc/shadow' and your secure language will happily do that because that is, in fact, a featu

    • by jmccue ( 834797 )
      Maybe time to dump Agile as used by large companies ? In cases I have seen, people work just to make points for the iteration instead of spending the time needed on testing.
    • by swilver ( 617741 )

      Or you decode your GIF in a safe language... or perhaps not run the GIF decoder with enough privileges to take over the system. Gif is not even that complex a format, but I guess performance has to trump safety... we wouldn't those gifs to take longer than 1ms to decode.

    • by antdude ( 79039 )

      Also companies, stop letting internal QA people go!! Yes, use external testers. Internal paid QA people are important too!

    • by whitroth ( 9367 )

      Maybe stop "faster is better", and "oh, our competitor put out a new update/version, we have to do it, too", and concentrate on the core, rather than bells and whistles that most people don't use... and *then* you have time for adequate testing.

  • by Mal-2 ( 675116 ) on Wednesday September 15, 2021 @04:37PM (#61799687) Homepage Journal

    It's not an estimate of severity, it's an estimate of how long until it's exploited in the wild. 0 Day just means it's already being exploited, it does not necessarily indicate a severe threat.

    • zero day means it could already be exploited, not that it is already being exploited

      and on the subject of the title, security bugs are not up, patches are up.

    • by Anonymous Coward

      Almost everybody gets it wrong. A 0-day does not necessarily have to be actively exploited to be a 0-day. Read the wikipedia entry it has some history on the term.

      The short of it is that the current meaning of "0-day" is a vulnerability that has no patch available to fix it.

    • by AmiMoJo ( 196126 )

      Zero day does not refer to the first point at which it was exploited, but the first point at which it was known.

      A zero day vulnerability is brand new, previously unknown and affecting the current version of the software/hardware affected. It also includes enough information to reproduce the flaw, enabling bad actors to build malware around it.

  • by Anonymous Coward
    Argh, stupid reporters, repeating each other's lies instead of reading the actual CVE and Apple's security statements. It's a PDF vulnerability in CoreGraphics.
  • It's bad that software seems to have so many flaws, but it's good that they are being patched quickly, even if it might be disruptive to customers.

    • by MrL0G1C ( 867445 )

      I find that android is a bit retarded about updates though, I'll leave the house, look at the phone after it's charged and it's asking to update at a time when I have an expensive connection. Then when I get home, forget all about updates, plug it in to charge it gets a good free connection and does nothing.

      Stupid. Why not just automatically update overnight when I'm charging the phone. A good candidate task for AI to learn when a good time to update is.

    • by antdude ( 79039 )

      It's annoying when there are bugs in patches.. :(

  • How does that coincide with the adaption of open source? Please I'm not raking open source. I myself imagine a literal 50-50 split between open and closed source as a road to success for many projects.
    • by Junta ( 36770 )

      I doubt there is good quality data.

      Open source is a popular target for research, obviously, because they can just farm github for all sorts of codebases to analyze from all sorts of random developers. So some teenager with no security practices posts some homework code or some code to manage their game library or something, that gets sucked right up into the target of researchers right along a professionally managed significant codebase. In this way, there's a mountain of available source code that has ne

  • In order for the NSO Group to develop software capable of penetrating the security defences of platforms like Google’s Android and Apple’s iOS, isn’t it fair to say that what the NSO Group does is to basically commit crimes in the context of the Computer Fraud and Abuse Act?

    I appreciate that this question is secondary to the presence or absence of vulnerabilities in the software platforms concerned, but why aren’t companies like NSO Group being prosecuted? If this was a lone hacke
    • by Jeremi ( 14640 )

      Hacking your own computer isn't a crime.

      Hacking someone else's computer (without their permission to do so) is a crime.

      As long as the NSO Group is doing its penetration-testing exclusively on devices that they own (or with the explicit consent of the owners of the devices), no crime has been committed.

    • by AmiMoJo ( 196126 )

      Commercial malware vendors avoid prosecution by only selling to people who are authorized to access the data, usually law enforcement and secret services. As such there is no unauthorized access, at least on paper.

      • by ytene ( 4376651 )
        I can’t fault the veracity of your logic here. But I worry that it is ‘theoretical logic” - in the good old adage” ”In theory there is no difference between theory and practice; in practice there is

        From what little I’ve read about this sort of spyware being detected, the most prevalent groups finding it on phones have been folk like journalists - a group for whom a justifiable case for interception might be very hard to make. OK, I get that protections of the freedoms
        • by AmiMoJo ( 196126 )

          Oh of course they know it gets abused, but in that case they can argue that they sold it in good faith and the contract said it should not be abused, so they did their part.

  • by Tough Love ( 215404 ) on Wednesday September 15, 2021 @06:25PM (#61799999)

    Sixteen out of 45 Windows. Four Android. Zero Linux.

    Yeah, I know, Android is Linux. If Android development was actually open we would have that fixed too. Doing our best.

    Good share of Apple crapware in there too. All aligns pretty with what we've been telling you. Never believe Apple or (especially) Microsoft when they claim they've turn the page. Just more of the same old same old.

    • by gTsiros ( 205624 )

      Android is linux-based but it is not linux.

      It lacks important components.

      • You're full of it. Fire up a shell and take a look around. Compile a Linux program and run it. It's Linux

      • Android is linux-based but it is not linux.
        It lacks important components.

        That is literally not how this works. If it's got a Linux kernel and a Unixlike userland then it's Linux. And if you want to, you can use Linux Deploy (with root) to install a FULL Linux distribution userland on your Android system, or use Termux (with or without root) to install enough missing pieces to make Android behave like the Linux you know and love.

        Frankly though, any Android system with a terminal app installed is immediately recognizable as Linux to anyone who is familiar with it. And some version

    • by AmiMoJo ( 196126 )

      That's only the ones that Google has discovered though. If you look through the CVE database there are plenty of issues for the Linux kernel that needed quickly patching (fortunately most were responsibly disclosed so the fix came in the next release cycle), and even more in Linux distros.

  • This article has so much wrong with it that I'm surprised to see this on Slashdot. Or I would be, if this were 20 years ago.

  • When people get used to the idea that a patch is needed immediately to prevent a security hole, don't the patches themselves become a very valuable target? I'm sure they are very well protected - but even the NSA was not able to protect their own data, do we really think Microsoft or Apple are better at it?
    • even the NSA was not able to protect their own data, do we really think Microsoft or Apple are better at it?

      In short, yes. The NSA is unable to attract the best and brightest because they do not pay as much as the private sector, and because you have to get a security clearance and pass a drug test. Putting those three factors together disqualifies almost everyone worth hiring, because almost everyone will want more money and/or fail to get clearance and/or wants to use illicit substances.

  • But it only now starts happening in the psychopathic thug community, where they only ever give a shit if you live or die if it affects their bottom line / cocaine stream.

  • Programming is complicated. Having developed software professionally for years, I would argue you can never harden everything down.

    A high level language is developed in a lower level language, there will be flaws in the millions of lines of code that is involved.

    If someone thinks they can QA everything, good luck with that. Do you think you can simulate an OS failure that impacts your software. Even more fun, a hardware or driver flaw, they happen and in unexpected ways. Will you emulate that the p

Ummm, well, OK. The network's the network, the computer's the computer. Sorry for the confusion. -- Sun Microsystems

Working...