Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Privacy

US Healthcare Giant Norton Says Hackers Stole Millions of Patients' Data During Ransomware Attack (techcrunch.com) 27

An anonymous reader quotes a report from TechCrunch: Kentucky-based nonprofit healthcare system Norton Healthcare has confirmed that hackers accessed the personal data of millions of patients and employees during an earlier ransomware attack. Norton operates more than 40 clinics and hospitals in and around Louisville, Kentucky, and is the city's third-largest private employer. The organization has more than 20,000 employees, and more than 3,000 total providers on its medical staff, according to its website. In a filing with Maine's attorney general on Friday, Norton said that the sensitive data of approximately 2.5 million patients, as well as employees and their dependents, was accessed during its May ransomware attack.

In a letter sent to those affected, the nonprofit said that hackers had access to "certain network storage devices between May 7 and May 9," but did not access Norton Healthcare's medical record system or Norton MyChart, its electronic medical record system. But Norton admitted that following a "time-consuming" internal investigation, which the organization completed in November, Norton found that hackers accessed a "wide range of sensitive information," including names, dates of birth, Social Security numbers, health and insurance information and medical identification numbers. Norton Healthcare says that, for some individuals, the exposed data may have also included financial account numbers, driver licenses or other government ID numbers, as well as digital signatures. It's not known if any of the accessed data was encrypted.

Norton says it notified law enforcement about the attack and confirmed it did not pay any ransom payment. The organization did not name the hackers responsible for the cyberattack, but the incident was claimed by the notorious ALPHV/BlackCat ransomware gang in May, according to data breach news site DataBreaches.net, which reported that the group claimed it exfiltrated almost five terabytes of data. TechCrunch could not confirm this, as the ALPHV website was inaccessible at the time of writing.

This discussion has been archived. No new comments can be posted.

US Healthcare Giant Norton Says Hackers Stole Millions of Patients' Data During Ransomware Attack

Comments Filter:
  • Your subscription of Norton 360 Total Protection for Windows might expire on December 11, 2023.
    After the expire date has passed your computer will become susceptible to many different ransomware threats.
    Your PC might be unprotected, it can be exposed to viruses and other malware ...

    Available: 62% 1Yr Renewal Discount: 2 mins 06 seconds left...

    Click here to Renew Subscription! Click "allow" in Windows Defender Security Center. Then enter your password and click "Run". When prompted "Do you want to allow t

  • Kentucky-based nonprofit healthcare system Norton Healthcare has confirmed that hackers accessed the personal data of millions of patients and employees during an earlier ransomware attack

    OK, sounds like more than a ransomware attack, or are they hiding something ? I wonder if they had an earlier break, forgot to report it, then not long afterwards had the ransomware attack. So they combined the two. To me, "real" ransomware people also stealing data would break their business model.

    If my wild off the wall guess is true, nice way to deflect blame and liability :)

  • Our rich faith history includes founding organizations and other faith communities: Episcopal Church, The United Methodist Church, United Church of Christ, Presbyterian Church, Roman Catholic Church, and software with no real security, system segmentation, isolation or unusual activity detection.

    Seriously, there needs to be HEAVY penalties for data breaches.

    • That's right, punish the victims of crime. Maybe eventually the criminals will leave them alone, if you punish the victims enough times! Make potential victims hire a private army of security experts, and if they don't, too bad for them, they should be obliterated.

      The actual quote:

      Our rich faith history includes founding organizations and other faith communities: Episcopal Church, The United Methodist Church, United Church of Christ, Presbyterian Church and Roman Catholic Church.

      https://nortonhealthcare.com/a... [nortonhealthcare.com]

      We don't treat any other crime with this kind of attitude. If a jewelry store is the victim of a smash-and-grab, we go after the thieves, not the store owners (who clearly didn't pay enough money

      • by smap77 ( 1022907 )

        The comparison is inappropriate. The companies aren't holding something they solely own, and the example diamonds don't care who possesses them. Further, when a data breach happens the company hasn't lost the data, they've lost exclusivity of possession.

        The collected data comes with a responsibility to protect it. If a company doesn't want the responsibility, simply don't possess the data.

        • The diamonds don't care, but the owners of the diamonds do care. The data doesn't care, but the owners of the data do care. And the jewelry store doesn't necessarily own the jewels on display. So yes, I do think it's a good analogy.

          Yes, there is a responsibility to protect the data. But no amount of protection is foolproof. This is also true with physical security. There is no such thing as "enough" security to prevent all types of attacks.

      • We don't treat any other crime with this kind of attitude.

        Wrong. Criminal negligence is absolutely a thing and is prosecuted.

        • You've got a point. This is not that.

          Like most large organizations, they no doubt have spent a lot of money on security. How much spending is "enough"?

          • Investment in security it's merely about the amount of money spent, it's about how well you comply with the best practices. I would bet you that they didn't bother with code coverage, fuzzing, or really anything beyond "well, it works". The fact that Windows boxes are even used in hospitals is insane in the first place because there are literal gigabytes of executable code that is not secure.

            Bottom line: if you have to hope you invested enough in security then you didn't invest enough in security.

            • So yes, it's important to invest in security. But are you going to tell me that failing to run proper code coverage and fuzzing, or failing to invest "enough," should be considered criminal? Really? My initial point was that everyone's first reaction is to blame the victim. Even stupid victims don't deserve to be victimized.

              While Windows is as insecure as you say, it's no worse or better than Linux or Mac OS. The only reason there are fewer breaches on these systems, is that the breaches follow the money. I

              • But are you going to tell me that failing to run proper code coverage and fuzzing, or failing to invest "enough," should be considered criminal? Really?

                Yes. There should be strict protocols for all software that protects sensitive information like medical but congress is a bunch of clueless fools and Republicans will vote against all regulation.

                While Windows is as insecure as you say, it's no worse or better than Linux or Mac OS.

                You fail to realize that with open source you get to pick which software and libraries you utilize and can therefore validate it's quality. If it's quality is lacking you can improve it or build a replacement. However, with closed source, you get what you get and that's all there is to it.

                To that end, closed source

                • I think you vastly overestimate the capability of developers and companies to adequately evaluate the security implications and quality of the tools and software that they use. Stricter laws would not have prevented Heartbleed. Open source did not prevent Heartbleed. Widespread adoption and inspection and testing of OpenSSL did not prevent Heartbleed.

                  I do not want to live in a country where you make the laws. We software developers would have to go find other work, because your standard of "good enough" is

                  • I think you vastly overestimate the capability of developers and companies to adequately evaluate the security implications and quality of the tools and software that they use.

                    What this really means is that companies should be hiring individuals who's sole job is to inspect and validate their software. Furthermore, it would behoove society to create a certification system (akin to the FDA) for specific versions of shared libraries. If sensitive systems could only be built upon certified libraries (with certs that can be revoked) then we would have a hell of a lot less issues.

                    Stricter laws would not have prevented Heartbleed. Open source did not prevent Heartbleed. Widespread adoption and inspection and testing of OpenSSL did not prevent Heartbleed.

                    And that's where you're wrong. Open source did prevent Heartbleed on up to date systems. The only people t

                    • You think "certifications" of shared libraries will prevent security vulnerabilities? Are you even a software developer? The problem with vulnerabilities, is that they are difficult to find. If they could be discovered through a certification process, Heartbleed would never have made it into the wild.

                      Open Source did not prevent Heartbleed on updated systems. The issue was introduced in 2012, and went undiscovered until 2014. https://en.wikipedia.org/wiki/... [wikipedia.org] That means the for two years, every up-to-date in

                    • The issue was introduced in 2012, and went undiscovered until 2014.

                      Sorry, I got the timeline mixed up.

                      You think "certifications" of shared libraries will prevent security vulnerabilities?

                      Prevent, no. Radically reduce them by orders of magnitude, yes. Perfect security is the ideal but good security is doable.

                      The problem with vulnerabilities, is that they are difficult to find.

                      That's only true for well-tested software. There are a LOT of poorly tested software libraries that nobody can be bothered to actually check. In house systems like what Norton Healthcare made often have gaping security holes.

                      Absolute security is impossible. Harsh laws won't change that.

                      What they will do is vastly improve security practices. Come on, this is a really basic concept.

                      In civilized countries, we go to great lengths to protect victims,

                      Again, negligence is a thing. When

                    • You have a much higher opinion of the value of certifications than I do, based on my experience going through certification processes. They basically prove nothing.

                      And you have no idea what kind of preventative measures Norton Healthcare took to ensure security. You are making assumptions based on facts not in evidence. Negligence is a crime that must be proven, not assumed.

    • by mjwx ( 966435 )

      Our rich faith history includes founding organizations and other faith communities: Episcopal Church, The United Methodist Church, United Church of Christ, Presbyterian Church, Roman Catholic Church, and software with no real security, system segmentation, isolation or unusual activity detection.

      Seriously, there needs to be HEAVY penalties for data breaches.

      As Louisville's 3rd biggest employer, they're probably going to get a bailout. (donations) too big to fail.

  • ...wanker enhancers.

  • by tiqui ( 1024021 ) on Tuesday December 12, 2023 @01:11AM (#64075403)

    send them all to jail for life. Auction off their estates for money to compensate their victims.

    We, as a society, need to learn to show ZERO tolerance for this sort of total mismanagement by the executive class. All over the nation we have CEOs and other execs dragging down massive paychecks way out of proportion to American historic norms on a scale that would've made the old robber barons blush, and justifying it by claiming to be the experts whose decision making skills generate the huge revenues that justify the payouts. Any ttime any of these jokers is interviewed and asked about the massive compensation, they point out that they have all the responsibility. Well, guess what, the execs are responsible for CHOOSING to run buggy off-the-shelf common (and commonly-known to hackers) software, on common (and well-understood by hackers) hardware platforms, and then HOOKING THE DAMNED PILE OF CRAP TO THE INTERNET [facepalm]

    Nobody with a functional brain hooks anything valuable to the internet. Nobody.

    Certainly nobody with an ounce of wisdom puts other people's valuable data on commodity hardware, running MS Windows (ANY flavor), and either hooks it to the internet or allows removable media or wifi connections to it. Anybody who does such a thing deserves the maximum punishment available WHEN (not if) a hack occurs and the data is grabbed.

    Nothing other than draconian punishment for executives who allow this stuff on their watch will put an end to it. Executives need to be so terrified of losing everything including their freedom that they demand systems that are unconnected [no system MUST be on the net], and unhackable [only code written by lazy jackasses is hackable], and employees who are not human hackable [no more secretaries giving out passwords to people mimicking the voice of the boss, etc]. Any punishment that does not devastate the executives in charge will become just a financial risk that they calculate and balance against the costs of proper IT practices - and we already know who that works out, they ALWAYS choose to put their customer and employee data at risk rather than funding a proper unhackable solution. They gamble that the money saved will make the bottom line better and the board will up their personal compensation. The experiment has been run over and over again, we have our results, CEOs tend to be self-absorbed reckless morons.

    If you think I am being severe here, consider: Every time one of these hacks succeeds, there are large numbers of people whose private data is grabbed... and it's usually people FORCED to hand over their info in order to get healthcare [this NEVER used to happen in America]. In the aftermath, as some stupid CEO heads back to the golf course or yacht, and some spokesliar puts out a bit of pablum, SOME of the people whose data was grabbed become victims of ID theft that can take YEARS to clean-up. This is a fundamental injustice.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...