Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Security Medicine Privacy Technology

AI Company Leaks Over 2.5 Million Medical Records 23

Secure Thoughts reports that artificial intelligence company Cense AI, which specializes in "SaaS-based intelligent process automation management solutions," has leaked nearly 2.6 million medical records on the internet. PCMag reports: [O]n July 7 security researcher Jeremiah Fowler discovered two folders of medical records available for anyone to access on the internet. The data was labeled as "staging data." Fowler believes the data was made public because Cense AI was temporarily hosting it online before loading it into the company's management system or an AI bot.

The medical records are quite detailed and include names, insurance records, medical diagnosis notes, and payment records. It looks as though the data was sourced from insurance companies and relates to car accident claims and referrals for neck and spine injuries. The majority of the personal information is thought to be for individuals located in New York, with a total of 2,594,261 records exposed. Fowler sent a responsible disclosure notice to Cense AI and public access to the folders was restricted soon after. However, the damage has potentially already been done if others had previously discovered the data was available. Fowler points out that medical data is the most valuable on the black market, fetching as much as $250 per record. If someone willing to act maliciously came across this data you can guarantee it is, or has been sold.
This discussion has been archived. No new comments can be posted.

AI Company Leaks Over 2.5 Million Medical Records

Comments Filter:
  • by goombah99 ( 560566 ) on Tuesday August 18, 2020 @05:12PM (#60416341)

    But I told you so.
    Collective thought of everyone who understands computers.

    • by ffkom ( 3519199 )
      Yep, where there is a trough, the pigs gather - and such a trove of medical records attracts a lot of pigs.
    • by AmiMoJo ( 196126 )

      How long until someone starts gathering this data and selling it? Kinda like a Facebook API but with stolen data, enter someone's details and it gives you everything that has been leaked or sold about them.

  • Well that wasn't very intelligent.
  • Is to fine them enough that they can't continue to operate.

    I understand we can't toss any of the CXX suite in jail, which would be a much better detterent

    We live with the solutions we can implement, not the the solutions we want
    • They are based in New York. If they were in California, the CCPA could be $2500 per record, plus $750 in damages to each person whose record was leaked. For any but the largest companies, the fines for leaking 2 million records is enough to put them out of business.
      • the data was sourced from insurance companies

        Never mind the leaking. How is this even legal?

        • My understanding is that under HIPAA, insurance companies can give data to other companies (who then fall under HIPAA as well) to do processing and whatever. So presumably this company convinced the insurance companies that they were going to follow HIPAA. Looks like they did a good job.
    • They very well could be fucked. This is a massive HIPAA violation. Fines can be quite heavy. That's not to mention how they just opened themselves up to lawsuits from their clients . The problem is we need HIPAA style punishments for all data breaches, not just medical ones.

  • by bobstreo ( 1320787 ) on Tuesday August 18, 2020 @05:27PM (#60416411)

    Cause 2.5 million of em could start to add up.

    • Staging data seems to imply not what the summary suggests, but that they use a poorly secured staging codebase on actual production data pulled from production.

  • Intriguing (Score:5, Funny)

    by OneHundredAndTen ( 1523865 ) on Tuesday August 18, 2020 @06:35PM (#60416629)
    If their AI is making such mistakes then maybe they are close to achieving human-level AI.
  • Maybe if it was really AI it would have noticed glaring security issues before it got to work.
  • by Randseed ( 132501 ) on Tuesday August 18, 2020 @10:19PM (#60417177)
    I worked a few years ago at a major regional hospital that had the worst computer security imaginable. There was a common Windows login. There was a wide open Windows share drive that housed most of the hospital data, including employee lists with birthdays, phone numbers, and social security numbers, call schedules, recipes from the kitchen, poorly written letters, patient lists, checkouts from doctor shifts, and worse. Their security cameras were accessible and used the default password so if you wanted to play voyeur you could watch facilities all over the city from anywhere.

    The killer, though, was when I found a share drive which stored data for the emergency center EHR. They had over 700,000 records sitting there wide open if you knew where to look.

    I brought these concerns up to the CIO. A week later they terminated my contract for some made-up bullshit. I should have turned them in, but I guarantee they would have claimed I "hacked" their system and either sue me or get me prosecuted, or at least trash my career. Of course they eventually fixed it so I let it slide.

    tl;dr Health care organizations don't give one shit about your privacy, and they care about your outcomes only to the extent that it affects their corporate bottom line. And since the hospital was government (county) run and the only one for 100 miles around, they wouldn't have done shit.

    • by Anonymous Coward

      I believe you. I had friends working in hospital IT, and if you discovered a security problem with Epic software, you were not only instafired, but Epic would not allow you to ever work with their software ever again, even at another hospital.

  • why is medical data most valuable? what can you do with $250 per medical record?
  • For all we know, the "AI" could have been doing advertising research.

If you didn't have to work so hard, you'd have more time to be depressed.

Working...