AI Company Leaks Over 2.5 Million Medical Records 23
Secure Thoughts reports that artificial intelligence company Cense AI, which specializes in "SaaS-based intelligent process automation management solutions," has leaked nearly 2.6 million medical records on the internet. PCMag reports: [O]n July 7 security researcher Jeremiah Fowler discovered two folders of medical records available for anyone to access on the internet. The data was labeled as "staging data." Fowler believes the data was made public because Cense AI was temporarily hosting it online before loading it into the company's management system or an AI bot.
The medical records are quite detailed and include names, insurance records, medical diagnosis notes, and payment records. It looks as though the data was sourced from insurance companies and relates to car accident claims and referrals for neck and spine injuries. The majority of the personal information is thought to be for individuals located in New York, with a total of 2,594,261 records exposed. Fowler sent a responsible disclosure notice to Cense AI and public access to the folders was restricted soon after. However, the damage has potentially already been done if others had previously discovered the data was available. Fowler points out that medical data is the most valuable on the black market, fetching as much as $250 per record. If someone willing to act maliciously came across this data you can guarantee it is, or has been sold.
The medical records are quite detailed and include names, insurance records, medical diagnosis notes, and payment records. It looks as though the data was sourced from insurance companies and relates to car accident claims and referrals for neck and spine injuries. The majority of the personal information is thought to be for individuals located in New York, with a total of 2,594,261 records exposed. Fowler sent a responsible disclosure notice to Cense AI and public access to the folders was restricted soon after. However, the damage has potentially already been done if others had previously discovered the data was available. Fowler points out that medical data is the most valuable on the black market, fetching as much as $250 per record. If someone willing to act maliciously came across this data you can guarantee it is, or has been sold.
I don' tmean to say I told you so (Score:3)
But I told you so.
Collective thought of everyone who understands computers.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
How long until someone starts gathering this data and selling it? Kinda like a Facebook API but with stolen data, enter someone's details and it gives you everything that has been leaked or sold about them.
AI? (Score:2)
Re:AI? (Score:4, Funny)
It used a Deep Fuckup architecture
They only way to stop this (Score:2)
I understand we can't toss any of the CXX suite in jail, which would be a much better detterent
We live with the solutions we can implement, not the the solutions we want
Re: They only way to stop this (Score:3)
Re: (Score:2)
the data was sourced from insurance companies
Never mind the leaking. How is this even legal?
Re: (Score:2)
Re: (Score:2)
They very well could be fucked. This is a massive HIPAA violation. Fines can be quite heavy. That's not to mention how they just opened themselves up to lawsuits from their clients . The problem is we need HIPAA style punishments for all data breaches, not just medical ones.
What's the current charge for a HIPAA violation? (Score:5, Interesting)
Cause 2.5 million of em could start to add up.
Re: (Score:3)
Staging data seems to imply not what the summary suggests, but that they use a poorly secured staging codebase on actual production data pulled from production.
Intriguing (Score:5, Funny)
That's why it's not real AI (Score:2)
Nothing new (Score:4)
The killer, though, was when I found a share drive which stored data for the emergency center EHR. They had over 700,000 records sitting there wide open if you knew where to look.
I brought these concerns up to the CIO. A week later they terminated my contract for some made-up bullshit. I should have turned them in, but I guarantee they would have claimed I "hacked" their system and either sue me or get me prosecuted, or at least trash my career. Of course they eventually fixed it so I let it slide.
tl;dr Health care organizations don't give one shit about your privacy, and they care about your outcomes only to the extent that it affects their corporate bottom line. And since the hospital was government (county) run and the only one for 100 miles around, they wouldn't have done shit.
Re: (Score:1)
I believe you. I had friends working in hospital IT, and if you discovered a security problem with Epic software, you were not only instafired, but Epic would not allow you to ever work with their software ever again, even at another hospital.
honest question (Score:1)
We Have A Right to Privacy (Score:2)