Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Government Medicine

FDA Calls On Medical Devicemakers To Focus On Cybersecurity 40

alphadogg writes "Medical device makers should take new steps to protect their products from malware and cyberattacks or face the possibility that U.S. Food and Drug Administration won't approve their devices for use, the FDA said. The FDA issued new cybersecurity recommendations for medical devices on Thursday, following reports that some devices have been compromised. Recent vulnerabilities involving Philips fetal monitors and in Oracle software used in body fluid analysis machines are among the incidents that prompted the FDA to issue the recommendations."
This discussion has been archived. No new comments can be posted.

FDA Calls On Medical Devicemakers To Focus On Cybersecurity

Comments Filter:
  • Isn't the normal solution to this kind of problem for those affected (or their families, since they will probably be dead) suing the manufacturer?

  • by Anonymous Coward

    Like we need more guvmint BS pushing up prices. Look at the cost of an insulin pump, greater than $5K for what is basically a re-purposed pager shell with a syringe pump.

    • by sjames ( 1099 )

      So it's about time they earned some of that $5K. It's not the government pushing the prices up, it's doctors who don't even bother to know the cost of the things they prescribe.

      Government regulation does facilitate the gouging by limiting the number of competing vendors, and it should do something to prevent that side effect, but we really don't want people dropping like flies when their pumps malfunction.

  • Run an internal network with no access to the internet. Limit the internet to certain devices or terminals.

    • by Anonymous Coward

      Which is fine - except a lot of manufactuers are wanting their devices to have Internet access to automatically download patches and data, and be remotelly accessible for maintainence. And they need to communicate with the rest of the hospital network to pass information.

    • by Relic of the Future ( 118669 ) <dales&digitalfreaks,org> on Friday June 14, 2013 @06:31PM (#44012273)
      Since I helped write a system that pulled live data from medical devices (during surgery) to update patient records on the fly, and that, eventually, those records have to be sent to someone else (using the internet): No. You can't just run an internal network with no access to the internet.

      Build layers of security. Don't use hard-wired passwords. I.e., Stop being stupid about security. But no, you can't just air gap.

      • You can air gap, it just requires more work - having a human manually transfer the data using known clean media.
        • This adds a number of significant additional risks:
          It adds a delay.
          It adds the risk that the human will mix records, or will fail to do the job without reporting back.
          It generates confidential waste that needs to be managed.

          I work a specialist hospital, which gets patients from over a wide region, including neighbouring states. The normal way of transferring X-ray/MRI/CT records is by file transfer from one hospital's server to the other. However, for hospitals which are not common "feeders", which haven't

          • Then your procedure is broken. There are many industries which accomplish this without issue. If you hire an untrained person who can't handle verifying the data is there before moving from one location to another, OF COURSE you're going to have issues.
  • Simple standard? (Score:4, Interesting)

    by Okian Warrior ( 537106 ) on Friday June 14, 2013 @05:41PM (#44012013) Homepage Journal

    Network security is an add-on, largely viewed as an externality by corporations.

    I think that it's largely because of this (and that mostly due to Microsoft) that people don't use good security features.

    Suppose the socket layer had a function to generate a key pair, and a function call to set the key used for encoding and decoding. (Possibly a bit in the protocol to send a message using or not-using encryption). If it was that simple most products would use it, certainly safety-certified products would use it.

    (There's Transport Layer Security [wikipedia.org], but it's not really simple to use.)

    Since there is no simple universal way to use good security, everyone ends up having to implement their own version, which costs time and money.

    Simple secure communications should be an OS feature.

    • Re:Simple standard? (Score:4, Informative)

      by Darinbob ( 1142669 ) on Friday June 14, 2013 @06:46PM (#44012383)

      I suspect most of these devices have either minimal operating systems, home grown operating systems, or no operating system at all. Even if security is in the network stack it doesn't fix things. Ie, do you require your hospital to run IPsec everywhere for every device? Having a top of the line IPsec enabled networking doesn't prevent hacking things if there are bugs due to injecting packets of the right type (ie, it isn't breaking through security to read data, but it is crashing the machine or corrupting data).

      The other thing is that when these machines are hacked it is very often due to reverse engineering the machines. These don't run windows or linux, there's no pre-built hacker kit available, the attackers have access to actual machines and have cracked them open, read the flash or monitored the bus to figure out what the software is doing or what style of OS it has, scanned through to find out if there's a recognizable file system type, etc. When you're up against sophisticated attacks like that then your builtin OS security isn't going to be much defense.

      I suspect most of these successful attacks are happening on machines that use Windows internally; ie, an app on a turnkey system, or Windows bolted onto the side of a device to provide a front end. But Windows already has a built in securre communication feature.

      • You're correct in that the programs should be tolerant of bad data, and much of the safety certification process addresses this issue. For example, as part of the certification process you need to show that buffer overflows cannot happen, that all cases of input data are covered (bad data is handled gracefully), and so on.

        I believe the original article was referring to data transfer and firmware upgrades. These would be conveniently handled over the internet, if only we could guarantee the security and inte

        • The devices I worked on in the past had protection in firmware and such. The goal however was to protect against competitors and unauthorized resellers, not random hackers. Ie, trying to crack down on the second hand market where they try to clone the firmware and try to resell old machines as new or to sell license features they haven't paid for. Firmware wasn't encrypted in this case but is definitely signed. Encrypting doesn't help much if the attacker has access to the bus.

        • Don't you mean asymmetric encryption? Symmetric encryption means it is the same key used, not a public/private key pair. However you are using it for integrity checking not confidentiality, you don't mind who can read the firmware binary, just want to make sure that is not modified in transport. Therefore it is a digital signature you want to use, not encryption.

  • In Soviet Russia pace makes you?

  • by Anonymous Coward

    if I want to overclock my pacemaker? Will it stop me from installing Linux?

  • Obamacare's Medical-Device Tax Kills Patients, Not Just Jobs [forbes.com]

    The 2.3 percent Obamacare excise tax on medical devices is a tax is on sales, not profits.

  • If FDA really wants secure devices, that means we will have patch cycles for medical devices. This is not a very desirable perspective: What happens if you pacemaker is down after a patch? Will you need a doctor to patch?
    • by cusco ( 717999 )
      Of course there's the alternative, manufacturers could spend the time and money to actually make a secure system, but that's just crazy talk. It might lower short term profits and damage some executive's chance at receiving another quarterly bonus, so we can't have that.
      • by Zumbs ( 1241138 )

        Many device manufacturers use Linux or Windows as an underlying OS as both have a lot of support form development tools and are widely used. This means that a lot of work has been poured into the OS to make it a lot more secure than some home brew system. On the other hand it also means that all vulnerability in the OS also apply to the device. As long as the device was sealed, this was no problem, but once users want to get data from the device, or even get it to interact with various internet-based servic

      • The problem with implantable devices is that they are severely power constrained, as typically a battery life of less than 5 years is considered unacceptable, with 10 years wanted for something like a cardiac pacemaker.

        This leaves very little power for CPU/communications/encryption functions. Any kind of crypto hardware, or any kind of unnecessary complexity in the firmware (e.g. duplicated bound checking, etc.) is likely to increase energy consumption and shorten battery life.

        This is becoming less of a pro

        • I appreciate your concern about crypto, but note that secure communications are useless if the device has remotely exploitable vulnerabilities. crypto increase code complexity and therefore the odds to get such a vulnerability. I am more comfortable with administering network switches over serial line rather than over SSH, but I am not sure we can find a similar approach with implanted medical devices.
  • "Seal the holes!" screamed the FDA. "Just not this one, and that one, and that one, which the FBI, CIA, and NSA use."

  • I did a bunch of work a number of years back where we had critical (financial services not medical) computers that we absolutely were not allowed to patch. The solution I implemented was to treat any computer that can't be patched as a mini-DMZ.

    The computer is firewalled from the rest of the network, put on a locked down VLAN and given only specific destinations, ports and so on as required in order to function. The concept of least privilege can and should be used for computers like this just as you would

  • I worked in the medical device field for a while. The level of paperwork and documentation required for validation activities is staggering, plus the medical field in general doesn't have as good a handle on fulfilling government requirements as well as, say, the aviation industry. The path to take a device from concept to validated, sellable product is a long one. Adding cybersecurity (while a worthy endeavor) will only exacerbate the arduous and hair-tearing experience of developing a product.
  • Don't connect your Medical devices to the Internet and don't use Computers that are so easily compromised by connecting to the Internet

"...a most excellent barbarian ... Genghis Kahn!" -- _Bill And Ted's Excellent Adventure_

Working...