Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Bug Security

Bugs In SCADA Software Leave 7,600 Factories Vulnerable 70

mspohr (589790) writes with this news from the BBC: "The discovery of bugs in software used to run oil rigs, refineries and power plants has prompted a global push to patch the widely used control system. The bugs were found by security researchers and, if exploited, could give attackers remote access to control systems for the installations. The U.S. Department of Homeland Security said an attacker with 'low skill' would be able to exploit the bugs. About 7,600 plants around the world are using the vulnerable software. 'We went from zero to total compromise,' said Juan Vazquez, a researcher at security firm Rapid7 who, with colleague Julian Diaz, found several holes in Yokogawa's Centum CS 3000 software which was first released to run on Windows 98 to monitor and control machinery in many large industrial installations. The researchers also explored other SCADA software: 'We ended up finding over 1,000 bugs in 100 days.'" The vulnerabilities reported are in Yokogawa's Centum CS 300 industrial control software.
This discussion has been archived. No new comments can be posted.

Bugs In SCADA Software Leave 7,600 Factories Vulnerable

Comments Filter:
  • by AmiMoJo ( 196126 ) * on Saturday April 05, 2014 @04:53AM (#46668185) Homepage Journal

    That looks like library code that the compiler generated. Maybe some kind of strcpy variant. As I'm sure you are aware strcpy does not check buffer sizes, but without knowing the context it is used in it is hard to say how bad this problem is.

    What you have to keep in mind is that this software was written for Windows 98. Windows 98 doesn't even have filesystem permissions or any real user segregation. There is no firewall by default. Chances are it would be running on some industrial equipment anyway, not connected to an external network. Yes, it uses UDP, but that is actually quite a common technique for processes on the same machine to communicate, or devices within a single piece of industrial equipment. We don't have enough context to know.

    I write software for embedded systems. I'm talking microcontrollers, not Windows based. The products I make are complex, but have very few computing resources because they have to run for 10 years on a couple of AA batteries. 16k ram is a luxury. I know that if someone decided to hack one they could do, easily. No through incompetence, through a deliberate decision not to compromise other aspects for extra security. Yeah, we could set up encryption keys on the comms protocol, but then we would need to get the minimum wage morons who deploy these things to understand how to install and use them. We have enough problems already. And no, we can't employ better people, our customers are the ones doing the deployment. They demand features like being able to send a completely unauthenticated text message from a standard phone to a device installed somewhere and have it execute those commands.

    So I imagine in this case it is a closed system, never designed to be connected to a network where it could be attacked with malformed UTP packets. The system has been tested and found to be stable because it never generates such packets itself. The real morons are the ones who tried to network it, presumably against Yokogawa's recommendations.

    This is commercial reality. Just like your car has a massive vulnerability where the passenger can reach over and yank the wheel and cause an accident the manufacturer probably figured that people ignoring their recommendations wasn't their problem.

  • by Hamsterdan ( 815291 ) on Saturday April 05, 2014 @08:06AM (#46668671)

    "not ones that are properly installed by competent engineers."

    Depends how management (bean counters, PHBs and MBAs) listens to said engineer. You'd be surprised what stupid (and not even cost-cutting in the long term) decisions companies will make to save a dime tomorrow. The biggest Telco in Canada used (not so long ago) to deploy its wireless routers with only WEP and *NO* admin password on the device, even if WEP was broken about 10 years ago.

    It's not like they don't have any competent tech people, but having worked there, yes, that's the kind of stupid decisions management will take.

BLISS is ignorance.

Working...