Researchers Find Slew of Flaws In SCADA Hardware, Software 110
Trailrunner7 writes "At the S4 security conference this week, 'Project Basecamp,' a volunteer-led security audit of leading programmable logic controllers (PLCs), performed by a team of top researchers found that decrepit hardware, buggy software and pitiful or nonexistent security features make thousands of PLCs vulnerable to trivial attacks by external hackers that could cause PLC devices to crash or run malicious code. 'We were looking for a Firesheep moment in PLC security,' Peterson told the audience of ICS security experts. They got one. 'It's a blood bath mostly,' said Wightman of Digital Bond. 'Many of these devices lack basic security features.' While the results of analysis of the various PLCs varied, the researchers found significant security issues with every system they tested, with some PLCs too brittle and insecure to even tolerate security scans and probing."
Completely Irresponible (Score:3, Interesting)
This series of zero-day public disclosures is an abhorrent act that violates most any professional or ethical code out there.
As these vulnerabilities impact devices which are known to be networked, as well as being in control of critical infrastructure, these "researchers" have at their disposal US-CERT/ICS-CERT, as well as direct contacts with the vendors in question.
They chose to turn this into a marketing stunt to sell tickets to their conference and to attempt to sell consulting services to the control systems industry. Luckily, I see this, and I will NEVER recommend that Rapid7, Dale Peterson, Digital Bond, Dillon Beresford, Jacob Kitchel, Tenable Network Security, or Ruben Santamarta be allowed near ANY critical systems.
These individuals have shown their true colors. The vendors WANT to play along, they WANT to increase security. Instead, these fools did a complete end-run around them and just dropped EXPLOIT CODE into the hands of everyone in the world. These "researchers" clearly do not care about the users of these systems, just the $$ they can milk out of the newly instilled fear.
Re:These things were too successful. (Score:5, Interesting)
On the other hand even if they were separated from the internet, they were intended to have some sorts of communication between a base (such as an office) and a remote station. I purchased a set of 33.6k programmable modems (Telindus Aster 5) once that were set up to dial in automatically into a specific location with passwords etc. pre-programmed. How easy do you think it would've been for anyone else to dial-in to those systems if they knew any of the details?
The matter of fact is that until the last 5 years none of these system makers thought about any security even though the techs in the field knew how their systems were going to be set up at the customer. I've worked with Siemens several times (both with their PLC side and their medical instruments side) and every single time I had to provide the additional security on my (or my clients) network even though it was never planned for, requested by them or even had come up in the negotiations when these things were purchased.
And to give you a reason why I think they should plan for it: Windows XP SP1 is what they not only use on their own systems but also set as a requirement for developers to use with their SDK's (together with some custom packages and Visual Studio 6).
Re:These things were too successful. (Score:5, Interesting)
Yes, keep it "offline" (Score:3, Interesting)
Re:These things were too successful. (Score:5, Interesting)
Right security is about people. I agree not every single device needs to be hardened like a fort. I think this is actually a trap many security folks, especially geeks fall into. Its wrong because it ends up frustrating everyone else, and they start not cooperating, circumventing security controls and opening wide holes than existed in the first place.
The most important thing is that everyone understands what things are. If its well known that say some industrial control app is thin on input validation, does simple clear text unauthenticated communications with other devices, and has a configuration interface with a password dialog easily bypassed by rigging up a specific query string, all that might be okay. There might even be very solid reasons for it. It keeps the code base small, easily understood, might offer performance advantages on limited hardware, etc. Hardware hardened for industrial applications is often limited and expensive. You might ask why even bother with the login screen if its so easy to defeat, well its a lock for honest people, it might help oh This is Isle 3A bay 15, I am supposed to be making this change on 3B-15, oops sort of mistakes.
Clearly such devices are designed for use on closed networks, as long as everyone understands that there is no reason to make things harder for people. Its when usb sticks and laptops start getting plugged in, or someone connects it to a larger network you see problems. Manufactures need to be upfront about these things being designed for a specific ecosystem. "Yes it password protected, but its human interface feature not a security control..."