Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security United States IT

Siemens SCADA Hacking Talk Pulled From TakeDownCon 104

alphadogg writes "A planned presentation on security vulnerabilities in Siemens industrial control systems was pulled Wednesday over worries that the information in the talk was too dangerous to be released. Independent security researcher Brian Meixell and Dillon Beresford, with NSS Labs, had been planning to talk Wednesday at a Dallas security conference about problems in Siemens PLC systems, the industrial computers widely used to open and shut valves on factory floors and power plants, control centrifuges, and even operate systems on warships. But the researchers decided to pull the talk at the last minute after Siemens and the US Department of Homeland Security pointed out the possible scope of the problem."
This discussion has been archived. No new comments can be posted.

Siemens SCADA Hacking Talk Pulled From TakeDownCon

Comments Filter:
  • by Anonymous Coward on Thursday May 19, 2011 @03:00PM (#36183782)

    Perfect example of security through obscurity. Yeah, everyday script kiddies won't be messing around in the systems, but those dedicated to do damage or spy have the time and means to get to know the systems. And it's even easier for them because the systems aren't properly secured.

    • As the Iranians found out the hard way, it's difficult to keep an intruder out despite the obscure nature of PLC (most people probably don't even know what that is.)

      • As the Iranians found out the hard way, it's difficult to keep an intruder out despite the obscure nature of PLC (most people probably don't even know what that is.)

        Programmable Logic Controllers.

        I prefer Allen-Bradley PLCs myself.

        • They still making PCLs? I thought they ran out of prefamulated amulite years ago.

          • For those who missed this one, look up the Turbo Encabulator on youtube. Too many to link to any specific one. A very long running and hilarious joke..

        • by Svartalf ( 2997 )

          Yeah, they're a bit cleaner. The big problem is that it's not just a Siemens problem. It's endemic throughout the industry in varying ways.

          Networks that're claimed to be air-gapped- but aren't because of "ease of use" concerns.
          Networks that shouldn't have a single Windows box because of that risk that do.
          And, so on and so forth.

      • The Iranians didn't find out about the obscure nature of PLC, they found out it isn't a good idea to buy your infrastructure from foreign countries... See in the U.S. we are careful to only use... oh nevermind.
        • The high ticket projects all attract multinational corporations. Those corporations aren't shy about buying smaller-scale operations with technology they want. Even if you do use technology developed only in your own country, it is not sold elsewhere? Are there vulnerable systems anywhere within the local technology entity? Even if they've got 10 vulnerabilities instead of 100,000+ they're still vulnerable.

          I've seen one region that didn't have any kind of electronic or software vulnerability whatsoever.

      • by sjames ( 1099 )

        Part of the problem is that they're not actually THAT obscure. The documentation exists and can be had if you're a paying customer.

    • Perhaps the intent is insecurity through obscurity. Can't sabotage your enemy's systems if you tell them where all the holes are.

    • Which is exactly why I really hope these researchers will present their findings to Siemens engineers so that the problems can be patched, and then give a talk about it. The stakes are pretty high with these systems, so hopefully a real fix will augment security via obscurity in this case.
      • by Hatta ( 162192 )

        Why would Siemens bother fixing holes nobody knows about?

        • If they don't, their competitors will.
        • by poity ( 465672 )

          Now that people know the holes exist, the race is on. They can't afford not to.

        • Because if these researchers acting in a more or less intellectual manner found them, it is safe to assume that individuals without such a noble goal in mind will find and possibly exploit them. Releasing the information to Siemens first would hopefully prolong the search for the "bad guys", by getting rid of some potential vulnerabilities.
        • The whole industry is riddled with massive holes because we're all tied to legacy OPC which relies on that massive dogs breakfast called DCOM. The slow adoption of OPC UA and even OPC WCF keeps the whole industry in a situation where it is easier to disable all security than deal with DCOM. Which makes the siemens issue too easy to exploit. Every single bloody version of windows has a different way of being configured, so no one bothers to do it right...

          Siemens needs to fix their issues. So does everyone e

      • by chemicaldave ( 1776600 ) on Thursday May 19, 2011 @03:26PM (#36184090)
        Did you RTFA? That's exactly why they decided not to give the talk, because Siemens hasn't fixed the problems. As NSS Ceo Rick Moy points out:

        "The vendor had proposed a fix that turned out not to work, and we felt it would be potentially very negative to the public if information was put out without mitigation being available." ... In the past, technology companies have threatened legal action against researchers, but Moy said that in this case the lawyers were not involved. "It's a temporary hold on the information; it's not that it's being buried," he said. "We just don't want to release it without mitigation being out there for the owners and operators of the SCADA equipment."

        • That's a surprisingly refreshing course of action isn't it? To me, that's how things should work. As long as Siemens follows through, and the talk is allowed to proceed I'd be happy.
        • by Svartalf ( 2997 )

          Heh... If they think that those patches will get deployed in a timeframe measured in anything other than months or years, they're kidding themselves...

          SCADA systems typically don't get patched- and when they do or get upgraded, it's a "big thing".

          • Mod parent up. Mostly when you patch or upgrade your scada everything breaks causing a massive headache. So most people would really rather not.

      • Yes, we can always hope [nytimes.com] that flaws of critical systems will be treated responsibly. Kinda off topic, I know.
    • by LunaticTippy ( 872397 ) on Thursday May 19, 2011 @03:24PM (#36184064)
      At my workplace, all our PLCs are on a process control network. It is isolated from the business network and internet completely. We assume that the PLCs are not secure and they are business critical. We can't take any chance a malware outbreak or hacker causes actual physical things to happen.

      It makes doing work more difficult, and there are still some attack vectors.
      • by Anonymous Coward

        Fuck you.

        --skynet

      • This was perfectly viable 10-15 years ago. Nowaday, the requirement for data archiving, process data historian, plant floor management, etc... make it almost impossible to have a true, complete isolated process network. You always end up having a dual-homing computer or firewall somewhere on that network. Therefore, a potential hole.
        • by Svartalf ( 2997 )

          Depends on the design. Properly designed setups will have an air-gap and only data transfer via sneakernet in the form of a hard-disk or similar coming from the SCADA to the corporate systems. Real-time's desirable- but for some networks, having the hole's too much of a risk- especially if you've got a Windows based HMI system or similar in the mix. Seriously.

          • by imsabbel ( 611519 ) on Thursday May 19, 2011 @05:39PM (#36185666)

            And stuxnet was transmitted via USB sticks doing the sneakernet stuff...

          • ...Properly designed setups will have an air-gap...

            Very very few industrial site have the "air-gap" any more. I suppose all the rest are improperly designed?

            Real-time's desirable- but for some networks, having the hole's too much of a risk- .

            For whom is it too much of a risk? Power stations? Mines? Water Treatment? None of the sites I work with have an air-gap any more.

            especially if you've got a Windows based HMI system or similar in the mix. Seriously.

            I'd say that the vast majority of SCADA/HMI systems run on Windows. In critical infrastructure. Without an air gap.

            I sure as hell hope there are other ways of securing a network

          • by ColaMan ( 37550 )

            I work on a PLC system that has a single ethernet TX pair to the rest of our network. It transmits stats blindly (with the help of a static entry in its ARP table) to a PC on the outside where a small program listens and collates data. I've heard of similar things with serial, fiber and radio modems,etc.

      • by Svartalf ( 2997 )

        Do you audit it often to make sure it's still air-gapped like you think it is? Many of the audits at power utilities where they had the same thinking had pro-sumer routers or switches tying the networks together that were done in a pinch for some ease of deployment thing or ease of use thing and then got forgotten.

        • Certainly audits are a good thing, but we mustn't forget that we're talking about something that gets in and hides itself well, even deleting itself from some hardware along the way. An audit of hardware still only gives a snapshot in time. That laptop that was briefly plugged in, or machine that briefly had a USB key plugged in, may be long gone. Intrusion detection can help, but with things like traffic to a PLC using the normal ports, it may take deep inspection of every packet to see what's going on,

      • As it should be. But isolation does not require a complete elimination of remote monitoring. Our Process Control Network has a Server on it, which via a hardware firewall pumps data one way to another machine outside which emulates the view of the process network. This basically gives us complete remote monitoring without the ability to send data back to the network.

        It makes it easy and there are few if any attack vectors, and when malware spreads around the business network (frequent) it so far has never m

        • That is a good idea. I don't see why it has to be expensive, though.
          • Ture, not necessarily expensive in absolutes. Just more expensive than the alternative. You'd be amazed at what some people come up with when "value engineering".

      • by 0xG ( 712423 )

        At my workplace, all our PLCs are on a process control network. It is isolated from the business network and internet completely.

        You are utterly kidding yourself if you think that your PLC network is "isolated". Does anyone ever request data from it? How do you transfer the data...with a USB key maybe? How are the controllers programmed? With a workstation that is plugged into which network...and never the internet? I would strongly suggest that you read up a bit on stuxnet. The details may blow your mind...

        • At my workplace, all our PLCs are on a process control network. It is isolated from the business network and internet completely.

          You are utterly kidding yourself if you think that your PLC network is "isolated".

          It's not difficult.

          1. decide what parameters are going to be reported from the secure system to the outside world and how frequently. Say "widget-count" and "sprocket-temperature", three sprocket temperatures per widget count.

          2. code your PLC network to count widgets, measure sprocket temperature

    • by betterunixthanunix ( 980855 ) on Thursday May 19, 2011 @03:29PM (#36184146)
      There is a notion in security engineering of responsible disclosure, which is letting a company know about a vulnerability long enough before you present it so as to allow the company to fix it and deploy the fix. I believe that what happened here was that the company complained that they did not have enough time to fix the problem and deploy the fix, and that DHS and the researcher agreed with that conclusion. I do not think this is terribly far fetched, and I doubt that there is a conspiracy to leave vulnerabilities in industrial equipment used here in America, not when the Iranians want to get back at the US and Israel for Stuxnet.
    • Perfect example of security through obscurity. Yeah, everyday script kiddies won't be messing around in the systems, but those dedicated to do damage or spy have the time and means to get to know the systems. And it's even easier for them because the systems aren't properly secured.

      I'll be at work for a few more hours. In my living room at home there is a suitcase with a lot of cash in it. I didn't lock my front door, I didn't even close it. I won't tell you where I live. Security through obscurity.

  • Secrecy (Score:1, Insightful)

    by grcumb ( 781340 )

    The argument that some knowledge is too dangerous to know is specious and flawed. But I can't tell you how or why for fear of undermining our existing regime of ignorance and ineptitude.

    • Re:Secrecy (Score:5, Insightful)

      by chemicaldave ( 1776600 ) on Thursday May 19, 2011 @03:28PM (#36184130)
      Did you RTFA? They're waiting for Siemens to fix the issues first, a common practice in security research. Siemens and DHS didn't force them to pull the talk and didn't even get lawyers involved. So please stop with your accusations. You clearly lack an understanding of the situation at hand.
      • by Anonymous Coward

        How many times in this topic are you going to ask people if they RTFA?

        This is /., we already know they didn't.

    • What is being argued is that Siemens did not have enough time to patch this vulnerability and deploy that patch in major installations of these systems. I do not doubt it; the real question is whether or not they are busy deploying a fix, and I would not doubt that they are. Stuxnet is out there being studied by people who would use it to attack US factories, if they could, and I would bet that the US government is putting pressure on Siemens to fix the problem. If within a year, the talk is still being
    • by Jack9 ( 11421 )

      > The argument that some knowledge is too dangerous to know is specious and flawed.

      That's not the reasoning given. The knowledge IS known. Some knowledge is dangerous to disseminate. This is a sad fact of humanity, but a fact. Given opportunity and knowledge of vulnerability, you will get attempts to use and abuse knowledge with similar results. People are eager to exercise their imagination and reluctant to exercise restraint or critical thought. I can understand their position.

      • by grcumb ( 781340 )

        > The argument that some knowledge is too dangerous to know is specious and flawed.

        That's not the reasoning given. The knowledge IS known. Some knowledge is dangerous to disseminate. This is a sad fact of humanity, but a fact. Given opportunity and knowledge of vulnerability, you will get attempts to use and abuse knowledge with similar results. People are eager to exercise their imagination and reluctant to exercise restraint or critical thought. I can understand their position.

        Thank you for replying instead of simply down-modding an argument you don't agree with. Others seem to prefer retaliation to debate.

        Let's look at this from another perspective. Everyone knows there are problems with Siemens' PLCs. That's been known since Stuxnet got reverse engineered. While there's no problem whatsoever with sharing the information about specific vulnerabilities with Siemens - indeed, making sure they're the among the first to know - what additional danger would be presented by sharing tha

  • by Attila Dimedici ( 1036002 ) on Thursday May 19, 2011 @03:08PM (#36183874)
    In other words, if your systems rely on PLC systems from Siemens, you had better hope that no attacker can get through your firewall.
    • Re:In other words (Score:5, Interesting)

      by Charliemopps ( 1157495 ) on Thursday May 19, 2011 @03:36PM (#36184228)
      I used to work in provisioning in a telco and it entirely depends on who's managing the plant. We'd install circuits in some power plants that were so strict that they insisted on fiber use only. We'd run copper to an access point outside their security perimeter then have a mux convert it to fiber to run across the perimeter into the facility where it would terminate in an outer building. Their security plan did not allow ANY outside network connections to the plant itself. They had networked equipment but it was all housed in an outer building with no connection to the main plant or control systems. They refused to allow copper on the premises because it's relatively easy to splice into and carry elsewhere. Fiber would be much more difficult to splice and bring in.

      Other facilities were less secure. I remember getting a panicked call from someone shouting "The Damns gonna bust!!!" They had a single "Circuit" they paid about $20 a month for that was nothing more that a single copper that ran from some building to the local damn. They'd apply +5 volts to the line to open the damn, and -5volts and it would close. They'd reacted too slowly to rising waters and it had flooded the copper pair they used to control the damn. They wanted us to send a phone tech into their overflowing damn to repair the circuit so they could open it from the safety of their administrative building. They had a hard time understanding my near hysterical laughter.
      • We usually use dams to hold back water, not damns. Sure, sometimes the damn dam breaks, but that's no reason to damn it from the beginning.

      • by Svartalf ( 2997 )

        Fiber would be much more difficult to splice and bring in.

        Heh... All it takes is a bit more effort- but it'd be a bit more obvious to pop a passive tap in a fiber run since they're not small. Sadly, it's not sound thinking all the same. The attackers are as likely to attack the end-nodes of the system where the security is much, much weaker and there's copper to be compromised before it gets to the fiber loops. You can do as much or more damage by dinking with a substation's setup as with the generation

        • You guys are way over thinking this. There is no connection to the outside world by the control equipment. The fiber that came in terminated in buildings outside what would be considered the power plant. I'm not sure what they used it for... likely they could measure data there or something. What the fiber was supposed to prevent was local staff getting bored and running their own bootleg connection into the building so they could watch porn on their critical workstations inside. Anyone on slashdot could pi
    • by Anonymous Coward

      It's not fair to pick on Siemens, there isn't a secure PLC out there.

      • It's not fair to pick on Siemens, there isn't a secure PLC out there.

        That's correct. PLCs do exactly what they're told...no matter who is telling them

    • No in other words you better hope you have good network design.

      At our workplace an attacker would need to get through a firewall, ... another firewall, ... and another firewall as they work their way through the business network, the information network, down to the process control network. That last firewall is a doozy too, one way communication between 2 computers only.

  • But the researchers decided to pull the talk at the last minute after Siemens and the US Department of Homeland Security pointed out the possible scope of the problem."

    Don't you mean the DHS told them not to do it or they would get a thorough anal probing in the airport security check on their way out of town. I'm pretty sure they understood the "scope of the problem" before they started doing the research (which was also probably the motivation for the research).

    • Re: (Score:3, Informative)

      by ArcCoyote ( 634356 )

      Idiot.

      First of all, don't you realize every time you make a joke about "anal probes" at the airport, you're being not-so-subtly homophobic? Same thing with prison-rape jokes. I'm about as much a fan of those jokes as I am of the acts.

      Didn't you read the part where the DHS CERT (a part of US-CERT, which falls under DHS but has nothing to do with the TSA...) told NSS something like, "Um, guys, the patch Siemens released doesn't work, and there are thousands of these devices deployed all over the place, includ

      • First of all, don't you realize every time you make a joke about "anal probes" at the airport, you're being not-so-subtly homophobic?

        Nonsense; it's a reference to bodily violation which works no matter what your gender and orientation. Just because a man is gay doesn't mean he wants the TSA up his ass.

        NSS decided to play it safe, they weren't forced to do anything. It's called responsible disclosure, and when Siemens gets their products fixed, it will be released.

        Disclosure delayed is disclosure which does

      • You, my familial-basement-dwelling troll, assume coercion and conspiracy is how everything gets done by three-letter agencies.

        I didn't assume it. I learned it by reading the memos from the U.S. government that were leaked.

        First of all, don't you realize every time you make a joke about "anal probes" at the airport, you're being not-so-subtly homophobic?

        Why do you assume that every gay likes a government agent to stick his gloved hand up his ass? Way to stereotype......

  • Ummmm.... (Score:2, Insightful)

    by jd ( 1658 )

    ...doesn't the existance of a virus that can attack such devices make this a zero-day flaw? The hack is public, since anyone can disassemble the virus that's in the wild and see how it works.

    And, frankly, I don't see it being awfully difficult for any Black Hat with a mind to to rip out the prior payload and install one that can attack a wider range of devices. Surely it is in the interests of security for corporations to understand what they can do to mitigate the risk of this.

    The DHS, IMHO, is acting in a

  • ... stick your fingers in your ears and repeat after me, "La-la-la-la-la-la-la..."

    Asking people not to listen (such as the US government telling college students, of all people, not to read ANYTHING about Wikileaks) makes as much sense as telling the speakers not to speak.

  • The people you don't want to know about this stuff, already know. The only reason Siemens or others don't want the info made public is to save face.

  • by Hierarch ( 466609 ) <CaptainNeeda@gmai[ ]om ['l.c' in gap]> on Thursday May 19, 2011 @03:42PM (#36184316) Homepage

    A lot of people seem to want to scream about censorship, but they're missing the point. This is one of the best case scenarios I've seen in relations between companies and security researchers.

    For those who can't be bothered to RTFA, here's a summary.

    Researchers found a serious flaw. The company developed a fix. It turned out that the fix was flawed. The company told the researchers about the potential impact of giving the talk before the flaw was fixed, and the researchers voluntarily postponed the talk while a better fix is built.

    That's it, and it looks like everybody did the best thing they could. Isn't this what we'd want Siemens to do? "You've got a right to give your talk, but we'd like you to postpone it. Here's why. Your call."

    • The info exists. The info is valuable to people who want to do something bad. Valuable information will find a supplyer, provided the demand (and pay) is high enough.

      People who want information for nefarious reasons don't care about legal troubles connected with the acquisition of said information. People who want information to prevent said nefarious actions usually cannot ignore the law when trying to get it.

      Question for 100: Who will now that this talk is not being held have the information, and who will

      • Question for 100: Who will now that this talk is not being held have the information, and who will not have it?

        The same (good and bad) people have it without the talk, but the rest of the world does not. Although the risk level is still there, it's not increased. If TFA is correct and Siemens is working on a fix, then what's wrong with giving them the time they need and/or working with them?

    • I have a hard time believing that it took siemens this long to develop a fix. The fact that stuxnet was designed to compromise siemens PLCs and how it accomplished this has been known for several months now. There's no excuse not to push out a (working) patch within a few months of a huge 0-day being discovered. To have not fixed this by now, especially given the critical applications some PLCs are used in, suggests negligence.

      Responsible disclosure says that you should give the responsible party a reasonab
  • So it would decrease security to give that information to people who pay for a sec talk, people who are most likely sent there by companies, companies possibly that use the technology in question?

    Let's think for a while: Someone who wants to blow up a dam or nuke a power plant probably doesn't really care too much about "virtual trespassing", aka hacking and the legal implications thereof, and neither would he bother to second guess spending some 1000 bucks on someone who would provide this information, whi

    • By not conducting the talk, the risk level is not increased, at least. It's still there, obviously. The companies that would have attended this talk should already be working on isolating SCADA and similar systems as much as possible, with or without the specifics of the talk. I doubt companies are going to be patching SCADA systems themselves without help from Siemens or their vendor. If Siemens is indeed honestly working on a solution, then _delaying_ the talk is entirely reasonable.

  • If the US intelligence services and Siemens had worked together in the past to exploit SCADA vulnerabilities in systems owned by unfriendly nations.

    Why would they want to increase awareness of SCADA problems?

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...