Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Worms IT

DHS Chief: What We Learned From Stuxnet 125

angry tapir writes "If there's a lesson to be learned from last year's Stuxnet worm, it's that the private sector needs to be able to respond quickly to cyber-emergencies (CT: Warning, site contains obnoxious interstitial ads. Blocker advised), according to the head of the US Department of Homeland Security. When Stuxnet hit, the US Department of Homeland security was sent scrambling to analyze the threat. Systems had to be flown in from Germany to the federal government's Idaho National Laboratory. In short order the worm was decoded, but for some time, many companies that owned Siemens equipment were left wondering what, if any measures, they should take to protect themselves from the new worm."
This discussion has been archived. No new comments can be posted.

DHS Chief: What We Learned From Stuxnet

Comments Filter:
  • #1 thing learned from Stuxnet:

    Air-gap your production SCADA/embedded stuff.

    • by rlp ( 11898 ) on Tuesday April 26, 2011 @03:24PM (#35946508)

      Air-gap your production SCADA/embedded stuff

      Stuxnet was designed to use USB-flash drives as a transmission vector.

      • by Anonymous Coward on Tuesday April 26, 2011 @03:26PM (#35946536)

        In other words: the real air gap you need to worry about is the one between your employees' ears.

        • In other words: the real air gap you need to worry about is the one between your employees' ears.

          Fact: It is impossible to guarantee zero errors from employees. People make mistakes.

          • Plugging a USB device into a machine that you're not supposed to plug it into is not a "mistake", it is vandalism, theft, or worse, industrial espionage. For that reason, USB should just be disabled on company computers, unless the USB is truly essential to it's operation. And, I haven't seen a machine yet where USB was essential. Fingerprint scanner, maybe? Get a scanner that plugs into the serial port, FFS!

      • by vlm ( 69642 )

        Some hot glue in the USB holes works wonders on other "secure" systems.

        • by ColdWetDog ( 752185 ) on Tuesday April 26, 2011 @04:04PM (#35946918) Homepage

          Some hot glue in the USB holes works wonders on other "secure" systems.

          Probably would work fairly well for the 'between-the-ears' airgap as well. Worth a try anyway.

        • Some hot glue in the USB holes works wonders on other "secure" systems.

          And if your system relies on USB to talk to the devices it is supposed to be programming, that hot glue isn't so useful.

          • Do you have such devices? I don't have any at my worksite. Everything is serial. Assuming you do communicate between devices via USB - how difficult would it be to use a serial?

            • Do you have such devices? I don't have any at my worksite. Everything is serial. Assuming you do communicate between devices via USB - how difficult would it be to use a serial?

              At a previous employer we had some USB programmers for TI MSP430 processors. Sure, they could have been serial, and we had serial ones. But serial is a legacy port nowadays.

      • and delivered by people willing to give their life for it (which they likely did.)

      • But keep in mind, that worm communicated with c&c servers after installation and was operated remotely.
        • by wsxyz ( 543068 )
          But there was no requirement for direct access to the network. Worm instances on airgapped systems received updates & transmitted information via later worm instances brought via USB stick.
          • by h4rr4r ( 612664 )

            If you are going to airgap, you must also disable the USB ports. Physically, not in software.

      • by icebike ( 68054 )

        That's just ONE vector, not the only one.

        Hot glue the USB ports, or disconnect them from the motherboard.
        Your employees have no business sticking USB drives into process control computers.

        The preponderance of USB-Only keyboard/mouse machines is a problem.

        • Your employees have no business sticking USB drives into process control computers.
           
          Until the software, firmware, what-have-you needs to be updated or changed. "We now need to change the rotation speed from X to Y in sub-vector Z". Would you like to do that all by keyboarding each one of the 25,000 or so machines?

          • What else do you have to do all day? What - you're going to miss a day or six of slashdot reading? Get off yer lazy arse and get to work updating those machines!

            BTW - I've been in a lot of production plants in my lifetime. I mean, a lot. You'll be hard pressed to find a list of plants with 25,000 machines doing similar jobs, all requiring the same or similar updates. Perhaps some corporation like General Motors has that many machines spread out across it's corporate landscape, including spare replaceme

        • The port isn't the problem - it's the OS that auto-plays that's the problem
    • # thing learned from Stuxnet:

      The human IT factor will always be the weakest link in the computer system equation.
    • by thsths ( 31372 )

      That, and never assume that the payload is harmless. Just because you do not understand it does not mean it does not affect you.

      So why did they have to analyse the code? It is a nice exercise, but for the threat assessment I think it is sufficient to state that the virus is uploading code to your SPS. It's like having an intruder on your premises - you do not need to understand his motives, but you do need to improve security.

      • Your point withstanding, from the summary, it said that people with Siemens equipment - disclaimer: I work for them, but not in that group - needed to know how they might be impacted. Yes, block the holes, but you also need to try to fathom how bad the damage is going to be. What are we looking at, here: harmless prank or full enterprise-wide melt-down?
    • by Kennon ( 683628 )
      How to write better detection avoidance considering they wrote it.
    • by thegarbz ( 1787294 ) on Tuesday April 26, 2011 @06:11PM (#35947924)

      #1 thing I've learnt from Stuxnet: People who have no experience with SCADA equipment say "OMGZ TEH HAXORS, Airgap! Airgap! Airgap!", and somehow get modded insightful.

      There is nothing insightful at all about taking the silly approach to simply cutting cables due to the fact that there maybe someone out there with nefarious motives. It's right up there with OH&S departments saying people should wear gloves at all times in case of papercuts.

      Any sizable SCADA system RELY on network access. We're not talking about one small unit running one compressor, but the type of systems that run entire plants. They must be able to communicate with each other, they must be able to communicate with asset management systems, they must be able to communicate with process historians, (all these on a different network of course), these machines must be able to communicate with engineering departments at worst, and at best be accessible by knowledgeable experts in the industry from the other side of the world.

      There are plenty of plants around the world which would turn into oversized holes in the ground if it weren't for the fact that realtime knowledge was accessible remotely. There are many companies which would have been sued out of existence if they put their hands on their hearts in front of congress and said, "Sorry we don't have any data on what has happened, our IT guys said we couldn't network our SCADA systems to the offsite historian, and it has all burnt in a fire".

      Security is NOT and airgap. Security is a complete process, a company culture and something that needs to be designed into every aspect of network design. Limiting access both physical and remote, using a complex heirarchy of firewalls and one way communications, etc etc.

      If you want a truly insightful post maybe read this one below [slashdot.org] You may learn something.

  • that is the lesson learned.

    so:

    1.) keep not only production but all but communication system from the Internet

    2) do not allow removable media to the users, apply extreme caution to 'upgrades'

    3) verify by viewing the source code ( or let it be done by 2 or more separate parties )

    -

    you have no source code? forget your IT security!!

  • Security 101 (Score:5, Insightful)

    by bragr ( 1612015 ) * on Tuesday April 26, 2011 @03:44PM (#35946736)
    What they should have done:
    1) anyone bringing in flashdrives and plugging them into mission critical should be taken out back and shot, or at least given a stern talking to. Autorun should be disabled
    2) Any machines brought into from the outside (laptops etc) should be placed on a separate, untrusted network
    3) Mission critical machines shouldn't be on a network. If that isn't possible, they should be on a separate network or vlan with only the machines they need to talk to, at the very least they shouldn't be able to access the internet
    4) Always ensure that all security updates are applied promptly and all relevant hardening is performed
    5) At the first sign of such a massive infection across multiple machines and devices, everything should have been taken offline, wiped, flashed, and reinstalled and brought up again on a know clean environment, with security procedures tightened.
    6) If all of your machines are running version X of OS Y, they will all suffer from the same 0 day attacks. Diversity, where appropriate, is useful.

    This may not have prevented a infection, but it would have definitely reduced its impact. I really question the competency of any IT person that had no idea what to do.
    • "anyone bringing in flashdrives and plugging them into mission critical should be taken out back and shot,"

      And how do you propose that updates be made to the system? Code them whole-cloth from within the secured network? Without testing the changes on a test system?

      • without autorun.

        hell if you really want to be paranoid set up as suggested above and make the the important machines only run EXEs signed with a specific key and be damn careful with what you sign.

      • by bragr ( 1612015 ) *

        "anyone bringing in flashdrives from the outside and plugging them into mission critical should be taken out back and shot,"

        Fixed

      • I propose using USB!!

        However, I propose having USB access on removable PCI cards, or some similar removable interface. Keep the cards locked up unless you are doing an update.

        Sure, a very stupid user could go buy a USB card to play his collection of Lady Gaga hits in the reactor control mainframe, but he's probably more likely to buy a USB player instead of going to the trouble of installing a card and rebooting the system.

        A process engineer I used to work for had a Golden Rule: Design the work space s
    • "1) anyone bringing in flashdrives and plugging them into mission critical should be taken out back and shot,"

      Iran is lucky enough to have that BOFH option.

    • by cusco ( 717999 )
      A SCADA system **IS** a network, even if transmission is over power lines, POTS lines or microwave links. If you mean it shouldn't be on the organization's standard LAN then you'd be right, and in this case it wasn't. Only the terminally stupid connect SCADA networks to their corporate backbones, and most of those have been weeded out by now.
    • Well, items 1), 2) & 3) amount to the same thing with SCADA equipment. Btw: how do you do item 4) if you haven't got one of the 1st three. Now having worked with / as well as developed SCADA software, I can tell you that the number of "Security" patches can be, sometimes, overwelming. So in effect, it's very easy to slip a trojan into a SCADA system.

      As to looking at source code(as an earlier poster suggested): Good luck with that. 99.99% of SCADA systems are proprietry, closed sourced and encu

    • Number 4 is not possible on SCADA machines like struxnet targets, or even on machines like an OSS system in a telco.

      You see, these application makers do not regard the machines as an HP-UX box (or Solaris box, or Sinix box or Windows box) running some software, but as, let's say, an NMS-2000, which, by pure random luck, "happens" to be implemented on HP-UX.

      Therefore, you are not allowed to install the latest patches from HP until the application provider (Nokia, in the Case of the NMS-2000, Siemens, in the

    • It's never one IT person, especially for such a massive outbreak or such an important site. Any actual boots-on-the-ground guy could have done what you said, but getting a whole org to do things is just a hair short of infinitely harder.

    • I've been working with SCADA and real-time control systems for 30+ years and I see one security hole cannot be plugged by any of the steps you mention.

      Ultimately, data must be *analyzed*. Your telemetry files will have to be brought in some manner to an engineer's desktop for that. A system that has no way to transfer data to less secure networks is useless.

      For me, the most secure control system would be a Linux system. In Linux, differently from closed-source OSes, you can configure exactly what's running.

  • Ralph Langner: Cracking Stuxnet, a 21st-century cyber weapon
    http://www.ted.com/ [ted.com] When first discovered in 2010, the Stuxnet computer

    http://www.youtube.com/watch?v=CS01Hmjv1pQ [youtube.com]

    In short he shows/claims US was behind it.

  • I thought the US wrote this? I still think it was Canada.

  • 1) Warn Boss of vulnerabilities
    2) Boss asks for time/cost estimate to fix
    2a) Boss brings estimate to talking-head meeting
    2b) people protest about their job process changing
    3) estimate sits on Boss's desk for 3 months
    4) Boss golfs with his sis's brother-in-law and they talk security
    5) Boss comes to work next day, calls meeting about security
    6) You remind him of estimate on desk for 3 months
    7) meeting devolves into yucks about golfing/hangover
    8) Boss calls you into office after meeting
    9) Asks you to pick two
  • ...is that guys at Langner Communications have seriously the best control system security chops out there.

    ~Sticky
    /My opinions are my own.

  • I thought they would have learned that with enough private sector forensics, everything gets traced back to them? Didn't DHS in Conjunction with Siemens and Israel write this?

    • Sorry, wrong federal agency. I doubt DHS had anything to do with it except to shit themselves when they found out how vulnerable U.S. infrastructure is.
  • Comment removed based on user account deletion
  • "...but for some time, many companies that owned Siemens equipment were left wondering what, if any measures, they should take to protect themselves from the new worm."

    The implication of this statement is that DHS didn't have an immediate answer (outside of pedantic default answers like "unplug your equipment" or "reload software" or anything else from answers.com).

    Gee, let's see -- a new worm never seen before, apparently written by a sophisticated group from the intelligence community and someone's actual

  • According to Iran, who is never wrong about these things as they will tell you themselves, We wrote this virus in collusion with the Zionist enemy. So why are we having to now go to all of this trouble to decode it?
  • is that like with the events leading up to 9/11, various government entities still don't share information with other ones.

    Until they fix that (isn't that what DHS was supposed to be for?) Iran is the least of their problems.
  • Last I checked DHS are part of the US government. So all they needed to find out about stuxnet was to talk to their Federales buddies who helped create it.
  • The way I hear it, Idaho National Labs was able to quickly decode the worm since it was likely a weaponized exploit from a report they wrote. I'm betting when DHS got them involved, it was not their first time seeing this equipment as they audit our infrastructure all the time.

    • Not that they would have known they were involved, since it would have been redacted from their report if DoE decided to pocket the exploit.

  • The same folks who bring us the TSA.

    Based on that alone, I can confidently say that they didn't learn anything from Stuxnet.

You will lose an important tape file.

Working...