Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security

Is Analog the Fix For Cyber Terrorism? 245

chicksdaddy writes "The Security Ledger has picked up on an opinion piece by noted cyber terrorism and Stuxnet expert Ralph Langner (@langnergroup) who argues in a blog post that critical infrastructure owners should consider implementing what he calls 'analog hard stops' to cyber attacks. Langner cautions against the wholesale embrace of digital systems by stating the obvious: that 'every digital system has a vulnerability,' and that it's nearly impossible to rule out the possibility that potentially harmful vulnerabilities won't be discovered during the design and testing phase of a digital ICS product. ... For example, many nuclear power plants still rely on what is considered 'outdated' analog reactor protection systems. While that is a concern (maintaining those systems and finding engineers to operate them is increasingly difficult), the analog protection systems have one big advantage over their digital successors: they are immune against cyber attacks.

Rather than bowing to the inevitability of the digital revolution, the U.S. Government (and others) could offer support for (or at least openness to) analog components as a backstop to advanced cyber attacks could create the financial incentive for aging systems to be maintained and the engineering talent to run them to be nurtured, Langner suggests."
Or maybe you could isolate control systems from the Internet.
This discussion has been archived. No new comments can be posted.

Is Analog the Fix For Cyber Terrorism?

Comments Filter:
  • by mindpivot ( 3571411 ) on Tuesday March 18, 2014 @12:05AM (#46513383)
    the terrorists are like cylons and we need to disconnect all networked computers for humanity!!!
  • by Anonymous Coward on Tuesday March 18, 2014 @12:12AM (#46513407)

    ever been compromised :) Physical kill switches, human operated are not simply analog (one might argue they are digital at the switch level). Analog might be the wrong word, since analog systems have been repeatedly compromised (from macrovision, to phreaking boxes, etc, etc). keep it off a communications network, even off local networks if they are uber critical.

  • by TWX ( 665546 ) on Tuesday March 18, 2014 @12:16AM (#46513425)

    said the person volunteering to get up at 3 am to go to the office to reset the a/c system.

    Sounds to me like you need a better A/C system.

    Or you need to not consider an HVAC system to be so critical that it can't be on the network. Or, perhaps you need to design the HVAC system to take only the simplest of input from Internet-connected machines through interfaces like RS-422, and to otherwise use its not-connected, internal network for actual major connectivity. And design it to fail-safe, where it doesn't shut off and leave the data center roasting if there's an erroneous input.

    And anything that is monitored three-shifts should not be Internet-connected if it's considered critical. After all, if it's monitored three shifts then it shouldn't have to notify anyone offsite.

  • by gweihir ( 88907 ) on Tuesday March 18, 2014 @12:20AM (#46513453)

    It is called self-secure systems. They have limiters, designed-in limitations and regulators in there that do not permit the systems to blow themselves up and there is no bypass for them (except going there in person and starting to get physical). This paradigm is centuries old and taught in every halfway reasonable engineering curriculum. That this even needs to be brought up shows that IT and CS do not qualify as engineering disciplines at this time. My guess would be that people have been exceedingly stupid, e.g. by putting the limiters in software in SCADA systems. When I asked my EE student class (bachelor level) what they though about that, their immediate response was that this is stupid. Apparently CS types are still ignoring well-established knowledge.

  • by Osgeld ( 1900440 ) on Tuesday March 18, 2014 @12:22AM (#46513465)

    analog is actually more suceptable to interference generated by rather simple devices, as there is no error checking on whats being fed to the system

    the problem is your reactor is for some fucking reason hooked to the same network as facebook and twitter

  • Good idea (Score:5, Insightful)

    by Animats ( 122034 ) on Tuesday March 18, 2014 @12:32AM (#46513495) Homepage

    There's a lot to be said for this. Formal analysis of analog systems is possible.The F-16 flight control system is an elegant analog system.

    Full authority digital flight control systems made a lot of people nervous. The Airbus has them, and not only do they have redundant computers, they have a second system cross-checking them which is running on a different kind of CPU, with code written in a different language, written by different people working at a different location. You need that kind of paranoia in life-critical systems.

    We're now seeing web-grade programmers writing hardware control systems. That's not good. Hacks have been demonstrated where car "infotainment" systems have been penetrated and used to take over the ABS braking system. Read the papers from the latest Defcon.

    If you have to do this stuff, learn how it's done for avionics, railroad signalling, and traffic lights. In good systems, there are special purpose devices checking what the general purpose ones are doing. For example, most traffic light controllers have a hard-wired hardware conflict checker. [pdhsite.com] If it detects two green signals enabled on conflicting routes, the whole controller is forcibly shut down and a dumb "blinking red" device takes over. The conflict checker is programmed by putting jumpers onto a removable PC board. (See p. 14 of that document.) It cannot be altered remotely.

    That's the kind of logic needed in life-critical systems.

  • by raymorris ( 2726007 ) on Tuesday March 18, 2014 @12:36AM (#46513515) Journal

    Analog vs. digital, fully connected vs less connected - all can fail in similar ways. If it's really critical, like nuclear power plant critical, use simple, basic physics. The simpler the better.

    You need to protect against excessive pressure rupturing a tank. Do you use a digital pressure sensor or an analog one? Use either, but how also add a blowout disc made of metal 1/4th as thick as the rest of the tank. An analog sensor may fail. A digital sensor may fail. A piece of thin, weak material is guaranteed to rupture when the pressure gets to high.

    Monitoring temperature in a life safety application? Pick analog or digital sensors, ei ther one, but you better have something simple like the vials used in fire sprinklers, or a wax piece that melts, something simple as hell based on physics. Ethanol WILL boil and wax WILL melt before it gets to be 300 F. That's guaranteed, everytime.

    New nuclear reactor designs do that. If the core gets to hot, something melts and it falls into a big pool of water. Gravity is going to keep working when all of the sophisticated electronics doesn't work because "you're not holding it right".

  • by phantomfive ( 622387 ) on Tuesday March 18, 2014 @12:37AM (#46513523) Journal
    The main use case that causes problems with air gaps (AFAIK) is transferring files to the computer that's hooked up to the heavy machinery. People get tired of copying updates over USB, for example, and hook it up. Or they want to be able to reboot their air conditioner remotely.

    And that is the use case that caused problems with for Iran with Stuxnet. They had an airgap, but the attackers infected other computers in the area, got their payload on a USB key, and when someone transferred files to the main target, it got infected. That is my understanding of how that situation went down. But once you start thinking along those lines, you start thinking of other attacks that might work.
  • by DMUTPeregrine ( 612791 ) on Tuesday March 18, 2014 @12:39AM (#46513531) Journal
    That's because CS is math, not engineering. Computer Engineering is engineering, Computer Science is the study of the mathematics of computer systems. CE is a lot rarer than CS though, so a lot of people with CS degrees try to be engineers, but aren't trained for it.
  • No, it's education (Score:5, Insightful)

    by Casandro ( 751346 ) on Tuesday March 18, 2014 @12:39AM (#46513533)

    Such systems are not insecure because they are digital or involve computers or anything. (seriously I doubt the guy even understands what digital and analog means) Such systems are insecure because they are unnecessarily complex.

    Let's take the Stuxnet example. That system designed to control and monitor the speed at which centrifuges spin. That's not really a complex task. That's something you should be able to solve in much less than a thousand lines of code. However the system they built had a lot of unnecessary features. For example if you inserted an USB stick (why did it have USB support) it displayed icons for some of the files. And those icons can be in DLLs where the stub code gets executed when you load them. So you insert an USB stick and the system will execute code from it... just like it's advertised in the manual. Other features include remote printing to file, so you can print to a file on a remote computer, or storing configuration files in an SQL database, obviously with a hard coded password.

    Those systems are unfortunately done by people who don't understand what they are doing. They use complex systems, but have no idea how they work. And instead of making their systems simpler, they actually make them more and more complex. Just google for "SCADA in the Cloud" and read all the justifications for it.

  • by phantomfive ( 622387 ) on Tuesday March 18, 2014 @12:39AM (#46513539) Journal
    I think his point is that anything that can be accessed remotely by a trusted party can also be accessed remotely by an attacker. The distinction between analog and digital is a red herring.

    Maybe that wasn't his point, but it's still a good one. :)
  • by vux984 ( 928602 ) on Tuesday March 18, 2014 @12:47AM (#46513581)

    My guess would be that people have been exceedingly stupid, e.g. by putting the limiters in software in SCADA systems.

    Or they just did what they were told by management. After all, software solutions to problems tend to be a fraction of the price of dedicated hardware solutions, and can be updated and modified later.

    Apparently CS types are still ignoring well-established knowledge.

    You can't build a SCADA system with *just* CS types; so apparently all your 'true engineers' were also all asleep at the wheel. What was their excuse?

    Seriously, get over yourself. The CS types can and should put limiters and monitors and regulators in the software; there's no good reason for them not to ALSO be in there; so when you run up into them there can be friendly error messages, logs, etc. Problems are caught quicker, and solved easier, when things are generally still working. This is a good thing. Surely you and your EE class can see that.

    Of course, there should ALSO be fail safes in hardware too for when the software fails, but that's not the programmers job, now is it? Who was responsible for the hardware? What were they doing? Why aren't those failsafes in place? You can't possibly put that at the feet of "CS types". That was never their job.

  • by Anonymous Coward on Tuesday March 18, 2014 @01:23AM (#46513699)

    Networked does not imply internet connected. In the same way, if you are using electricity, it does not mean you need to be connected to the electric grid.

    There is no reason to going analog IF people are not stupid.

    Unfortunately, we have plenty of examples that refute your premise. People ARE stupid, including the people who designed the highly vulnerable smart grid that most of the US is now using for power distribution.

  • by volvox_voxel ( 2752469 ) on Tuesday March 18, 2014 @02:37AM (#46513895)

    There are billions of embedded systems out there, and most of them are not connected to the internet. I've designed embedded control systems for most of my career, and can attest to the many advantages a digital control system has over an analog one. Analog still has it's place (op-amps are pretty fast & cheap), but it's often quite useful to have a computer do it. Most capacitors have a 20% tolerance or so, have a temperature tolerance, and have values that drift. Your control system can drift over time, and may even become unstable due to the aging of the components in the compensator (e.g. PI, PID,lead/lag) .. Also a microcontroller wins hands down when it comes to long time constants with any kind of precision (millihertz). It's harder to make very long RC time constants, and trust those times. Microcontrollers/FPGA's are good for a wide control loops including those that are very fast or very very slow. Microcontrollers allow you to do things like adaptive control when you plant can vary over time like maintaining a precision temperature and ramp time of a blast-furnace when the volume inside can change wildly.. They also allow you to easily handle things like transport/phase lags, and a lot of corner conditions, system changes -- all without changing any hardware..

    I am happy to see the same trend with software-defined radio, where we try to digitize as much of the radio as possible, as close to the antenna as possible.. Analog parts add noise, offsets, drift, cross-talk exhibit leakag,etc.. Microcontrollers allow us to minimize as much of the analog portion as possible.

  • by CBravo ( 35450 ) on Tuesday March 18, 2014 @02:47AM (#46513917)
    And to make it even more simple: Everyone, including smart people, makes mistakes.
  • Re:Good idea (Score:4, Insightful)

    by Viol8 ( 599362 ) on Tuesday March 18, 2014 @06:05AM (#46514471) Homepage

    "Code written in a different language is totally helpless here"

    No it isn't. Some languages have different pitfalls to others eg, C code often has hidden out of bounds memory access issues , Ada doesn't because checking these is built into the runtime. Also different languages make people think in slightly different ways to solve a problem which means the chances of them coming up with exactly the same algorithm - and hence possibly exactly the same error - is somewhat less.

  • by wagnerrp ( 1305589 ) on Tuesday March 18, 2014 @08:48AM (#46514939)
    Only because no one thought to put some sort of rate limiter on the reset command. If you're continually needing to reset something, clearly there is a serious issue that should warrant a tech or engineer being called out to investigate.

The one day you'd sell your soul for something, souls are a glut.

Working...