Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Operating Systems Security IT

Kaspersky To Build Secure OS For SCADA Systems 165

Trailrunner7 writes "Attacks against SCADA and industrial-control systems have become a major concern for private companies as well as government agencies, with executives and officials worried about the potential effects of a major compromise. Security experts in some circles have been warning about the possible ramifications of such an attack for some time now, and researchers have found scores of vulnerabilities in SCADA and ICS systems in the last couple of years. Now, engineers at Kaspersky Lab have begun work on new operating system designed to be a secure-by-design environment for the operation of SCADA and ICS systems. 'Well, re-designing ICS applications is not really an option. Again, too long, too pricey and no guarantees it will fit the process without any surprises. At the same time, the crux of the problem can be solved in a different way. OK, here is a vulnerable ICS but it does its job pretty well in controlling the process. We can leave the ICS as is but instead run it in a special environment developed with security in mind! Yes, I'm talking about a highly-tailored secure operating system dedicated to critical infrastructure,' Eugene Kaspersky said in an interview."
This discussion has been archived. No new comments can be posted.

Kaspersky To Build Secure OS For SCADA Systems

Comments Filter:
  • by Splat ( 9175 ) on Tuesday October 16, 2012 @01:27PM (#41672043)

    Monitoring and "remote support" by KGB included free with every purchase!

    • by Anonymous Coward

      When you say "Monitoring and "remote support" by KGB included free with every purchase!", are you Putin us on?

      • I was russian to say the same thing but you beat me to it. I'm stalin to think that this whole thing is a hoax.
  • by Revotron ( 1115029 ) on Tuesday October 16, 2012 @01:28PM (#41672065)
    They'll never go for it.
    • by jythie ( 914043 )
      Yeah.. secure systems tend to be ineffient to use or more work to maintain, so often people just switch off a lot of the security, esp when they are being used/maintained by people who just want to use them in order to complete other tasks.
      • by bluefoxlucid ( 723572 ) on Tuesday October 16, 2012 @01:55PM (#41672423) Homepage Journal

        I think a Linux system that used PaX would be easy. Actually I used to maintain the list of incompatible apps--mostly Java itself, a handful of other things that turned out to be broken (and occasionally have critical security holes, none of which I personally found)--for Gentoo Linux. Thing about PaX is when something is killed, it's logged, and you get a wealth of debug data--when your program misbehaves, it usually dies from it early and it's easier to find the problem. This means developers have an easier time getting their software more correct, and the system doesn't do odd unexpected things (by bad software or by being hacked and worm-infested), and so the more secure system becomes the more usable system and the more maintainable system.

        Similarly, for Unix environments, you could work on building out Minix and bolt on services that supply security guarantees as PaX does, and that interface between the user space utilities and the OS (because the OS Syscall handler is itself a service, you run the program under a DIFFERENT SERVICE) to implement namespaces and act as functional jails--virtualization, semi-virtualization. Services supplied under full microkernels like GNU HURD, L4, or Minix are small and thus easily audited for correctness--and thus improve security.

        It all requires policy, of course. The PaX stuff is policy: no write/execute and no !execute to execute. If that crashes the program, you need to fix the program or remove that policy restriction. Semi-virtualization is mainly a file access policy--hide (can't see it), read-through (can see it, writes are redirected a la UnionFS), read-write (can see and change it, object is shared)--and a resource policy--PIDs, network devices/addresses, etc are hidden or shared. It's on the developer to do that, although forced policy on deployment is possible (you can externally generate a policy). grsecurity has always supplied a learning mode that logs and then develops policy automatically, which you can then audit for monkey business.

        • by Chris Mattern ( 191822 ) on Tuesday October 16, 2012 @02:47PM (#41673069)

          It all requires policy, of course. The PaX stuff is policy: no write/execute and no !execute to execute. If that crashes the program, you need to fix the program or remove that policy restriction.

          And right there you've put your finger precisely on the problem. Fixing the program is hard--if you got it from a vendor, it might well be impossible. Removing the policy, on the other hand, is easy.

          • by jythie ( 914043 )
            And that is the crux of the problem... while OSes have a wealth of security problems, generally when you drill down into break ins, one often finds that the hole was human (or institutional) in nature. Outside unintended consequences, making something secure and making something functional are mutually exclusive (or at least often conflicting) goals that humans will often just turn off security when it gets in the way of doing tasks they need done.
            • No, that's stupid. As I said, the stricter guarantees that you must follow under PaX-alikes tend to make programs easier to debug. ASLR and W^X policies cause hard failures to happen more often and earlier in the failure process, rather than letting the program off-by-one and get away with it or just hobble along half-dead until it sputters out and dies miles away from where the error actually occurred. Debugger intervention and proper core dumps (with library layouts and other memory mappings) can be us

              • by jythie ( 914043 )
                Stricter programming tends to fall by the wayside as soon as development encounters things that the requirements make more difficult to prevent completely since, even if underwriters are putting pressure on, there is usually a more dirrect pressure to get something hacked together and out the door. If you have the luxary of time and funding you can build something using strict methodology.. but in corperate development that is rarely the case... esp when stricter development only covers a narrow class of e
                • Narrow class? 86% of the first 60 security exploits published as advisories by Ubuntu were covered by ASLR and W^X supplied by PaX.

                  Also, as I"ve stated repeatedly, the development changes under ASLR policy are: DON'T MAKE BROKEN CODE (really, the only break here is to assume an address for a dynamically allocated resource is always the same and hard-code it; or subtle corruption that semi-works because generally things don't move around). For the W^X protections supplied by PaX, W^X, ExecShield, and a

          • This is actually not as much of a problem as you think. Programs don't last forever, and they get rewritten a lot. Upgrades. New features. Refactoring, replacement. Remember the shiny new thing is more attractive, so businesses throw out the old and make new just for the sake of being more up-to-date than the competition.

            As such, a feature that says "Requires you to disable the security features of your OS for this particular program" and a big scary warning box that says "Program is requesting policy

            • by cusco ( 717999 )
              businesses throw out the old and make new just for the sake of being more up-to-date than the competition.

              Not so much for SCADA systems and the like. I worked at a place that still had a knee-high pile of 386 laptops in 2005, because the control program for their half million dollar radio tower wouldn't run on any other architecture. Finding a SCADA system that runs, for example, a power dam which is less than five years old will be difficult, since that's really a situation where "if it's not broke d
              • businesses throw out the old and make new just for the sake of being more up-to-date than the competition. Not so much for SCADA systems and the like.

                Your entire argument becomes irrelevant because such systems aren't going to upgrade the underlying OS into a "secure OS" anyway. The OS is going to rot away with the underlying program because the WHOLE PACKAGE works, and if management wants it upgraded they're going to want the WHOLE PACKAGE upgraded--why the hell would you move a program from the 90s off an NT4 box onto an XP box? You get a NEW program and install it on shiny new Server 2008r2.

                You want a secure architecture without bolt-on bullshit,

                • by cusco ( 717999 )
                  you can change...

                  Well, maybe YOU can, but I can't. I just install the stuff and make it work, I rely on folks like to give me the parts to put it together. I don't really **want** to move that 2002 package from NT to Server 2008 or Red Hat version whatever, and then repeat the exercise again in a few years. If the underlying OS is secure enough that it doesn't have to go through a major upgrade every two years (with all the necessary testing and downtime that involves) we can start poking vendors wit
                  • Well, maybe YOU can, but I can't. I just install the stuff and make it work, I rely on folks like to give me the parts to put it together.

                    Bolt-on bullshit is bolt-on bullshit because refactoring a huge multi-million-line codebase is really freaking hard. Microkernels and other modular programming designs rely on small, specialized tasks so that the pieces can be shuffled, improved, and have other pieces interposed without the whole system needing a reworking front-to-back. Even Andrew Morton would have an easier time adding stuff like LSM to Minix rather than the (already completed, obviously) task of adding it to Linux.

                    Micro-modular soft

              • Qubes OS, previously seen on Slashdot, ought to do nicely.

                It looks like Linux, but it isn't. It's more like VMWare ESX Server. It's a tiny kernel that provides a desktop consisting of composited windows from guest VMs. Each window is labled according to the VM it runs in, with window borders colored according to security compartment.

    • by AK Marc ( 707885 )
      It won't work. SCADA was built off the assumption of physical security. It would take firing everyone who ever worked in SCADA to design the next generation of applications to get anything with security in it. Why would anyone switch to this "secure" system, when it's already more secure (physically) than any "program" can make it?
      • It won't work. SCADA was built off the assumption of physical security. It would take firing everyone who ever worked in SCADA to design the next generation of applications to get anything with security in it. Why would anyone switch to this "secure" system, when it's already more secure (physically) than any "program" can make it?

        That's a bit of a stretch. I work in SCADA and it's not the developer's who are the problem. I, myself, harp on about security every other week. However, the marketing and development managers don't give a rats arse. If it's not a new "shiny" or something the competition has, then no money gets put into it.

        Even when security is called out in the standards or client specifications, it's usually just security theater. The maintainers and end users don't want security and tend to bypass the token metho

        • by AK Marc ( 707885 )
          I wasn't talking developers specifically, but usually SCADA is written by engineers and supported by IT, so you'd have to get rid of the engineers that made/designed it, as well as the managers that supported it. There's so much inertia around SCADA, and it started as an electrical spec for monitor, with an idea on someday using electricity to control. The "original" SCADA was electrically driven analogue gauges for monitor, with no control ability. Later, it evolved into a 2-way control, assuming you ar
          • by rioki ( 1328185 )
            Whats the deal?! Modern SCADA systems are secure by them selves. (Save a few implementation flaws, like any other system.) The problem is that the installations often are ridiculously unsafe. You get things like the Industrial Ethernet being switched with the office network, default passwords, the SPS in program/run mode and the likes. The important thing to remember about SCADA systems is that physical security is also part of the deal, who cares about the SCADA systems when you can access the petro chemic
            • by AK Marc ( 707885 )
              The engineers are the worst, almost as bad as doctors. They have a "technical" degree, like a degree in petroleum engineering makes you an expert in computers or programming. So they'll develop the thing themselves then demand the IT department hook it up to the network, and the IT manager doesn't want the fight with the COO, so yes it goes. And they are one virus away from a serious problem.
      • Why do critical control systems need to be connected to the Internet? Computerized control systems such as SCADA have existed far longer than the Internet. There is also a difference between allowing critical quantities to be controlled remotely and these same critical quantities being monitored remotely. It MAY be permissible to allow read only monitoring over the Internet, but certainly, critical controls should never, ever at any time under any circumstances, be accessible from the Internet, so that some

        • by mikael ( 484 )

          It's cheaper to run traffic over an Internet link that it is to buy a dedicated line. Knew one oil company who wanted to run an RGB composite video cable all the way underwater from an oil rig to the head office, just so the CEO's could see what was happening on the control system offshore. Fortunately, the consultants persuaded him that converting the video signal to digital and running that through the existing fibre-optic network would do just as well.

        • by cusco ( 717999 )
          Actually very few are connected to the Internet, and then only by morons. More often what happens is the dedicated SCADA network gets "upgraded" and put on the corporate backbone by IT executives who think that a VLAN is "just as good" as a private network. This makes it a lot easier for the operator, who no longer needs a dedicated workstation just for the SCADA part of his job and who no longer has to sneaker-net reports from one network to the other, but the utter security failure is obvious to anyone
  • by EmperorOfCanada ( 1332175 ) on Tuesday October 16, 2012 @01:28PM (#41672071)
    Aren't Kaspersky Labs the bozos who supported Internet passports? That is such a dumb idea that my computer lost 100Mhz just browsing the article. These guys just have verisign envy and want to get between users and hardware in order to charge rent.
  • by Billly Gates ( 198444 ) on Tuesday October 16, 2012 @01:29PM (#41672091) Journal

    Make the client OS use DNS SEC and encrypted traffic for a secure network that is not physically connected to the internet or any network with a gateway to the internet. Why is this so hard?

    This secure OS will eventually get compromised too if it has USB ports enabled, physically access to the machine, or be on a network.

    • by Anonymous Coward on Tuesday October 16, 2012 @01:38PM (#41672203)

      All of the SCADA systems I have installed are wireless. A potential hacker doesn't need physical access, they just need to be in range.

      True story: The largest wireless SCADA system I did was for an oilfield company. I originally set up passwords made of random letters and numbers, making them as secure as possible. But less than a week after the system was up and running, they complained the passwords were too difficult to remember. So I was forced to change them all to something similar to President Skroob's luggage combination or not get paid.

      (The SCADA radios ran Linux, in case you're interested...)

    • There's more than one way to infect a system, and yes, most of them require user stupidity but there is no end to the supply of stupid users. The most commonly described example is to drop a few USB thumb drives in the parking lot with your worm on them, then just wait for some well intentioned (or not) employee to pick it up and either plug it in to see who it might belong to or start using it for day to day activities, such as updating software on the 'secure', air-gapped systems.

      • by cusco ( 717999 )
        The worst problems aren't in the hardware or the software, but in the wetware. You can't fix stupid.
    • by Chris Mattern ( 191822 ) on Tuesday October 16, 2012 @02:44PM (#41673033)

      Make the client OS use DNS SEC and encrypted traffic for a secure network that is not physically connected to the internet or any network with a gateway to the internet. Why is this so hard?

      Because management wants the real-time reports on their desks. What do mean it's not secure? Everybody else does it. You're the only one who seems to have trouble doing this!

      • What's so hard about a one way dump of data? Open an outgoing port and send the data. Block all incoming requests.

  • by i.r.id10t ( 595143 ) on Tuesday October 16, 2012 @01:30PM (#41672099)

    Why waste the time in new development. Start with one of the BSD systems (already approved under iso9001/9002 type stuff) and either set up custom configurations, or fix what needs fixing.

    • by gentryx ( 759438 ) *
      Exactly what I thought. Why reinvent the wheel? Shouldn't be too difficult to make BSD real-time capable.
    • Minix. It's easier to do a major modification--the fact that it's only basically functional is not an issue, since it's a functional Unix OS without bells and whistles and you're going to be designing and implementing most of the bells and whistles.
    • Agreed, but why beat around the bush? Start with OpenBSD.
      • by Anonymous Coward

        Once again the OpenBSD clan has stepped forward with a reasonable plan for saving the world, unfortunately as is par for the course with OpenBSD, their chosen representative refers to himself as buttfuckinpimpnugget to which the world replied with "oh god, what is wrong with you, go away, just go away!" and promptly returned to their course of apathetic self destruction.

  • I like the idea (Score:4, Interesting)

    by kasperd ( 592156 ) on Tuesday October 16, 2012 @01:32PM (#41672117) Homepage Journal
    I do like the idea of an operating system designed with such security in mind. The operating system is probably also going to require some sort of real time guarantees, but otherwise no requirements for ultra high performance.

    As far as security goes, I think one important aspect is transparency. Code running on the operating system should probably not have much freedom to modify the underlying system, but it is crucial that they can see what is going on, such that you can monitor that nothing unexpected is running on the system.

    I guess for most SCADA systems the risk of bad stuff happening due to unauthorized changes is a much greater concern than leaking information from the system.

    Are Kaspersky the right people to build the OS? Time will show.
    • by Anonymous Coward

      They want their capabilities architecture back.

      • by kasperd ( 592156 )

        They want their capabilities architecture back.

        Capabilities leads to complexity, complexity leads to bugs, bugs leads to vulnerabilities.

        To build something secure, you need to aim for simplicity.

    • As far as security goes, I think one important aspect is transparency.

      FTFA:

      Threatpost: What are the most important features for the new OS?

      Eugene Kaspersky: Alas, I cannot disclose many details about it.

      A secure OS shouldn't need to be kept secret.
      It should be publicly vetted like an encryption algorithm

      • by kasperd ( 592156 )

        As far as security goes, I think one important aspect is transparency.

        FTFA:

        Threatpost: What are the most important features for the new OS?

        Eugene Kaspersky: Alas, I cannot disclose many details about it.

        A secure OS shouldn't need to be kept secret. It should be publicly vetted like an encryption algorithm

        That wasn't the sort of transparency I had in mind, when writing my comment. But I still agree with you. However I don't see a problem with it being kept secret in their design phase, as long as it is publicly vetted before being put in production. A bug bounty would also be a good idea.

  • by jader3rd ( 2222716 ) on Tuesday October 16, 2012 @01:40PM (#41672229)
    "re-designing ICS applications is not really an option". If redesigning the apps isn't an option, how would a new OS help?
    • by JWW ( 79176 )

      I'm assuming they want to sandbox access to lower level hardware, which can be done with a modified OS.

      Except SCADA's a strange bird in that respect. While low level access to network hardware might not be needed by the control interface, low level access to the controllers and monitoring systems is needed.

      They're onto something when they're talking about a custom OS. But that problem had largely been solved in the past, until all the engineers and operators wanted SCADA interfaces that ran on Windows. A

      • I've made several OSs -- It's not that hard. Protected Mode should have been the end of it, but you see, even hardware has bugs. Perfect software can be vulnerable on different hardware. Just look at any BIOS interrupt listing -- some BIOS interrupts are known to trash certain registers in an undocumented way. If our hardware doesn't always perform per spec then you're barking up the wrong fucking tree when it comes to security... Software is only part of the problem.
    • by Kaenneth ( 82978 )

      I would guess the new OS would be binary compatible with the old OS; just like you can run Windows 1.0 applications on Windows 7 (subject to quirks...)

      Which would also be why just using BSD wouldn't work.

    • how would a new OS help?

      Magic.

    • I'm confused about what, exactly, is supposed to run on top of this new operating system.

      Is it supposed to be a new OS for devices with physical-layer control capability like PLCs (Programmable Logic Controllers), DCSs (Distributed Control Systems) and RTUs (Remote Terminal Units)?

      If so, I don't see how it would help, since each of these devices has its own unique proprietary hardware architecture. It's highly unlikely Kaspersky could effectively support the hardware.

      Or is it supposed to be for hosting cent

  • by Anonymous Coward

    The problem isn't the os. The problem is the programmers.
    The culture, the style, the programming best practices.. It needs to change when it comes to embedded systems that need high security.

    You need high standards for these things.. Instead mostly they are still slapped together like the crappiest web apps.

    So now you're going to build an idiotproof os?
    Well you know what they say about that...

  • Keep M$ out of mission critical and high-danger environments? Good, and it's about time. Nothing could be smarter.
  • That's how my eyes are rolling right now.

    Kaspersky Lab is a company that has its whole business centered around digging through compromised insecure systems. They wouldn't know a secure design if it bit them in their faces.

    • That's how my eyes are rolling right now.

      Kaspersky Lab is a company that has its whole business centered around digging through compromised insecure systems. They wouldn't know a secure design if it bit them in their faces.

      Man, I've still got this head-ache...
      Well, damn, man. Don't you think you should see about getting that bear trap removed from your head.
      What bear trap?!

  • We aren't done completely hosing Iran's nuke program just yet. Once that is completely kaput then have at it!
  • But then again, anyone who knows the solution would have to kill you if they told you.

  • assumption 1: we can have remote control work. FALSE. any backdoor anywhere will open.

    assumption 2: the vendor is secure. FALSE. any fixed system password is known to somebody bad.

    assumption 3: we can use lowest-cost hardware. FALSE. there will be flash drives and dongles and games placed on these machines from who knows where.

    assumption 4: we can firewall the net and have Smart Grid work safely. FALSE. it's >ALL fixed passwords out there in StupidGrid, wireless here and there, customer acces

  • Those of us who have been around a while will remember Microsoft trumpeting Windows NT's security.

    "Microsoft included security as part of the initial design specifications for Windows NT, and it is pervasive in the operating system"

    The whole Orange Book / Red book, C2 security level and so on,

    They would be better off improving the failings of the existing system, rather than inventing a whole new set of ways to fail.

  • Wow.... You know what I really want... Trusted Computing Platform for SCADA. Because, hey... If I don't have verifiable challenge-response between a sensor and controller, how can I really trust it. Maybe they can even make the Thunderbiolt connector the standard, with authentication for all the cables! That would be great... Then we could just blame system failure in a bug on the authentication layer!

    The need for interoperability is where most of the problems seem to come from. Properly securing and ma

  • Or maybe Kaspersky will engineer some tightly security checked distribution of Linux/*BSD/Solaris?
    Kaspersky should also take into account solutions like L4 or Minix3. I fear that really witing an OS from zero would be overkilling.
  • Ice Cream Sandwich? Android?

  • This is all good in theory, but let's not forget WHY we have ended up here:

    The Customer.

    The Customer WANTED to have Windows based servers, the customer wanted to have integration on to their business networks using Windows protocols and standards.

    DCS vendors for DECADES had their own OS's from the PLC up to the HMI , granted they were not secure, but they didn't need to be as they were not externally accessible, nor could they run anything untoward.

    When the customer sees this new OS and can't get the data t

  • "After nearly three years of development, Invisible Things Labs has finally released Qubes 1.0, a Fedora 17-based Linux distribution that tries to be as secure as possible by isolating various applications in their own virtual machines using Xen. If one of the applications is compromised, the damage is isolated to the domain it's running in" link [linuxuser.co.uk]
  • They're writing it in C/C++. What could possibly go wrong? There are already embedded operating systems with reasonably good levels of assurance, such as LynxOS and QNX.

  • Comment removed based on user account deletion

Keep up the good work! But please don't ask me to help.

Working...