Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

New Siemens SCADA Vulnerabilities Kept Secret, Says Schneier 119

From the article: SCADA systems -- computer systems that control industrial processes -- are one of the ways a computer hack can directly affect the real world. Here, the fears multiply. It's not bad guys deleting your files, or getting your personal information and taking out credit cards in your name; it's bad guys spewing chemicals into the atmosphere and dumping raw sewage into waterways. It's Stuxnet: centrifuges spinning out of control and destroying themselves. Never mind how realistic the threat is, it's scarier." What worries Bruce Schneier most is that industry leader Siemens is keeping its SCADA vulnerabilities secret, at least in part due to pressure from the Department of Homeland Security .
This discussion has been archived. No new comments can be posted.

New Siemens SCADA Vulnerabilities Kept Secret, Says Schneier

Comments Filter:
  • Uh oh, this story looks exactly like this story [slashdot.org].
  • I find the idea of Iranian centrifuges spinning out of control and destroying themselves comforting rather than scary. Its a shame teh same hasn't happened to Pakistan [dailymail.co.uk].
    • by smelch ( 1988698 )
      Yeah, well how would you like incubators for human babies to start spinning out of control and destroying themselves?

      I'm not so worried about what terrorists might do in a cyber attack, I'm worried about the trolls.
    • You all keep on pissing and moaning about Iranian nukes, while part of the new Saudi arms deal is to protect future Saudi nuclear ambitions.. which, by the way, also involves Pakistan [google.com] (had to use google cache to get the whole article)

      And what did this clown [foreignpolicy.com] ever do to deserve all those medals?

    • by Anonymous Coward

      "spinning out of control and destroying themselves"

      The image the author creates is of a machine spinning at such velocity it explodes in a shower of fragments. While that makes for great copy, it's hardly what happened. In reality, Stuxnet caused the affected centrifuges to alter their rotational speed by only a few percent, which resulted in lower material rendering in the cascading purification process. This result has several advantages to a "self-destructing" centrifuge. 1) a destroyed centrifuge is an obvious problem which would trigger immediate inves

  • by AmiMoJo ( 196126 ) on Tuesday May 24, 2011 @08:16AM (#36226966) Homepage Journal

    Seems like Israel and the US are playing a dangerous game here. Say that Stuxnet caused an accident that released radioactive material into the environment...

    • by Lehk228 ( 705449 )
      the whole thing would have been denied and covered up instead of bragged about.
      • I think he meant in terms of danger to people's lives, not blowback.

        • by sjames ( 1099 )

          Government officials care only about their own lives and those of their friends and family. That's why we can have wars. Note that the draft exemptions are generally met by the family of anyone who is involved in mandating a draft.

    • Re: (Score:3, Interesting)

      by MRe_nl ( 306212 )

      The Japanese nuclear plant in Fukushima ran on Siemens computers that the Stuxnet worm was programmed to infect- in fact the virus was found in Fukushima systems last year.
      Makes you wonder why the cooling system wasn't functioning. Maybe the tsunami caused failures which Stuxnet made the reactors unable to handle.
      Failures at four other plants in Japan, German and South African reactors shut down.
      Using Siemens systems as well?

      • Quit reading tin hat nonsense sites. Have you seen the 1970s systems that control those GE Mark I's? No virus exists for those old things
      • by Yvanhoe ( 564877 )
        Of course you have the sources for that ?

        From what I understand, stuxnet was targeting unrichment facilities, which is very different from what Fukushima is.
        • by Svartalf ( 2997 ) on Tuesday May 24, 2011 @09:39AM (#36227854) Homepage

          Stuxnet doesn't "target" anything other than Windows SCADA systems (which should cause concern when you see those three words together...), notably those from Seimens. Anywhere you've got one of those SCADA systems, you've got a possibility of Stuxnet. It's just that Iran was using them for their process control systems for the enrichment plant.

          • by dachshund ( 300733 ) on Tuesday May 24, 2011 @10:15AM (#36228306)

            Stuxnet doesn't "target" anything other than Windows SCADA systems (which should cause concern when you see those three words together...), notably those from Seimens. Anywhere you've got one of those SCADA systems, you've got a possibility of Stuxnet. It's just that Iran was using them for their process control systems for the enrichment plant.

            Stuxnet targets a Siemens centrifuge controller that's programmed by an (air-gapped) Windows machine. Unfortunately this same basic pattern repeats itself all over the place.

            For any given SCADA system --- regardless of manufacturer --- you're extremely likely to see it connected to a modern PC, typically a windows machine. Even if the Windows machine is just running a terminal program, it's connected.

            What Stuxnet showed us is that these Windows boxes are a critical vulnerability, even if they're just an ingredient in the programming chain, even if the box is separated by an air gap. I'm sure Israel/US would have found a way to those centrifuge controllers, but without the Windows infection vector it would have been a whole hell of a lot more difficult.

            • Yes, they should have been using macs. They don't get viruses.
              • Yes, they should have been using macs. They don't get viruses.

                But in all seriousness, the question I would ask is: how many known USB drive infection and privilege escalation vulnerabilities can you download for a 1-year-unpatched* Mac right now. How many can you download for the same Windows machine? In each category many have already been weaponized? How many of these can be tied together with widespread malware vectors that will get them near the machine to be infected?

                Of course many of the same vulne

            • What? If it is connected to a Linux machine, they will program Stuxnet to work with Linux. Or are you implying that non-Window machines are invulnerable?
          • by TubeSteak ( 669689 ) on Tuesday May 24, 2011 @11:50AM (#36229540) Journal

            Stuxnet doesn't "target" anything other than Windows SCADA systems (which should cause concern when you see those three words together...), notably those from Seimens.

            You might want to do a little more research on the matter.
            Stuxnet's code has been picked apart: the trojan was designed to infect SCADA systems, but only to attack very specific hardware configurations.

            Stuxnet's payload was designed to (1) spin the uranium centrifuges used by Iran at certain known-to-be-destructive RPMs,
            (2) lie to the monitoring software which was supposed to prevent out of bounds conditions and set off alarms if they occur,
            and (3) should 1 & 2 not ruin the centrifuges, Stuxnet would go dormant and reawaken to try (1) and (2) again.

            Stuxnet is completely harmless unless you happen to attach the exact same hardware the Iranians had plugged into their SCADA controllers.
            Just to be very clear: Stuxnet's payload was specifically crafted to attack the known configuration of Iran's uranium centrifuging program

          • Actually Stuxnet does have a target. Each Siemen system has a unique serial number. Stuxnet only manipulated systems with certain serial numbers and it so happens that these serial numbers only existed in Siemens systems in Iran.
          • Most of the oil rigs in the North Sea and the land plants supporting them are programmed using windows XP machines.
            Some of the rigs also have HMI systems (800xA by ABB) that run on win2003 servers.

            There really are no modern control systems that do not have a windows component.. not if they have the feature-set required by most customers.

            It isnt remotely perfect, but the options to avoid it are extremely limited.

    • by gl4ss ( 559668 )

      neglible amounts, if any. the real why it would end up in atmo would have been the faults of the engineers working at the plant. if you don't know how to build control systems, you shouldn't be building a nuke in the first place.

      anyhow, scada networks, because of how practically all of them are designed, should be separated from untrusted networks anyways, and preferably all control going through some bridge that wouldn't pass "wrong" things - better yet, all control should pass through a human - but this

    • If there was an accident surely all the danger would be in Iran. Why would Israel and the US be playing a dangerous game?

    • I'm sorry but I must have missed the proof that the US or Israel was responsible for Stuxnet. But of course actual facts are old fashion and can be quite problematic when the don't support your world view.
    • ... I can see not publicizing vulnerabilities. We don't, for instance, want our military publicly posting our vulnerabilities. Because, they sure as anything aren't going to ask for public patches. Public disclosure only really works if someone in the public can help. On the other hand, if you are running legacy systems in any number of unknown locations, you can't apply the patches anyways.

      We always talk about how bad obfuscation is as a security vector. However, it is a vector. Knowledge of a thing

    • Jus en Bello.

      In the event that a cyber attack did cause collateral damage (unlikely, in this case, but maybe not for future ones), whomever is pressing the launch button better be in uniform.

      Why? Military operations against actual targets are legitimate acts of military aggression. The Laws of Armed Conflict (LOAC) are the legal basis for determining whether an act is legitimate act or a war crime.

      This is why we don't prosecute fighter pilots for targeting a bus with a JDAM, that is known to be carrying Al

      • by lennier ( 44736 )

        Civilian casualties are regrettable, but kinetic operations are not going to be shelved on that basis alone.

        Wow, that's cold.

        It's easy for you to say that when it's not your wife and children on that bus. If they were, you might have a different view on whether or not merely being a uniformed cog in an industrial death machine should allow "regrettable" murders to be shrugged off as if they were heavy rainfall.

        Funny thing is, I thought I was taught in high school that the "can't prosecute me, I was just following orders, sir" defence was smashed apart at Nuremberg. Apparently that wasn't the case?

  • by Anonymous Coward on Tuesday May 24, 2011 @08:16AM (#36226968)

    How do you think Reese's initially got chocolate in their peanut butter?

  • by Pecisk ( 688001 ) on Tuesday May 24, 2011 @08:19AM (#36226986)

    ...simply good old network security with hardened OSes (Linux, BSD, OS X) with seriously turned off all other services, firewalls and proxies with filtering won't do a trick?

    Who is running industrial systems with direct contact with Internet anyway?

    • by Anonymous Coward

      ...simply good old network security with hardened OSes (Linux, BSD, OS X) with seriously turned off all other services, firewalls and proxies with filtering won't do a trick?

      Who is running industrial systems with direct contact with Internet anyway?

      Stuxnet infected the Iranian nuclear plan through USB.

      • by Pecisk ( 688001 )

        And excuse of sticking USB without security protocol (can't execute stuff from USB drives) is...? Still similary stupid to connecting boxes to Internet.

        • by jimicus ( 737525 ) on Tuesday May 24, 2011 @08:50AM (#36227300)

          I'm not sure it would have done much good. The general consensus of opinion is that this was a case of a determined attacker with a lot of resources, not some nutter on the Internet with a copy of the latest Virus Generator Toolkit (TM).

          How much weight we should give that opinion is something I'm not going to discuss.

          In any case, you think a determined attacker is going to be put off by a small thing like that? Hell, if it boils down to it you either organise double agents to apply for jobs at the target site or you target someone who already works there with a brown envelope full of unmarked, non-sequential notes. The latter is high risk, but find the right person, someone who's in debt up to their eyeballs and has been keeping it from their family for some time perhaps, and away you go.

    • by wiredog ( 43288 )

      Many systems are remotely accessible, just not over the internet, and no one thought that heavy security would be needed. Even though those networks were getting compromised back in the 60's.

      Just pulling the cable when remote access isn't needed is a highly effective, and often neglected, security practice.

      • no one thought that heavy security would be needed

        Who is this imaginary "no one", that never thinks anything could go wrong? I and all my friends who grew up watching "War Games" are always thinking about how things could go wrong.

        It always seemed to me that what "no one" is thinking is not that bad things can happen, but "I'll put in just enough security so that the failure won't happen until after I retire."

        • by wiredog ( 43288 )

          Why would you need heavy security for something that is air-gapped? If the Bad Guys (TM) get physical access you've lost anyway! We didn't even require passwords for access, because the keyboard was locked in the control cabinet. The only time it was networked was when someone hooked up a modem so it could be remotely debugged or upgraded. After which the modem was disconnected.

      • Just pulling the cable when remote access isn't needed is a highly effective, and often neglected, security practice.

        I tried that, but my screen went black.

    • Imagine a power plant that takes little to no intervention throughout the year. At most the engineer(s) only need to make adjustments when changing out fuel rods or during an emergancy. Now imagine the engineers that make these changes make $200k+ a year and your company has 10 such reactors. Introduce the internet and... profit!!!
    • by cshake ( 736412 )

      I'd imagine it would be because the company that makes the machines that you're controlling only make drivers and control software for their own special computer systems that you have to buy from them. The advantage there is that if any part of the system goes wrong, from computer to end product, you have a single point of contact to get support from.
      I think the mistake that many people on /. make is thinking that everyone is a 'computer guy', where in reality the people running these computers just know h

    • by nnull ( 1148259 )
      Quite a lot of people lately. Management wants to see their production on their office computers. They're easy to network and of course easy to hack. Siemens is not the only one vulnerable here, *cough* Schneider *cough* Schweitzer *cough* ABB...
    • Who is running industrial systems with direct contact with Internet anyway?

      Here's a thought, why does it matter? So far there has been only one demonstrated attack on a SCADA system, and that attack didn't use the internet as its vector.

      SCADA systems benefit greatly from being connected to the world, but not directly. There should be many tiers of security both virtual and physical. It is the physical security here that was lacking. The best airgap in the network doesn't help you if one of your underlings plugs an infected USB stick into a machine on the process control network.

  • I would leave exposed SCADA interface in the open, after Stuxnet it should be clear that securing SCADA interfaces should be done on a higher level - by putting it in a different VPN etc.
    Whether the vulnerabilites are public or not doesn't change the fact that a given setup is secure or insecure by design...

    • Now imagine the scenario where you have windows machines on the same network as your SCADA devices because the tools you've bought or built work this way. Someone attaches an unauthorized device to your network and fail, fail.

      Now, I think we can probably agree that you can and should take steps to prevent something like that from happening, but there is the issue of getting from point A, where your network is insecure, to point B, which requires at least buying or developing a whole bunch of new software. T

      • Now imagine the scenario where you have windows machines on the same network as your SCADA devices because the tools you've bought or built work this way. Someone attaches an unauthorized device to your network and fail, fail.

        Aren't those development tools rather than run-time tools? If so, isolate your system and get serious about how you allow stuff to be moved over to it.

        • by Interfacer ( 560564 ) on Tuesday May 24, 2011 @09:10AM (#36227498)

          Not really. The process control is done on real-time controllers, but visualization is usually on windows machines. Data historians, configuration databases, OPC servers, etc are often Windows servers. Add to that that hotfixes and service packs have to be vendor approved before putting them on the live system. This means that those systems often run whatever was approved at the time of installation, which can be years out of date.

          Many SCADA and DCS systems are also horribly insecure, have default or hard coded administrative passwords, etc. What doesn't help is that they are often managed by people who are good at the actual process stuff, but not necessarily at security or system administration.

  • What worries Bruce Schneier most is that industry leader Siemens is keeping its SCADA vulnerabilities secret

    If you want to prevent the bad guys from exploiting a vulnerability, then don't... um... tell them about the vulnerability? But do tell the affected parties about it.

    • by Anonymous Coward

      Yeah, security by obscurity. That works great. Keeps the world safe.

    • Re:Duh? (Score:5, Insightful)

      by nedlohs ( 1335013 ) on Tuesday May 24, 2011 @08:38AM (#36227196)

      or fix it, that works really well too.

    • Re:Duh? (Score:5, Insightful)

      by markus_baertschi ( 259069 ) <markus@@@markus...org> on Tuesday May 24, 2011 @08:45AM (#36227246)

      That is exactly what will not happen.

      The ones who should tell their Customers about the problem is Siemens. But they will play the problem down because it might affect the sales of the next batch of stuff.

      The evil hacker will just buy a bunch of systems, analyze it and find the vulnerabilities. This completely independent of the disclosure. Stuxnet was developed before this disclosure and I think the vulnerabilities used by Stuxnet are still there.

      This is why security by obscurity does not work in the real world.

      • by Svartalf ( 2997 )

        The evil hacker will just buy a bunch of systems, analyze it and find the vulnerabilities. This completely independent of the disclosure. Stuxnet was developed before this disclosure and I think the vulnerabilities used by Stuxnet are still there.

        This is why security by obscurity does not work in the real world.

        Most definitely. Comments about someone not being able to afford to buy the devices not withstanding, it is very much what someone would do if they were to attack a system or come up with a new Stux

        • And yet obscurity is a valuable tool in security. Absolutely it should not be the only tool - but discarding it completely is like saying we should discard firewalls because firewalls can't stop all attack vectors. They are all tools, and none is sufficient security on its own - but many different tools used in conjunction can make for a formidable defense.
      • That is exactly what will not happen.

        The ones who should tell their Customers about the problem is Siemens. But they will play the problem down because it might affect the sales of the next batch of stuff.

        Speaking of real world this is something that doesn't happen on a typical control systems vendor. Your typical vendor releases pages of errata. Your typical vendor knows what you have purchased and your typical vendor typically comes running in with a fix be it hardware or software, or a temporary workaround while they are coming up with the fix.

        This has happened to us on many occasions. One instance a certain series of commands issued on the display graphic would cause an alarm manager to stop responding.

    • If you want to prevent the bad guys from exploiting a vulnerability, then don't... um... tell them about the vulnerability? But do tell the affected parties about it.

      I think nuclear power plants and the like warrant something a bit more than security through obscurity...

    • Except that the bad guys have already been made aware of the vulnerability, since Stuxnet is out there for anyone to analyze. Do you think the Iranians have not been picking apart Stuxnet and trying to figure out how they can use it?
    • by Svartalf ( 2997 )

      That's specifically not what they're doing...telling the affected people about it. They're keeping that information to themselves- because it might reveal the exploits in question. As for not disclosing because the bad-guys might figure it out...heh...keep fooling yourselves folks. The bad-guys almost always KNOW about them- it's why they call 'em "0-dayz".

  • by Anonymous Coward on Tuesday May 24, 2011 @08:29AM (#36227094)

    Actually it's probably the CIA, NSA and other TLA's that truly want the security holes. They're just using the DHS as the mouthpiece to convince the companies to keep quiet and not plug the holes. After all, without those holes, Stuxnet (and likely other woms/viruses/trojans) wouldn't be as effective as they apparently have been.

    • by fuzzyfuzzyfungus ( 1223518 ) on Tuesday May 24, 2011 @09:00AM (#36227408) Journal
      I'm not so sure: Obviously, assorted sinister TLAs are happy to exploit available holes; but all but the really stupid ones have to realize that they don't exactly live in a unipolar world when it comes to writing viruses, and that the US(and its assorted western buddies) have a lot to lose in an atmosphere of general SCADA-smashing.

      If all SCADA systems become deeply vulnerable, who loses more? Industrial or post-industrial societies with high levels of complexity that could be on the edge of collapse with a few days of supply chain disruption, or the dusty low-GDP countries of the world where disenfranchised hackers, cheap laptops(and/or exploits provided by friendly powers using them as proxies) are still easily available?
      • There's another layer here. Having vulnerabilities allows TLAs to do sneaky things, empowers them, and helps them do their job. And you're right, those things cut both ways. But, having them cut both ways ALSO empowers the TLAs. They all got a big budget boost in the aught years. Security was suddenly a really important thing.

        So what the TLAs want isn't always their intended purpose.
    • by jonwil ( 467024 )

      If the CIA etc really wanted to infect these things, why wouldn't they just infect the machines at the factory then use a front company to sell them to IRAN or whoever on the cheap.

      I remember seeing a JAG episode once where the spooks deliberatly allowed some bad guys to steal an F-14 and extract its control software (knowing that the software was to be given to IRAN for an upgrade of its F-14s and knowing that the software was deliberatly defective) and I see no reason it couldn't happen in the real world.

      • by Jaysyn ( 203771 )

        Didn't the CIA do something like this to the USSR concerning oil pipeline control software?

  • by Aladrin ( 926209 ) on Tuesday May 24, 2011 @08:36AM (#36227168)

    Last I checked, 'responsible disclosure' meant giving the company time to fix the vulnerabilities before you released the info to the public.

    Am I missing the part where we've gone beyond that point?

    • Responsible disclosure meant giving the company time to fix the vulnerabilities before you released the info to the public.
      So the companies stalled and never fixed the vulnerabilities, tried to sue the researchers, etc.
      Responsible disclosure came to mean giving the company a set amount of time to fix the vulnerabilities before you released the info to the public.
      So the companies kept threatening researchers, called the grace period "extortion" and such, stalled for more time to "test", and didn't fix the
  • Sounds like exactly the sort of thing Wikileaks exists for.

  • Hole/bugs lifetime is forever. If you find a bug or a hole, and you choose to ignore then, it will not go away. It will be there waiting for his moment to ruin your morning. Maybe bug/holes are not as important as people dedicated to the racketeer industry think. So if you can't fix then on the morning, you can fix then after the tea, if you fix then today.

  • Open Secret (Score:5, Informative)

    by adavies42 ( 746183 ) on Tuesday May 24, 2011 @09:11AM (#36227526)

    I did my master's thesis on SCADA security. tl;dr: there isn't any. We're talking about an industry that uses unencrypted radio links in their control systems....

    • by Anonymous Coward
      Believe me I know all about it, as I'm a guy that designs and puts those links in. I've been making a big push with the uppers for more secure links and it is starting to get in now. Licensed bands with AES etc. Previously it would all be modpacs or Canopy links. We're starting to move to licensed stuff with AES now, but damn near everything in this area was already done like this. Anyone could pretty much get on the network with the right know how. Most of the stuff is the water control systems for the cit
      • by dargaud ( 518470 )
        I do control/command systems as well and there are many reasons why there's few if any security. One of them is that in the (rare) case of a reboot, you want the system back online automatically as quick as possible. You don't want to wait until 9am when the first employee who knowns the password can type it in... hence no passwords. Note: I'm NOT saying it's a good thing !
    • by Svartalf ( 2997 )

      Heh... They're "thinking" about using crypto on things like the radio links. They're "concerned" about things like "latency" (Here's a hint, if you're worried about injecting a 1-2 character's worth of transmission time delay at 9600 baud, you're doing it wrong.) so the industry's been reticient at trying to at least lock down some aspects of the remote links. Biggest problem is the downtime of some systems in addition to the overall expense of things while they retrofit to higher data rates, end-to-end

    • by Anonymous Coward

      Wait a second. Unencrypted *radio* links? Seriously? I sure hope it's for controlling the ice cream and chocolate factories of the world and not something important.

    • Master's Thesis on SCADA Sec? Really? Published anywhere?

      SCADA security isn't. I'm sorry but it's true. And the entire "security industry" is talking just like all the slashtards commenting.

      Doing security right in this environment is non-trivial. The SCADA/ICS vendor community isn't providing it because SCADA/ICS customers aren't asking for it. The downside of course is that the SCADA/ICS customer is NOT the individual who is going to suffer when the screwups happen. The SCADA/ICS vendors and customers hav

      • Master's Thesis on SCADA Sec? Really?

        Yes, the cite would be something like

        Davies, Aaron G. "A Toolkit for Intrusion Detection in a SCADA Environment." MA thesis. University of Louisville, 2005.

        Published anywhere?

        Astonishingly, yes, at least if you count ProQuest [umi.com]. Not that I'd bother reading it (or at least anything but the background material) if I were you--it was basically about hooking up a SCADA emulator to Snort and an alert correlator to make a testbed you could deploy potential attacks against to see if your filter configuration worked. I have no idea if

    • by rabtech ( 223758 )

      Long ago, I worked as an IT admin for a grocery company that owned it's own bakery, ice cream, drink, etc plant. The "industrial control systems" I saw in use were the worst engineered pieces of junk I've ever encountered. I am talking unpatched Windows 95 systems running a crappy VB 4 UI, that talked to a poorly written VxD to control the ice cream mixer, which was a massive piece of equipment that could easily kill someone standing too close to it.

      I just got one of those TI $4 embedded development kits an

      • From what I hear from my friends (one of whom used to program for defense embedded systems), it is all like that. Terrible platforms, terrible code, security through obscurity (if that), etc.

        There's no practical way to defend the embedded system from the device which programs it. So while it's true that the tools use to program embedded systems are often primitive, it has little to do with attacking them.

        The fact that the video signals from some of our drones are broadcast unencrypted over the air?

        IIRC, th

    • As the person on site somewhat responsible for managing encryption keys for wireless telemetry and wireless process control systems on my site I call bullshit. Either that or you did your thesis in the 80s oldtimer.

  • "SCADA systems -- computer systems that control industrial processes -- are one of the ways a computer hack can directly affect the real world".

    Only if you connect the SCADA systems directly to the Internet and run them on top of Windows [wired.com]. Instead of running them behind a secure VPN connection running on embedded hardware.

    • by PPH ( 736903 )

      The problem with such systems is that they can be 'infected' through the programming platform. In the case of Stuxnet [wikipedia.org] it was the PCs used to program the PLCs that were infected. And one of the vectors of infection was the use of infected USB flash drives on these (Windows) systems. Programming PLCs is often done through a direct cable connection, so while keeping industrial control systems off 'The Internet' may be a good idea, it isn't sufficient to prevent such an attack.

    • by geekoid ( 135745 )

      This types of attacks should happen regardless of the OS.

      These are specifically targeted attacks, the manly take advantage of human trust.

      And yes, the SCADA systems need to be isolated, and then need to only allow access from specific machines.

      OTOH, that level of security is expensive, and for some reason everyone thinks infrastructure programs don't cost money. and all money the government gets is 'waste'.

  • there should also be strict government oversight to ensure the vulnerabilities are being fixed.

    • there should also be strict government oversight to ensure the vulnerabilities are being fixed.

      ... And that the fixes don't make it to other governments. See: VUPEN's alleged Chrome exploit.

      VUPEN released a video of the exploit in action to demonstrate a drive-by download attack that successfully launches the calculator app without any user action.

      The exploit shown in this video is one of the most sophisticated codes we have seen and created so far as it bypasses all security features including ASLR/DEP/Sandbox (and without exploiting a Windows kernel vulnerability), it is silent (no crash after executing the payload), it relies on undisclosed (0day) vulnerabilities discovered by VUPEN and it works on all Windows systems (32-bit and x64).

      VUPEN, which sells vulnerability and exploit information to business and government customers, does not plan to provide technical details of the attack to anyone, including Google.

      Guess it depends on who you think the "bad guys" are. I say, show the world and let the good 'n malicious duke it out -- hint: Bug fixes are often easier to code than full exploits.

  • Someone has been watching too much 24 Season 7.
  • Hi, I’m Dirk Gebert, system manager for security for Siemens Industrial Automation Systems. I’m on the team working on the topic mentioned in this article. We are posting updates on this website: http://www.siemens.com/industrialsecurity [siemens.com]. Let me know if you have questions that are not answered there.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...