Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT

Do Complex Systems Require Higher Safety Standards From Managers and Engineers? (techcrunch.com) 137

An anonymous reader quotes TechCrunch: Automotive emissions, nuclear power plants, airplanes, application platforms, and electrical grids all share one thing in common: they are very complex, highly coupled systems... Engineers have matched some of this growing complexity with more sophisticated tools, mostly derived from greater computing power and better modeling. But there are limits to how far the technical tools can help here given our limits of organizational behavior about complexity in these systems. Even if engineers are (potentially) acquiring more sophisticated tools, management itself most definitely is not.... One pattern that binds all of these engineering disasters together is that they all had whistleblowers who were aware of the looming danger before it happened. Someone, somewhere knew what was about to transpire, and couldn't hit the red button to stop the line...

Engineering managers probably have the most challenging role, since they both need to sell upwards and downwards within an organization in order to maintain safety standards. The pattern that I have gleaned from reading many reports on disasters over the years indicates that most safety breakdowns start right here. The eng manager starts to prioritize business concerns from their leadership over the safety of their own product. Resistance of these pecuniary impulses is not enough -- safety has to be the watchword for everyone...

Finally, for individual contributors and employees, the key is to always be observant, to be thinking about safety and security while conducting engineering work, and to bring up any concerns early and often. Safety requires tenacity. And if the organization you are working for is sufficiently corrupt, then frankly, it might be incumbent on you to pull that proverbial red button and whistleblow to stop the madness.... [T]he demise of the ethical engineer doesn't have to be a fait accompli.

This discussion has been archived. No new comments can be posted.

Do Complex Systems Require Higher Safety Standards From Managers and Engineers?

Comments Filter:
  • by Anonymous Coward on Sunday April 28, 2019 @07:41PM (#58506982)
    You will see a very sudden interest in the culture of safety.
    • PE powers and more unions!

    • by sysrammer ( 446839 ) on Sunday April 28, 2019 @07:50PM (#58507006) Homepage

      You will see a very sudden interest in the culture of safety.

      Unfortunately, many of the most "accomplished" executives are masters at deflecting blame. A scapegoat will get fired for cause, and the responsible personnel will maybe possibly perhaps "retire in a few months to spend time pursuing other families" or whatever. Snugly strapped in to their golden parachute, ofc.

      • by evanh ( 627108 ) on Monday April 29, 2019 @01:12AM (#58507976)

        I've experienced, as an electrical worker, the reaction exec's have when they suddenly become, by law, always at fault for "preventable" accidents. "Follow the book" suddenly becomes the only solution. If there isn't one then write it. And if you tell them it'll cost 10x as much that way, they just shrug because they know every other company is in the same boat.

        Of course, that may also trigger them into looking for another country to move the factory to.

        • And if you tell them it'll cost 10x as much that way, they just shrug because they know every other company is in the same boat.
          Of course, that may also trigger them into looking for another country to move the factory to.

          That's why we have standards, right? If you demand they follow the standard no matter where they go, then they will only go if they can find trained labor which will work cheaper than the amount it will cost to ship the product back to the market. The standard has to require inspection, of course, to ensure that it's being followed.

          • By standards, do you mean "PCI compliance", "SOX compliance", "security polices", and "corporate mandates"? Because for many industries, including airlines and medical safety, many of these standards are not followed. They are merely checklists to note compliance, not actually used to guide practices and policies that may interfere with profit, with quarterly reports of goals met, or which may yield turf to other departments. The bureaucracy or the organization themselves become the goal of the company.

            I'v

      • "Pursuing other families"
        I think this is what sometime actually happens.

  • Quoting Betteridge.

  • by Registered Coward v2 ( 447531 ) on Sunday April 28, 2019 @07:54PM (#58507022)
    The article raises some very valid points. However, in focusing on a management - engineer / safety - profits as the issues misses another big challenge; assuming something is safe since nothing has happened (yet). The author alludes to that but doesn't really flesh out the argument. Engineers test something repeatedly and because it doesn't fail thew assume the safety limit is higher than it really is. Richard Feynman in his appendix to the Rogers Commission Report makes the case clearly and eloquently about that danger.
  • Neural networks (Score:5, Insightful)

    by Anonymous Coward on Sunday April 28, 2019 @07:54PM (#58507026)

    This seems like a big problem for automated systems built around algorithms like neural networks. It isn't practical to examine a neural network of reasonable complexity and try to extract information about why it produces the outputs it does. Training is fitting a complex mathematical function to the training data set. Due to the complexity of the neural network (or another algorithm of similar complexity), it seems very difficult to be confident that the mathematical functions haven't been fit in such a way that the inputs at certain points won't lead to undesired behavior. If these algorithms are automating systems like aircraft or self-driving cars, that seems like a big problem. The complexity of the system allows for better performance, but the inability to extract meaningful information about why the system behaves the way it does would seem to be a big problem. I recognize that simpler techniques like decision trees might not provide the skill of a neural network, but at least one can be more confident that they won't behave in an unexpected manner.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      This seems like a big problem for automated systems built around algorithms like neural networks.

      I believe the oft quoted answer is you don't necessarily rigorously validate the AI/neural network/whatever. You validate the watchdog / verifier bit, and try to prove that it does limit the worst cases within the limits that are required, or that the solution found actually works and meets some definition of correct.

      For instance, maybe an AI finds a good path from A to B while avoiding all obstacles, and getting a good reward score internally for however it figured out the path. Perhaps the AI is on a ro

    • by AmiMoJo ( 196126 )

      Safe systems don't rely on unpredictable control systems.

      For example, Japanese high speed rail is the safest in the world. Drivers are not allowed to use their own initiative to deal with problems. When there is a fault or unusual situation, they must open the operating manual and follow the instructions precisely.

      Their network was the first in the world, starting in 1964, and has never had a single fatality or serious injury due to an accident.

      For safe systems AI will be limited to doing things like image

  • by 0100010001010011 ( 652467 ) on Sunday April 28, 2019 @07:55PM (#58507028)

    ISO 26262, DO-178C, IEC 61508, etc, etc.

    The problem with American aerospace is they figured out how to game the requirements process so that no one is culpable. At this point I would trust VW's safety record (because the spotlight is already on them) over Boeing's (because they played fast and loose).

    • At this point I would trust VW's safety record (because the spotlight is already on them) over Boeing's (because they played fast and loose).

      VW has a tolerably but not amazingly good safety record. Several of the brands under their umbrella have had design failure-related unattended vehicle fires and the like. I would expect them to be true to form.

  • This is beyond fake news, it's full on joke news.

  • by mschaffer ( 97223 ) on Sunday April 28, 2019 @08:05PM (#58507090)

    Many complex systems require greater safety standards when compared to systems that cannot significantly harm people. That is why many of these systems are already subject to safety standards. For example, consider the NFPA standards that are commonly adopted in the USA and regulate electrical wiring, fuel storage, building safety, and many, many more everyday items. There are hundreds and hundreds of other ISO, ANSI, FM standards (to name a few) that also cover safety and standard tests.

    Now consider programmers and developers. What standards are used (other than, perhaps, the standards that define various versions of computer languages) to put together very complex systems?

    Also consider that there is no such thing as a PE for Software Engineering even though many people's lives depend on software working properly.

    • by goombah99 ( 560566 ) on Sunday April 28, 2019 @08:34PM (#58507216)

      Engineering is precisley the sceince and training required to manage complex Engineered systems. That's exactly what engineering is. period.

      Computer science is unrelated to engineering. Possibly one might study the science of computer engineering but that's going all meta on this and is a distraction from the point

      Computer scientists should not be allowed to manage complex safety systems unless they are qualified engineers.

      Engineers learn to work in teams where the they manage complexity across interfaces and scales.

      If you think git hub takes care of that or you only need to work on your own part of the tool and git merge or an atp-get build will take care of the integration then you are not an engineer.

      So yes, people with engineering credentials can manage safety.

      Doesn't mean they will do it right. For example, the poster child for engineering safety across every scale used to be Boeing. I've always been awestruck that they can build machines that fly when no one person knows where every nut of bolt goes and how it was certified to be a sufficient fastener. That's not a made up example, many early boeing crashes in world war 2 were attributable to things similar to a nut coming lose and rattling it's way into an electrical junction box. Engineering practice incoporates continuous safety improvements as complex systems (structural like a nut on door) interact with electronic sub systems when the respective engineers don't even work in the same building.

      The tragedy of boeing is their last 3 planes have been engineeing duds. Something has happened to that company.

      • Computer science is unrelated to engineering.

        I guess it all depends on how you view it and how you define Computer Science. The engineering aspect is very primitive right now, but it does exist, even if code monkeys never put into practice any engineering principals.

    • by Keick ( 252453 )

      What standards are used (other than, perhaps, the standards that define various versions of computer languages) to put together very complex systems?

      As far as software standards goes, there are many, but only a few that are ubiquitous across aerospace and automotive fields. MISRA started as a safety standard for coding automotive applications, but is widely used as a starting point in aero as well. DO-187B is what your going to follow, and test to, if you do any aerospace development.

      IEEE-12207 is generally the process your going to follow for developing large applications for aero or auto, but you'll still use MISRA and DO-178B as needed.

      If your hardco

  • by Anonymous Coward

    A priority on safety means nothing will work. The safest plane is one that stays on the ground, immobile.

    • Very myopic point of view. Planes that only stay on the ground are not planes. If the hard constraint is that the plane needs to fly, not providing this is not meeting the specification.

      Incidentally, most aviation mishaps occur when a plane is touching the ground or rapidly becomes immobile.

      • Incidentally, most aviation mishaps occur when a plane is touching the ground or rapidly becomes immobile.

        I vote that we remove the component causing the failure. If we get rid of the ground, we should eliminate a pretty high percentage of failures.

      • Kind of like saying it's not the fall that kills you but the sudden stop. Though the worst aviation disaster (which hopefully will never be exceeded) was two planes colliding on the ground.

    • The safest plane is one that stays on the ground, immobile.

      It isn't. I made one like that once and a truck crashed into it. That couldn't have happened if it was safely in the sky.

    • Hardly. Acceptable margins of safety are the first priority, not the only one. Engineering is the art of meeting all the other necessary priorities (functionality, cost, etc) without compromising the first priority.

      Nothing will ever be perfectly safe, that must be the accepted starting point of any conversation around safety. But that doesn't mean it's acceptable to throw away acceptable safety margins to get on with the job. "What percentage of users is it acceptable to injure or kill, and with what fr

  • by weilawei ( 897823 ) on Sunday April 28, 2019 @08:15PM (#58507144)

    Put together a team to do nothing but rip holes in designs. Separate chain of command with the power to independently veto and halt processes. Make their pay based on how well they find errors, not on how few error come up (i.e., properly oriented incentives).

    We used to call them testers and QA.

    • Apparently that violates the agile commandments.

      Saw a thread on stackexchange about it while looking for something else. Those guys were like religious zealots.

    • by Gim Tom ( 716904 )
      I did QA on systems written and used by our large organization for the first 25 years of my career. I AM a graduate engineer from the 1960's and we HAD to take courses in engineering ethics back then. When management decided to move to "enterprise solutions" to replace the in house ones, there were some pretty major failures that would have never happened with engineers and programmers that actually understood what was really needed and what the issues would be if some of those needs were not met.

      I m
      • " I AM a graduate engineer from the 1960's and we HAD to take courses in engineering ethics back then."

        I am a graduate engineer from the 1960s as well. I went to the U of Arizona and there was no mention of engineering ethics that I recall. Could have used it because a few years after graduating I was fired because I would not certify MTBF on an isolation thermocouple module for a nuclear plant. (First look at the schematics found 400 mW on a 250 mW resistor.)

        I wonder what schools had courses in engineer

        • by Gim Tom ( 716904 )
          I graduated from Georgia Tech in 1970 -- where I started in Aeronautical engineering, but changed majors after my sophomore year (for several reasons) to Industrial and Systems engineering. I can't remember whether that course was part of the AE or ISYE courses, but I found the book we used when I cleaned out my late parent's house a few of years ago. A lot of it had to do with contracts and contracting, but there was a good bit on general ethics. I ended up migrating into computers early after college a
  • Safety costs money. Safety costs profits. Investors are not going to be shoveling money for useless things like safety and liability. They want profit. The raison-d'entendre for these limited liability corporations are to allow profits to flow one way while liabilities are stopped. These are the lakes of liability in the rivers of commerce created by the LLCs.

    Don't even think of creating laws to ensure safety. They are job killing regulations and they have no place in the modern free market governed by the

  • by JBMcB ( 73720 ) on Sunday April 28, 2019 @08:45PM (#58507256)

    I have a pal who writes code for medical imaging machines. Each bug he fixes, he creates a new *trunk* of the codebase, adjusts or writes new unit tests, checks the integration tests, fixes the bug, runs the whole unit and integration suite against his change, writes out an explanation of what he did and why, along with failure mode and impact analyses, gets it code reviewed with the whole team, merges back with the trunk, then re-runs all the unit and integration tests *again* That's the process for a simple bug.

    So, yeah, mission critical stuff has a more strict development process.

    • *branch* i meant, but he branches the entire codebase.

  • These days, most safety standards, as well as other standards, are not specific to any one industry, and can be applied to organizations of any size and complexity. They apply equally well to the manufacture of pins, airline production or a Tokamak fusion reactor.

    Unfortunately, failure to meet safety standards may or may not be illegal, depending on the law, rules and regulations governing a specific industry. If metal shards are found in food, there is a recall. If there are more than 50 reports of fault

  • by BobC ( 101861 ) on Sunday April 28, 2019 @10:59PM (#58507676)

    Yes, management and tools are always important factors. However, it is fundamental to the very concept of being a "professional" engineer to take complete responsibility for one's own work product. It is not permissible to blame others for giving you bad specifications or a bad design or bad tools if you didn't even try to do the due diligence needed to assess and properly perform your own assigned tasks.

    I have built aircraft and spacecraft avionics. I have built nuclear reactor instrumentation and control systems. I have built automated X-ray and neutron inspection systems, some of which were for munitions plants where one mistake could kill dozens or hundreds.

    I refuse to take assignments where I don't know and understand where and how my piece fits into the whole. I insist on being fully informed, and being given enough time and information to come to my own informed judgments about the project as a whole and the tasks at hand.

    I own my work. I also take full responsibility for any and all errors I ship. This attitude and commitment becomes evident in two main areas:

    1. The requirements and specifications. Do they make sense? Do they cover all the needs of the product? Do they fully take into account outside regulations and standards? Do they have any gaps? Most importantly, is each and every requirement explicitly testable? Are there ways and methods in place to ensure every requirement is being fully met?

    2. Testing. Not only is this needed to ensure requirements are met, but also to ensure the product always works as intended. I test things until they break, not to find weakness in the product, but to ensure strength and completeness both in my testing and in the product definition itself.

    Any engineer who thinks they can work with their blinders on, seeking the minimum context needed to get the job done, focusing on the schedule more than thoroughness and quality, is not any kind of engineer in my book. They are droids, working mechanically and blindly, never seeking nor seeing the bigger picture. Droids, not engineers.

    They are the folks who help get people killed. No excuses.

    I have left employers who tried to keep me in the dark, sometimes with large negative financial impacts. I refuse to have non-engineers try to tell me what engineering is.

    Unfortunately, this extends to engineering professors who have never engineered or delivered a safety-critical system. It extends to universities who don't include Requirements Analysis and Design of Experiments in the engineering curriculum. Most importantly, education never exposes students to unsafe situations to illustrate the extreme need for safety, and the correct ways to pursue it. At best, students may see a risky chemistry demo, but that's about it.

    Where did I learn it, if not in college? I served in the US Navy before college (to get the GI Bill so I could afford to go), and I saw first-hand what it means to have one's daily job be around extremely hazardous systems, with the need to always practice safety in all its forms, to maintain those systems to maximize safety, even when the maintenance itself has risks. And to be trained and prepared when it all goes wrong, when people get hurt, when systems get destroyed, when the fires burn.

    I have seen what happens when safety fails. There is no better education than the real world. Which, unfortunately, to most engineers today is almost a virtual world, an abstraction. It is hard to connect a code editing session to hundreds dead in a crash. Yet that line is present, direct and clear.

    I have a silver ring that looks much like a wedding band, but is worn on a different finger. It is based on the Canadian Iron Ring ceremony (https://en.wikipedia.org/wiki/Iron_Ring). Please read the link. Wearing that ring is a visible reminder to both myself and others that failure to do always pursue excellence can cost innocent lives.

    It sometimes means standing up to management when situations arise that I believe need

    • Walking away from the project doesn't stop it going ahead. Someone eventually caves to managements request to do it cheaply.

      If management doesn't allow the engineers to spend huge up-front safety costs then it doesn't happen. Management can only blame the workers if they can first prove they weren't doing shortcuts themselves.

      This applies equally to production too. If a reported failing process is allowed to continue in production just to meet targets, it's not the workers at fault.

      The exec's must be to

    • > However, it is fundamental to the very concept of being a "professional" engineer to take complete responsibility for one's own work product.

      This is often not permitted. For security in a large environment, the tasks are often deliberately fragmented in order to expand the size of the group and limit responsibility, and authority, to increasingly small areas and avoid the risks of a single developer or engineer being a master of the entire system. Network firewalls are in distinct hands from applicati

    • I refuse to take assignments where I don't know and understand where and how my piece fits into the whole. I insist on being fully informed, and being given enough time and information to come to my own informed judgments about the project as a whole and the tasks at hand.

      It is awesome that you take your responsibility seriously. It is rare to see that, especially with medical doctors... but we are discussing engineering.

      The thing is, while we can demand that other engineers have your scruples and morals, the only way to find out that someone is "faking it" is for something to fail. Most things won't fail, so we have a surfeit of engineers who do NOT have the same scruples and morals as you.

      How do you intend to solve that problem? It is true, you have accurately assigned bla

    • Hear! Hear! @BobC So well put. As engineers, we are responsible for the things we build. Full stop. We are creating a new generation of developers who's "products" will have engineering level consequences. In my experience. they are largely unaware and unconcerned.
  • Engineering managers probably have the most challenging role, since they both need to sell upwards and downwards within an organization in order to maintain safety standards. The pattern that I have gleaned from reading many reports on disasters over the years indicates that most safety breakdowns start right here. The eng manager starts to prioritize business concerns from their leadership over the safety of their own product. Resistance of these pecuniary impulses is not enough -- safety has to be the watchword for everyone...

    The real problem is that safety is not an explicit business goal that sits right alongside all the other goals for the project at the top level, from day one. If we dare to think that safety is just something an engineering manager needs to remind his/her reports about often enough, the project is already set up to be inherently very risky.

  • "All had whistleblowers" doesn't mean much if we don't know how many safe-enough systems were also whistleblown upon. Can we reasonably review everything all the whistleblowers have whistleblown, without halting the whole economy?

"The medium is the massage." -- Crazy Nigel

Working...