Blog
 

WHY USE ELIMINATIVE ARGUMENTATION

Over the last few years CSL has embraced Eliminative Argumentation (EA) as a notation for describing system assurance cases, and more importantly, as a mode of thinking about overall system assurance.

EA was proposed by Goodenough and Weinstock as an extension to Goal Structured Notation (GSN), the conventional method used to express assurance cases [1]. Their objective was to create a method to improve “confidence” in an argument by systematically eliminating reasons to doubt the argument’s validity.

In this blog post, we will argue that EA is both a natural means of arguing about system safety and that it ultimately leads to more comprehensive and useful safety arguments.

A TEASPOON OF DOUBT
As engineers, we are trained to question the validity of claims about the systems we design. Any engineer with a deep understanding of their system will be able to rattle off a handful of reasons why a system might not deliver its stated functionality.

Despite what marketing campaigns might have you believe, the reality is that safety (and security) engineering is messy business. At CSL we are suspicious of assurance cases that do not have some “warts” or reasons to doubt their validity, and we are not alone in our doubt. For example, the notion of “defect free” software is regarded as an axiomatic impossibility by software engineers!

Engineers’ doubts should not be swept under the rug. These doubts should be embraced, and the goal should be to express them as clearly as possible! Look no further than the infamous Challenger disaster to see the consequences of a lack of clear communication about engineering doubts. The good news is that EA explicitly permits (and indeed encourages) the expression of doubt.

EA is based on the principle that “when you have eliminated the impossible, whatever remains, however improbable, must be the truth” (Sir Arthur Conan Doyle). More concretely, EA is founded on the premise that one can systematically express, and then subsequently refute, doubts about the validity of a claim. If doubts are not adequately addressed, they become “residual” and should be subject to further consideration. Maybe you need to change the design to address a doubt, or perhaps it is a fundamental risk that you must accept. If it is the latter, then you have a nice tidy list of “residual doubts” that you can show to stakeholders and ask, “do we accept the risk?

There is a bit more to EA than making a laundry list of doubts. In fact, EA provides a framework for structuring arguments around doubt. See more in this report published by the SEI [1].

Of course, too much of a good thing can be a problem. Endlessly doubting oneself might adversely affect a project (and possibly your health). There is certainly a sweet spot where all the critical doubts have been identified. As with many aspects of engineering, this becomes a matter of judgement and experience.

DOUBT AND CONFIRMATION BIAS
Confirmation bias is a serious concern when developing a safety case. When you set out to “prove” that a system is safe, it is often tempting to take the path of least resistance and not question the claims made.

The fatal Nimrod crash in 2006 is a notable example of how confirmation bias can adversely affect a safety case. Charles Haddon-Cave’s “Nimrod Review” provides the following assessment: “the Nimrod Safety Case [was] fatally undermined by an assumption by all the organisations and individuals involved that the Nimrod was ‘safe anyway’, because the Nimrod fleet had successfully flown for 30 years, and they were merely documenting something which they already knew. … The Nimrod Safety Case became essentially a paperwork and ‘tickbox’ exercise” [2].

Hindsight is always 20/20 and it would be hubris to presume that EA would have been the “silver bullet” that prevented this accident. But it is worth reflecting on how the deliberate and careful consideration of doubts (that were almost certainly in minds of these engineers) might have changed the outcome of the Nimrod safety case.

ONE STEP AHEAD
In addition to providing a medium for expression and consideration of doubt, we find that EA helps anticipate questions that arise during certification audits. Safety and security auditors are driven by doubt and demand evidence to support claims made to address their doubts. In our experience, EA helps engineering teams prepare the right evidence for an auditor.

Moreover, using an EA-style argument structures the evidence and ensures that it is well organized and ready to go. This avoids the, often uncomfortable and stressful, “scramble” to collect evidence in response to an auditor’s queries. Not only does this save time (and reduces stress), it also demonstrates to an auditor that the project’s safety and security programs were diligently managed, which helps increase confidence that risk has been carefully managed.

CONCLUSION
To date, CSL has employed EA in a number of projects across the automotive, rail, and industrial control domains. We have found that careful treatment of engineering doubts is a natural way to reduce confirmation bias and to increase confidence in an assurance case. From an organizational perspective, EA ensures teams are well prepared for a safety audit and helps to reduce certification risk.

Overall, we see EA as the next step in the evolution of safety and security assurance cases and a key tool in our toolbox for handling increasingly complex systems.
For more details on EA, please see the forthcoming CSL paper published in the proceedings of IEEE SysCon 2020 [3]. This paper provides an example and concrete ‘lessons learned’ from our industrial work using EA.

REFERENCES
[1] J. B. Goodenough, C. B. Weinstock and A. Z. Klein, "Eliminative Argumentation: A Basis for Arguing Confidence in System Properties," Software Engineering Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania, 2015.
[2] C. Haddon-Cave, "The Nimrod Review," London Stationary Office, London, UK, 2009.
[3] S. Diemert and J. Joyce, "Eliminative Argumentation for Arguing System Safety - A Practitioner's Experience," in IEEE SysCon, Montreal, 2020.

 

  • Friday, 15 May 2020 02:25
    Over the last few years CSL has embraced Eliminative Argumentation (EA) as a notation for describing system assurance cases, and more importantly, as a mode of thinking about overall system assurance. EA was proposed by Goodenough and Weinstock as an extension to Goal Structured Notation (GSN), the conventional method used to express assurance cases [1]. Their objective was to create a method to improve “confidence” in an argument by systematically eliminating reasons to doubt the argument’s validity.In this blog post, we will argue that EA is both a natural means of arguing about system safety and that it ultimately leads to more comprehensive and useful safety arguments...
  • Thursday, 18 September 2014 20:17
    Cyber-Security and Safety for Aircraft and Aircraft Systems: DO-326A guidance CSL has been an active member of the international committee, RTCA SC 216, charged with the responsibility of developing guidance material that will help ensure safe, secure and efficient operations amid the growing use of highly integrated electronic systems and network technologies used on-board aircraft, for CNS/ATM systems and air carrier operations and maintenance. Recent efforts of the committee have resulted in a revision of RTCA DO 326 “Airworthiness Security Process Specification” that was released on the RTCA web site in August 2014. The guidance of DO-326A is intended to augment current guidance for aircraft certification to handle the information security (i.e., cybersecurity) threat to aircraft safety. In a nutshell, this new document describes a security engineering process that includes generic activities with corresponding compliance objectives. The scope of DO-326A not only covers the...
  • Friday, 06 September 2013 15:39
    EN ISO 14971 or not EN ISO 14971?The European community recognised EN ISO 14971:2012 in July 2012. EN ISO 14971:2012 supersedes EN ISO 14971:2009 which was based on ISO 14971:2007 ‘Medical devices - Application of risk management to medical devices’.In general, the EC committee felt that the application of ISO14971:2007 did not meet the Essential Requirements described in the European Medical Device Directive 93/42/EEC. Therefore the EC group reviewed ISO14971 to identify these areas in the standard that are not compliant with the MDD and formally document these deviations.EN ISO 14971:2012 applies only to manufacturers with devices intended for the European market; for the rest of the world, ISO 14971:2007 remains the standard recommended for risk management purposes.Standard outlineThis standard published* in 2012 is somewhat unusual in its layout: it includes three annexes located at the beginning of the document and then includes a copy of the 2007 corrected version of ISO 14971....
  • Friday, 26 July 2013 22:36
    FAA recognizes RTCA DO-178C and associated technical supplements (July 2013)The FAA published AC20-115C on July 19, 2013. In this AC, the FAA recognizes RTCA DO-178C, three associated technical supplements and the 'Software Tool Qualification Considerations' document.The actual documents that are the subject of this AC are the following RTCA documents:- DO-178C, Software Considerations in Airborne Systems and Equipment Certification- DO-330, Software Tool Qualification Considerations, dated December 13, 2011.- DO-331, Model-Based Development and Verification Supplement to DO178C and DO-278A,- DO-332, Object-Oriented Technology and Related Techniques Supplement to DO-178C and DO-278A- DO-333, Formal Methods Supplement to DO-178C and DO-278AIt has been a long awaited recognition since the release of RTCA DO-178C in December 2011. (In terms of comparison, RTCA DO-178B was endorsed by the FAA only one month after its publication).Similarly to the previous AC that it replaces, this new AC...
  • Tuesday, 16 July 2013 20:03
    Correctness vs. SafetyOne of the examples that we regularly use in our training material is the catastrophic loss of Lufthansa Flight 2904 on September 14, 1993 when it ran off the end of the runway in Warsaw Poland.  It is an interesting and very useful teaching example because it illustrates some of the main themes of the training that we regularly provide to clients on system/software safety. This accident is particularly effective as an introduction to the training material because students quickly realize that we are not simply talking about defect prevention or quality assurance.When an Airbus 320 lands, the crew relies on the combination of brakes, ground spoilers and reverse thrusters to slow the aircraft.  However in the case of Flight 2904 the activation of all three of these critical systems was delayed such that the aircraft reached the end of the runway at a speed of 72 knots and hit an embankment resulting in 2 fatalities.The official investigation...
  • Wednesday, 22 May 2013 17:38
    Alarm Management  Our clients appreciate the fact that we are involved in projects across a wide spectrum of industries becausewe bring insights from common challenges experienced by other industries that lead to innovative solutionsto their problems.  A good example is alarm management which is a consideration in the design of almostevery kind of critical system.  Although the details of alarm management may vary considerably betweentechnical domains, our approach to helping clients with alarm management is based on the same fundamentalconcepts and principles.Thanks to Hollywood and movies such as The China Syndrome (1979), most of us have a sense of theadrenaline fueled drama of a control room during an emergency with alarm bells ringing and lights flashing.  But helping a client develop a sound approach to alarm management goes well beyond thinking about therare moments of high drama.For example, an operator could eventually become desensitized to a spurious alarm that is...