Archive | July, 2012

DO-178C Level C Clarification

25 Jul

DO-178C Level C Clarification – A Synopsis excerpt from Vance Hilderman’s private technical research.

“There has been much controversy, in fact more than a few arguments, in the normally staid avionics engineering community, about DO-178C’s alleged requirement  to greatly increase the amount of testing required for Level C (and even Level B) software.   And some have said that DO-178C will require performing MCDC coverage analysis on Level C software.  Is it true? No:  DO-178C does not require MCDC coverage for Level C software; however a closer look at the facts is necessary to understand that DO-178C will force more detailed requirements for Level C  (and Level A and B) software; therefore more test cases of those requirements will be necessary which thus emulates a portion of the added testing normally associated with MCDC.  A closer analysis of the facts is warranted …

First, it should be note that this topic is partially addressed within the book “Avionics Certification – A Complete Guide To DO-178 & DO-254” which has sold thousands of copies (all royalties are donated to charity directly by the publisher, Avionics Communications Inc, the world’s largest publisher of avionics related technical materials. Vance Hilderman, the principal author of that book states:

“The DO-178 book chapter on this topic is limited in scope and like all books, simply cannot be ‘all things to all people’. The book is meant as an overview, as it was generally written from the DO-178 Training  I developed during my 2004-2005 sabbatical. MCDC (rather Modified Condition Decision Coverage) is a formally required objective for DO-178C’s Level A software. In an all-too-brief synopsis, MCDC structural coverage attempts to affirm that each source code statement has been verified to ensure that each condition within that statement properly, and independently, affects the outcome of that statement (with the intent to correlate such conditions to the underlying requirements). More details are provided in my DO-178C Verification Whitepaper, but here is an extract:

As an example, consider the following logic:

Park_ Status  =  (Engines_Off  &&  Brakes_Set )  ||   (Weight_On_Wheels  ||  Parked_Discrete);

Clearly, the True/False value of Park_Status is a function of four conditions:

Engines_Off

Brakes_Set

Weight_On_Wheels

Parked_Discrete

Each of the above four conditions can independently affect the value of Park_Status.  Level A requires MCDC coverage, meaning a minimum of N+1 test cases where N equals the number of conditions. Therefore, there are clearly sixteen possible test cases (2**4) for the above line of code and Level A requires execution and analysis of at least five of them (N+1, where N is the number of conditions).

Now, Level C merely requires statement coverage, meaning each statement must be executed. Therefore, it could be implied that only one of the sixteen possible test cases require execution for Level C. So, which is correct: does Level C require just one test case for the above line of code? Two test cases? Three test cases? Four test cases? Or five test cases?  Can you simply decide for yourself?  The answer requires a true knowledge of the intent of DO-178.

First, DO-178C Level C requires both high- and low-level requirements, and verification of those requirements, with the intent that such low-level requirements cover normal operating conditions and algorithms.  Both high- and low-level requirements must be written, uniquely identified, traced, and verified (reviewed and tested). The above logic block is clearly a normal operating condition expressed via an algorithm.  Ideally then there should  have been low-level requirements (note the plural form) describing the above logic, prior to actually writing that logic.  Note that I use the term “logic” instead of software, since realistically the implementation could have been either via software (executing in a CPU or microcontroller), or via firmware in silicon (more properly termed Complex Electronic Hardware  (CEH) which is implemented via an FPGA, PLD, or ASIC, for example).  Under DO-178B, the objective was to ensure all major logic blocks were associated with requirements, typically low-level requirements.  But did DO-178B formally require such?  No, however thorough, and good, engineers took DO-178B’s intent to heart and wrote more detailed requirements than was minimally necessary, then performed more functional (requirements-based) testing than was minimally necessary.  Those thorough and good engineers are rewarded under DO-178C by having very little additional work to do.  However, those doing the least amount of work allowable under DO-178B will find themselves needed to have more detailed requirements, and thus more test cases, under DO-178C than they previously were able to get away with under DO-178B …

For the above example, the high-level requirements could state “The capability shall be provided to determine the plane’s parking status based upon the status of engines, brakes, weight-on-wheels, and parking mode.” A corresponding low-level requirement  might be “Parking  Status shall be set to Parked when the engines are off and the brakes are set.”  Another associated low-level requirement might be “Parking status shall be set to Parked when both the plane’s Parked discrete is set to Parked and its  weight-on-wheels status is positive.” (Note:  for extra credit, please find the error in the logic, based upon the aforementioned requirements … and THAT is why you must do code reviews to the traced requirements, and test reviews to the traced requirements!).

If the low-level requirements were not completely elucidated like the above, or if the software verifier missed the intent of such low-level requirements, then test cases could potentially be missing on a Level C project. In that case, any combination of the four conditions which yielded a Park_Status of True would satisfy the DO-178C Level C structural coverage objective.  And such could be rationalized under the seemingly innocent claim that “this isn’t a life-critical Level A or B system, since it’s designated Level C”.  And there’s the paradox:  by fully verifying properly elucidated Level C requirements, many or all of the MCDC test cases normally reserved for Level A would be executed via Level C.  And that is a good thing. But it is simply not required. Remember:  DO-178C has five distinct criticality levels, even though everyone recognizes that systems, and the aircraft, would be safer if we simply developed all logic to Level A standards. The five different criticality levels are provided simply because “not all systems are created equal” in terms of contribution to flight safety.

Level C systems or components are not as critical to flight safety as their Level A or B counterparts.  Should your requirements standard and verification plan require such stringent verification for Level C?  “I personally believe so”, states Vance Hilderman. Does DO-178C mandate such?  “Not specifically, however ‘good practices’ imply such”, explains Vance Hilderman.” It should be noted that DO-248 makes a modest (though universally acknowledged as ‘weak’) attempt to further explain this; however, any avionics veteran knows that DO-248 applies to DO-178B, not DO-178C.”  Will Level C software be more reliable under DO-178C than under DO-178B? “Level C software will have more detailed requirements under DO-178C than most developers employed for DO-178B; thus there will be more test cases for Level C software required under DO-178C than there was for DO-178B.  However Mr. Hilderman notes “Testing software does not directly improve its quality; rather testing assesses software quality and it is that assessment which can be used to improve quality.  Also, avionics systems Design Assurance Level (DAL) plays a large role in system reliability, since Level B software is 100 times more reliable than Level C, due to the architecture of the system (usually mandating redundancy for Level B to achieve the requisite 1 x 10-7 reliability factor versus Level C’s 1 x 10-5 value.

<This section deleted from this general-distribution synopsis.>

This is but one of many “gray areas” which DO-178C helped to illuminate but still left open for interpretation within individual projects. There are many criteria by which projects adhere to DO-178C criteria, and the certification authorities or oversight entities have the responsibility to clarify on a case-by-case basis for each project.  For more details, contact Vance Hilderman directly.

DO-178C Core Changes Synopsis

25 Jul

<Excerpted from Vance Hilderman’s private research>

 

DO178C Changes To Core Body: MCDC, Requirement Detail, and Traceability, by Vance Hilderman

Everyone knows that DO-178C adds significantly useful information  and clarification regarding formal methods, tool qualification, object oriented programming, and model-based development; this greatly anticipated information is placed in all-new annexes (appendices) appended to the new DO-178C.

What is less appreciated however are the seemingly subtle changes to the core body text of DO-178C itself and not the signiicant additions represented by the Supplements.  Specifically, DO-178A and DO-178B were notorously “flexible” regarding the level of detailed specified within the requisite software requirements. This flexibility meant that less rigorous software requirement specifications contained less-than-desirable low level requirement detail; the resultant requirements ambiguities led to increased software errors and/or increased structural coverage analysis associated with uncovered structures, particularly at Level A which required statement, DC, and MCDC coverage.  Therefore, DO-178C tightens the requirement to verify MCDC cases during the software functional testing of requirements. The software test case development should consider low-level perturbations of conditions thereby providing near-complete MCDC coverage at the requirement level.  Also, DO-178B implied the requirement for closed-loop traceability, to ensure that all requirements were covered within design/code/tests (“top-to-bottom” coverage) and also that only the specified requirement were implemented (“bottom-to-top” coverage).  DO-178C clarifies such to more specifically require top-to-bottom and bottom-to-top traceabilty to remove any ambiguity.

Quality Assurance within DO-178 And DO-254

25 Jul

<extracted from private research by Vance Hilderman>

Quality   Assurance (“QA”) is arguably the most critical aspect of avionics software   and hardware certification within DO-178C and DO-254.  However, QA is almost never given the   attention, or credit, befitting its crucial role.  Consider the following statements and assess   whether they are true or false; answers and explanations are provided within   this paper: 

  1. 1.       QA’s most important   role is assessing final product quality (?)
  2. 2.       QA personnel perform   technical reviews (?)
  3. 3.       QA personnel assess   avionics development engineer’s adherence to criteria specified in DO-178C   and DO-254 (?)
  4. 4.       The four Stage Of   Involvement (“SOI”) events represent the four QA audits of the avionics   development process (?)
  5. 5.       Every Level A and   Level B avionics development project requires at least three persons, and the   most crucial of those is the QA person (?)

 

Easy questions?  Not at first glance; in fact, answering the   above without understanding the overall DO-178C and DO-254 framework is like   understanding Fourier transforms without first understanding The   Calculus:  impossible for mere mortals.  

 

 

In DO-178 and DO-254, QA has two   primary responsibilities:

  1. 1.       Ensuring the   project specific development plans and standards comply with DO-178/254, and
  2. 2.       Assessing then   ensuring that the separate development organization has followed those plans,   from project inception through completed delivery.

 

In   other, non-avionics development activities, “quality assurance” implies a   more adjunct, and more passive, measurement role.  Wikipedia, for example, aptly states QA “is the systematic measurement, comparison   with a standard, monitoring of processes and an associated feedback loop that   confers error prevention.”   The weakness of such a traditional, yet   common,  QA interpretation within   safety-critical systems is obvious: 

  •   Who ensures those “standards”   are correct and compliant to ARP 4754A, DO-178C, and DO-254?
  •   Who ensures those “processes” are deterministic,   repeatable, clearly defined, and compliant to ARP 4754A, DO-178C, and DO-254?
  •   Who is responsible for ensuring proof exists that   errors were appropriately dispositioned and closed?

 

In avionics, the answers to the   above are simple:  “Quality Assurance”.

DO-178   and DO-254 are somewhat “flexible” regarding the  manner in which QA processes are defined,   scheduled, and performed. The Quality Assurance Plan, typically authored but   always signed by QA, is one of five project-specific plans which define how   that project intends to meet the applicable DO-178 and DO-254 objectives. While   flexible in process, DO-178 and DO-254 are not pushovers:  QA must ensure the following objectives are   met:

  •   All requisite development plans and standards are developed   per DO-178/254 and are then followed (including suppliers)
  •   Tansition criteria are satisfied
  •   Conformity Reviews are performed
  •   Audits are performed to ensure data exists which   affirms the above

 

Upon   first glance, the above list of avionics QA objectives seems easy; almost   obvious.  In fact, in other,   non-avionics development environments, “quality assurance” appears to embody   similar objectives:  assess product   implementation to measure and improve quality.  Virtually every consumer electronics   product manufactured today has some form of basic quality assurance performed   upon it.

 

However,   closer examination then understanding of DO-178/254’s QA framework reveals   more proactive and robust QA guidance. In avionics, there are five required   plans for every safety-related airborne system.  While all are important (and the FAA/EASA   wisely decline to state “which” plan or objectives are most important),   avionics certification experts generally agree that the order of importance   of the five required plans is:

  1. 1.       Certification Plan
  2. 2.       Quality Assurance Plan
  3. 3.       Configuration Management Plan
  4. 4.       Development Plan
  5. 5.       Verification Plan

 

The   2nd most important plan is the QA Plan which must describe QA   processes which ensure each system:

  •   has a complete set of plans/standards which embody 100%   of all applicable DO-178/254 objectives
  •   has mechanisms to assess whether the actual processes   used throughout the development process complies with those plans
  •    <Extracted from Vance Hilderman’s primary paper; see main paper for remaining content>