Paper

Automated Generation of Test Artifacts and Traceability for a Safety-Critical, Distributed Avionics Platform

Author(s)
Christian Block (University of Stuttgart, Germany) [email protected]
Serkan Dikmen (University of Stuttgart, Germany) [email protected]
Reinhard Reichel (Universität Stuttgart, Germany) [email protected]

paper Menu

Abstract

A large share of the high effort and costs for the development and certification of safety-critical, distributed avionics systems, such as fly-by-wire systems, is consumed by testing. Hence, automation, not only for test execution but also for test artifact generation, promises significant cost and effort reductions. In this paper we present an approach to enable this. The approach is one out of three automation subprocesses that together form the AAA process. This model-driven process is designed in conjunction with the Flexible Avionics Platform developed at the Institute of Aircraft Systems at the University of Stuttgart in cooperation with Aviotech GmbH. The Flexible Avionics Platform enables the highly automated instantiation of integrated, distributed, safety-critical systems with the first subprocess of the AAA process. The second subprocess generates the corresponding requirements which are the input for the third subprocess for test artifact generation. This third subprocess automatically generates the verification specifications on system and software high-level as required in aeronautical standards such as ARP4754A and DO-178C, as well as test scripts that enable the automated execution of all tests on a hardware-in- the-loop test system. For that, a suitable set of test vectors is determined based on each requirement. Via simulation of the system behavior test cases are generated. Based on this, test artifacts are instantiated and traceability between them and the requirements is added. We currently validate the Flexible Avionics Platform and the AAA process in the context of an in-flight demonstrator for unmanned flight in non-segregated airspace.

Keywords—System testing, Model-driven development, Automatic testing