Author :
Shannon, Russell ; Modi, Mukund ; Stanco, Joe
Author_Institution :
Naval Air Syst. Command, Lakehurst, NJ, USA
Abstract :
Over the past two decades, the US Department of Defense (DoD) has seen the introduction of weapon systems that do not meet their diagnostic requirements when initially fielded. Some suffer false alarm rates over eighty percent [1]. During a product\´s lifecycle, the ability to determine how well it performs is based on the capability to test and the evaluation of those tests. Different tests and evaluations are required throughout the product\´s lifecycle. These tests must, therefore, have the standards that define parameters, techniques and procedures for measurements that present an accurate and precise communication of information. In order to effectively accomplish this task, the development of quality tests capable of supporting these tasks must be defined and documented for the design, production, and operations-and-support phases of a system\´s lifecycle. As a product proceeds through its life cycle, the information collected at each phase must be used for the support of subsequent phases. Demonstrations of avionics system and subsystem diagnostic capability are performed before a system or subsystem is verified. This ordinarily happens during the System Design and Demonstration phase of a program. In the case of aircraft orground vehicles, there are several subsystem demonstrations, followed by a single system-level event. A major issue plaguing the development of aircraft avionics systems is the lack of standardized methods to demonstrate full testability in a scientific and efficient manner before an aircraft is fielded. This is usually due to budgetary, schedule and knowledge constraints. Shannon and Knecht [1] surveyed diagnostic managers in government and Industry regarding the current state of test on major aircraft acquisition programs. The authors reported that there was "agreement that current guidance was insufficient to prevent erroneous, incomplete or insufficient testing at the system or subsystem level" before an aircraft was fielded. Each phas- - e of the acquisition process can be decomposed into a set of processes that can be grouped together. These groups can be further decomposed until there is sufficient information to completely describe the process. Using this technique, a hierarchical set of data items and their associated processes can be defined. From this group, the processes involved in testing can now be identified and used in activities that have multiple applications. In this paper, the authors present the process flows associated with the development and testing of a system. The authors identify where those processes typically break down "in the real world", due to constraints such as budget, schedule, lack of training/guidance for those involved, and other factors. The authors then recommend use of existing standards, at appropriate "break points", and suggest where new industry wide standards need to be developed if one does not currently exist.
Keywords :
automatic testing; defence industry; equipment evaluation; military equipment; military standards; standardisation; weapons; US Department of Defense; diagnostic requirements; industry-wide standards; military equipment; optimum test; process flow; product lifecycle; quality tests; standardization; weapon systems; Aircraft; Electric breakdown; IEEE standards; Maintenance engineering; Testing; XML; ATS Framework; diagnostics; prognostics; testability;