Title :
Test-suite reduction for model based tests: effects on test quality and implications for testing
Author :
Heimdahl, Mats P E ; George, Devaraj
Author_Institution :
Dept. of Comput. Sci. & Eng., Minnesota Univ., Minneapolis, MN, USA
Abstract :
Model checking techniques can be successfully employed as a test case generation technique to generate tests from formal models. The number of tests cases produced, however, is typically large for complex coverage criteria such as MCDC. Test-suite reduction can provide us with a smaller set of test cases that present the original coverage-often a dramatically smaller set. One potential drawback with test-suite reduction is that this might affect the quality of the test-suite in terms of fault finding. Previous empirical studies provide conflicting evidence on this issue. To further investigate the problem and determine its effect when testing formal models of software, we performed an experiment using a large case example of a flight guidance system, generated reduced test-suites for a variety of structural coverage criteria while presenting coverage, and recorded their fault finding effectiveness. Our results show that the size of the specification based test-suites can be dramatically reduced and that the fault detection of the reduced test-suites is adversely affected. In this report we describe our experiment, analyze the results, and discuss the implications for testing based on formal specifications.
Keywords :
conformance testing; fault tolerant computing; formal specification; program testing; program verification; MCDC; automated test generation; complex coverage criteria; fault detection; flight guidance system; formal model testing; formal specification; model based testing; model checking; software testing; specification based test-suites; specification-based testing; structural coverage criteria; test case generation technique; test quality; test-suite reduction; Automatic testing; Computer science; DC generators; Electronic mail; Fault detection; NASA; Performance evaluation; Software performance; Software testing; System testing;
Conference_Titel :
Automated Software Engineering, 2004. Proceedings. 19th International Conference on
Print_ISBN :
0-7695-2131-2
DOI :
10.1109/ASE.2004.1342735