Title :
Scalable isolation of failure-inducing changes via version comparison
Author :
Ghanavati, Mojgan ; Andrzejak, Artur ; Zhen Dong
Author_Institution :
Inst. of Comput. Sci., Heidelberg Univ., Heidelberg, Germany
Abstract :
Despite of indisputable progress, automated debugging methods still face difficulties in terms of scalability and runtime efficiency. To reach large-scale projects, we propose an approach which reports small sets of suspicious code changes. Its essential strength is that size of these reports is proportional to the amount of changes between code commits, and not the total project size. In our method we combine version comparison and information on failed tests with static and dynamic analysis. We evaluate our method on real bugs from Apache Hadoop, an open source project with over 2 million LOC1. In 2 out of 4 cases, the set of suspects produced by our approach contains exactly the location of the defective code (and no false positives). Another defect could be pinpointed by small approach extensions. Moreover, the time overhead of our approach is moderate, namely 3-4 times the duration of a failed software test.
Keywords :
configuration management; parallel processing; program debugging; program diagnostics; public domain software; Apache Hadoop; automated debugging methods; code commits; dynamic analysis; failure-inducing code changes; open source project; software test; static analysis; version comparison; Debugging; Instruments; Java; Libraries; Runtime; Software; Testing; Automated debugging; failure-inducing changes; large-scale projects; software tests; thin slicing; version comparison;
Conference_Titel :
Software Reliability Engineering Workshops (ISSREW), 2013 IEEE International Symposium on
Conference_Location :
Pasadena, CA
DOI :
10.1109/ISSREW.2013.6688895