Title :
A Model Building Process for Identifying Actionable Static Analysis Alerts
Author :
Heckman, Sarah ; Williams, Laurie
Author_Institution :
North Carolina State Univ., Raleigh, NC
Abstract :
Automated static analysis can identify potential source code anomalies early in the software process that could lead to field failures. However, only a small portion of static analysis alerts may be important to the developer (actionable). The remainder are false positives (unactionable). We propose a process for building false positive mitigation models to classify static analysis alerts as actionable or unactionable using machine learning techniques. For two open source projects, we identify sets of alert characteristics predictive of actionable and unactionable alerts out of 51 candidate characteristics. From these selected characteristics, we evaluate 15 machine learning algorithms, which build models to classify alerts. We were able to obtain 88-97% average accuracy for both projects in classifying alerts using three to 14 alert characteristics. Additionally, the set of selected alert characteristics and best models differed between the two projects, suggesting that false positive mitigation models should be project-specific.
Keywords :
learning (artificial intelligence); program diagnostics; software process improvement; source coding; actionable static analysis alerts; automated static analysis; false positive mitigation models; machine learning techniques; open source projects; software process; source code anomalies; Buildings; Data mining; Failure analysis; Inspection; Machine learning; Machine learning algorithms; Predictive models; Programming profession; Software testing; Software tools; false positive mitigation; machine learning; static analysis;
Conference_Titel :
Software Testing Verification and Validation, 2009. ICST '09. International Conference on
Conference_Location :
Denver, CO
Print_ISBN :
978-1-4244-3775-7
Electronic_ISBN :
978-0-7695-3601-9
DOI :
10.1109/ICST.2009.45