• DocumentCode
    1888290
  • Title

    An experiment to assess different defect detection methods for software requirements inspections

  • Author

    Porter, A.A. ; Votta, L.G.

  • Author_Institution
    Dept. of Comput. Sci., Maryland Univ., College Park, MD, USA
  • fYear
    1994
  • fDate
    16-21 May 1994
  • Firstpage
    103
  • Lastpage
    112
  • Abstract
    Software requirements specifications (SRS) are usually validated by inspections, in which several reviewers read all or part of the specification and search for defects. We hypothesize that different methods for conducting these searches may have significantly different rates of success. Using a controlled experiment, we show that a scenario-based detection method, in which each reviewer executes a specific procedure to discover a particular class of defects has a higher defect detection rate than either ad hoc or checklist methods. We describe the design, execution and analysis of the experiment so others may reproduce it and test our results for different kinds of software developments and different populations of software engineers
  • Keywords
    formal specification; program debugging; program diagnostics; program verification; checklist method; defect detection rate; scenario-based detection method; software defect detection methods; software development; software engineers; software requirements inspections; software requirements specifications; Computer science; Design engineering; Educational institutions; Fault detection; Hardware; Inspection; Production; Software performance; Software standards; Software testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Software Engineering, 1994. Proceedings. ICSE-16., 16th International Conference on
  • Conference_Location
    Sorrento
  • ISSN
    0270-5257
  • Print_ISBN
    0-8186-5855-X
  • Type

    conf

  • DOI
    10.1109/ICSE.1994.296770
  • Filename
    296770