DocumentCode :
1689381
Title :
Automated test case generation for spreadsheets
Author :
Fisher, Marc ; Cao, Mingming ; Rothermel, Gregg ; Cook, Curtis R. ; Burnett, Margaret M.
Author_Institution :
Dept. of Comput. Sci., Oregon State Univ., Corvallis, OR, USA
fYear :
2002
Firstpage :
141
Lastpage :
151
Abstract :
Spreadsheet languages, which include commercial spreadsheets and various research systems, have had a substantial impact on end-user computing. Research shows, however, that spreadsheets often contain faults. Thus, in previous work, we presented a methodology that assists spreadsheet users in testing their spreadsheet formulas. Our empirical studies have shown that this methodology can help end-users test spreadsheets more adequately and efficiently; however, the process of generating test cases can still represent a significant impediment. To address this problem, we have been investigating how to automate test case generation for spreadsheets in ways that support incremental testing and provide immediate visual feedback. We have utilized two techniques for generating test cases, one involving random selection and one involving a goal-oriented approach. We describe these techniques, and report results of an experiment examining their relative costs and benefits.
Keywords :
cost-benefit analysis; program debugging; program testing; spreadsheet programs; automated test case generation; cost benefit analysis; end-user computing; experiment; goal-oriented approach; incremental testing; methodology; program testing; random selection; spreadsheets; visual feedback; Automatic testing; Business; Computer aided software engineering; Computer science; Impedance; Output feedback; Permission; Programming profession; Software engineering; Tail;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Software Engineering, 2002. ICSE 2002. Proceedings of the 24rd International Conference on
Conference_Location :
Orlando, FL, USA
Print_ISBN :
1-58113-472-X
Type :
conf
Filename :
1007963
Link To Document :
بازگشت