Title :
CrowdOracles: Can the Crowd Solve the Oracle Problem?
Author :
Pastore, Fabrizio ; Mariani, Leonardo ; Fraser, Gordon
Author_Institution :
Univ. of Milano Bicocca, Milan, Italy
Abstract :
Despite the recent advances in test generation, fully automatic software testing remains a dream: Ultimately, any generated test input depends on a test oracle that determines correctness, and, except for generic properties such as “the program shall not crash”, such oracles require human input in one form or another. CrowdSourcing is a recently popular technique to automate computations that cannot be performed by machines, but only by humans. A problem is split into small chunks, that are then solved by a crowd of users on the Internet. In this paper we investigate whether it is possible to exploit CrowdSourcing to solve the oracle problem: We produce tasks asking users to evaluate CrowdOracles - assertions that reflect the current behavior of the program. If the crowd determines that an assertion does not match the behavior described in the code documentation, then a bug has been found. Our experiments demonstrate that CrowdOracles are a viable solution to automate the oracle problem, yet taming the crowd to get useful results is a difficult task.
Keywords :
Internet; automatic test pattern generation; program testing; CrowdOracles; CrowdSourcing; Internet; Oracle problem; automatic software testing; code documentation; generic properties; program behavior; test generation; test oracle; Computer crashes; Documentation; Educational institutions; Java; Measurement; Software; Software testing; crowd sourcing; test case generation; test oracles;
Conference_Titel :
Software Testing, Verification and Validation (ICST), 2013 IEEE Sixth International Conference on
Conference_Location :
Luembourg
Print_ISBN :
978-1-4673-5961-0
DOI :
10.1109/ICST.2013.13