DocumentCode :
1538081
Title :
Obtaining High-Quality Relevance Judgments Using Crowdsourcing
Author :
Vuurens, Jeroen B P ; De Vries, Arjen P.
Volume :
16
Issue :
5
fYear :
2012
Firstpage :
20
Lastpage :
27
Abstract :
The performance of information retrieval (IR) systems is commonly evaluated using a test set with known relevance. Crowdsourcing is one method for learning the relevant documents to each query in the test set. However, the quality of relevance learned through crowdsourcing can be questionable, because it uses workers of unknown quality with possible spammers among them. To detect spammers, the authors´ algorithm compares judgments between workers; they evaluate their approach by comparing the consistency of crowdsourced ground truth to that obtained from expert annotators and conclude that crowdsourcing can match the quality obtained from the latter.
Keywords :
document handling; information retrieval; outsourcing; IR systems; crowdsourcing; expert annotators; high-quality relevance judgments; information retrieval systems; learning method; Crowdsourcing; Detection algorithms; Information retrieval; Internet; Outsourcing; Query processing; Unsolicited electronic mail; crowdsourcing; judgment; quality; relevance; spam;
fLanguage :
English
Journal_Title :
Internet Computing, IEEE
Publisher :
ieee
ISSN :
1089-7801
Type :
jour
DOI :
10.1109/MIC.2012.71
Filename :
6216343
Link To Document :
بازگشت