DocumentCode :
2280466
Title :
QualityCrowd — A framework for crowd-based quality evaluation
Author :
Keimel, Christian ; Habigt, Julian ; Horch, Clemens ; Diepold, Klaus
Author_Institution :
Inst. for Data Process., Tech. Univ. Munchen, Munich, Germany
fYear :
2012
fDate :
7-9 May 2012
Firstpage :
245
Lastpage :
248
Abstract :
Video quality assessment with subjective testing is both time consuming and expensive. An interesting new approach to traditional testing is the so-called crowdsourcing, moving the testing effort into the internet. We therefore propose in this contribution the QualityCrowd framework to effortlessly perform subjective quality assessment with crowdsourcing. QualityCrowd allows codec independent quality assessment with a simple web interface, usable with common web browsers. We compared the results from an online subjective test using this framework with the results from a test in a standardized environment. This comparison shows that QualityCrowd delivers equivalent results within the acceptable inter-lab correlation. While we only consider video quality in this contribution, QualityCrowd can also be used for multimodal quality assessment.
Keywords :
online front-ends; user interfaces; video signal processing; Internet; QualityCrowd framework; Web browsers; crowd-based quality evaluation; crowdsourcing; online subjective test; simple Web interface; subjective testing; traditional testing; video quality assessment; Browsers; Codecs; Correlation; Internet; Servers; Testing; Video sequences;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Picture Coding Symposium (PCS), 2012
Conference_Location :
Krakow
Print_ISBN :
978-1-4577-2047-5
Electronic_ISBN :
978-1-4577-2048-2
Type :
conf
DOI :
10.1109/PCS.2012.6213338
Filename :
6213338
Link To Document :
بازگشت