• DocumentCode
    1788694
  • Title

    Crowdsourcing 2.0: Enhancing execution speed and reliability of web-based QoE testing

  • Author

    Gardlo, Bruno ; Egger, Sebastian ; Seufert, Michael ; Schatz, Roland

  • Author_Institution
    Telecommun. Res. Center, Vienna (FTW), Vienna, Austria
  • fYear
    2014
  • fDate
    10-14 June 2014
  • Firstpage
    1070
  • Lastpage
    1075
  • Abstract
    Since its introduction a few years ago, the concept of `Crowdsourcing´ has been heralded as highly attractive alternative approach towards evaluating the Quality of Experience (QoE) of networked multimedia services. The main reason is that, in comparison to traditional laboratory-based subjective quality testing, crowd-based QoE assessment over the Internet promises to be not only much more cost-effective (no lab facilities required, less cost per subject) but also much faster in terms of shorter campaign setup and turnaround times. However, the reliability of remote test subjects and consequently, the trustworthiness of study results is still an issue that prevents the widespread adoption of crowd-based QoE testing. Various ideas for improving user rating reliability and test efficiency have been proposed, with the majority of them relying on a posteriori analysis of results. However, such methods introduce a major lag that significantly affects efficiency of campaign execution. In this paper we address these shortcomings by introducing in momento methods for crowdsourced video QoE assessment which yield improvements of results reliability by factor two and campaign execution efficiency by factor ten. The proposed in momento methods are applicable to existing crowd-based QoE testing approaches and suitable for a variety of service scenarios.
  • Keywords
    Internet; human factors; outsourcing; quality of experience; Web-based QoE testing execution speed enhancement; Web-based QoE testing reliability enhancement; a-posteriori analysis; campaign execution efficiency; campaign setup; crowd-based QoE assessment; crowdsourced video QoE assessment; crowdsourcing 2.0; in-momento methods; networked multimedia services; quality of experience; remote test subject reliability; test efficiency improvement; trustworthiness; turnaround times; user rating reliability improvement; Crowdsourcing; Quality assessment; Quality of service; Reliability engineering; Testing; Video recording; Crowdsourcing; Quality Evaluation; Quality of Experience; Reliability verification;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Communications (ICC), 2014 IEEE International Conference on
  • Conference_Location
    Sydney, NSW
  • Type

    conf

  • DOI
    10.1109/ICC.2014.6883463
  • Filename
    6883463