DocumentCode :
248828
Title :
Benchmarking result diversification in social image retrieval
Author :
Ionescu, Bogdan ; Popescu, Adrian ; Muller, Holger ; Menendez, Maria ; Radu, Anca-Livia
Author_Institution :
Univ. Politeh. of Bucharest, Bucharest, Romania
fYear :
2014
fDate :
27-30 Oct. 2014
Firstpage :
3072
Lastpage :
3076
Abstract :
This article addresses the issue of retrieval result diversification in the context of social image retrieval and discusses the results achieved during the MediaEval 2013 benchmarking. 38 runs and their results are described and analyzed in this text. A comparison of the use of expert vs. crowdsourcing annotations shows that crowdsourcing results are slightly different and have higher inter observer differences but results are comparable at lower cost. Multimodal approaches have best results in terms of cluster recall. Manual approaches can lead to high precision but often lower diversity. With this detailed results analysis we give future insights on this matter.
Keywords :
image retrieval; social networking (online); MediaEval 2013 benchmarking; crowdsourcing annotations; expert annotations; retrieval result diversification; social image retrieval; Benchmark testing; Cultural differences; Global Positioning System; Image retrieval; Media; Optimization; Visualization; crowdsourcing; image content description; re-ranking; result diversification; social photo retrieval;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Image Processing (ICIP), 2014 IEEE International Conference on
Conference_Location :
Paris
Type :
conf
DOI :
10.1109/ICIP.2014.7025621
Filename :
7025621
Link To Document :
بازگشت