DocumentCode
3724131
Title
LambdaMF: Learning Nonsmooth Ranking Functions in Matrix Factorization Using Lambda
Author
Guang-He Lee;Shou-De Lin
fYear
2015
Firstpage
823
Lastpage
828
Abstract
This paper emphasizes optimizing ranking measures in a recommendation problem. Since ranking measures are non-differentiable, previous works have been proposed to deal with this problem via approximations or lower/upper bounding of the loss. However, such mismatch between ranking measures and approximations/bounds can lead to non-optimal ranking results. To solve this problem, we propose to model the gradient of non-differentiable ranking measure based on the idea of virtual gradient, which is called lambda in learning to rank. In addition, noticing the difference between learning to rank and recommendation models, we prove that under certain circumstance the existence of popular items can lead to unlimited norm growing of the latent factors in a matrix factorization model. We further create a novel regularization term to remedy such concern. Finally, we demonstrate that our model, LambdaMF, outperforms several state-of-the-art methods. We further show in experiments that in all cases our model achieves global optimum of normalized discount cumulative gain during training. Detailed implementation and supplementary material can be found at (http://www.csie.ntu.edu.tw/~b00902055/).
Keywords
"Loss measurement","Approximation methods","Collaboration","Data models","Predictive models","Optimization","Matrix decomposition"
Publisher
ieee
Conference_Titel
Data Mining (ICDM), 2015 IEEE International Conference on
ISSN
1550-4786
Type
conf
DOI
10.1109/ICDM.2015.108
Filename
7373396
Link To Document