Title :
Learning Preferences with Millions of Parameters by Enforcing Sparsity
Author :
Chen, Xi ; Bai, Bing ; Qi, Yanjun ; Qihang Lin ; Carbonell, Jaime
Author_Institution :
NEC Labs. America, Princeton, NJ, USA
Abstract :
We study the retrieval task that ranks a set of objects for a given query in the pair wise preference learning framework. Recently researchers found out that raw features (e.g. words for text retrieval) and their pair wise features which describe relationships between two raw features (e.g. word synonymy or polysemy) could greatly improve the retrieval precision. However, most existing methods can not scale up to problems with many raw features (e.g. English vocabulary), due to the prohibitive computational cost on learning and the memory requirement to store a quadratic number of parameters. In this paper, we propose to learn a sparse representation of the pair wise features under the preference learning framework using the L1 regularization. Based on stochastic gradient descent, an online algorithm is devised to enforce the sparsity using a mini-batch shrinkage strategy. On multiple benchmark datasets, we show that our method achieves better performance with fast convergence, and takes much less memory on models with millions of parameters.
Keywords :
data mining; feature extraction; gradient methods; learning (artificial intelligence); query processing; stochastic processes; text analysis; L1 regularization; mini-batch shrinkage strategy; multiple benchmark dataset; object set; online algorithm; pairwise feature; pairwise preference learning; query processing; raw feature; retrieval task; sparse representation; stochastic gradient descent algorithm; text mining; learning to rank; online learning; preference learning; sparse model; text mining;
Conference_Titel :
Data Mining (ICDM), 2010 IEEE 10th International Conference on
Conference_Location :
Sydney, NSW
Print_ISBN :
978-1-4244-9131-5
Electronic_ISBN :
1550-4786
DOI :
10.1109/ICDM.2010.67