DocumentCode :
1791675
Title :
Boosting Stochastic Newton Descent for Bigdata large scale classification
Author :
D´Ambrosio, Roberto ; Belhajali, Wafa ; Barlaud, Michel
Author_Institution :
ICTEAM, Univ. catholique de Louvain, Louvain-la-Neuve, Belgium
fYear :
2014
fDate :
27-30 Oct. 2014
Firstpage :
36
Lastpage :
41
Abstract :
Efficient Bigdata classification requires low cost learning methods. A standard approach involves Stochastic Gradient Descent algorithm (SGD) for the minimization of the Hinge Loss in the primal space. Although complexity of Stochastic Gradient Descent is linear with the number of samples these method suffers from slow convergence. In order to cope with this issue, we propose here a Boosting Stochastic Newton Descent (BSND) method for minimization of any calibrated loss in the primal space. BSND approximates the inverse Hessian by the best low-rank approximation. We validate BSND by benchmarking it against several variants of the state-of-the-art SGD algorithm on the the large scale ImageNet and Higgs dataset. We provide further core optimization for fast convergence. The results on big data set: ImageNet and Higgs display that BSND improves significantly accuracy of the SGD baseline while being faster by orders of magnitude.
Keywords :
Big Data; Newton method; approximation theory; data mining; gradient methods; minimisation; stochastic processes; BSND; Higgs dataset; bigdata large scale classification; boosting stochastic Newton descent; hinge Loss; inverse Hessian; large scale ImageNet; low-rank approximation; minimization; stochastic gradient descent algorithm; Accuracy; Complexity theory; Convergence; Covariance matrices; Fasteners; Training; Vectors; Bigdata; Boosting; Calibrated risks; Large Scale; Stochastic Newton Method; Supervised Classification;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Big Data (Big Data), 2014 IEEE International Conference on
Conference_Location :
Washington, DC
Type :
conf
DOI :
10.1109/BigData.2014.7004354
Filename :
7004354
Link To Document :
بازگشت