Title :
On Accelerated Gradient Approximation for Least Square Regression with L1-Regularization
Author :
Yongquan Zhang;Jianyong Sun
Author_Institution :
Dept. of Inf. &
Abstract :
In this paper, we consider an online least square regression problem where the objective function is composed of a quadratic loss function and an L1 regularization on model parameter. For each training sample, we propose to approximate the L1 regularization by a convex function. This results in an overall convex approximation to the original objective function. We apply an efficient accelerated stochastic approximation algorithm to solve the approximation. The developed algorithm does not need to store previous samples which reduces the space complexity. We further prove that the developed algorithm is guaranteed to converge to the global optimum with a convergence rate O (ln n/√n) where n is the number of training samples. The proof is based on a weaker assumption than those applied in similar research work.
Keywords :
"Acceleration","Convergence","Stochastic processes","Approximation algorithms","Least squares approximations","Algorithm design and analysis"
Conference_Titel :
Computational Intelligence, 2015 IEEE Symposium Series on
Print_ISBN :
978-1-4799-7560-0
DOI :
10.1109/SSCI.2015.221