DocumentCode :
431941
Title :
Gradient sparse optimization via competitive learning
Author :
Zhang, Nan ; Zeng, Shuqing ; Weng, Juyang
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., East Lansing, MI, USA
Volume :
4
fYear :
2005
fDate :
18-23 March 2005
Abstract :
In this paper, we propose a new method to achieve sparseness via a competitive learning principle for the linear kernel regression and classification task. We form the duality of the LASSO criteria, and transfer an ℓ 1 norm minimization to an ℓ norm maximization problem. We introduce a novel solution derived from gradient descending, which links the sparse representation and the competitive learning scheme. This framework is applicable to a variety of problems, such as regression, classification, feature selection, and data clustering.
Keywords :
classification; data structures; gradient methods; maximum likelihood estimation; optimisation; regression analysis; unsupervised learning; ℓ 1 norm minimization; ℓ norm maximization; LASSO criteria duality; classification; competitive learning; data clustering; feature selection; gradient descending; gradient sparse optimization; linear kernel regression; maximum likelihood estimator; sparse representation; sparseness; supervised learning; Additive white noise; Computer science; Function approximation; Kernel; Laplace equations; Least squares approximation; Maximum likelihood estimation; Random variables; Supervised learning; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2005. Proceedings. (ICASSP '05). IEEE International Conference on
ISSN :
1520-6149
Print_ISBN :
0-7803-8874-7
Type :
conf
DOI :
10.1109/ICASSP.2005.1416091
Filename :
1416091
Link To Document :
بازگشت