Title :
Gradient sparse optimization via competitive learning
Author :
Zhang, Nan ; Zeng, Shuqing ; Weng, Juyang
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., East Lansing, MI, USA
Abstract :
In this paper, we propose a new method to achieve sparseness via a competitive learning principle for the linear kernel regression and classification task. We form the duality of the LASSO criteria, and transfer an ℓ 1 norm minimization to an ℓ∞ norm maximization problem. We introduce a novel solution derived from gradient descending, which links the sparse representation and the competitive learning scheme. This framework is applicable to a variety of problems, such as regression, classification, feature selection, and data clustering.
Keywords :
classification; data structures; gradient methods; maximum likelihood estimation; optimisation; regression analysis; unsupervised learning; ℓ 1 norm minimization; ℓ∞ norm maximization; LASSO criteria duality; classification; competitive learning; data clustering; feature selection; gradient descending; gradient sparse optimization; linear kernel regression; maximum likelihood estimator; sparse representation; sparseness; supervised learning; Additive white noise; Computer science; Function approximation; Kernel; Laplace equations; Least squares approximation; Maximum likelihood estimation; Random variables; Supervised learning; Vectors;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2005. Proceedings. (ICASSP '05). IEEE International Conference on
Print_ISBN :
0-7803-8874-7
DOI :
10.1109/ICASSP.2005.1416091