Title :
A gradient descending solution to the LASSO criteria
Author :
Zhang, Nan ; Zeng, Shuqing
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., USA
fDate :
31 July-4 Aug. 2005
Abstract :
In this paper, we propose a new perspective to achieve sparseness via the winner-take-all principle for the linear kernel regression and classification tasks. We form the duality of the LASSO criteria, and transfer an ℓ1 norm minimization to an ℓ∞ norm maximization problem. We introduce a novel winner-take-all neural network solution derived from gradient descending, which links the sparse representation and the competitive learning scheme. This scheme is a form of unsupervised learning in which each input pattern comes through learning, to be associated with the activity of one or at most a few neurons. However, the lateral interaction between neurons in the same layer is strictly preemptive in this model. This framework is applicable to a variety of problems, such as independent component analysis (ICA), feature selection, and data clustering.
Keywords :
gradient methods; independent component analysis; minimisation; neural nets; pattern classification; pattern clustering; regression analysis; unsupervised learning; LASSO criteria; classification task; competitive learning; data clustering; feature selection; gradient descending; independent component analysis; linear kernel regression; neural network; norm maximization; norm minimization; unsupervised learning; winner-take-all principle; Computer science; Independent component analysis; Kernel; Laplace equations; Maximum likelihood estimation; Neural networks; Neurons; Quadratic programming; Supervised learning; Unsupervised learning;
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
DOI :
10.1109/IJCNN.2005.1556393