DocumentCode :
2325306
Title :
Sparse representation from a winner-take-all neural network
Author :
Zhang, Nan ; Weng, Juyang
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., USA
Volume :
3
fYear :
2004
fDate :
25-29 July 2004
Firstpage :
2209
Abstract :
We introduce an incremental algorithm for independent component analysis (ICA) based on maximization of sparseness criteria. We propose using a new sparseness measure criteria function. The learning algorithm based on this criteria leads to a winner-take-all learning mechanism. It avoids the optimization of high order nonlinear function or density estimation, which have been used by other ICA methods. We show that when the latent independent random variables are super-Gaussian distributions, the network efficiently extracts the independent components.
Keywords :
Gaussian distribution; estimation theory; independent component analysis; learning (artificial intelligence); neural nets; optimisation; random processes; sparse matrices; ICA methods; density estimation; high order nonlinear function; incremental algorithm; independent component analysis; independent random variables; learning algorithm; maximization; optimization; sparse representation; sparseness measure criteria function; super Gaussian distributions; winner take all learning mechanism; winner take all neural network; Computer science; Convergence; Data mining; Electronic mail; Independent component analysis; Machine learning algorithms; Maximum likelihood estimation; Neural networks; Optimization methods; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
ISSN :
1098-7576
Print_ISBN :
0-7803-8359-1
Type :
conf
DOI :
10.1109/IJCNN.2004.1380963
Filename :
1380963
Link To Document :
بازگشت