Title :
Sparse representation from a winner-take-all neural network
Author :
Zhang, Nan ; Weng, Juyang
Author_Institution :
Dept. of Comput. Sci. & Eng., Michigan State Univ., USA
Abstract :
We introduce an incremental algorithm for independent component analysis (ICA) based on maximization of sparseness criteria. We propose using a new sparseness measure criteria function. The learning algorithm based on this criteria leads to a winner-take-all learning mechanism. It avoids the optimization of high order nonlinear function or density estimation, which have been used by other ICA methods. We show that when the latent independent random variables are super-Gaussian distributions, the network efficiently extracts the independent components.
Keywords :
Gaussian distribution; estimation theory; independent component analysis; learning (artificial intelligence); neural nets; optimisation; random processes; sparse matrices; ICA methods; density estimation; high order nonlinear function; incremental algorithm; independent component analysis; independent random variables; learning algorithm; maximization; optimization; sparse representation; sparseness measure criteria function; super Gaussian distributions; winner take all learning mechanism; winner take all neural network; Computer science; Convergence; Data mining; Electronic mail; Independent component analysis; Machine learning algorithms; Maximum likelihood estimation; Neural networks; Optimization methods; Vectors;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1380963