Title of article :
A unifying information-theoretic framework for independent component analysis
Author/Authors :
Te-Won Lee، نويسنده , , M. Girolami، نويسنده , , A. J. Bell، نويسنده , , T. J. Sejnowski، نويسنده ,
Issue Information :
دوهفته نامه با شماره پیاپی سال 2000
Pages :
21
From page :
1
To page :
21
Abstract :
We show that different theories recently proposed for independent component analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pearlmutter and Parra [1] and Cardoso [2] showed that the infomax approach of Bell and Sejnowski [3] and the maximum likelihood estimation approach are equivalent. We show that negentropy maximization also has equivalent properties, and therefore, all three approaches yield the same learning rule for a fixed nonlinearity. Girolami and Fyfe [4] have shown that the nonlinear principal component analysis (PCA) algorithm of Karhunen and Joutsensalo [5] and Oja [6] can also be viewed from information-theoretic principles since it minimizes the sum of squares of the fourth-order marginal cumulants, and therefore, approximately minimizes the mutual information [7]. Lambert [8] has proposed different Bussgang cost functions for multichannel blind deconvolution. We show how the Bussgang property relates to the infomax principle. Finally, we discuss convergence and stability as well as future research issues in blind source separation.
Keywords :
Blind source separation , ICA , Entropy , Information maximization , maximum likelihood estimation
Journal title :
Computers and Mathematics with Applications
Serial Year :
2000
Journal title :
Computers and Mathematics with Applications
Record number :
918992
Link To Document :
بازگشت