Title :
Complex Independent Component Analysis Using Three Types of Diversity: Non-Gaussianity, Nonwhiteness, and Noncircularity
Author :
Geng-Shen Fu ; Phlypo, Ronald ; Anderson, Matthew ; Adali, Tulay
Author_Institution :
Dept. of CSEE, Univ. of Maryland, Baltimore, MD, USA
Abstract :
By assuming latent sources are statistically independent, independent component analysis (ICA) separates underlying sources from a given linear mixture. Since in many applications, latent sources are non-Gaussian, noncircular, and have sample dependence, it is desirable to exploit all these properties jointly. Mutual information rate, which leads to the minimization of entropy rate, provides a natural cost for the task. In this paper, we establish the theory for complex-valued ICA giving Cramér-Rao lower bound and identification conditions, and present a new algorithm that takes all these properties into account. We propose an effective estimator of entropy rate and a complex-valued entropy rate bound minimization algorithm based on it. We show that the new method exploits all these properties effectively by comparing the estimation performance with the Cramér-Rao lower bound and by a number of examples.
Keywords :
entropy; independent component analysis; signal processing; Cramer-Rao lower bound; complex valued ICA; complex valued entropy rate bound minimization algorithm; entropy estimation; identification condition; independent component analysis; latent sources; mutual information rate; noncircularity diversity; nongaussianity diversity; nonwhiteness diversity; Cost function; Covariance matrices; Entropy; Minimization; Mutual information; Signal processing algorithms; Vectors; Cramér-Rao lower bound; Fisher information matrix; entropy rate; independent component analysis; maximum entropy distributions; mutual information rate;
Journal_Title :
Signal Processing, IEEE Transactions on
DOI :
10.1109/TSP.2014.2385047