Title :
Expectation-maximization Bernoulli-Gaussian approximate message passing
Author :
Vila, Jeremy ; Schniter, Philip
Author_Institution :
Dept. of ECE, Ohio State Univ., Columbus, OH, USA
Abstract :
The approximate message passing (AMP) algorithm originally proposed by Donoho, Maleki, and Montanari yields a computationally attractive solution to the usual ℓ1-regularized least-squares problem faced in compressed sensing, whose solution is known to be robust to the signal distribution. When the signal is drawn i.i.d from a marginal distribution that is not least-favorable, better performance can be attained using a Bayesian variation of AMP. The latter, however, assumes that the distribution is perfectly known. In this paper, we navigate the space between these two extremes by modeling the signal as i.i.d Bernoulli-Gaussian (BG) with unknown prior sparsity, mean, and variance, and the noise as zero-mean Gaussian with unknown variance, and we simultaneously reconstruct the signal while learning the prior signal and noise parameters. To accomplish this task, we embed the BG-AMP algorithm within an expectation-maximization (EM) framework. Numerical experiments confirm the excellent performance of our proposed EM-BG-AMP on a range of signal types12.
Keywords :
Bayes methods; Gaussian noise; compressed sensing; expectation-maximisation algorithm; message passing; signal reconstruction; BG-AMP algorithm; Bayesian variation; Bernoulli-Gaussian approximate message passing; EM framework; compressed sensing; expectation-maximization; marginal distribution; noise parameter; signal distribution; signal parameter; signal reconstruction; zero-mean Gaussian noise; Bayesian methods; Compressed sensing; Manganese; Message passing; Noise; Noise measurement; Sensors;
Conference_Titel :
Signals, Systems and Computers (ASILOMAR), 2011 Conference Record of the Forty Fifth Asilomar Conference on
Conference_Location :
Pacific Grove, CA
Print_ISBN :
978-1-4673-0321-7
DOI :
10.1109/ACSSC.2011.6190117