Title :
Exploiting Statistical Dependencies in Sparse Representations for Signal Recovery
Author :
Peleg, Tomer ; Eldar, Yonina C. ; Elad, Michael
Author_Institution :
Dept. of Electr. Eng., Technion - Israel Inst. of Technol., Haifa, Israel
fDate :
5/1/2012 12:00:00 AM
Abstract :
Signal modeling lies at the core of numerous signal and image processing applications. A recent approach that has drawn considerable attention is sparse representation modeling, in which the signal is assumed to be generated as a combination of a few atoms from a given dictionary. In this work we consider a Bayesian setting and go beyond the classic assumption of independence between the atoms. The main goal of this paper is to introduce a statistical model that takes such dependencies into account and show how this model can be used for sparse signal recovery. We follow the suggestion of two recent works and assume that the sparsity pattern is modeled by a Boltzmann machine, a commonly used graphical model. For general dependency models, exact MAP and MMSE estimation of the sparse representation becomes computationally complex. To simplify the computations, we propose greedy approximations of the MAP and MMSE estimators. We then consider a special case in which exact MAP is feasible, by assuming that the dictionary is unitary and the dependency model corresponds to a certain sparse graph. Exploiting this structure, we develop an efficient message passing algorithm that recovers the underlying signal. When the model parameters defining the underlying graph are unknown, we suggest an algorithm that learns these parameters directly from the data, leading to an iterative scheme for adaptive sparse signal recovery. The effectiveness of our approach is demonstrated on real-life signals-patches of natural images-where we compare the denoising performance to that of previous recovery methods that do not exploit the statistical dependencies.
Keywords :
least mean squares methods; message passing; signal representation; Bayesian setting; Boltzmann machine; MAP estimation; MMSE estimation; adaptive sparse signal recovery; image processing; message passing algorithm; natural images; signal modeling; signal processing; sparse representation modeling; sparsity pattern; statistical dependencies; statistical model; Approximation algorithms; Atomic clocks; Bayesian methods; Computational modeling; Dictionaries; Discrete cosine transforms; Hidden Markov models; Bayesian estimation; Boltzmann machine (BM); MRF; decomposable model; denoising; greedy pursuit; image patches; maximum a posteriori (MAP); message passing; pseudo-likelihood; sequential subspace optimization (SESOP); signal synthesis; sparse representations; unitary dictionary;
Journal_Title :
Signal Processing, IEEE Transactions on
DOI :
10.1109/TSP.2012.2188520