Title :
Parsimonious dictionary learning
Author :
Yaghoobi, Mehrdad ; Blumensath, Thomas ; Davies, Michael E.
Author_Institution :
Inst. for Digital Commun., Univ. of Edinburgh, Edinburgh
Abstract :
Sparse modeling of signals has recently received a lot of attention. Often, a linear under-determined generative model for the signals of interest is proposed and a sparsity constraint imposed on the representation. When the generative model is not given, choosing an appropriate generative model is important, so that the given class of signals has approximate sparse representations. In this paper we introduce a new scheme for dictionary learning and impose an additional constraint to reduce the dictionary size. Small dictionaries are desired for coding applications and more likely to ldquoworkrdquo with suboptimal algorithms such as Basis Pursuit. Another benefit of small dictionaries is their faster implementation, e.g. a reduced number of multiplication/addition in each matrix vector multiplication, which is the bottleneck in sparse approximation algorithms.
Keywords :
matrix multiplication; signal representation; linear under-determined generative model; matrix vector multiplication; parsimonious dictionary learning; sparse modeling; sparse representations; sparsity constraint; Approximation algorithms; Costs; Dictionaries; Digital communication; Image coding; Image processing; Pursuit algorithms; Signal generators; Signal processing; Sparse matrices; Dictionary Learning; Majorization Method; Sparse Approximation; Sparse Coding;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on
Conference_Location :
Taipei
Print_ISBN :
978-1-4244-2353-8
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2009.4960222