DocumentCode :
3522775
Title :
Parsimonious dictionary learning
Author :
Yaghoobi, Mehrdad ; Blumensath, Thomas ; Davies, Michael E.
Author_Institution :
Inst. for Digital Commun., Univ. of Edinburgh, Edinburgh
fYear :
2009
fDate :
19-24 April 2009
Firstpage :
2869
Lastpage :
2872
Abstract :
Sparse modeling of signals has recently received a lot of attention. Often, a linear under-determined generative model for the signals of interest is proposed and a sparsity constraint imposed on the representation. When the generative model is not given, choosing an appropriate generative model is important, so that the given class of signals has approximate sparse representations. In this paper we introduce a new scheme for dictionary learning and impose an additional constraint to reduce the dictionary size. Small dictionaries are desired for coding applications and more likely to ldquoworkrdquo with suboptimal algorithms such as Basis Pursuit. Another benefit of small dictionaries is their faster implementation, e.g. a reduced number of multiplication/addition in each matrix vector multiplication, which is the bottleneck in sparse approximation algorithms.
Keywords :
matrix multiplication; signal representation; linear under-determined generative model; matrix vector multiplication; parsimonious dictionary learning; sparse modeling; sparse representations; sparsity constraint; Approximation algorithms; Costs; Dictionaries; Digital communication; Image coding; Image processing; Pursuit algorithms; Signal generators; Signal processing; Sparse matrices; Dictionary Learning; Majorization Method; Sparse Approximation; Sparse Coding;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on
Conference_Location :
Taipei
ISSN :
1520-6149
Print_ISBN :
978-1-4244-2353-8
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2009.4960222
Filename :
4960222
Link To Document :
بازگشت