Title :
Supervised Matrix Factorization with sparseness constraints and fast inference
Author :
Thom, Markus ; Schweiger, Roland ; Palm, Günther
Author_Institution :
Dept. Environ. Perception (GR/PAP), Daimler AG, Ulm, Germany
fDate :
July 31 2011-Aug. 5 2011
Abstract :
Non-negative Matrix Factorization is a technique for decomposing large data sets into bases and code words, where all entries of the occurring matrices are non-negative. A recently proposed technique also incorporates sparseness constraints, in such a way that the amount of nonzero entries in both bases and code words becomes controllable. This paper extends the Non-negative Matrix Factorization with Sparseness Constraints. First, a modification of the optimization criteria ensures fast inference of the code words. Thus, the approach is real-time capable for use in time critical applications. Second, in case a teacher signal is associated with the samples, it is considered in order to ensure that inferred code words of different classes can be well distinguished. Thus, the derived bases generate discriminative code words, which is a crucial prerequisite for training powerful classifiers. Experiments on natural image patches show, similar to recent results in the field of sparse coding algorithms, that Gabor-like filters are minimizing the reconstruction error while retaining inference capabilities. However, applying the approach with incorporation of the teacher signal to handwritten digits yields morphologically completely different bases, while achieving superior classification results.
Keywords :
matrix algebra; optimisation; very large databases; Gabor-like filters; discriminative code words; fast inference; handwritten digits; inference capabilities; large data set decomposition; natural image patches; nonnegative matrix factorization; nonzero entries; optimization criteria; sparse coding algorithms; sparseness constraints; supervised matrix factorization; teacher signal; time critical applications; Computer architecture; Encoding; Matrix decomposition; Optimization; Sparse matrices; Training; Transfer functions;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033328