Title :
A diagonalized newton algorithm for non-negative sparse coding
Author_Institution :
Dept. of Electr. Eng., ESAT - KU Leuven, Leuven, Belgium
Abstract :
Signal models where non-negative vector data are represented by a sparse linear combination of non-negative basis vectors have attracted much attention in problems including image classification, document topic modeling, sound source segregation and robust speech recognition. In this paper, an iterative algorithm based on Newton updates to minimize the Kullback-Leibler divergence between data and model is proposed. It finds the sparse activation weights of the basis vectors more efficiently than the expectation-maximization (EM) algorithm. To avoid the computational burden of a matrix inversion, a diagonal approximation is made and therefore the algorithm is called diagonal Newton Algorithm (DNA). It is several times faster than EM, especially for undercomplete problems. But DNA also performs surprisingly well on overcomplete problems.
Keywords :
Newton method; approximation theory; encoding; matrix inversion; signal processing; vectors; Kullback-Leibler divergence minimisation; Newton updates; diagonal Newton algorithm; diagonal approximation; diagonalized Newton algorithm; iterative algorithm; matrix inversion; nonnegative basis vectors; nonnegative sparse coding; nonnegative vector data; signal models; sparse activation weights; sparse linear combination; Approximation algorithms; Approximation methods; Convergence; DNA; Dictionaries; Speech recognition; Vectors; Kullback Leibler divergence; Newton method; non-negative matrix factorization; source separation; sparse coding; vocabulary acquisition;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on
Conference_Location :
Vancouver, BC
DOI :
10.1109/ICASSP.2013.6639080