Title :
Learning Stable Multilevel Dictionaries for Sparse Representations
Author :
Thiagarajan, Jayaraman J. ; Natesan Ramamurthy, Karthikeyan ; Spanias, Andreas
Author_Institution :
Sensor Signal & Inf. Process. Center, Arizona State Univ., Tempe, AZ, USA
Abstract :
Sparse representations using learned dictionaries are being increasingly used with success in several data processing and machine learning applications. The increasing need for learning sparse models in large-scale applications motivates the development of efficient, robust, and provably good dictionary learning algorithms. Algorithmic stability and generalizability are desirable characteristics for dictionary learning algorithms that aim to build global dictionaries, which can efficiently model any test data similar to the training samples. In this paper, we propose an algorithm to learn dictionaries for sparse representations from large scale data, and prove that the proposed learning algorithm is stable and generalizable asymptotically. The algorithm employs a 1-D subspace clustering procedure, the K-hyperline clustering, to learn a hierarchical dictionary with multiple levels. We also propose an information-theoretic scheme to estimate the number of atoms needed in each level of learning and develop an ensemble approach to learn robust dictionaries. Using the proposed dictionaries, the sparse code for novel test data can be computed using a low-complexity pursuit procedure. We demonstrate the stability and generalization characteristics of the proposed algorithm using simulations. We also evaluate the utility of the multilevel dictionaries in compressed recovery and subspace learning applications.
Keywords :
learning (artificial intelligence); pattern clustering; signal representation; 1D subspace clustering procedure; algorithmic stability; compressed recovery applications; data processing applications; dictionary learning algorithms; ensemble approach; generalizability; global dictionaries; hierarchical dictionary; information-theoretic scheme; k-hyperline clustering; large scale data; low-complexity pursuit procedure; machine learning applications; multilevel dictionaries; sparse code; sparse representations; subspace learning applications; test data; Algorithm design and analysis; Asymptotic stability; Clustering algorithms; Dictionaries; Stability analysis; Training; Vectors; Compressed sensing; dictionary learning; generalization; sparse representations; stability; stability.;
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
DOI :
10.1109/TNNLS.2014.2361052