Title :
Joint multiple dictionary learning for Tensor sparse coding
Author :
Yifan Fu ; Junbin Gao ; Yanfeng Sun ; Xia Hong
Author_Institution :
Sch. of Comput. & Math., Charles Sturt Univ., Bathurst, NSW, Australia
Abstract :
Traditional dictionary learning algorithms are used for finding a sparse representation on high dimensional data by transforming samples into a one-dimensional (ID) vector. This ID model loses the inherent spatial structure property of data. An alternative solution is to employ Tensor Decomposition for dictionary learning on their original structural form - a tensor - by learning multiple dictionaries along each mode and the corresponding sparse representation in respect to the Kronecker product of these dictionaries. To learn tensor dictionaries along each mode, all the existing methods update each dictionary iteratively in an alternating manner. Because atoms from each mode dictionary jointly make contributions to the spar sity of tensor, existing works ignore atoms correlations between different mode dictionaries by treating each mode dictionary independently. In this paper, we propose a joint multiple dictionary learning method for tensor sparse coding, which explores atom correlations for sparse representation and updates multiple atoms from each mode dictionary simultaneously. In this algorithm, the Frequent-Pattern Tree (FP-tree) mining algorithm is employed to exploit frequent atom patterns in the sparse representation. Inspired by the idea of K-SVD, we develop a new dictionary update method that jointly updates elements in each pattern. Experimental results demonstrate our method outperforms other tensor based dictionary learning algorithms.
Keywords :
data mining; dictionaries; image coding; image denoising; learning (artificial intelligence); tensors; trees (mathematics); vectors; FP-tree mining algorithm; ID model; ID vector; K-SVD; Kronecker product; atom correlations; data spatial structure property; dictionary update method; frequent-pattern tree mining algorithm; high dimensional data; image coding; image denoising; joint multiple dictionary learning; mode dictionaries; one-dimensional vector; sparse representation; tensor based dictionary learning algorithm; tensor decomposition; tensor sparse coding; tensor sparsity; Correlation; Databases; Dictionaries; Encoding; Joints; Tensile stress; Vectors;
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
DOI :
10.1109/IJCNN.2014.6889490