DocumentCode :
1797428
Title :
Semi-supervised sparse coding
Author :
Wang, Jim Jing-Yan ; Xin Gao
Author_Institution :
State Univ. of New York, Buffalo, NY, USA
fYear :
2014
fDate :
6-11 July 2014
Firstpage :
1630
Lastpage :
1637
Abstract :
Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.
Keywords :
codes; data structures; pattern classification; class labels; codebook; codewords; data representation; data sample approximation; learning discriminative sparse codes; linear classifier; manifold structure; semisupervised sparse coding; sparse linear combination; unified objective function; Compounds; Encoding; Linear programming; Optimization; Sparse matrices; Training; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
Type :
conf
DOI :
10.1109/IJCNN.2014.6889449
Filename :
6889449
Link To Document :
بازگشت