Title :
A Method of Discriminative Information Preservation and In-Dimension Distance Minimization Method for Feature Selection
Author :
Shangrong Huang ; Jian Zhang ; Xinwang Liu ; Lei Wang
Author_Institution :
Adv. Analytics Inst., Univ. of Technol. Sydney, Sydney, NSW, Australia
Abstract :
Preserving sample´s pair wise similarity is essential for feature selection. In supervised learning, labels can be used as a direct measure to check whether two samples are similar with each other. In unsupervised learning, however, such similarity information is usually unavailable. In this paper, we propose a new feature selection method through spectral clustering based on discriminative information as an underlying data structure. Laplacian matrix is used to obtain more partitioning information than other previously proposed structures such as the Eigen space of original data. The high dimension of sample data is projected into a low dimensional space. The in-dimension distance is also considered to get a better compact clustering result. The proposed method can be solved efficiently by updating the projection matrix and its inverse normalized diagonal matrix. A comprehensive experimental study has demonstrated that the proposed method outperforms many state-of-the-art feature selection algorithms with different criterion including the accuracy of clustering/classification and Jaccard score.
Keywords :
feature selection; matrix algebra; pattern clustering; unsupervised learning; Laplacian matrix; discriminative information preservation; feature selection method; in-dimension distance minimization method; inverse normalized diagonal matrix; projection matrix; spectral clustering; unsupervised learning; Accuracy; Clustering algorithms; Educational institutions; Face; Feature extraction; Laplace equations; Support vector machines;
Conference_Titel :
Pattern Recognition (ICPR), 2014 22nd International Conference on
Conference_Location :
Stockholm
DOI :
10.1109/ICPR.2014.286