Title of article :
Iterative Nearest Neighbors
Author/Authors :
Timofte، نويسنده , , Radu and Van Gool، نويسنده , , Luc، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2015
Pages :
13
From page :
60
To page :
72
Abstract :
Representing data as a linear combination of a set of selected known samples is of interest for various machine learning applications such as dimensionality reduction or classification. k-Nearest Neighbors (k NN) and its variants are still among the best-known and most often used techniques. Some popular richer representations are Sparse Representation (SR) based on solving an l1-regularized least squares formulation, Collaborative Representation (CR) based on l2-regularized least squares, and Locally Linear Embedding (LLE) based on an l1-constrained least squares problem. We propose a novel sparse representation, the Iterative Nearest Neighbors (INN). It combines the power of SR and LLE with the computational simplicity of k NN. We empirically validate our representation in terms of sparse support signal recovery and compare with similar Matching Pursuit (MP) and Orthogonal Matching Pursuit (OMP), two other iterative methods. We also test our method in terms of dimensionality reduction and classification, using standard benchmarks for faces (AR), traffic signs (GTSRB), and objects (PASCAL VOC 2007). INN compares favorably to NN, MP, and OMP, and on par with CR and SR, while being orders of magnitude faster than the latter. On the downside, INN does not scale well with higher dimensionalities of the data.
Keywords :
least squares , Classification , Collaborative representation , Sparse representation , Iterative Nearest Neighbors , Dimensionality reduction
Journal title :
PATTERN RECOGNITION
Serial Year :
2015
Journal title :
PATTERN RECOGNITION
Record number :
1879839
Link To Document :
بازگشت