Author/Authors :
Zhang Qi Yu، نويسنده , , Yoan Mic he ، نويسنده , , Antti Sorjamaa، نويسنده , , Alberto Guillen، نويسنده , , Amaury Lendasse، نويسنده , , and Eric Severin، نويسنده ,
Abstract :
This paper presents a methodology named Optimally Pruned K-Nearest Neighbors (OP-KNNs) which has the advantage ofcompeting with state-of-the-art methods while remaining fast. It builds a one hidden-layer feedforward neural network usingK-Nearest Neighbors as kernels to perform regression. Multiresponse Sparse Regression (MRSR) is used in order to rank eachk th nearest neighbor and finally Leave-One-Out estimation is used to select the optimal number of neighbors and to estimate thegeneralization performances. Since computational time of this method is small, this paper presents a strategy using OP-KNN toperform Variable Selection which is tested successfully on eight real-life data sets from di fferent application fields. In summary, themost significant characteristic of this method is that it provides good performance and a comparatively simple model at extremelyhigh-learning speed.