DocumentCode
2035752
Title
Optimizing the k-NN metric weights using differential evolution
Author
AlSukker, Akram ; Khushaba, Rami ; Al-Ani, Ahmad
Author_Institution
Univ. of Technol., Sydney, NSW, Australia
fYear
2010
fDate
2-4 March 2010
Firstpage
89
Lastpage
92
Abstract
Traditional k-NN classifier poses many limitations including that it does not take into account each class distribution, importance of each feature, contribution of each neighbor, and the number of instances for each class. A Differential evolution (DE) optimization technique is utilized to enhance the performance of k-NN through optimizing the metric weights of features, neighbors and classes. Several datasets are used to evaluate the performance of the proposed DE based metrics and to compare it to some k-NN variants from the literature. Practical experiments indicate that in most cases, incorporating DE in k-NN classification can provide more accurate performance.
Keywords
learning (artificial intelligence); optimisation; pattern classification; differential evolution; k-NN classifier; k-NN metric weights; optimization technique; Australia; Classification algorithms; Euclidean distance; H infinity control; Machine learning; Machine learning algorithms; Nearest neighbor searches; Testing; Voting; Weight measurement;
fLanguage
English
Publisher
ieee
Conference_Titel
Multimedia Computing and Information Technology (MCIT), 2010 International Conference on
Conference_Location
Sharjah
Print_ISBN
978-1-4244-7001-3
Type
conf
DOI
10.1109/MCIT.2010.5444845
Filename
5444845
Link To Document