• DocumentCode
    527366
  • Title

    2-Stage instance selection algorithm for KNN based on Nearest Unlike Neighbors

  • Author

    Dong, Chun-Ru ; Chan, Patrick P K ; Ng, Wing W Y ; Yeung, Daniel S.

  • Author_Institution
    Machine Learning & Cybern. Res. Center, South China Univ. of Technol., Guangzhou, China
  • Volume
    1
  • fYear
    2010
  • fDate
    11-14 July 2010
  • Firstpage
    134
  • Lastpage
    140
  • Abstract
    For the virtues such as simplicity, high generalization capability, and few training cost, the K-Nearest-Neighbor (KNN) classifier is widely used in pattern recognition and machine learning. However, the computation complexity of KNN classifier will become higher when dealing with large data sets classification problem. In consequence, its efficiency will be decreased greatly. This paper proposes a general two-stage training set condensing algorithm for general KNN classifier. First, we identify the noise data points and remove them from the original training set. Second, a general condensed nearest neighbor rule based on the so-called Nearest Unlike Neighbor (NUN) is presented to further eliminate the redundant samples in training set. In order to verify the performance of the proposed method, some numerical experiments are conducted on several UCI benchmark databases.
  • Keywords
    pattern classification; K-nearest-neighbor; KNN classifier; condensed nearest neighbor rule; instance selection algorithm; nearest unlike neighbor; Artificial neural networks; Classification algorithms; Nearest neighbor searches; Noise; Noise measurement; Testing; Training; Condensed nearest neighbor rule; KNN; Nearest unlike neighbor; Noise;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Machine Learning and Cybernetics (ICMLC), 2010 International Conference on
  • Conference_Location
    Qingdao
  • Print_ISBN
    978-1-4244-6526-2
  • Type

    conf

  • DOI
    10.1109/ICMLC.2010.5581078
  • Filename
    5581078