• DocumentCode
    2223940
  • Title

    Adaptive metric nearest neighbor classification

  • Author

    Domeniconi, Carlotta ; Peng, Jing ; Gunopulos, Dimitrios

  • Author_Institution
    Dept. of Comput. Sci., California Univ., Riverside, CA, USA
  • Volume
    1
  • fYear
    2000
  • fDate
    2000
  • Firstpage
    517
  • Abstract
    Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chi-squared distance analysis to compute a flexible metric for producing neighborhoods that are highly adaptive to query locations. Neighborhoods are elongated along less relevant feature dimensions and constricted along most influential ones. As a result, the class conditional probabilities tend to be smoother in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other techniques using a variety of simulated and real world data
  • Keywords
    image classification; class conditional probabilities; classification performance; conditional probabilities; dimensionality; flexible metric; nearest neighbor classification; Computer science; Error analysis; Euclidean distance; Kernel; Nearest neighbor searches; Q measurement;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer Vision and Pattern Recognition, 2000. Proceedings. IEEE Conference on
  • Conference_Location
    Hilton Head Island, SC
  • ISSN
    1063-6919
  • Print_ISBN
    0-7695-0662-3
  • Type

    conf

  • DOI
    10.1109/CVPR.2000.855863
  • Filename
    855863