• DocumentCode
    1253141
  • Title

    Minimax nonparametric classification .I. Rates of convergence

  • Author

    Yang, Yuhong

  • Author_Institution
    Dept. of Stat., Iowa State Univ., Ames, IA, USA
  • Volume
    45
  • Issue
    7
  • fYear
    1999
  • fDate
    11/1/1999 12:00:00 AM
  • Firstpage
    2271
  • Lastpage
    2284
  • Abstract
    This paper studies minimax aspects of nonparametric classification. We first study minimax estimation of the conditional probability of a class label, given the feature variable. This function, say f, is assumed to be in a general nonparametric class. We show the minimax rate of convergence under square L2 loss is determined by the massiveness of the class as measured by metric entropy. The second part of the paper studies minimax classification. The loss of interest is the difference between the probability of misclassification of a classifier and that of the Bayes decision. As is well known, an upper bound on risk for estimating f gives an upper bound on the risk for classification, but the rate is known to be suboptimal for the class of monotone functions. This suggests that one does not have to estimate f well in order to classify well. However, we show that the two problems are in fact of the same difficulty in terms of rates of convergence under a sufficient condition, which is satisfied by many function classes including Besov (Sobolev), Lipschitz, and bounded variation. This is somewhat surprising in view of a result of Devroye, Gorfi, and Lugosi (see A Probabilistic Theory of Pattern Recognition, New York: Springer-Verlag, 1996)
  • Keywords
    Bayes methods; approximation theory; convergence of numerical methods; entropy; estimation theory; minimax techniques; nonparametric statistics; probability; signal classification; Bayes decision; Besov function; Lipschitz function; Sobolev function; approximation; bounded variation; class label; conditional probability; convergence rates; feature variable; general nonparametric class; metric entropy; minimax convergence rate; minimax estimation; minimax nonparametric classification; misclassification probability; monotone functions; suboptimal rate; sufficient condition; upper bound; Convergence; Density measurement; Entropy; Error probability; Estimation error; Loss measurement; Minimax techniques; Neural networks; Sufficient conditions; Upper bound;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.796368
  • Filename
    796368