• DocumentCode
    1256459
  • Title

    On exponential bounds on the Bayes risk of the kernel classification rule

  • Author

    Krzyzak, Adam

  • Author_Institution
    Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
  • Volume
    37
  • Issue
    3
  • fYear
    1991
  • fDate
    5/1/1991 12:00:00 AM
  • Firstpage
    490
  • Lastpage
    499
  • Abstract
    The exponential, distribution-free bounds for the kernel classification rule are derived. The equivalence of all modes of the global convergence of the rule is established under optimal assumptions on the smoothing sequence. Also derived is the optimal global rate of convergence of the kernel regression estimate within the class of Lipschitz distributions. The rate is optimal for the nonparametric regression, but not for classifications. It is shown. using the martingale device, that weak, strong, and complete L1 Bayes risk consistencies are equivalent. Consequently the conditions on the smoothing sequence hn to 0 and nhn to infinity are necessary and sufficient for Bayes risk consistency of the kernel classification rule. The rate of convergence of the kernel classification rule is also given.
  • Keywords
    Bayes methods; convergence; information theory; Bayes risk; Lipschitz distributions; distribution-free bounds; exponential bounds; global convergence; kernel classification rule; kernel regression estimate; martingale device; nonparametric regression; optimal global rate of convergence; smoothing sequence; Computer science; Convergence; Kernel; Neural networks; Random variables; Smoothing methods;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.79905
  • Filename
    79905