• DocumentCode
    1245243
  • Title

    Concept learning using complexity regularization

  • Author

    Lugosi, Gábor ; Zeger, Kenneth

  • Author_Institution
    Fac. of Electr. Eng., Tech. Univ. Budapest, Hungary
  • Volume
    42
  • Issue
    1
  • fYear
    1996
  • fDate
    1/1/1996 12:00:00 AM
  • Firstpage
    48
  • Lastpage
    54
  • Abstract
    In pattern recognition or, as it has also been called, concept learning, the value of a { 0,1}-valued random variable Y is to be predicted based upon observing an Rd-valued random variable X. We apply the method of complexity regularization to learn concepts from large concept classes. The method is shown to automatically find a good balance between the approximation error and the estimation error. In particular, the error probability of the obtained classifier is shown to decrease as O(√(logn/n)) to the achievable optimum, for large nonparametric classes of distributions, as the sample size n grows. We also show that if the Bayes error probability is zero and the Bayes rule is in a known family of decision rules, the error probability is O(logn/n) for many large families, possibly with infinite VC dimension
  • Keywords
    Bayes methods; computational complexity; decision theory; pattern recognition; prediction theory; probability; random processes; Bayes error probability; Bayes rule; approximation error; classifier; complexity regularization; concept learning; decision rules; distributions; error probability; estimation error; nonparametric classes; pattern recognition; prediction rule; random variable; sample size; Computer science; Error analysis; Error probability; Estimation error; Estimation theory; Mathematics; Pattern recognition; Random variables; Risk management; Virtual colonoscopy;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.481777
  • Filename
    481777