• DocumentCode
    285144
  • Title

    Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension

  • Author

    Kearns, Michael

  • Author_Institution
    AT&T Bell Labs., Murray Hill, NJ, USA
  • Volume
    2
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Abstract
    Summary form only given. A Bayesian or average-case model of concept learning is given. The model provides more precise characterizations of the learning curve (sample complexity) behaviour that depends on properties of both the prior distribution over concepts and the sequence of instances seen by the learner. It unites in a common framework statistical physics and VC dimension theories of learning curves. A systematic investigation and comparison of two fundamental quantities in learning and information theory is undertaken. These are the probability of an incorrect prediction for an optimal learning algorithm, and the Shannon information gain. This paper provides an understanding of the sample complexity of learning in several existing models
  • Keywords
    Bayes methods; computational complexity; information theory; learning (artificial intelligence); neural nets; Bayesian learning; Shannon information gain; VC dimension; average-case model; information theory; optimal learning algorithm; prior distribution; sample complexity; Bayesian methods; Conferences; Electronic mail; Information theory; Physics; Probability; Virtual colonoscopy;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.226964
  • Filename
    226964