• DocumentCode
    3318581
  • Title

    Some notes on the stability of learning

  • Author

    Yufeng Deng ; Guo, Jun ; Luo, Shoushan

  • Author_Institution
    Sch. of Inf. Eng., Beijing Univ. of Posts & Telecommun., China
  • fYear
    2005
  • fDate
    30 Oct.-1 Nov. 2005
  • Firstpage
    756
  • Lastpage
    759
  • Abstract
    Learning theory based on ERM principle, especially promoted by VC theory provides some conditions on the hypothesis space to ensure generalization. However, several successful learning algorithms including regularization learning, SVM, bagging and boost are not strictly ERM. So, scientists are looking for new foundation of learning. Stability conditions are perhaps new foundation. We give an exponential bound for generalization performance based on concentration inequality with strong CV stability.
  • Keywords
    generalisation (artificial intelligence); learning (artificial intelligence); CV stability; VC theory; empirical risk minimization principle; learning theory; Bagging; Learning systems; Machine learning; Mathematics; Risk management; Stability; Support vector machines; Topology; Virtual colonoscopy; Zinc; Concentration inequality; generalization bound; strong CV stability;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Natural Language Processing and Knowledge Engineering, 2005. IEEE NLP-KE '05. Proceedings of 2005 IEEE International Conference on
  • Print_ISBN
    0-7803-9361-9
  • Type

    conf

  • DOI
    10.1109/NLPKE.2005.1598837
  • Filename
    1598837