• DocumentCode
    437463
  • Title

    Lower bounds of stochastic complexities in variational Bayes learning of Gaussian mixture models

  • Author

    Watanabe, Kazuho ; Watanabe, Sumio

  • Author_Institution
    Dept. of Comput. Intelligence & Syst., Tokyo Inst. of Technol., Yokohama, Japan
  • Volume
    1
  • fYear
    2004
  • fDate
    1-3 Dec. 2004
  • Firstpage
    99
  • Abstract
    The Bayesian learning is widely used and proved to be effective in many data modelling problems. However, computations involved in it require huge costs and generally cannot be performed exactly. The Variational Bayes approach, proposed as an approximation of the Baysian learning, has provided computational tractability and good generalization performance in many applications. In spite of these advantages, the properties and capabilities of the Variational Bayes learning itself have not been clarified yet. It is still unknown how good approximation the Variational Bayes approach can achieve. In this paper, we discuss the Variational Bayes learning of Gaussian mixture models and derive the lower bounds of the stochastic complexities. Stochastic complexity not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayes approach as an approximation of the true Bayesian learning.
  • Keywords
    Bayes methods; Gaussian processes; learning (artificial intelligence); stochastic processes; Gaussian mixture model; Variational Bayes learning; data modelling problem; stochastic complexity; Bayesian methods; Costs; Distributed computing; Gaussian distribution; Machine learning; Neural networks; Pattern recognition; Postal services; Probability; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Cybernetics and Intelligent Systems, 2004 IEEE Conference on
  • Print_ISBN
    0-7803-8643-4
  • Type

    conf

  • DOI
    10.1109/ICCIS.2004.1460394
  • Filename
    1460394