• DocumentCode
    2708487
  • Title

    A gradient-based algorithm competitive with variational Bayesian EM for mixture of Gaussians

  • Author

    Kuusela, Mikael ; Raiko, Tapani ; Honkela, Antti ; Karhunen, Juha

  • Author_Institution
    Adaptive Inf. Res. Center, Helsinki Univ. of Technol. (TKK), Helsinki, Finland
  • fYear
    2009
  • fDate
    14-19 June 2009
  • Firstpage
    1688
  • Lastpage
    1695
  • Abstract
    While variational Bayesian (VB) inference is typically done with the so called VB EM algorithm, there are models where it cannot be applied because either the E-step or the M-step cannot be solved analytically. In 2007, Honkela et al. introduced a recipe for a gradient-based algorithm for VB inference that does not have such a restriction. In this paper, we derive the algorithm in the case of the mixture of Gaussians model. For the first time, the algorithm is experimentally compared to VB EM and its variant with both artificial and real data. We conclude that the algorithms are approximately as fast depending on the problem.
  • Keywords
    Gaussian processes; expectation-maximisation algorithm; gradient methods; inference mechanisms; variational techniques; E-step; Gaussian mixture; Gaussian model; M-step; VB EM algorithm; VB inference; gradient based algorithm; gradient-based algorithm; variational Bayesian EM; variational Bayesian inference; Algorithm design and analysis; Bayesian methods; Gaussian processes; Inference algorithms; Least squares approximation; Machine learning; Machine learning algorithms; Neural networks; Probability distribution; Sampling methods;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2009. IJCNN 2009. International Joint Conference on
  • Conference_Location
    Atlanta, GA
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-3548-7
  • Electronic_ISBN
    1098-7576
  • Type

    conf

  • DOI
    10.1109/IJCNN.2009.5178726
  • Filename
    5178726