• DocumentCode
    3432845
  • Title

    Bayesian supervised learning with non-Gaussian latent variables

  • Author

    Siwei Lyu

  • Author_Institution
    Comput. Sci. Dept., Univ. at Albany, Albany, NY, USA
  • fYear
    2013
  • fDate
    6-10 July 2013
  • Firstpage
    659
  • Lastpage
    663
  • Abstract
    We describe a Bayesian learning scheme for the hierarchal Bayesian linear model, which is based on the Gaussian scale mixture (GSM) modeling of the distribution of the latent variable. The proposed method takes advantage of the hierarchal Gaussian structure for a simple Monte-Carlo sampling algorithm. Particularly, with a single hidden scale parameter controlling the distribution of the latent variables, it leads to an efficient algorithm without explicit matrix inversion.
  • Keywords
    Gaussian processes; Monte Carlo methods; belief networks; learning (artificial intelligence); sampling methods; Bayesian supervised learning; GSM modeling; Gaussian scale mixture modeling; Monte-Carlo sampling algorithm; hierarchal Bayesian linear model; hierarchal Gaussian structure; latent variable distribution; nonGaussian latent variables; Bayes methods; Computational modeling; Eigenvalues and eigenfunctions; Estimation; GSM; Monte Carlo methods; Vectors; Bayesian learning; Gaussian scale mixtures; latent variable models;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Signal and Information Processing (ChinaSIP), 2013 IEEE China Summit & International Conference on
  • Conference_Location
    Beijing
  • Type

    conf

  • DOI
    10.1109/ChinaSIP.2013.6625424
  • Filename
    6625424