Title :
Bayesian supervised learning with non-Gaussian latent variables
Author_Institution :
Comput. Sci. Dept., Univ. at Albany, Albany, NY, USA
Abstract :
We describe a Bayesian learning scheme for the hierarchal Bayesian linear model, which is based on the Gaussian scale mixture (GSM) modeling of the distribution of the latent variable. The proposed method takes advantage of the hierarchal Gaussian structure for a simple Monte-Carlo sampling algorithm. Particularly, with a single hidden scale parameter controlling the distribution of the latent variables, it leads to an efficient algorithm without explicit matrix inversion.
Keywords :
Gaussian processes; Monte Carlo methods; belief networks; learning (artificial intelligence); sampling methods; Bayesian supervised learning; GSM modeling; Gaussian scale mixture modeling; Monte-Carlo sampling algorithm; hierarchal Bayesian linear model; hierarchal Gaussian structure; latent variable distribution; nonGaussian latent variables; Bayes methods; Computational modeling; Eigenvalues and eigenfunctions; Estimation; GSM; Monte Carlo methods; Vectors; Bayesian learning; Gaussian scale mixtures; latent variable models;
Conference_Titel :
Signal and Information Processing (ChinaSIP), 2013 IEEE China Summit & International Conference on
Conference_Location :
Beijing
DOI :
10.1109/ChinaSIP.2013.6625424