Title :
Infinite mixtures of multivariate Gaussian processes
Author_Institution :
Dept. of Comput. Sci. & Technol., East China Normal Univ., Shanghai, China
Abstract :
This paper presents a new model called the infinite mixture of multivariate Gaussian processes, which can be used to learn vector-valued functions and applied to multitask learning. As an extension of the single multivariate Gaussian process, the mixture model has the advantages of modeling multimodal data and alleviating the computationally cubic complexity of the multivariate Gaussian process. A prior Dirichlet process is adopted to allow the (possibly infinite) number of mixture components to be automatically inferred from the training data, and Markov chain Monte Carlo sampling techniques are used for parameter and latent variable inference. Preliminary experimental results on multivariate regression show the feasibility of the proposed model.
Keywords :
Gaussian processes; Markov processes; Monte Carlo methods; computational complexity; learning (artificial intelligence); mixture models; regression analysis; sampling methods; vectors; Dirichlet process; Markov chain Monte Carlo sampling techniques; computationally cubic complexity; infinite multivariate Gaussian process mixtures; latent variable inference; mixture model; multimodal data modeling; multitask learning; multivariate regression; parameter inference; vector-valued functions; Abstracts; Zirconium; Dirichlet process; Gaussian process; Markov chain Monte Carlo; Multitask learning; Regression; Vector-valued function;
Conference_Titel :
Machine Learning and Cybernetics (ICMLC), 2013 International Conference on
Conference_Location :
Tianjin
DOI :
10.1109/ICMLC.2013.6890744