Title :
Learning mixtures of experts with a hybrid generative-discriminative algorithm
Author :
Xue, Ya ; Xiao Hu ; Yan, Weizhong ; Qiu, Rai
Author_Institution :
Ind. Artificial Intell. Lab., Gen. Electr. Global Res. Center, Niskayuna, NY, USA
Abstract :
The hierarchical mixtures of experts (HME) model is a flexible model that stochastically partitions the feature space into sub-regions, in which simple surfaces can be fitted to data. However, there are issues with model selection during the learning of the HME model using Expectation-Maximization (EM) inference. In addition, the EM algorithm suffers from the well-known problem of local minima. In this paper, we present a hybrid generative-discriminative approach that inherits the flexibility of the HME while decomposing the learning process into a few simple steps. The proposed algorithm solves the model-selection problem and empirical experiments on public benchmark datasets show its advantages in classification accuracy and efficiency.
Keywords :
expectation-maximisation algorithm; inference mechanisms; learning (artificial intelligence); expectation-maximization inference; hierarchical mixtures of experts model; hybrid generative-discriminative algorithm; learning process; model-selection problem; Hybrid power systems;
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
Print_ISBN :
978-1-4244-6916-1
DOI :
10.1109/IJCNN.2010.5596680