• DocumentCode
    2493068
  • Title

    Learning mixtures of experts with a hybrid generative-discriminative algorithm

  • Author

    Xue, Ya ; Xiao Hu ; Yan, Weizhong ; Qiu, Rai

  • Author_Institution
    Ind. Artificial Intell. Lab., Gen. Electr. Global Res. Center, Niskayuna, NY, USA
  • fYear
    2010
  • fDate
    18-23 July 2010
  • Firstpage
    1
  • Lastpage
    8
  • Abstract
    The hierarchical mixtures of experts (HME) model is a flexible model that stochastically partitions the feature space into sub-regions, in which simple surfaces can be fitted to data. However, there are issues with model selection during the learning of the HME model using Expectation-Maximization (EM) inference. In addition, the EM algorithm suffers from the well-known problem of local minima. In this paper, we present a hybrid generative-discriminative approach that inherits the flexibility of the HME while decomposing the learning process into a few simple steps. The proposed algorithm solves the model-selection problem and empirical experiments on public benchmark datasets show its advantages in classification accuracy and efficiency.
  • Keywords
    expectation-maximisation algorithm; inference mechanisms; learning (artificial intelligence); expectation-maximization inference; hierarchical mixtures of experts model; hybrid generative-discriminative algorithm; learning process; model-selection problem; Hybrid power systems;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2010 International Joint Conference on
  • Conference_Location
    Barcelona
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-6916-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2010.5596680
  • Filename
    5596680