• Title of article

    A recurrent log-linearized Gaussian mixture network

  • Author/Authors

    T.، Tsuji, نويسنده , , Bu، Nan نويسنده , , O.، Fukuda, نويسنده , , M.، Kaneko, نويسنده ,

  • Issue Information
    روزنامه با شماره پیاپی سال 2003
  • Pages
    -303
  • From page
    304
  • To page
    0
  • Abstract
    Context in time series is one of the most useful and interesting characteristics for machine learning. In some cases, the dynamic characteristic would be the only basis for achieving a possible classification. A novel neural network, which is named "a recurrent log-linearized Gaussian mixture network (R-LLGMN)," is proposed in this paper for classification of time series. The structure of this network is based on a hidden Markov model (HMM), which has been well developed in the area of speech recognition. R-LLGMN can as well be interpreted as an extension of a probabilistic neural network using a log-linearized Gaussian mixture model, in which recurrent connections have been incorporated to make temporal information in use. Some simulation experiments are carried out to compare R-LLGMN with the traditional estimator of HMM as classifiers, and finally, pattern classification experiments for EEG signals are conducted. It is indicated from these experiments that R-LLGMN can successfully classify not only artificial data but real biological data such as EEG signals.
  • Keywords
    Storage capacity , neural-network modularity , two-hidden-layer feedforward networks (TLFNs) , Learning capability
  • Journal title
    IEEE TRANSACTIONS ON NEURAL NETWORKS
  • Serial Year
    2003
  • Journal title
    IEEE TRANSACTIONS ON NEURAL NETWORKS
  • Record number

    62812