• DocumentCode
    295834
  • Title

    Simple recurrent networks as generalized hidden Markov models with distributed representations

  • Author

    Sakakibara, Yasubumi ; Golea, Mostefa

  • Author_Institution
    Inst. for Social Inf. Sci., Fujitsu Labs. Ltd., Shizuoka, Japan
  • Volume
    2
  • fYear
    1995
  • fDate
    Nov/Dec 1995
  • Firstpage
    979
  • Abstract
    Proposes simple recurrent neural networks as probabilistic models for representing and predicting time-sequences. The proposed model has the advantage of providing forecasts that consist of probability densities instead of single guesses of future values. It turns out that the model can be viewed as a generalized hidden Markov model with a distributed representation. The authors devise an efficient learning algorithm for estimating the parameters of the model using dynamic programming. The authors present some very preliminary simulation results to demonstrate the potential capabilities of the model. The present analysis provides a new probabilistic formulation of learning in simple recurrent networks
  • Keywords
    dynamic programming; hidden Markov models; learning (artificial intelligence); probability; recurrent neural nets; sequences; distributed representations; dynamic programming; generalized hidden Markov models; learning algorithm; probabilistic models; probability densities; simple recurrent networks; time-sequences; Australia; Dynamic programming; Hidden Markov models; Information science; Laboratories; Neural networks; Predictive models; Probability density function; Probability distribution; Recurrent neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1995. Proceedings., IEEE International Conference on
  • Conference_Location
    Perth, WA
  • Print_ISBN
    0-7803-2768-3
  • Type

    conf

  • DOI
    10.1109/ICNN.1995.487553
  • Filename
    487553