• DocumentCode
    540204
  • Title

    Application of temporal supervised learning algorithm to generation of natural language

  • Author

    Kamimura, Ryotaro

  • fYear
    1990
  • fDate
    17-21 June 1990
  • Firstpage
    201
  • Abstract
    An attempt is made to generate natural language by using a recurrent neural network with the temporal supervised learning algorithm (TSLA), developed by R.J. Williams and D. Zipser (1989). As TSLA uses explicit representation of consecutive events, it can deal with time-changing phenomena without increasing the number of units in the network. However, its performance has been evaluated exclusively upon the limited short sequences or sequences with explicit regularity and not for the sequences of natural language, which show complex and long-distance correlation. It was found that TSLA showed extreme instability in the learning process, and it took a long time to finish the learning. Thus, the author proposes two methods to improve the performance of TSLA. The first is the variable learning rate method, which is used to remove the instability of the learning process. The second is Minkowski-r power metrics, which is used to improve the learning time
  • Keywords
    learning systems; natural languages; neural nets; Minkowski-r power metrics; instability; natural language generation; performance; recurrent neural network; temporal supervised learning algorithm;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1990., 1990 IJCNN International Joint Conference on
  • Conference_Location
    San Diego, CA, USA
  • Type

    conf

  • DOI
    10.1109/IJCNN.1990.137570
  • Filename
    5726530