• DocumentCode
    3744852
  • Title

    Incremental sentence compression using LSTM recurrent networks

  • Author

    Sakriani Sakti;Faiz Ilham;Graham Neubig;Tomoki Toda;Ayu Purwarianti;Satoshi Nakamura

  • Author_Institution
    Graduate School of Information Science, Nara Institute of Science and Technology, Japan
  • fYear
    2015
  • Firstpage
    252
  • Lastpage
    258
  • Abstract
    Many of the current sentence compression techniques attempt to produce a shortened form of a sentence by relying on syntactic structure such as dependency tree representations. While the performance of sentence compression has been improving, these approaches require a full parse of the sentence before performing sentence compression, making it difficult to perform compression in real time. In this paper, we examine the possibilities of performing incremental sentence compression using long short-term memory (LSTM) recurrent neural networks (RNN). The decision of whether to remove a word is done at each time step, without waiting for the end of the sentence. Various RNN parameters are investigated, including the number of layers and network connections. Furthermore, we also propose using a pretraining method in which the network is pretrained as an autoencoder. Experimental results reveal that our method obtains compression rates similar to human references and a better accuracy than the state-of-the-art tree transduction models.
  • Keywords
    "Speech","Recurrent neural networks","Training","Real-time systems","Logic gates","Training data","Speech recognition"
  • Publisher
    ieee
  • Conference_Titel
    Automatic Speech Recognition and Understanding (ASRU), 2015 IEEE Workshop on
  • Type

    conf

  • DOI
    10.1109/ASRU.2015.7404802
  • Filename
    7404802