• DocumentCode
    183302
  • Title

    Dropout Improves Recurrent Neural Networks for Handwriting Recognition

  • Author

    Pham, Vu ; Bluche, Theodore ; Kermorvant, Christopher ; Louradour, Jerome

  • Author_Institution
    A2iA, Paris, France
  • fYear
    2014
  • fDate
    1-4 Sept. 2014
  • Firstpage
    285
  • Lastpage
    290
  • Abstract
    Recurrent neural networks (RNNs) with Long Short-Term memory cells currently hold the best known results in unconstrained handwriting recognition. We show that their performance can be greatly improved using dropout - a recently proposed regularization method for deep architectures. While previous works showed that dropout gave superior performance in the context of convolutional networks, it had never been applied to RNNs. In our approach, dropout is carefully used in the network so that it does not affect the recurrent connections, hence the power of RNNs in modeling sequences is preserved. Extensive experiments on a broad range of handwritten databases confirm the effectiveness of dropout on deep architectures even when the network mainly consists of recurrent and shared connections.
  • Keywords
    handwriting recognition; recurrent neural nets; RNN; convolutional network; dropout; handwritten databases; long short-term memory cells; recurrent connection; recurrent neural network; regularization method; unconstrained handwriting recognition; Computer architecture; Databases; Error analysis; Handwriting recognition; Hidden Markov models; Recurrent neural networks; Training; Dropout; Handwriting Recognition; Recurrent Neural Networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Frontiers in Handwriting Recognition (ICFHR), 2014 14th International Conference on
  • Conference_Location
    Heraklion
  • ISSN
    2167-6445
  • Print_ISBN
    978-1-4799-4335-7
  • Type

    conf

  • DOI
    10.1109/ICFHR.2014.55
  • Filename
    6981034