• DocumentCode
    3695064
  • Title

    Framewise and CTC training of Neural Networks for handwriting recognition

  • Author

    Théodore Bluche;Hermann Ney;Jérôme Louradour;Christopher Kermorvant

  • Author_Institution
    A2iA SA, Paris, France
  • fYear
    2015
  • Firstpage
    81
  • Lastpage
    85
  • Abstract
    In recent years, Long Short-Term Memory Recurrent Neural Networks (LSTM-RNNs) trained with the Connectionist Temporal Classification (CTC) objective won many international handwriting recognition evaluations. The CTC algorithm is based on a forward-backward procedure, avoiding the need of a segmentation of the input before training. The network outputs are characters labels, and a special non-character label. On the other hand, in the hybrid Neural Network / Hidden Markov Models (NN/HMM) framework, networks are trained with framewise criteria to predict state labels. In this paper, we show that CTC training is close to forward-backward training of NN/HMMs, and can be extended to more standard HMM topologies. We apply this method to Multi-Layer Perceptrons (MLPs), and investigate the properties of CTC, namely the modeling of character by single labels and the role of the special label.
  • Keywords
    "Hidden Markov models","Artificial neural networks","Continuous wavelet transforms","Training","Topology","Labeling","Text recognition"
  • Publisher
    ieee
  • Conference_Titel
    Document Analysis and Recognition (ICDAR), 2015 13th International Conference on
  • Type

    conf

  • DOI
    10.1109/ICDAR.2015.7333730
  • Filename
    7333730