DocumentCode
730809
Title
Annealed dropout trained maxout networks for improved LVCSR
Author
Rennie, Steven J. ; Dognin, Pierre L. ; Xiaodong Cui ; Goel, Vaibhava
Author_Institution
IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
fYear
2015
fDate
19-24 April 2015
Firstpage
5181
Lastpage
5185
Abstract
A significant barrier to progress in automatic speech recognition (ASR) capability is the empirical reality that techniques rarely “scale”-the yield of many apparently fruitful techniques rapidly diminishes to zero as the training criterion or decoder is strengthened, or the size of the training set is increased. Recently we showed that annealed dropout-a regularization procedure which gradually reduces the percentage of neurons that are randomly zeroed out during DNN training-leads to substantial word error rate reductions in the case of small to moderate training data amounts, and acoustic models trained based on the cross-entropy (CE) criterion [1]. In this paper we show that deep Maxout networks trained using annealed dropout can substantially improve the quality of commercial-grade LVCSR systems even when the acoustic model is trained with sequence-level training criterion, and on large amounts of data.
Keywords
learning (artificial intelligence); neural nets; speech recognition; DNN training; Large Vocabulary Continuous Speech Recognition; acoustic models; annealed dropout trained maxout networks; automatic speech recognition; cross entropy criterion; decoder; deep maxout networks; deep neural nets; improved LVCSR; regularization procedure; training criterion; training set size; word error rate reduction; Acoustics; Annealing; Data models; Schedules; Topology; Training; Training data; Deep Neural Networks; Deterministic Annealing; Dropout Training; Maxout Networks; Model aggregation;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on
Conference_Location
South Brisbane, QLD
Type
conf
DOI
10.1109/ICASSP.2015.7178959
Filename
7178959
Link To Document