DocumentCode :
179603
Title :
A family of discriminative training criteria based on the F-divergence for deep neural networks
Author :
Nussbaum-Thom, Markus ; Xiaodong Cui ; Schluter, Ralf ; Goel, Vikas ; Ney, Hermann
Author_Institution :
IBM T.J. Watson Res. Center, Yorktown Heights, NY, USA
fYear :
2014
fDate :
4-9 May 2014
Firstpage :
5612
Lastpage :
5616
Abstract :
We present novel bounds on the classification error which are based on the f-Divergence and, at the same time, can be used as practical training criteria. There exist virtually no studies which investigate the link between the f-Divergence, the classification error and practical training criteria. So far only the Kullback-Leibler f-Divergence has been examined in this context to formulate a bound on the classification error and to derive the cross-entropy criterion. We extend this concept to a larger class of f-Divergences. We also successfully investigate if the novel training criteria based on the f-Divergence are suited for frame-wise training of deep neural networks on the Babel Vietnamese and Bengali speech recognition tasks.
Keywords :
neural nets; pattern classification; Babel Vietnamese; Bengali speech recognition; F-divergence; Kullback-Leibler f-Divergence; classification error; deep neural networks; discriminative training criteria; frame wise training; Acoustics; Conferences; Data models; Neural networks; Optimization; Speech; Training; classification error bound; deep neural network; discriminative training; f-Divergence;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on
Conference_Location :
Florence
Type :
conf
DOI :
10.1109/ICASSP.2014.6854677
Filename :
6854677
Link To Document :
بازگشت