DocumentCode :
2919037
Title :
A KLD Based Method for Initial Set Selection in Active Learning
Author :
Chen, Wei ; Liu, Gang ; Guo, Jun
Author_Institution :
Pattern Recognition & Intell. Syst. Lab., Beijing Univ. of Posts & Telecommun., Beijing
fYear :
2009
fDate :
20-22 Feb. 2009
Firstpage :
33
Lastpage :
37
Abstract :
Speech recognition systems are usually trained using tremendous transcribed samples, and training data preparation is intensively time-consuming and costly. Aiming at achieving better performance of acoustic model with less transcribed samples, active learning is used in acoustic model training. This learning scheme firstly selects and transcribes a small initial training set, then iteratively selects the most informative samples corresponding to a certain criterion from the unlabeled samples, then annotates them and adds the newly transcribed samples to the training set to update the acoustic model. Concerning that the initial set influences the performance and convergence rate of active learning a lot, we proposed a method for initial set selection based on Kullback-Leibler divergence (KLD). Our experiments show that active learning using initial set selected by our proposed method can achieve better performance.
Keywords :
learning (artificial intelligence); speech recognition; KLD based method; Kullback-Leibler Divergence; acoustic model training; active learning; initial set selection; learning scheme; speech recognition systems; Convergence; Databases; Hidden Markov models; Intelligent systems; Laboratories; Learning systems; Pattern recognition; Speech recognition; Telecommunication computing; Training data; Initial Set Selection; KLD; active learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Electronic Computer Technology, 2009 International Conference on
Conference_Location :
Macau
Print_ISBN :
978-0-7695-3559-3
Type :
conf
DOI :
10.1109/ICECT.2009.102
Filename :
4795915
Link To Document :
بازگشت