DocumentCode
578432
Title
Genetic algorithm on fuzzy codebook training for speech recognition
Author
Pan, Shing-tai ; Chen, Ching-fa ; Lee, Ying-wei
Author_Institution
Dept. of Comput. Sci. & Inf. Eng., Nat. Univ. of Kaohsiung, Kaohsiung, Taiwan
Volume
4
fYear
2012
fDate
15-17 July 2012
Firstpage
1552
Lastpage
1558
Abstract
A genetic algorithm is used to train the fuzzy membership function of a fuzzy codebook for the modeling of Discrete Hidden Markov Model (DHMM) applied to Mandarin speech recognition. Vector quantization for a speech feature based on a codebook is a fundamental process to recognize the speech signal by DHMM. A codebook with fuzzy membership functions corresponding to each vector in the codebook will be first trained by genetic algorithms (GAs) through speech features. The trained fuzzy codebook is then used to quantize the speech features. Subsequently, the quantized speech statistical features are used to model the DHMM for each speech. Besides, all the speech features to be recognized will go through the fuzzy codebook for quantization before being fed into the DHMM model for recognition. Experimental results show that both the speech recognition rate and computation time for recognition can be improved by the proposed strategy.
Keywords
genetic algorithms; hidden Markov models; learning (artificial intelligence); quantisation (signal); speech recognition; DHMM model; GA; Mandarin speech recognition; discrete hidden Markov model; fuzzy codebook training; fuzzy membership function; genetic algorithm; quantized speech statistical features; speech feature quantization; speech recognition; speech signal; speech signal recognition; vector quantization; Abstracts; Fuzzy codebook; Genetic algorithm; Hidden Markov Model; Speech recognition;
fLanguage
English
Publisher
ieee
Conference_Titel
Machine Learning and Cybernetics (ICMLC), 2012 International Conference on
Conference_Location
Xian
ISSN
2160-133X
Print_ISBN
978-1-4673-1484-8
Type
conf
DOI
10.1109/ICMLC.2012.6359596
Filename
6359596
Link To Document