Title :
Further investigation into multilingual training and adaptation of stacked bottle-neck neural network structure
Author :
Grezl, Frantisek ; Egorova, Ekaterina ; Karafiat, Martin
Author_Institution :
Speech@FIT & IT4I Center of Excellence, Brno Univ. of Technol., Brno, Czech Republic
Abstract :
Multilingual training of neural networks for ASR is widely studied these days. It has been shown that languages with little training data can benefit largely from multilingual resources. We have evaluated possible ways of adaptation of multilingual stacked bottle-neck hierarchy to target domain. This paper extends our latest work and focuses on the impact certain aspects have on the performance of an adapted neural network feature extractor. First, the performance of adapted multilingual networks preliminarily trained on different languages is studied. Next, the effect of different target units - phonemes vs. triphone states - used for multilingual NN training is evaluated. Then the impact of an increasing number of languages used for multilingual NN training is investigated. Here the condition of constant amount of data is added to separately control the influence of larger language variability and larger amount of data. The effect of adding languages from a different domain is also evaluated. Finally a study is performed where a language with the phonetic structure similar to the target´s one is added to multilingual training data.
Keywords :
feature extraction; learning (artificial intelligence); natural language processing; neural nets; ASR; language variability; multilingual NN training; multilingual networks; multilingual resources; multilingual stacked bottle-neck hierarchy; multilingual training data; neural network feature extractor; phonemes; phonetic structure; triphone states; Abstracts; Adaptation models; Artificial neural networks; Complexity theory; Training; multilingual training; neural network adaptation; neural networks; stacked bottle-neck;
Conference_Titel :
Spoken Language Technology Workshop (SLT), 2014 IEEE
DOI :
10.1109/SLT.2014.7078548