Title :
Towards Scaling Up Classification-Based Speech Separation
Author :
Yuxuan Wang ; DeLiang Wang
Author_Institution :
Dept. of Comput. Sci. & Eng., Ohio State Univ., Columbus, OH, USA
Abstract :
Formulating speech separation as a binary classification problem has been shown to be effective. While good separation performance is achieved in matched test conditions using kernel support vector machines (SVMs), separation in unmatched conditions involving new speakers and environments remains a big challenge. A simple yet effective method to cope with the mismatch is to include many different acoustic conditions into the training set. However, large-scale training is almost intractable for kernel machines due to computational complexity. To enable training on relatively large datasets, we propose to learn more linearly separable and discriminative features from raw acoustic features and train linear SVMs, which are much easier and faster to train than kernel SVMs. For feature learning, we employ standard pre-trained deep neural networks (DNNs). The proposed DNN-SVM system is trained on a variety of acoustic conditions within a reasonable amount of time. Experiments on various test mixtures demonstrate good generalization to unseen speakers and background noises.
Keywords :
computational complexity; neural nets; speech processing; support vector machines; very large databases; DNN system; acoustic conditions; background noises; binary classification problem; computational complexity; discriminative features; kernel support vector machines; linear SVM; relatively large datasets; scaling up classification-based speech separation; standard pretrained deep neural networks; test mixtures; training set; unseen speakers; Acoustics; Feature extraction; Kernel; Neural networks; Noise; Speech; Training; Computational auditory scene analysis (CASA); deep belief networks; feature learning; monaural speech separation; support vector machines;
Journal_Title :
Audio, Speech, and Language Processing, IEEE Transactions on
DOI :
10.1109/TASL.2013.2250961