DocumentCode :
3154135
Title :
An animated display of tongue, lip and jaw movements during speech: A proper basis for speech aids to the handicapped and other speech technologies
Author :
Hochberg, J.G. ; Laroche, F. ; Levy, S. ; Papcun, G. ; Thomas, T.R.
Author_Institution :
Los Alamos Nat. Lab., NM, USA
fYear :
1992
fDate :
1-5 Feb 1992
Firstpage :
57
Lastpage :
59
Abstract :
The authors have developed a method for inferring articulatory parameters from acoustics. For this method, an X-ray microbeam records the movements of the lower lip, tongue tip and tongue dorsum during normal speech. A neural network is then trained to map from concurrently recorded acoustic data to the articulatory data. The device has applications in speech therapy as a lip-reading aid, and as a basis for other speech technologies including speech and speaker recognition and low data-rate speech transmission
Keywords :
computer animation; handicapped aids; natural language interfaces; neural nets; speech analysis and processing; speech recognition; X-ray microbeam; acoustics; animated display; articulatory parameters; handicapped; lip-reading aid; low data-rate speech transmission; neural network; speaker recognition; speech recognition; speech therapy; Acoustic devices; Acquired immune deficiency syndrome; Animation; Computer displays; Laboratories; Loudspeakers; Neural networks; Postal services; Speech; Tongue;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computing Applications to Assist Persons with Disabilities, 1992., Proceedings of the Johns Hopkins National Search for
Conference_Location :
Laurel, MD
Print_ISBN :
0-8186-2730-1
Type :
conf
DOI :
10.1109/CAAPWD.1992.217393
Filename :
217393
Link To Document :
بازگشت