DocumentCode :
1035380
Title :
Rendering localized spatial audio in a virtual auditory space
Author :
Zotkin, Dmitry N. ; Duraiswami, Ramani ; Davis, Larry S.
Author_Institution :
Perceptual Interfaces & Reality Lab., Univ. of Maryland, College Park, MD, USA
Volume :
6
Issue :
4
fYear :
2004
Firstpage :
553
Lastpage :
564
Abstract :
High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware.
Keywords :
audio signal processing; audio user interfaces; augmented reality; rendering (computer graphics); 3-D audio processing; audio user interfaces; augmented reality; data sonification; head related transfer functions; perceptual user interfaces; spatial audio; virtual audio scene rendering; virtual auditory spaces; virtual reality environments; Audio databases; Augmented reality; Digital signal processing; Interpolation; Layout; Real time systems; Rendering (computer graphics); Scattering; Transfer functions; User interfaces; -D audio processing; Audio user interfaces; head-related transfer function; spatial audio; user interfaces; virtual auditory spaces; virtual environments; virtual reality;
fLanguage :
English
Journal_Title :
Multimedia, IEEE Transactions on
Publisher :
ieee
ISSN :
1520-9210
Type :
jour
DOI :
10.1109/TMM.2004.827516
Filename :
1315647
Link To Document :
بازگشت