DocumentCode
1361467
Title
Training Surrogate Sensors in Musical Gesture Acquisition Systems
Author
Tindale, Adam ; Kapur, Ajay ; Tzanetakis, George
Author_Institution
Dept. of Comput. Sci., Univ. of Victoria, Victoria, BC, Canada
Volume
13
Issue
1
fYear
2011
Firstpage
50
Lastpage
59
Abstract
Capturing the gestures of music performers is a common task in interactive electroacoustic music. The captured gestures can be mapped to sounds, synthesis algorithms, visuals, etc., or used for music transcription. Two of the most common approaches for acquiring musical gestures are: 1) “hyper-instruments” which are “traditional” musical instruments enhanced with sensors for directly detecting the gestures and 2) “indirect acquisition” in which the only sensor is a microphone capturing the audio signal. Hyper-instruments require invasive modification of existing instruments which is frequently undesirable. However, they provide relatively straightforward and reliable sensor measurements. On the other hand, indirect acquisition approaches typically require sophisticated signal processing and possibly machine learning algorithms in order to extract the relevant information from the audio signal. The idea of using direct sensor(s) to train a machine learning model for indirect acquisition is proposed in this paper. The resulting trained “surrogate” sensor can then be used in place of the original direct invasive sensor(s) that were used for training. That way, the instrument can be used unmodified in performance while still providing the gesture information that a hyper-instrument would provide. In addition, using this approach, large amounts of training data can be collected with minimum effort. Experimental results supporting this idea are provided in two detection contexts: 1) strike position on a drum surface and 2) strum direction on a sitar.
Keywords
audio signal processing; gesture recognition; information retrieval; learning (artificial intelligence); microphones; musical instruments; sensors; audio signal; direct invasive sensor; gesture information; indirect acquisition approach; information extraction; interactive electroacoustic music; machine learning; musical gesture acquisition system; signal processing; surrogate sensor; training data; Context; Feature extraction; Instruments; Machine learning; Music; Sensors; Training; Gesture recognition; machine learning; new interfaces for musical expression; surrogate sensors; virtual sensors;
fLanguage
English
Journal_Title
Multimedia, IEEE Transactions on
Publisher
ieee
ISSN
1520-9210
Type
jour
DOI
10.1109/TMM.2010.2089786
Filename
5610728
Link To Document