Abstract :
In this paper, a method for controlling electronic digital music instruments is proposed, based on visual capture of baton and hand motion of a conductor. This approach is suitable for being applied in mixed ensembles of human musicians and electronic instruments. Computer vision methods that are well established, are used to track the motion of the baton and to deduce musical parameters (volume, pitch, expression) for the sound creation or cues for the time-synchronized replay of previously recorded music notation sequences (beat, tempo, expression). Combined with acoustic signal processing, this method can enable the automatic playing of a computer-based instrument in an orchestra, in which the conductor conducts both this instrument as well as the human musicians. This allows an intuitive control of the timing and expression towards a unique interpretation. In this paper, the concept is introduced and the feasibility is discussed.
Keywords :
acoustic signal processing; computer vision; electronic music; music; musical acoustics; musical instruments; tracking; acoustic signal processing; automatic playing; computer vision tracking; computer-based instrument; electronic digital music instrument; electronic instruments; human musicians; Acoustic signal processing; Automatic control; Computer vision; Conductors; Digital control; Humans; Instruments; Motion control; Music; Tracking;