DocumentCode
635383
Title
AUtomatic accompaniment generation to evoke specific emotion
Author
Pei-Chun Chen ; Keng-Sheng Lin ; Chen, He Henry
Author_Institution
Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
fYear
2013
fDate
15-19 July 2013
Firstpage
1
Lastpage
6
Abstract
A music piece consists of melody and accompaniment in most genres. In this paper, we present a system to automatically generate accompaniment that evokes specific emotions for a given melody. In particular, we propose harmonic progression and onset rate as two key features for emotion-based accompaniment generation. The former refers to the progression of chords, and the latter refers to the number of music events (such as notes and drums) in a unit time. The harmonic progression and the onset rate are altered according to the specified emotion represented by the valence and arousal parameters, respectively. The performance of the system is evaluated subjectively, and the result shows a perfect positive Spearman correlation between the specified emotion and the perceived emotion.
Keywords
acoustic signal processing; behavioural sciences; correlation methods; emotion recognition; harmonic analysis; music; Spearman correlation; arousal parameters; automatic accompaniment generation; chord progression; emotion-based accompaniment generation; harmonic progression; melody; music events; music piece; onset rate; specific emotion; Correlation; Correlation coefficient; Databases; Educational institutions; Harmonic analysis; Rhythm; Harmonic progression; accompaniment; music emotion; onset rate;
fLanguage
English
Publisher
ieee
Conference_Titel
Multimedia and Expo (ICME), 2013 IEEE International Conference on
Conference_Location
San Jose, CA
ISSN
1945-7871
Type
conf
DOI
10.1109/ICME.2013.6607423
Filename
6607423
Link To Document