Title :
Music mood classification by rhythm and bass-line unit pattern analysis
Author :
Tsunoo, Emiru ; Akase, Taichi ; Ono, Nobutaka ; Sagayama, Shigeki
Author_Institution :
Grad. Sch. of Inf. Sci. & Technol., Univ. of Tokyo, Tokyo, Japan
Abstract :
This paper discusses an approach for the feature extraction for audio mood classification which is an important and tough problem in the field of music information retrieval (MIR). In this task the timbral information has been widely used, however many musical moods are characterized not only by timbral information but also by musical scale and temporal features such as rhythm patterns and bass-line patterns. In particular, modern music pieces mostly have certain fixed rhythm and bass-line patterns, and these patterns can characterize the impression of songs. We have proposed the extraction of rhythm and bass-line patterns, and these unit pattern analysis are combined with statistical feature extraction for mood classification. Experimental results show that the automatically calculated unit pattern information can be used to effectively classify musical mood.
Keywords :
acoustic signal processing; feature extraction; music; musical acoustics; audio mood classification; bass-line unit pattern analysis; music information retrieval; music mood classification; rhythm unit pattern analysis; statistical feature extraction; timbral information; Data mining; Dynamic programming; Feature extraction; Mood; Multiple signal classification; Music information retrieval; Pattern analysis; Rhythm; Spectrogram; Testing; Audio classification; dynamic programming; feature extraction; k-means clustering; pattern clustering method;
Conference_Titel :
Acoustics Speech and Signal Processing (ICASSP), 2010 IEEE International Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4244-4295-9
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2010.5495964