DocumentCode :
1947496
Title :
Notice of Retraction
Multi-modal music genre classification approach
Author :
Chao Zhen ; Jieping Xu
Author_Institution :
Multimedia Lab. of Inf. Sch., Renmin Univ. of China, Beijing, China
Volume :
8
fYear :
2010
fDate :
9-11 July 2010
Firstpage :
398
Lastpage :
402
Abstract :
Notice of Retraction

After careful and considered review of the content of this paper by a duly constituted expert committee, this paper has been found to be in violation of IEEE´s Publication Principles.

We hereby retract the content of this paper. Reasonable effort should be made to remove all past references to this paper.

The presenting author of this paper has the option to appeal this decision by contacting TPII@ieee.org.

As a fundamental and critical component of music information retrieval (MIR) systems, automatically classifying music by genre is a challenging problem. The traditional approaches which solely depending on low-level audio features may not be able to obtain satisfactory results. In recent years, the social tags have emerged as an important way to provide information about resources on the web. So, in this paper we propose a novel multi-modal music genre classification approach which uses the acoustic features and the social tags together for classifying music by genre. For the audio content-based classification, we design a new feature selection algorithm called IBFFS (Interaction Based Forward Feature Selection). This algorithm selects the features depending on the pre-computed rules which considering the interaction between the different features. In addition, we are interested in another aspect, that is how performing automatic music genre classification depending on the available tag data. Two classification methods based on the social tags (including music-tags and artist-tags) which crawled from website Last.fm are developed in our work: (1) we use the generative probabilistic model Latent Dirichlet Allocation (LDA) to analyze the music-tags. Then, we can obtain the probability of every tag belonging to each music genre. (2) The starting point of the second method is that music´s artist is often associated with music genres more closely. Therefore, we can compute the similarity between the artist-tag vector- to infer which genre the music belongs to. At last, our experimental results demonstrate the benefit of our multi-modal music genre classification approach.
Keywords :
Web sites; content-based retrieval; feature extraction; music; musical acoustics; pattern classification; probability; IBFFS; Latent Dirichlet allocation; Website; acoustic feature; artist tag; audio content based classification; generative probabilistic model; interaction based forward feature selection; multimodal music genre classification; music information retrieval system; music tag; social tag; Acoustics; Analytical models; Art; Atmospheric modeling; Computational modeling; Neodymium; IBFFS; LDA; artist tag; music genre classification; music tag;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Science and Information Technology (ICCSIT), 2010 3rd IEEE International Conference on
Conference_Location :
Chengdu
Print_ISBN :
978-1-4244-5537-9
Type :
conf
DOI :
10.1109/ICCSIT.2010.5564489
Filename :
5564489
Link To Document :
بازگشت