• DocumentCode
    3697409
  • Title

    Modeling musical rhythmatscale with the music Genome project

  • Author

    Matthew Prockup;Andreas F. Ehmann;Fabien Gouyon;Erik M. Schmidt;Youngmoo E. Kim

  • Author_Institution
    Drexel University, ECE Dept., 3141 Chestnut Street, Philadelphia, PA 19104
  • fYear
    2015
  • Firstpage
    1
  • Lastpage
    5
  • Abstract
    Musical meter and attributes of the rhythmic feel such as swing, syncopation, and danceability are crucial when defining musical style. However, they have attracted relatively little attention from the Music Information Retrieval (MIR) community and, when addressed, have proven difficult to model from music audio signals. In this paper, we propose a number of audio features for modeling meter and rhythmic feel. These features are first evaluated and compared to timbral features in the common task of ballroom genre classification. These features are then used to learn individual models for a total of nine rhythmic attributes covering meter and feel using an industrial-sized corpus of over one million examples labeled by experts from Pandora® Internet Radio´s Music Genome Project®. Linear models are shown to be powerful, representing these attributes with high accuracy at scale.
  • Keywords
    "Rhythm","Transforms","Multiple signal classification","Context","Genomics","Bioinformatics"
  • Publisher
    ieee
  • Conference_Titel
    Applications of Signal Processing to Audio and Acoustics (WASPAA), 2015 IEEE Workshop on
  • Type

    conf

  • DOI
    10.1109/WASPAA.2015.7336891
  • Filename
    7336891