SATIN: A persistent musical database for music information retrieval and a supporting deep learning experiment on song instrumental classification

Abstract : This paper introduces SATIN, the Set of Audio Tags and Identifiers Normalized. SATIN is a database of 400k audio-related metadata and identifiers that aims at facilitating reproducibility and comparisons among the Music Information Retrieval (MIR) algorithms. The idea is to take advantage of partnerships between scientists and private companies that host millions of tracks. Scientists can send their feature extraction algorithm to companies along SATIN identifiers and retrieve the corresponding features. This procedure allows the MIR community to have access to more tracks for classification purposes. Afterwards, scientists can provide to the MIR community the classification result for each track, which can then be compared with other algorithms results. SATIN thus resolves the major problems of accessing more tracks, managing copyrights locks, saving computation time, and guaranteeing consistency over research databases. We introduce SOFT1, the first Set Of FeaTures extracted by a company thanks to SATIN. We propose a supporting experiment classifying instrumentals and songs to detail a possible use of SATIN. We compare a deep learning approach —that has emerged in recent years in MIR— with a knowledge-based approach.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01762796
Contributor : Yann Bayle <>
Submitted on : Tuesday, April 10, 2018 - 2:15:26 PM
Last modification on : Tuesday, May 29, 2018 - 9:14:04 AM

Identifiers

Collections

Citation

Yann Bayle, Matthias Robine, Pierre Hanna. SATIN: A persistent musical database for music information retrieval and a supporting deep learning experiment on song instrumental classification. Multimedia Tools and Applications, Springer Verlag, 2018, ⟨10.1007/s11042-018-5797-8⟩. ⟨hal-01762796⟩

Share

Metrics

Record views

112