What Musical Knowledge Does Self-Attention Learn? - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

What Musical Knowledge Does Self-Attention Learn?

Résumé

Since their conception for NLP tasks in 2017, Transformer neural networks have been increasingly used with compelling results for a variety of symbolic MIR tasks including music analysis, classification and generation. Although the concept of self-attention between words in text can intuitively be transposed as a relation between musical objects such as notes or chords in a score, it remains relatively unknown what kind of musical relations precisely tend to be captured by self attention mechanisms when applied to musical data. Moreover, the principle of self-attention has been elaborated in NLP to help model the “meaning” of a sentence while in the musical domain this concept appears to be more subjective. In this explorative work, we open the music transformer black box looking to identify which aspects of music are actually learnt by the self-attention mechanism. We apply this approach to two MIR probing tasks : composer classification and cadence identification.
Fichier principal
Vignette du fichier
keller_loiseau_bigo_NLP4MuSA.pdf (441.07 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03419236 , version 1 (09-11-2021)
hal-03419236 , version 2 (25-11-2021)

Identifiants

  • HAL Id : hal-03419236 , version 2

Citer

Mikaela Keller, Gabriel Loiseau, Louis Bigo. What Musical Knowledge Does Self-Attention Learn?. Workshop on NLP for Music and Spoken Audio (NLP4MuSA 2021), 2021, Online, France. ⟨hal-03419236v2⟩
232 Consultations
166 Téléchargements

Partager

Gmail Facebook X LinkedIn More