The Augmented String Quartet: Experiments and Gesture Following
Résumé
We present interdisciplinary research undertaken for the development of an ‘augmented’ string quartet. Hardware and software components were especially designed to enable mixed acoustic/electronic music where bow gestures drive digital sound processes. Precisely, inertial motion sensors and a bow force sensor were added to each musician's bow, and dedicated modules allowed forthe wireless data transmission to an on-line gesture analysis system. Prior to the performance, a research phase was performed to evaluate qualitatively the variability of the gesture data. Recording sessions of both gesture and audio data were carried out with a professional string quartet. The music material included a set of prototypical musical phrases containing various bowing styles and playing techniques as well as a complete music composition. The analysis of the recorded sessions allowed us to compare the consistency within and between players. While a given player was found to be generally consistent, the comparison between players revealed significant gesture idiosyncrasies. These results helped us to adapt a real-time gesture analysis system called the gesture follower. This tool was successful to automatically synchronize the live performance with electronic sound transformation in two concerts. A quantitative assessment is reported on a specific section of the piece, illustrating the accuracy and the types of errors encountered.
Domaines
Son [cs.SD] Interface homme-machine [cs.HC] Musique, musicologie et arts de la scène Traitement du signal et de l'image [eess.SP] Apprentissage [cs.LG] Intelligence artificielle [cs.AI] Ingénierie assistée par ordinateur Multimédia [cs.MM] Vision par ordinateur et reconnaissance de formes [cs.CV] Autre [cs.OH] Traitement du signal et de l'image [eess.SP]
Origine : Fichiers produits par l'(les) auteur(s)
Loading...