High level control of sound synthesis for sonification processes

Richard Kronland-Martinet 1 Solvi Ystad 1 Mitsuko Aramaki 2
1 Sons
LMA - Laboratoire de Mécanique et d'Acoustique [Marseille] : UPR7051
2 Sons
INCM - Institut de neurosciences cognitives de la méditerranée - UMR 6193
Abstract : Methods of sonification based on the design and control of sound synthesis is presented in this pa- per. The semiotics of isolated sounds was evidenced by performing fundamental studies using a combined acoustical and brain imaging (Event Related Poten- tials) approach. The perceptual cues (which are known as invariants) responsible for the evocations elicited by the sounds generated by impacts, moving sound sour- ces, dynamic events and vehicles (car-door closing and car engine noise) were then identified based on physical and perceptual considerations. Lastly, some examples of the high-level control of a synthesis process simulat- ing immersive 3-D auditory scenes, interacting objects and evoked dynamics are presented.
Complete list of metadatas

Contributor : Solvi Ystad <>
Submitted on : Wednesday, December 1, 2010 - 11:04:46 AM
Last modification on : Monday, March 4, 2019 - 2:04:25 PM

Links full text



Richard Kronland-Martinet, Solvi Ystad, Mitsuko Aramaki. High level control of sound synthesis for sonification processes. AI and Society, Springer Verlag, 2012, 24 (1), pp.245-255. ⟨10.1007/s00146-011-0340-8⟩. ⟨hal-00541776⟩



Record views