Skip to Main content Skip to Navigation
Journal articles

Sound context modulates perceived vocal emotion

Abstract : Many animal vocalizations contain nonlinear acoustic phenomena as a consequence of physiological arousal. In humans, nonlinear features are processed early in the auditory system, and are used to efficiently detect alarm calls and other urgent signals. Yet, high-level emotional and semantic contextual factors likely guide the perception and evaluation of roughness features in vocal sounds. Here we examined the relationship between perceived vocal arousal and auditory context. We presented listeners with nonverbal vocalizations (yells of a single vowel) at varying levels of portrayed vocal arousal, in two musical contexts (clean guitar, distorted guitar) and one non-musical context (modulated noise). As predicted, vocalizations with higher levels of portrayed vocal arousal were judged as more negative and more emotionally aroused than the same voices produced with low vocal arousal. Moreover, both the perceived valence and emotional arousal of vocalizations were significantly affected by both musical and non-musical contexts. These results show the importance of auditory context in judging emotional arousal and valence in voices and music, and suggest that nonlinear features in music are processed similarly to communicative vocal signals.
Complete list of metadatas

Cited literature [20 references]  Display  Hide  Download
Contributor : Jean-Julien Aucouturier <>
Submitted on : Friday, January 31, 2020 - 3:00:46 PM
Last modification on : Saturday, March 28, 2020 - 1:55:31 AM
Long-term archiving on: : Friday, May 1, 2020 - 3:38:02 PM


Liuni et al. Sound context (20...
Files produced by the author(s)



Marco Liuni, Emmanuel Ponsot, Gregory Bryant, Jean-Julien Aucouturier. Sound context modulates perceived vocal emotion. Behavioural Processes, Elsevier, 2020, ⟨10.1016/j.beproc.2020.104042⟩. ⟨hal-02462759⟩



Record views


Files downloads