Skip to Main content Skip to Navigation
Journal articles

Realistic manipulation of facial and vocal smiles in real-world video streams.

Abstract : Research in affective computing and cognitive science has shown the importance of emotional facial and vocal expressions during human-computer and human-human interactions. But, while models exist to control the display and interactive dynamics of emotional expressions, such as smiles, in embodied agents, these techniques can not be applied to video interactions between humans. In this work, we propose an audiovisual smile transformation algorithm able to manipulate an incoming video stream in real-time to parametrically control the amount of smile seen on the user's face and heard in their voice, while preserving other characteristics such as the user's identity or the timing and content of the interaction. The transformation is composed of separate audio and visual pipelines, both based on a warping technique informed by real-time detection of audio and visual landmarks. Taken together, these two parts constitute a unique audiovisual algorithm which, in addition to providing simultaneous real-time transformations of a real person's face and voice, allows to investigate the integration of both modalities of smiles in real-world social interactions.
Complete list of metadatas

Cited literature [59 references]  Display  Hide  Download
Contributor : Jean-Julien Aucouturier <>
Submitted on : Friday, July 12, 2019 - 1:53:16 PM
Last modification on : Wednesday, October 14, 2020 - 4:18:58 AM


Arias-2018-Realistic transform...
Publication funded by an institution



Pablo Arias, Catherine Soladie, Oussema Bouafif, Axel Roebel, Renaud Seguier, et al.. Realistic manipulation of facial and vocal smiles in real-world video streams.. IEEE Transactions on Affective Computing, Institute of Electrical and Electronics Engineers, 2019, PP (99), pp.1-1. ⟨10.1109/TAFFC.2018.2811465⟩. ⟨hal-01712834⟩



Record views


Files downloads