Multimodal recognition of emotions using a formal computational model

Abstract : In this paper, we present a multimodal approach for the emotion recognition that takes into account more sources of information (physiological signals, facial expressions, speech, etc). This approach is based on an algebraic representation of emotional states using multidimensional vectors. This multidimensional model provides a powerful mathematical tools for the analysis and the processing of emotions. It permits to integrate information from different modalities (speech, facial expressions, gestures) in order to allow more reliable estimation of emotional states. Indeed, our proposal aims at efficient recognition of emotional state even when it appear to be superposed or masked.
Type de document :
Communication dans un congrès
2012 IEEE International Conference on Complex Systems (ICCS) , Nov 2012, Agadir, Morocco. IEEE Xplore, pp.1-6, 2013, <http://ieeexplore.ieee.org/document/6458511/>. <10.1109/ICoCS.2012.6458511>
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01516378
Contributeur : Nhan Le Thanh <>
Soumis le : dimanche 30 avril 2017 - 17:58:02
Dernière modification le : lundi 1 mai 2017 - 01:05:44

Identifiants

Collections

Citation

Imen Tayari-Meftah, Nhan Le-Thanh, Chokri Ben-Amar. Multimodal recognition of emotions using a formal computational model. 2012 IEEE International Conference on Complex Systems (ICCS) , Nov 2012, Agadir, Morocco. IEEE Xplore, pp.1-6, 2013, <http://ieeexplore.ieee.org/document/6458511/>. <10.1109/ICoCS.2012.6458511>. <hal-01516378>

Partager

Métriques

Consultations de la notice

27