Multimodal Recognition of Emotions Using Physiological Signals with the Method of Decision-Level Fusion for Healthcare Applications

Abstract : Automatic emotion recognition enhance dramatically the development of human/machine dialogue. Indeed, it allows computers to determine the emotion felt by the user and adapt consequently its behavior. This paper presents a new method for the fusion of signals for the purpose of a multimodal recognition of eight basic emotions using physiological signals. After a learning phase where an emotion data base is constructed, we apply the recognition algorithm on each modality separately. Then, we merge all these decisions separately by applying a decision fusion approach to improve recognition rate. The experiments show that the proposed method allows high accuracy emotion recognition. Indeed we get a recognition rate of 81.69% under some conditions.
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01174349
Contributor : Jean-Pierre Damiano <>
Submitted on : Thursday, July 9, 2015 - 8:23:36 AM
Last modification on : Monday, November 5, 2018 - 3:52:09 PM

Identifiers

  • HAL Id : hal-01174349, version 1

Collections

Relations

Citation

Chaka Koné, Imen Tayari-Meftah, Nhan Le Thanh, Cécile Belleudy. Multimodal Recognition of Emotions Using Physiological Signals with the Method of Decision-Level Fusion for Healthcare Applications. 13th International Conference On Smart homes and health Telematics Inclusive Smart Cities and e-Health, Jun 2015, Genève, Switzerland. pp.301-306, 2015. 〈hal-01174349〉

Share

Metrics

Record views

264