Multimodal Recognition of Emotions Using Physiological Signals with the Method of Decision-Level Fusion for Healthcare Applications

Abstract : Automatic emotion recognition enhance dramatically the development of human/machine dialogue. Indeed, it allows computers to determine the emotion felt by the user and adapt consequently its behavior. This paper presents a new method for the fusion of signals for the purpose of a multimodal recognition of eight basic emotions using physiological signals. After a learning phase where an emotion data base is constructed, we apply the recognition algorithm on each modality separately. Then, we merge all these decisions separately by applying a decision fusion approach to improve recognition rate. The experiments show that the proposed method allows high accuracy emotion recognition. Indeed we get a recognition rate of 81.69% under some conditions.
Type de document :
Communication dans un congrès
13th International Conference On Smart homes and health Telematics Inclusive Smart Cities and e-Health, Jun 2015, Genève, Switzerland. pp.301-306, 2015
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01174349
Contributeur : Jean-Pierre Damiano <>
Soumis le : jeudi 9 juillet 2015 - 08:23:36
Dernière modification le : jeudi 12 octobre 2017 - 01:14:14

Identifiants

  • HAL Id : hal-01174349, version 1

Collections

Relations

Citation

Chaka Koné, Imen Tayari Meftah, Nhan Le Thanh, Cécile Belleudy. Multimodal Recognition of Emotions Using Physiological Signals with the Method of Decision-Level Fusion for Healthcare Applications. 13th International Conference On Smart homes and health Telematics Inclusive Smart Cities and e-Health, Jun 2015, Genève, Switzerland. pp.301-306, 2015. 〈hal-01174349〉

Partager

Métriques

Consultations de la notice

160