Skip to Main content Skip to Navigation
Book sections

Multimodal Recognition of Emotions Using Physiological Signals with the Method of Decision-Level Fusion for Healthcare Applications

Abstract : Automatic emotion recognition enhance dramatically the development of human/machine dialogue. Indeed, it allows computers to determine the emotion felt by the user and adapt consequently its behavior. This paper presents a new method for the fusion of signals for the purpose of a multimodal recognition of eight basic emotions using physiological signals. After a learning phase where an emotion data base is constructed, we apply the recognition algorithm on each modality separately. Then, we merge all these decisions separately by applying a decision fusion approach to improve recognition rate. The experiments show that the proposed method allows high accuracy emotion recognition. Indeed we get a recognition rate of 81.69% under some conditions.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01516225
Contributor : Nhan Le Thanh <>
Submitted on : Saturday, April 29, 2017 - 10:05:20 AM
Last modification on : Tuesday, May 26, 2020 - 6:50:48 PM

Identifiers

Collections

Relations

Citation

Chaka Koné, Imen Tayari-Meftah, Nhan Le Thanh, Cecile Belleudy. Multimodal Recognition of Emotions Using Physiological Signals with the Method of Decision-Level Fusion for Healthcare Applications. Computer Science. Lecture Notes in Computer Science, 9102, Springer, Cham, pp.301-306, 2015, Inclusive Smart Cities and e-Health, 978-3-319-19312-0. ⟨10.1007/978-3-319-19312-0_26⟩. ⟨hal-01516225⟩

Share

Metrics

Record views

213