Multimodal Approach for Emotion Recognition Using an Algebraic Representation of Emotional States
Résumé
Emotions play a key role in human-computer interaction. They are generally expressed through several ways (e.g. facial expressions, speech, body postures and gestures, etc). In this paper, we present a multimodal approach for the emotion recognition that integrates information coming from different cues and modalities. It is based on a formal multidimensional model using an algebraic representation of emotional states. This multidimensional model provides to represent infinity of emotions and provide powerful mathematical tools for the analysis and the processing of these emotions. It permits to estimate the human emotional state through combining information from different modalities (e.g. facial expressions, speech, body postures and gestures, etc) in order to allow more reliable estimation of emotional states. Our proposal permits to recognize not only the basic emotions (e.g., anger, sadness, fear) but also different types of complex emotions like simulated and masked emotions. Experimental results show how the proposed approach increase the recognition rates in comparison with the unimodal approach.