Emotion Classification or face identification depend on which part of the face is analyzed - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2007

Emotion Classification or face identification depend on which part of the face is analyzed

Résumé

Gosselin and Schyns (2001) have demonstrated that two distinct categorizations of the same faces require different visual information: the mouth is the only diagnostic region for the expression whereas the eyes and the center of the mouth are needed to recognize the gender. Using images from their database (five men and five women with three different emotions), we propose a model of the human visual system (HVS) dedicated to face analysis. Our HVS model is divided into two parts: a retina model that enhances the structure and texture data (as a result, video data are well-conditioned), and a cortical model (V1) that extracts the description of the orientations and frequency bands of the visual stimuli. This model confirms the behavioural results of Gosselin and Schyns and in addition, shows that the upper part of faces contains the identity (and not only the gender) of a person (more than 80% of correct identification) whereas only the lower part is needed to classify emotions (angry, happy or neutral, with more than 85% of correct classification). More testing experiments are carrying out to test our model on larger databases
Fichier non déposé

Dates et versions

hal-00193535 , version 1 (03-12-2007)

Identifiants

  • HAL Id : hal-00193535 , version 1

Citer

Alexandre Benoit, Nathalie Guyader, Alice Caplier, Jeanny Hérault. Emotion Classification or face identification depend on which part of the face is analyzed. ECVP 2007 - European Conference on Visual Preception, Aug 2007, Arrezo, Italy. ⟨hal-00193535⟩
83 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More