Skip to Main content Skip to Navigation
Conference papers

Associating textual features with visual ones to improve affective image classification

Ningning Liu 1 Emmanuel Dellandréa 1 Bruno Tellez 1 Liming Chen 1 
1 imagine - Extraction de Caractéristiques et Identification
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
Abstract : Many images carry a strong emotional semantic. These last years, some rare investigations have been driven to automatically identify induced emotions that may arise in viewers when looking at those images, based on low-level image properties. Since these features can only catch the image atmosphere, they may fail when the emotional semantic is carried by objects in images, such as a child crying for example. Therefore additional information is needed, and we propose in this paper to make use of textual information describing the image such as tags. Thus, we have developed features based on this text to catch its emotional meaning that are further combined with visual features thanks to an approach based on the evidence theory. Experiments have been driven on two datasets to evaluate visual and textual features and their fusion. The results have shown that our textual features can improve the classification accuracy of affective images.
Document type :
Conference papers
Complete list of metadata
Contributor : Équipe gestionnaire des publications SI LIRIS Connect in order to contact the contributor
Submitted on : Thursday, August 18, 2016 - 7:28:19 PM
Last modification on : Tuesday, June 1, 2021 - 2:08:09 PM

Links full text



Ningning Liu, Emmanuel Dellandréa, Bruno Tellez, Liming Chen. Associating textual features with visual ones to improve affective image classification. International Conference on Affective Computing and Intelligent Interaction (ACII2011), Oct 2011, Memphis, TN, United States. pp.195-204, ⟨10.1007/978-3-642-24600-5_23⟩. ⟨hal-01354456⟩



Record views