A Multi-Componential Analysis of Emotions during Complex Learning with an Intelligent Multi-agent System

Abstract : This paper presents the evaluation of the synchronization of three emotional measurement methods (automatic facial expression recognition, self-report, electrodermal activity) and their agreement regarding learners' emotions. Data were collected from 67 undergraduates enrolled at a North American university whom learned about a complex science topic while interacting with MetaTutor, a multi-agent computerized learning environment. Videos of learners' facial expressions captured with a webcam were analyzed using automatic facial recognition software (FaceReader 5.0). Learners' physiological arousal was recorded using Affectiva's Q-Sensor 2.0 electrodermal activity measurement bracelet. Learners' self-reported their experience of 19 different emotional states on five different occasions during the learning session, which were used as markers to synchronize data from FaceReader and Q-Sensor. We found a high agreement between the facial and self-report data (75.6%), but low levels of agreement between them and the Q-Sensor data, suggesting that a tightly coupled relationship does not always exist between emotional response components.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01340608
Contributor : Lip6 Publications <>
Submitted on : Friday, July 1, 2016 - 1:55:56 PM
Last modification on : Wednesday, May 15, 2019 - 3:50:12 AM

Identifiers

Citation

Jason M. Harley, François Bouchet, M. Sazzad Hussain, Roger Azevedo, Rafael A. Calvo. A Multi-Componential Analysis of Emotions during Complex Learning with an Intelligent Multi-agent System. Computers in Human Behavior, Elsevier, 2015, 48, pp.615--625. ⟨10.1016/j.chb.2015.02.013⟩. ⟨hal-01340608⟩

Share

Metrics

Record views

229