Gaze and face-to-face interaction

Gérard Bailly 1 Alaeddine Mihoub 1, 2 Christian Wolf 2 Frédéric Elisei 1, 3
1 GIPSA-CRISSP - CRISSP
GIPSA-DPC - Département Parole et Cognition
3 GIPSA-Services - GIPSA-Services
GIPSA-lab - Grenoble Images Parole Signal Automatique
Abstract : This chapter describes experimental and modeling work aiming at describing gaze patterns that are mutually exchanged by interlocutors during situated and task-directed face-to-face two-ways interactions. We will show that these gaze patterns (incl. blinking rate) are significantly influenced by the cognitive states of the interlocutors (speaking, listening, thinking, etc.), their respective roles in the conversation (e.g. instruction giver, respondent) as well as their social relationship (e.g. colleague, supervisor). This chapter provides insights into the (micro-)coordination of gaze with other components of attention management as well as methodologies for capturing and modeling behavioral regularities observed in experimental data. A particular emphasis is put on statistical models, which are able to learn behaviors in a data-driven way. We will introduce several statistical models of multimodal behaviors that can be trained on such multimodal signals and generate behaviors given perceptual cues. We will notably compare performances and properties of models which explicitly model the temporal structure of studied signals, and which relate them to internal cognitive states. In particular we study Semi-Hidden Markov Models and Dynamic Bayesian Networks and compare them to classifiers without sequential models (Support Vector Machines and Decision Trees). We will further show that the gaze of conversational agents (virtual talking heads, speaking robots) may have a strong impact on communication efficiency. One of the conclusions we draw from these experiments is that multimodal behavioral models able to generate co-verbal gaze patterns of interactive avatars should be designed with great care in order not to increase the cognitive load of human partners. Experiments involving an impoverished or irrelevant control of the gaze of artificial agents (virtual talking heads and humanoid robots) have demonstrated its negative impact on communication (Garau, Slater, Bee, & Sasse, 2001).
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01939223
Contributor : Gérard Bailly <>
Submitted on : Tuesday, February 26, 2019 - 9:18:45 AM
Last modification on : Friday, March 8, 2019 - 1:30:20 AM
Long-term archiving on : Monday, May 27, 2019 - 12:39:42 PM

Identifiers

Citation

Gérard Bailly, Alaeddine Mihoub, Christian Wolf, Frédéric Elisei. Gaze and face-to-face interaction. Geert Brône & Bert Oben. Eye-tracking in Interaction. Studies on the role of eye gaze in dialogue, Benjamins, pp.139 - 168, 2018, ⟨10.1075/ais.10.07bai⟩. ⟨hal-01939223⟩

Share

Metrics

Record views

80

Files downloads

154