GazeForm: Dynamic Gaze-adaptive Touch Surface for Eyes-free Interaction in Airliner Cockpits

Abstract : An increasing number of domains, including aeronautics, are adopting touchscreens. However, several drawbacks limit their operational use, in particular, eyes-free interaction is almost impossible making it difficult to perform other tasks simultaneously. We introduce GazeForm, an adaptive touch interface with shape-changing capacity that offers an adapted interaction modality according to gaze direction. When the user’s eyes are focused on interaction, the surface is flat and the system acts as a touchscreen. When eyes are directed towards another area, physical knobs emerge from the surface. Compared to a touch only mode, experimental results showed that GazeForm generated a lower subjective mental workload and a higher efficiency of execution (20% faster). Furthermore, GazeForm required less visual attention and participants were able to concentrate more on a secondary monitoring task. Complementary interviews with pilots led us to explore timings and levels of control for using gaze to adapt modality.
Document type :
Conference papers
Complete list of metadatas

Cited literature [44 references]  Display  Hide  Download

https://hal-enac.archives-ouvertes.fr/hal-01809299
Contributor : Sylvain Pauchet <>
Submitted on : Wednesday, June 6, 2018 - 3:52:42 PM
Last modification on : Tuesday, October 2, 2018 - 3:36:08 PM
Long-term archiving on : Friday, September 7, 2018 - 2:00:29 PM

File

file139_HAL.pdf
Files produced by the author(s)

Identifiers

Collections

ENAC | LII

Citation

Sylvain Pauchet, Catherine Letondal, Jean-Luc Vinot, Mickaël Causse, Mathieu Cousy, et al.. GazeForm: Dynamic Gaze-adaptive Touch Surface for Eyes-free Interaction in Airliner Cockpits. DIS '18, Designing Interactive Systems Conference, Jun 2018, Hong-Kong, China. pp.P.1193-1205 ; ISBN: 978-1-4503-5198-0, ⟨10.1145/3196709.3196712⟩. ⟨hal-01809299⟩

Share

Metrics

Record views

143

Files downloads

235