WHY and HOW to study multimodal interaction in cockpit design - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2016

WHY and HOW to study multimodal interaction in cockpit design

Jérôme Barbé
  • Fonction : Auteur
  • PersonId : 994548
Laurent Spaggiari
  • Fonction : Auteur
  • PersonId : 994549
Patxi Berard
  • Fonction : Auteur
  • PersonId : 994550
Azdine Aissani
  • Fonction : Auteur
  • PersonId : 994551

Résumé

Technological evolution opens up new ways of interacting and broadens the design perspectives for the use of multimodality in the future cockpit. The use of adaptive multimodality is expected to provide more natural and intuitive interactions and should increase pilots’ physical and cognitive performance within a given context. But this augmented set of input/output modalities and their combination highly increases the number of possible design solutions to be considered. Consequently, the design becomes more complex, requires more iterative loops for thinking, developing and validating interaction concepts. So, there is a strong need (1) to define a method to mark out the boundaries of design by giving recommendations and guidelines and (2) to develop a multi-agent platform to quickly iterate on potentially interesting multimodal design solutions that could enhance human performance for future cockpit. This paper presents a preliminary approach to address multimodal interaction in cockpit design.
Fichier non déposé

Dates et versions

hal-01403892 , version 1 (28-11-2016)

Identifiants

  • HAL Id : hal-01403892 , version 1

Citer

Jérôme Barbé, Laurent Spaggiari, Régis Mollard, Alexis Clay, Patxi Berard, et al.. WHY and HOW to study multimodal interaction in cockpit design . Ergo'IA, Jul 2016, Bidart, France. ⟨hal-01403892⟩
239 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More