A new step to optimize sound localization adaptation through the use of vision - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

A new step to optimize sound localization adaptation through the use of vision

Résumé

Following a previous experiment, this paper presents our ongoing effort to improve a rapid multisensory non-individual HRTF adaptation training method. Motivations to modify our former attempt of an audio-visuo-proprioceptive training are first presented. The paper mainly focuses on two aspects: the inclusion of a non-individual HRTF pre-selection phase, and the design of relevant visual cues, consistent with the auditory cues. The design of an experiment, planned to investigate the effect of vision on the adaptation to non-individual HRTF, is then detailed. We conclude with some expected results and future research directions and a new context of use: the rehabilitation of patients suffering from spatial cognition disorders.
Fichier non déposé

Dates et versions

hal-02922711 , version 1 (26-08-2020)

Identifiants

  • HAL Id : hal-02922711 , version 1

Citer

Tristan-Gaël Bara, Alma Guilbert, Tifanie Bouchara Bouchara. A new step to optimize sound localization adaptation through the use of vision. AES International Conference on Audio for Virtual and Augmented Reality, Aug 2020, Seatle, United States. ⟨hal-02922711⟩
155 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More