Skip to Main content Skip to Navigation
Journal articles

Audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments

Abstract : A study on audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments is presented. Audio-visual rendering is provided using tracked passive visual stereoscopy and acoustic wave fi eld synthesis (WFS). Distances are estimated using indirect blind-walking (triangulation) under each rendering condition. Experimental results show that distances perceived in the virtual environment are accurately estimated or overestimated for rendered distances closer than the position of the audio-visual rendering system and underestimated for distances farther. Interestingly, participants perceived each virtual object at a modality-independent distance when using the audio modality, the visual modality, or the combination of both. Results show WFS capable of synthesizing perceptually meaningful sound fields in terms of distance. Dynamic audio-visual cues were used by participants when estimating the distances in the virtual world. Moving may have provided participants with a better visual distance perception of close distances than if they were static. No correlation between the feeling of presence and the visual distance underestimation has been found. To explain the observed perceptual distance compression, it is proposed that, due to con flicting distance cues, the audio-visual rendering system physically anchors the virtual world to the real world. Virtual objects are thus attracted by the physical audio-visual rendering system.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-00743233
Contributor : Xavier Boutillon <>
Submitted on : Tuesday, December 11, 2012 - 11:16:45 AM
Last modification on : Monday, February 10, 2020 - 6:14:04 PM
Document(s) archivé(s) le : Saturday, December 17, 2016 - 2:03:54 AM

File

A_V_AV_perception-HAL.pdf
Files produced by the author(s)

Identifiers

Citation

Marc Rébillat, Xavier Boutillon, Étienne Corteel, Brian F. G. Katz. Audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments. ACM Transactions on Applied Perception, Association for Computing Machinery, 2012, 9 (4), 19 (p. 1-17). ⟨10.1145/2355598.2355602⟩. ⟨hal-00743233⟩

Share

Metrics

Record views

479

Files downloads

404