Reducing Simulator Sickness with Perceptual Camera Control
Résumé
Virtual-reality provides an immersive environment but can induce cybersickness
due to the discrepancy between visual and vestibular cues. To avoid
this problem, the movement of the virtual camera needs to match the motion
of the user in the real world. Unfortunately, this is usually difficult due to
the mismatch between the size of the virtual environments and the space
available to the users in the physical domain. The resulting constraints on
the camera movement significantly hamper the adoption of virtual-reality
headsets in many scenarios and make the design of the virtual environments
very challenging. In this work, we study how the characteristics of the virtual
camera movement (e.g., translational acceleration and rotational velocity)
and the composition of the virtual environment (e.g., scene depth) contribute
to perceived discomfort. Based on the results from our user experiments, we
devise a computational model for predicting the magnitude of the discomfort
for a given scene and camera trajectory. We further apply our model to a
new path planning method which optimizes the input motion trajectory to
reduce perceptual sickness. We evaluate the effectiveness of our method in
improving perceptual comfort in a series of user studies targeting different
applications. The results indicate that our method can reduce the perceived
discomfort while maintaining the fidelity of the original navigation, and
perform better than simpler alternatives.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...