Using relative head and hand-target features to predict intention in 3D moving-target selection - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2014

Using relative head and hand-target features to predict intention in 3D moving-target selection

Résumé

Selection of moving targets is a common, yet complex task in human-computer interaction (HCI) and virtual reality (VR). Predicting user intention may be beneficial to address the challenges inherent in interaction techniques for moving-target selection. This article extends previous models by integrating relative head-target and hand-target features to predict intended moving targets. The features are calculated in a time window ending at roughly two-thirds of the total target selection time and evaluated using decision trees. With two targets, this model is able to predict user choice with up to ~ 72% accuracy on general moving-target selection tasks and up to ~ 78% by also including task-related target properties.
Fichier principal
Vignette du fichier
LE2I_VR_2014_CASALLAS.pdf (3.01 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01133927 , version 1 (20-03-2015)

Identifiants

Citer

Juan Sebastian Casallas, James H. Oliver, Jonathan W. Kelly, Frédéric Merienne, Samir Garbaya. Using relative head and hand-target features to predict intention in 3D moving-target selection. IEEE Virtual Reality, Mar 2014, Minneapolis, Minnesota, United States. pp.51-56, ⟨10.1109/VR.2014.6802050⟩. ⟨hal-01133927⟩
100 Consultations
170 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More