Improving Haptic Response for Contextual Human Robot Interaction - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Sensors Année : 2022

Improving Haptic Response for Contextual Human Robot Interaction

Résumé

For haptic applications, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve device response is to infer human intended motion and move the robot at the earliest time possible to the desired goal. This paper presents an experimental study to improve prediction time and reduce the robot time to reach the desired position. We developed motion strategies based on the hand motion and eye-gaze direction to determine the point of user interaction in a virtual environment. To assess the performance of the strategies, we conducted a subject-based experiment using an exergame for reach and grab tasks designed for upper limb rehabilitation training. Experimental results in this study revealed that eye-gaze-based prediction significantly improved detection time by 37% and the robot time to reach the target by 27%. Further analysis provided more insight on the effect of eye-gaze window and the hand threshold on the device response for the experimental task.
Fichier principal
Vignette du fichier
main.pdf (6.3 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03599147 , version 1 (07-03-2022)

Identifiants

  • HAL Id : hal-03599147 , version 1

Citer

Stanley Mugisha, Vamsi Krisha Guda, Christine Chevallereau, Matteo Zoppi, Rezia Molfino, et al.. Improving Haptic Response for Contextual Human Robot Interaction. Sensors, 2022, 22. ⟨hal-03599147⟩
54 Consultations
26 Téléchargements

Partager

Gmail Facebook X LinkedIn More