SoundGuides: Adapting Continuous Auditory Feedback to Users - Archive ouverte HAL Access content directly
Conference Papers Year : 2016

SoundGuides: Adapting Continuous Auditory Feedback to Users

Abstract

We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the particular way each user performs a given gesture. SoundGuides is suitable for the design of continuous auditory feedback aimed at guiding users' movements and helping them to perform a specific movement consistently over time. Applications span from movement-based interaction techniques to auditory-guided rehabilitation. We first describe our system and report a study that demonstrates a 'stabilizing effect' of our adaptive auditory feedback method.
Fichier principal
Vignette du fichier
soundguides-final.pdf (856.32 Ko) Télécharger le fichier
Vignette du fichier
soudguides.jpg (3.48 Ko) Télécharger le fichier
CHIEA16-SoundGuides.mp4 (18.71 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Format : Figure, Image
Origin : Files produced by the author(s)
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01317069 , version 1 (18-05-2016)

Identifiers

Cite

Jules Françoise, Olivier Chapuis, Sylvain Hanneton, Frédéric Bevilacqua. SoundGuides: Adapting Continuous Auditory Feedback to Users. CHI 2016 - Conference Extended Abstracts on Human Factors in Computing Systems, May 2016, San Jose, United States. pp.2829--2836, ⟨10.1145/2851581.2892420⟩. ⟨hal-01317069⟩
588 View
407 Download

Altmetric

Share

Gmail Facebook X LinkedIn More