SoundGuides: Adapting Continuous Auditory Feedback to Users

Abstract : We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the particular way each user performs a given gesture. SoundGuides is suitable for the design of continuous auditory feedback aimed at guiding users' movements and helping them to perform a specific movement consistently over time. Applications span from movement-based interaction techniques to auditory-guided rehabilitation. We first describe our system and report a study that demonstrates a 'stabilizing effect' of our adaptive auditory feedback method.
Document type :
Conference papers
ACM. CHI EA '16, May 2016, San Jose, United States. pp.2829--2836, 2016, Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 〈10.1145/2851581.2892420〉
Liste complète des métadonnées

Cited literature [23 references]  Display  Hide  Download


https://hal.archives-ouvertes.fr/hal-01317069
Contributor : Olivier Chapuis <>
Submitted on : Wednesday, May 18, 2016 - 3:29:27 PM
Last modification on : Tuesday, October 10, 2017 - 11:28:02 AM
Document(s) archivé(s) le : Thursday, November 17, 2016 - 10:45:04 AM

Identifiers

Citation

Jules Françoise, Olivier Chapuis, Sylvain Hanneton, Frédéric Bevilacqua. SoundGuides: Adapting Continuous Auditory Feedback to Users. ACM. CHI EA '16, May 2016, San Jose, United States. pp.2829--2836, 2016, Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 〈10.1145/2851581.2892420〉. 〈hal-01317069〉

Share

Metrics

Record views

388

Document downloads

297