Skip to Main content Skip to Navigation
Journal articles

Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications

Qinjie Ju 1, 2 René Chalon 2 Stéphane Derrode 1
1 imagine - Extraction de Caractéristiques et Identification
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
2 SICAL - Situated Interaction, Collaboration, Adaptation and Learning
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
Abstract : Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. However, it lacks convenient action triggering methods. In our research, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. In this instance, we have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixedgaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user’s hands are occupied by another task, we have designed and developed an experimental application known as EyeMusic. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this paper. The usability of our application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement.
Document type :
Journal articles
Complete list of metadata
Contributor : René Chalon Connect in order to contact the contributor
Submitted on : Tuesday, April 30, 2019 - 2:40:28 PM
Last modification on : Tuesday, June 1, 2021 - 2:08:10 PM




Qinjie Ju, René Chalon, Stéphane Derrode. Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications. Proceedings of the ACM on Human-Computer Interaction , Association for Computing Machinery (ACM), 2019, 3 (EICS), Article 3, pp 1-29. ⟨10.1145/3300962⟩. ⟨hal-02115757⟩



Record views