Thesis of Qinjie Ju
Subject:
Start date: 01/02/2016
Defense date: 09/04/2019
Advisor: Stéphane Derrode
Coadvisor: René Chalon
Summary:
Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. In this thesis, we concentrate
in demonstrating this potential by highlighting the scenarios in which the eye-tracking possesses obvious advantages comparing with all the other interaction
modalities. During our research, we find that this technology lacks convenient action triggering methods, which can scale down the performance of interacting by gaze. In this instance, we investigate the combination of eye-tracking and fixedgaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction, and we have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user’s hands are occupied by another task, we have implemented some tests in the EyeMusic application that we have designed and developed. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this thesis. The usability of our EyeMusic application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement. Apart from the EyeMusic application, we have also designed two other scenarios to reinforce the potential of eye-tracking in HCI when coupling it with fixed-gaze head movements. These two applications name respectively EyeRecipe and EyePay, the details of these two applications are also presented in this thesis.
Jury:
Christophe Kolski | Professeur(e) | Université Polytechnique Hauts-de-France | Rapporteur(e) |
Jean Vanderdonckt | Professeur(e) | Université Catholique de Louvain | Rapporteur(e) |
Patrick Girard | Professeur(e) | Université de Poitiers | Examinateur(trice) |
Francis Jambon | Maître de conférence | Université Grenoble Alpes | Examinateur(trice) |
Christine Michel | Maître de conférence | INSA de Lyon | Examinateur(trice) |
René Chalon | Maître de conférence | É́cole Centrale de Lyon | Co-directeur (trice) |
Stéphane Derrode | Professeur(e) | É́cole Centrale de Lyon | Directeur(trice) de thèse |