Thesis of Samuel Berlemont

Instrumented gestures classification using sensor fusion


The main subject of this thesis is Smartphone-based 3D gesture recognition. Existing methods already allow for gesture recognition services in a “closed world”, with a limited number of predefined gestures and a specific training for each user. However, they cannot be generalized for multiple users: the necessary memory and computation times are not compatible with the limitations of a mobile terminal and with a “real-time” constraint.

This work aims at delivering to users generic models of commonly used gestures, with strong semantic values, while users keep the possibility to personalize the system with their own gesture vocabulary.

Thus, on the one hand, we will study multimodal sensor fusion (accelerometer, gyrometer, and magnetometer) to give a better characterization to gestures. On the other hand, we will work on new automatic learning models, in order to offer a better generalization and discrimination during the recognition process.

Advisor: Christophe Garcia
Coadvisor: Stefan Duffner

Defense date: thursday, february 11, 2016