Thesis of Charles Javerliat


Subject:
Intangible Cultural Heritage Preservation Through Multisensory Virtual Reality: Contributions to 3D Markerless Motion Capture, Olfactory Restitution, and User Experience Evaluation

Start date: 01/09/2022
End date (estimated): 01/09/2025

Advisor: Guillaume Lavoué
Coadvisor: Pierre Raimbaud

Summary:

Intangible Cultural Heritage (ICH) encompasses living practices, traditional craftsmanship, and embodied knowledge that represent a critical yet vulnerable dimension of human cultural patrimony. Unlike tangible heritage, which benefits from well-established digitization technologies, ICH remains underserved by existing preservation methods. Its ephemeral, embodied, and multisensory nature resists conventional documentation approaches, creating an urgent need for technological frameworks capable of capturing and restituting the full experiential richness of cultural practices.

This thesis addresses the challenge of preserving ICH through the development of an integrated pipeline for creating multisensory digital twins, that is, digital representations that extend beyond traditional 2D audiovisual dimensions to encompass the complete sensory spectrum of cultural practices and restitute it in multisensory virtual reality. Using the Guédelon Castle experimental medieval construction site as a living laboratory, where artisans employ thirteenth-century techniques in outdoor and uncontrolled environments, we confront the practical challenges of documenting traditional craftsmanship under authentic working conditions.

The research advances three complementary technological contributions spanning multiple aspects of the preservation pipeline: capture, restitution, and evaluation. For capture, we introduce Kineo, a calibration-free markerless motion capture system that reconstructs high-fidelity 3D human movement from sparse RGB cameras without requiring  some complex setup, dedicated hardware, or tedious and error-prone calibration procedures. For restitution, we present Nebula, an open-source portable olfactory display that delivers precisely controlled scent stimulation in six-degrees-of-freedom virtual reality experiences, compatible with both PC-based and standalone headsets. For evaluation, we present PLUME, a comprehensive software framework for recording, replaying, analyzing, and sharing multimodal behavioral and physiological data from immersive experiments.

Beyond immediate heritage applications, this work establishes methodological foundations and open-source tools that lower technical barriers for researchers across disciplines, supporting advances in human motion analysis, multisensory virtual reality, and user experience evaluation.

To promote reproducibility and facilitate further research, all developments presented in this thesis are made available as open source, including software implementations and hardware designs.