Doctoral thesis
OA Policy
English

An Illumination Registration Model for Dynamic Virtual Humans in Mixed Reality

Number of pages151
Imprimatur date2006-10-09
Abstract

From the platonic notion of world of ideas and world of the senses to the computer graphics simulations across the Mixed Realities continuum the principal aim was common: invent an empirical realism that sustains its inextricably integrated strands of ideal and sensual, virtual and real, through pragmatic and structuralist narratives. This process has been exemplified through the cinematic film narratives where the composition of real and virtual dynamic elements (such as compositing virtual creatures in real video scenes) has been evolving with a constantly growing pace. Recent developments in the computer graphics hardware and algorithmic domain allow for consistent and believable compositions for the interactive equivalent of cinematic compositions, that of Augmented Reality (AR) simulations. For such interactive and real-time AR synergies to occur, the geometric and the illumination registration of the virtual elements with the real is crucial. The focus of this thesis is to engage in the research for such an illumination model that would allow for consistent compositions in the AR world of senses and conversely for the VR 'ideal' world, specifically for virtual character simulations. The reason for focusing on virtual humans is derived from the fact that so far research has eluded the problematic of real-time simulation of dynamic virtual characters in AR scenes. Furthermore most real-time VR illumination models for virtual humans have either been based on extensions over local illumination or ad-hoc approaches and rarely on physically plausible illumination models for such deformable, dynamic and multi-hierarchical geometrical meshes. In the light of the above narratives involving virtual characters in Mixed Realities, we derive two algorithms that build up in our observations regarding the virtual human topology as well as its response to environment illumination, in a quest for a physically plausible model. Our first algorithm enhances the latest developments in low-frequency (diffuse, continuous) Precomputed Radiance Transfer models by its application to virtual characters in AR. Our second algorithm is presenting an all-frequency (including glossy, discontinuous lights) virtual character illumination model, inspired by the key-fill cinematographic light setup applied in live action movies. We finally present two virtual heritage case studies involving our combined MR illumination model: a) a desktop VR simulation and b) a mobile AR on-site experience, both involving complete simulated virtual characters, re-enacting digital narratives via illumination registration with 'natural' captured light.

Citation (ISO format)
PAPAGIANNAKIS, Georgios. An Illumination Registration Model for Dynamic Virtual Humans in Mixed Reality. Doctoral Thesis, 2006. doi: 10.13097/archive-ouverte/unige:155590
Main files (1)
Thesis
accessLevelPublic
Identifiers
256views
40downloads

Technical informations

Creation19/10/2021 09:43:00
First validation19/10/2021 09:43:00
Update time16/03/2023 01:34:14
Status update16/03/2023 01:34:12
Last indexation13/05/2025 18:48:21
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack