Doctoral thesis
English

Real-time Animation of Interactive Virtual Humans

ContributorsEgges, Jan
Number of pages134
Imprimatur date2006-10-10
Abstract

Over the last years, there has been a lot of interest in the area of Interactive Virtual Humans (IVHs). Virtual characters who interact naturally with users in mixed realities have many different applications, such as interactive video games, virtual training and rehabilitation, or virtual heritage. The main purpose of using interactive virtual humans in such applications is to increase the realism of the environment by adding life-like characters. The means by which these characters perceive their environment and how they express themselves greatly influences how convincing these characters are. The work presented in this thesis aims to improve the expressive capabilities of virtual characters, notably the animation of IVHs. Because of the complexity of interaction, a high level of control is required over the face and body motions of the virtual humans. In order to achieve this, current approaches try to generate face and body motions from a high-level description. Although this indeed allows for a precise control over the movement of the virtual human, it is difficult to generate a naturallooking motion from such a high-level description. Another problem that arises when animating IVHs is that motions are not generated all the time. Therefore a flexible animation scheme is required that ensures a natural posture even when no animation is playing. Finally, because of the many different components that an Interactive Virtual Human consists of, the animation model should be as efficient as possible. In this thesis, we will present a new animation model, based on a combination of motion synthesis from motion capture and a statistical analysis of prerecorded motion clips. As opposed to existing approaches that create new motions with limited flexibility, our model adapts existing motions, by automatically adding dependent joint motions. This renders the animation more natural, but since our model does not impose any conditions on the input motion, it can be linked easily with existing gesture synthesis techniques for IVHs. In order to assure a continuous realistic motion, a basic layer of motions, called idle motions, is always present. These motions are generated by sequencing prerecorded motion segments organised in a graph. The path followed through this graph is controlled by high-level constraints, such as an emotional state. On top of that, small variations in posture are added so that the character is never static. Because we use a linear representation for joint orientations, blending and interpolation is done very efficiently, resulting in an animation engine especially suitable for real-time applications.

Citation (ISO format)
EGGES, Jan. Real-time Animation of Interactive Virtual Humans. Doctoral Thesis, 2006. doi: 10.13097/archive-ouverte/unige:155421
Main files (1)
Thesis
accessLevelRestrictedaccessLevelPublic 10/10/2106 CC BY
Identifiers
465views
0downloads

Technical informations

Creation15/10/2021 09:54:00
First validation15/10/2021 09:54:00
Update time16/03/2023 01:30:58
Status update16/03/2023 01:30:57
Last indexation13/05/2025 18:47:48
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack