Proceedings chapter

Impression detection and management using an embodied conversational agent

Presented at Copenhagen (Denmark), 19-24 July 2020
Publication date2020

During interactions with Embodied Conversational Agents(ECAs) users form an impression of the ECAs by evaluating their warmth and competence. This impression affects the interaction and could even remain afterwards. Measuring users' reactions to detect these impressions, and adjusting ECAs' non-verbal behaviours accordingly, could lead to a more natural interaction, closer to human-human interactions.Motivated by the state-of-the-art on affect recognition, we investigated three research questions: 1) which users' reactions (facial expressions, eye movements, and physiological signals) reveal most of the formed impressions in the continuous warmth-competence space; 2) whether an adaptive ECA could leave a better impression by maximizing the impression it produces; 3) whether there are differences in impression formation during human-human vs. human-agent interaction. Firstly, our results showed the interest to combine different modalities to detect impressions (best concordance correlation coefficient of 0.681 obtained for warmth), with facial expressions outperforming the other modalities. Secondly, an adaptive ECA was created and tested in an experiment out of the laboratory. Results showed that impressions' ratings were higher in the conditions where the ECA adapted its behaviour based on the detected impressions compared to random behaviour selection. Thirdly, we found similar behaviour during human-human vs. human-agent interaction. People treated an ECA similarly to a human by spending more time observing the face area when forming an impression.

  • Affective computing
  • Impression detection
  • Virtual agent
  • Eye gaze
  • Impression management
  • Machine learning
  • Reinforcement learning
  • Swiss National Science Foundation - 2000221E-164326.
Citation (ISO format)
WANG, Chen et al. Impression detection and management using an embodied conversational agent. In: HCII 2020: human-computer interaction. multimodal and natural interaction. Copenhagen (Denmark). [s.l.] : Springer, 2020. doi: 10.1007/978-3-030-49062-1_18
Main files (1)
Proceedings chapter (Submitted version)

Technical informations

Creation08/04/2020 10:29:00 AM
First validation08/04/2020 10:29:00 AM
Update time03/15/2023 10:41:07 PM
Status update03/15/2023 10:41:06 PM
Last indexation05/05/2024 5:15:31 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack