Scientific article

MultiModal Emotion Recognition in Response to Videos

Published inIEEE transactions on affective computing, vol. 3, no. 2, p. 211-223
Publication date2012

This paper presents a user-independent emotion recognition method with the goal of recovering affective tags for videosusing electroencephalogram (EEG), pupillary response and gaze distance. We first selected 20 video clips with extrinsic emotionalcontent from movies and online resources. Then, EEG responses and eye gaze data were recorded from 24 participants whilewatching emotional video clips. Ground truth was defined based on the median arousal and valence scores given to clips in apreliminary study using an online questionnaire. Based on the participants' responses, three classes for each dimension were defined.The arousal classes were calm, medium aroused, and activated and the valence classes were unpleasant, neutral, and pleasant. Oneof the three affective labels of either valence or arousal was determined by classification of bodily responses. A one-participant-outcross validation was employed to investigate the classification performance in a user-independent approach. The best classificationaccuracies of 68.5 percent for three labels of valence and 76.4 percent for three labels of arousal were obtained using a modality fusionstrategy and a support vector machine. The results over a population of 24 participants demonstrate that user-independent emotionrecognition can outperform individual self-reports for arousal assessments and do not underperform for valence assessments.

  • Bmi
  • Emotion recognition
  • EEG
  • Pupillary reflex
  • Pattern classification
  • Affective computing
Citation (ISO format)
SOLEYMANI, Mohammad, PANTIC, Maja, PUN, Thierry. MultiModal Emotion Recognition in Response to Videos. In: IEEE transactions on affective computing, 2012, vol. 3, n° 2, p. 211–223. doi: 10.1109/T-AFFC.2011.37
Main files (1)
Article (Published version)
ISSN of the journal1949-3045

Technical informations

Creation03/03/2015 4:36:05 PM
First validation03/03/2015 4:36:05 PM
Update time03/14/2023 10:56:40 PM
Status update03/14/2023 10:56:39 PM
Last indexation10/18/2023 10:13:12 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack