Corrnet: Fine-grained emotion recognition for video watching using wearable physiological sensors

Tianyi Zhang*, Abdallah El Ali, Chen Wang, Alan Hanjalic, Pablo Cesar

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

23 Citations (Scopus)
132 Downloads (Pure)

Abstract

Recognizing user emotions while they watch short-form videos anytime and anywhere is essential for facilitating video content customization and personalization. However, most works either classify a single emotion per video stimuli, or are restricted to static, desktop environments. To address this, we propose a correlation-based emotion recognition algorithm (CorrNet) to recognize the valence and arousal (V-A) of each instance (fine-grained segment of signals) using only wearable, physiological signals (e.g., electrodermal activity, heart rate). CorrNet takes advantage of features both inside each instance (intra-modality features) and between different instances for the same video stimuli (correlation-based features). We first test our approach on an indoor-desktop affect dataset (CASE), and thereafter on an outdoor-mobile affect dataset (MERCA) which we collected using a smart wristband and wearable eyetracker. Results show that for subject-independent binary classification (high-low), CorrNet yields promising recognition accuracies: 76.37% and 74.03% for V-A on CASE, and 70.29% and 68.15% for V-A on MERCA. Our findings show: (1) instance segment lengths between 1–4 s result in highest recognition accuracies (2) accuracies between laboratory-grade and wearable sensors are comparable, even under low sampling rates (≤64 Hz) (3) large amounts of neu-tral V-A labels, an artifact of continuous affect annotation, result in varied recognition performance.

Original languageEnglish
Article number52
Pages (from-to)1-25
Number of pages25
JournalSensors (Switzerland)
Volume21
Issue number1
DOIs
Publication statusPublished - 2020

Keywords

  • Emotion recognition
  • Machine learning
  • Physiological signals
  • Video

Fingerprint

Dive into the research topics of 'Corrnet: Fine-grained emotion recognition for video watching using wearable physiological sensors'. Together they form a unique fingerprint.

Cite this