MUMBAI: multi-person, multimodal board game affect and interaction analysis dataset

Metehan Doyran, Arjan Schimmel, Pınar Baki, Kübra Ergin, Batıkan Türkmen, Almıla Akdağ Salah, Sander C.J. Bakkes, Heysem Kaya, Ronald Poppe, Albert Ali Salah

Research output: Contribution to journalArticleScientificpeer-review

4 Downloads (Pure)

Abstract

Board games are fertile grounds for the display of social signals, and they provide insights into psychological indicators in multi-person interactions. In this work, we introduce a new dataset collected from four-player board game sessions, recorded via multiple cameras, and containing over 46 hours of visual material. The new MUMBAI dataset is extensively annotated with emotional moments for all game sessions. Additional data comes from personality and game experience questionnaires. Our four-person setup allows the investigation of non-verbal interactions beyond dyadic settings. We present three benchmarks for expression detection and emotion classification and discuss potential research questions for the analysis of social interactions and group dynamics during board games.

Original languageEnglish
Number of pages19
JournalJournal on Multimodal User Interfaces
DOIs
Publication statusPublished - 2021

Keywords

  • Affective computing
  • Board games
  • Facial expression analysis
  • Game experience
  • Group dynamics
  • Multimodal interaction
  • Social interactions

Fingerprint Dive into the research topics of 'MUMBAI: multi-person, multimodal board game affect and interaction analysis dataset'. Together they form a unique fingerprint.

Cite this