Designing real-time, continuous emotion annotation techniques for 360° VR videos

Tong Xue, Surjya Ghosh, Gangyi Ding, Abdallah El Ali, Pablo Cesar

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

Abstract

With the increasing availability of head-mounted displays (HMDs) that show immersive 360° VR content, it is important to understand to what extent these immersive experiences can evoke emotions. Typically to collect emotion ground truth labels, users rate videos through post-experience self-reports that are discrete in nature. However, post-stimuli self-reports are temporally imprecise, especially after watching 360° videos. In this work, we design six continuous emotion annotation techniques for the Oculus Rift HMD aimed at minimizing workload and distraction. Based on a co-design session with six experts, we contribute HaloLight and DotSize, two continuous annotation methods deemed unobtrusive and easy to understand. We discuss the next challenges for evaluating the usability of these techniques, and reliability of continuous annotations.

Original languageEnglish
Title of host publicationCHI EA 2020 - Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages10
ISBN (Electronic)9781450368193
DOIs
Publication statusPublished - 2020
Event2020 ACM CHI Conference on Human Factors in Computing Systems, CHI EA 2020 - Honolulu, United States
Duration: 25 Apr 202030 Apr 2020

Publication series

NameConference on Human Factors in Computing Systems - Proceedings

Conference

Conference2020 ACM CHI Conference on Human Factors in Computing Systems, CHI EA 2020
CountryUnited States
CityHonolulu
Period25/04/2030/04/20
OtherVirtual/online event due to COVID-19

Keywords

  • 360 video
  • Continuous
  • Emotion annotation
  • Visualization

Fingerprint Dive into the research topics of 'Designing real-time, continuous emotion annotation techniques for 360° VR videos'. Together they form a unique fingerprint.

  • Cite this

    Xue, T., Ghosh, S., Ding, G., El Ali, A., & Cesar, P. (2020). Designing real-time, continuous emotion annotation techniques for 360° VR videos. In CHI EA 2020 - Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems [3382895] (Conference on Human Factors in Computing Systems - Proceedings). Association for Computing Machinery (ACM). https://doi.org/10.1145/3334480.3382895