Abstract
We present the collection and annotation of a multi-modal database with negative human-human interactions. The work is part of supporting behavior recognition in the context of a virtual reality aggression prevention training system. The data consist of dyadic interactions between professional aggression training actors (actors) and naive participants (students). In addition to audio and video, we have recorded motion capture data with kinect, head tracking, and physiological data: heart rate (ECG), galvanic skin response (GSR) and electromyography (EMG) of biceps, triceps and trapezius muscles. Aggression levels, fear, valence, arousal and dominance have been rated separately for actors and students. We observe higher inter-rater agreement for rating the actors than for rating the students, consistently for each annotated dimension, and a higher inter-rater agreement for speaking behavior than for listening behavior. The data can be used among others for research on affect recognition, multimodal fusion and the relation between different bodily manifestation.
Original language | English |
---|---|
Title of host publication | Proceedings of International Conference on Affective Computing and Intelligent Interaction (ACII2017) |
Place of Publication | Piscataway, NJ |
Publisher | IEEE |
Pages | 21-27 |
Number of pages | 7 |
ISBN (Electronic) | 978-1-5386-0563-9 |
DOIs | |
Publication status | Published - 2017 |
Event | ACIIW 2017: 7th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos - San Antonio, TX, United States Duration: 23 Oct 2017 → 26 Oct 2017 Conference number: 7 |
Conference
Conference | ACIIW 2017 |
---|---|
Country/Territory | United States |
City | San Antonio, TX |
Period | 23/10/17 → 26/10/17 |
Keywords
- Training
- Sensors
- Databases
- Forensics
- Magnetic heads
- Electromyography
- Tracking