TY - JOUR
T1 - Towards an Engagement-Aware Attentive Artificial Listener for Multi-Party Interactions
AU - Oertel, Catharine
AU - Jonell, Patrik
AU - Kontogiorgos, Dimosthenis
AU - Mora, Kenneth Funes
AU - Odobez, Jean Marc
AU - Gustafson, Joakim
PY - 2021
Y1 - 2021
N2 - Listening to one another is essential to human-human interaction. In fact, we humans spend a substantial part of our day listening to other people, in private as well as in work settings. Attentive listening serves the function to gather information for oneself, but at the same time, it also signals to the speaker that he/she is being heard. To deduce whether our interlocutor is listening to us, we are relying on reading his/her nonverbal cues, very much like how we also use non-verbal cues to signal our attention. Such signaling becomes more complex when we move from dyadic to multi-party interactions. Understanding how humans use nonverbal cues in a multi-party listening context not only increases our understanding of human-human communication but also aids the development of successful human-robot interactions. This paper aims to bring together previous analyses of listener behavior analyses in human-human multi-party interaction and provide novel insights into gaze patterns between the listeners in particular. We are investigating whether the gaze patterns and feedback behavior, as observed in the human-human dialogue, are also beneficial for the perception of a robot in multi-party human-robot interaction. To answer this question, we are implementing an attentive listening system that generates multi-modal listening behavior based on our human-human analysis. We are comparing our system to a baseline system that does not differentiate between different listener types in its behavior generation. We are evaluating it in terms of the participant’s perception of the robot, his behavior as well as the perception of third-party observers.
AB - Listening to one another is essential to human-human interaction. In fact, we humans spend a substantial part of our day listening to other people, in private as well as in work settings. Attentive listening serves the function to gather information for oneself, but at the same time, it also signals to the speaker that he/she is being heard. To deduce whether our interlocutor is listening to us, we are relying on reading his/her nonverbal cues, very much like how we also use non-verbal cues to signal our attention. Such signaling becomes more complex when we move from dyadic to multi-party interactions. Understanding how humans use nonverbal cues in a multi-party listening context not only increases our understanding of human-human communication but also aids the development of successful human-robot interactions. This paper aims to bring together previous analyses of listener behavior analyses in human-human multi-party interaction and provide novel insights into gaze patterns between the listeners in particular. We are investigating whether the gaze patterns and feedback behavior, as observed in the human-human dialogue, are also beneficial for the perception of a robot in multi-party human-robot interaction. To answer this question, we are implementing an attentive listening system that generates multi-modal listening behavior based on our human-human analysis. We are comparing our system to a baseline system that does not differentiate between different listener types in its behavior generation. We are evaluating it in terms of the participant’s perception of the robot, his behavior as well as the perception of third-party observers.
KW - artificial listener
KW - eye-gaze patterns
KW - head gestures
KW - human-robot interaction
KW - multi-party interactions
KW - non-verbal behaviors
KW - social signal processing
UR - http://www.scopus.com/inward/record.url?scp=85110106028&partnerID=8YFLogxK
U2 - 10.3389/frobt.2021.555913
DO - 10.3389/frobt.2021.555913
M3 - Article
AN - SCOPUS:85110106028
SN - 2296-9144
VL - 8
JO - Frontiers In Robotics and AI
JF - Frontiers In Robotics and AI
M1 - 555913
ER -