Look together: Using gaze for assisting co-located collaborative search

Yanxia Zhang*, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, Hans Gellersen

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

57 Citations (Scopus)
47 Downloads (Pure)

Abstract

Gaze information provides indication of users focus which complements remote collaboration tasks, as distant users can see their partner’s focus. In this paper, we apply gaze for co-located collaboration, where users’ gaze locations are presented on the same display, to help collaboration between partners. We integrated various types of gaze indicators on the user interface of a collaborative search system, and we conducted two user studies to understand how gaze enhances coordination and communication between co-located users. Our results show that gaze indeed enhances co-located collaboration, but with a trade-off between visibility of gaze indicators and user distraction. Users acknowledged that seeing gaze indicators eases communication, because it let them be aware of their partner’s interests and attention. However, users can be reluctant to share their gaze information due to trust and privacy, as gaze potentially divulges their interests.

Original languageEnglish
Pages (from-to)173-186
Number of pages14
JournalPersonal and Ubiquitous Computing
Volume21
Issue number1
DOIs
Publication statusPublished - 1 Feb 2017

Keywords

  • Collaborative task
  • Eye tracking
  • Gaze awareness
  • Gaze interaction
  • Large pervasive display
  • Multi-user

Fingerprint

Dive into the research topics of 'Look together: Using gaze for assisting co-located collaborative search'. Together they form a unique fingerprint.

Cite this