Head-tracked off-axis perspective projection improves gaze readability of 3D virtual avatars

Tamas Bates, Jens Kober, Michael Gienger

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

34 Downloads (Pure)


Virtual avatars have been employed in many contexts, from simple conversational agents to communicating the internal state and intentions of large robots when interacting with humans. Rarely, however, are they employed in scenarios which require non-verbal communication of spatial information or dynamic interaction from a variety of perspectives. When presented on a flat screen, many illusions and visual artifacts interfere with such applications, which leads to a strong preference for physically-actuated heads and faces.

By adjusting the perspective projection used to render 3D avatars to match a viewer's physical perspective, they could provide a useful middle ground between typical 2D/3D avatar representations, which are often ambiguous in their spatial relationships, and physically-actuated heads/faces, which can be difficult to construct or impractical to use in some environments. A user study was conducted to determine to what extent a head-tracked perspective projection scheme was able to mitigate the issues in readability of a 3D avatar's expression or gaze target compared to use of a standard perspective projection. To the authors' knowledge, this is the first user study to perform such a comparison, and the results show not only an overall improvement in viewers' accuracy when attempting to follow the avatar's gaze, but a reduction in spatial biases in predictions made from oblique viewing angles
Original languageEnglish
Title of host publicationProceedings SIGGRAPH Asia 2019 (SA '18)
Subtitle of host publicationTechnical Briefs
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages4
ISBN (Electronic)978-1-4503-6062-3
Publication statusPublished - 2018
EventSIGGRAPH Asia 2018 - Tokyo, Japan
Duration: 4 Dec 20187 Dec 2018


ConferenceSIGGRAPH Asia 2018
Abbreviated titleSA '18

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care

Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.


  • Human-Computer Interaction
  • Virtual Reality
  • Augmented Reality
  • Mixed Reality
  • Eye Gaze


Dive into the research topics of 'Head-tracked off-axis perspective projection improves gaze readability of 3D virtual avatars'. Together they form a unique fingerprint.

Cite this