Split-Depth Image Generation and Optimization

Jingtang Liao, Martin Eisemann, Elmar Eisemann

Research output: Contribution to journalArticleScientificpeer-review

2 Citations (Scopus)

Abstract

Split-depth images use an optical illusion, which can enhance the 3D impression of a 2D animation. In split-depth images (also often called split-depth GIFs due to the commonly used file format), static virtual occluders inform of vertical or horizontal bars are added to a video clip, which leads to occlusions that are interpreted by the observer as a depth cue. In this paper, we study different factors that contribute to the illusion and propose a solution to generate split-depth images for a given RGB + depth image sequence. The presented solution builds upon a motion summarization of the object of interest (OOI) through space and time. It allows us to formulate the bar positioning as an energy-minimization problem, which we solve efficiently. We take a variety of important features into account, such as the changes of the 3D effect due to changes in the motion topology, occlusion, the proximity of bars or the OOI, and scene saliency. We conducted a number of psycho-visual experiments to derive an appropriate energy formulation. Our method helps in finding optimal positions for the bars and, thus, improves the 3D perception of the original animation. We demonstrate the effectiveness of our approach on a variety of examples. Our study with novice users shows that our approach allows them to quickly create satisfying results even for complex animations.
Original languageEnglish
Pages (from-to)175-182
Number of pages8
JournalComputer Graphics Forum (online)
Volume36
Issue number7
DOIs
Publication statusPublished - 2017
EventPacific Graphics 2017: 25th Pacific Conference on Computer Graphics and Applications - Taipei, Taiwan
Duration: 16 Oct 201719 Oct 2017
https://www.eg.org/wp/event/pacific-graphics-2017/

Fingerprint

Dive into the research topics of 'Split-Depth Image Generation and Optimization'. Together they form a unique fingerprint.

Cite this