Depth for Multi-Modal Contour Ensembles

N. F. Chaves-de-Plaza*, M. Molenaar, P. Mody, M. Staring, R. van Egmond, E. Eisemann, A. Vilanova, K. Hildebrandt

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

52 Downloads (Pure)

Abstract

The contour depth methodology enables non-parametric summarization of contour ensembles by extracting their representatives, confidence bands, and outliers for visualization (via contour boxplots) and robust downstream procedures. We address two shortcomings of these methods. Firstly, we significantly expedite the computation and recomputation of Inclusion Depth (ID), introducing a linear-time algorithm for epsilon ID, a variant used for handling ensembles with contours with multiple intersections. We also present the inclusion matrix, which contains the pairwise inclusion relationships between contours, and leverage it to accelerate the recomputation of ID. Secondly, extending beyond the single distribution assumption, we present the Relative Depth (ReD), a generalization of contour depth for ensembles with multiple modes. Building upon the linear-time eID, we introduce CDclust, a clustering algorithm that untangles ensemble modes of variation by optimizing ReD. Synthetic and real datasets from medical image segmentation and meteorological forecasting showcase the speed advantages, illustrate the use case of progressive depth computation and enable non-parametric multimodal analysis. To promote research and adoption, we offer the contour-depth Python package.

Original languageEnglish
Article numbere15083
Number of pages12
JournalComputer Graphics Forum
Volume43
Issue number3
DOIs
Publication statusPublished - 2024

Keywords

  • CCS concepts
  • cluster analysis
  • statistical graphics
  • human-centered computing
  • mathematics of computing
  • scientific visualization
  • nonparametric statistics

Fingerprint

Dive into the research topics of 'Depth for Multi-Modal Contour Ensembles'. Together they form a unique fingerprint.

Cite this