Uncertainty quantification and out-of-distribution detection using surjective normalizing flows

Simon Dirmeier, Ye Hong, Yanan Xin, Fernando Perez-Cruz

Research output: Working paper/PreprintPreprint

Abstract

Reliable quantification of epistemic and aleatoric uncertainty is of crucial importance in applications where models are trained in one environment but applied to multiple different environments, often seen in real-world applications for example, in climate science or mobility analysis. We propose a simple approach using surjective normalizing flows to identify out-of-distribution data sets in deep neural network models that can be computed in a single forward pass. The method builds on recent developments in deep uncertainty quantification and generative modeling with normalizing flows. We apply our method to a synthetic data set that has been simulated using a mechanistic model from the mobility literature and several data sets simulated from interventional distributions induced by soft and atomic interventions on that model, and demonstrate that our method can reliably discern out-of-distribution data from in-distribution data. We compare the surjective flow model to a Dirichlet process mixture model and a bijective flow and find that the surjections are a crucial component to reliably distinguish in-distribution from out-of-distribution data.
Original languageEnglish
PublisherCornell University Library - arXiv.org
Number of pages14
DOIs
Publication statusPublished - 2023
Externally publishedYes

Fingerprint

Dive into the research topics of 'Uncertainty quantification and out-of-distribution detection using surjective normalizing flows'. Together they form a unique fingerprint.

Cite this