BNN-DP: Robustness Certification of Bayesian Neural Networks via Dynamic Programming

Steven Adams*, Andrea Patanè, Morteza Lahijanian, Luca Laurenti

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

9 Downloads (Pure)


In this paper, we introduce BNN-DP, an efficient algorithmic framework for analysis of adversarial robustness of Bayesian Neural Networks (BNNs). Given a compact set of input points T ⊂ Rn, BNN-DP computes lower and upper bounds on the BNN's predictions for all the points in T. The framework is based on an interpretation of BNNs as stochastic dynamical systems, which enables the use of Dynamic Programming (DP) algorithms to bound the prediction range along the layers of the network. Specifically, the method uses bound propagation techniques and convex relaxations to derive a backward recursion procedure to over-approximate the prediction range of the BNN with piecewise affine functions. The algorithm is general and can handle both regression and classification tasks. On a set of experiments on various regression and classification tasks and BNN architectures, we show that BNN-DP outperforms state-of-the-art methods by up to four orders of magnitude in both tightness of the bounds and computational efficiency.

Original languageEnglish
Title of host publicationICML'23: Proceedings of the 40th International Conference on Machine Learning
EditorsAndreas Krause, Emma Brunskill, Kyunghyun Cho
PublisherAssociation for Computing Machinery (ACM)
Number of pages19
Publication statusPublished - 2023
Event40th International Conference on Machine Learning, ICML 2023 - Honolulu, United States
Duration: 23 Jul 202329 Jul 2023

Publication series

NameProceedings of Machine Learning Research
ISSN (Print)2640-3498


Conference40th International Conference on Machine Learning, ICML 2023
Country/TerritoryUnited States


Dive into the research topics of 'BNN-DP: Robustness Certification of Bayesian Neural Networks via Dynamic Programming'. Together they form a unique fingerprint.

Cite this