Learning Stochastic Graph Neural Networks With Constrained Variance

Zhan Gao*, Elvin Isufi

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review


Stochastic graph neural networks (SGNNs) are information processing architectures that learn representations from data over random graphs. SGNNs are trained with respect to the expected performance, which comes with no guarantee about deviations of particular output realizations around the optimal expectation. To overcome this issue, we propose a variance-constrained optimization problem for SGNNs, balancing the expected performance and the stochastic deviation. An alternating primal-dual learning procedure is undertaken that solves the problem by updating the SGNN parameters with gradient descent and the dual variable with gradient ascent. To characterize the explicit effect of the variance-constrained learning, we analyze theoretically the variance of the SGNN output and identify a trade-off between the stochastic robustness and the discrimination power. We further analyze the duality gap of the variance-constrained optimization problem and the converging behavior of the primal-dual learning procedure. The former indicates the optimality loss induced by the dual transformation and the latter characterizes the limiting error of the iterative algorithm, both of which guarantee the performance of the variance-constrained learning. Through numerical simulations, we corroborate our theoretical findings and observe a strong expected performance with a controllable variance.
Original languageEnglish
Pages (from-to)358-371
Number of pages14
JournalIEEE Transactions on Signal Processing
Publication statusPublished - 2023

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care
Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.


  • Stochastic graph neural networks
  • variance constraint
  • primal-dual learning
  • duality gap
  • convergence


Dive into the research topics of 'Learning Stochastic Graph Neural Networks With Constrained Variance'. Together they form a unique fingerprint.

Cite this