Variance-Constrained Learning for Stochastic Graph Neural Networks

Zhan Gao, Elvin Isufi, Alejandro Ribeiro

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

3 Citations (Scopus)

Abstract

Stochastic graph neural networks (SGNNs) are information processing architectures that can learn representations from data over random graphs. SGNNs are trained with respect to the expected performance, but this training comes with no guarantee about the deviation of particular output realizations around the optimal mean. To overcome this issue, we propose a learning strategy for SGNNs based on a variance constrained optimization problem, balancing the expected performance and the stochastic deviation. To handle the variance constraint in the stochastic optimization problem, training is undertaken in the dual domain. We propose an alternating primal-dual learning algorithm that updates the primal variable (SGNN parameters) with gradient descent and the dual variable with gradient ascent. We show the stochastic deviation is explicitly controlled through Chebyshev inequality and analyze the optimality loss induced by the primal-dual learning. Through numerical simulations, we observe a strong performance in expectation with a controllable deviation corroborating the theoretical findings.
Original languageEnglish
Title of host publicationICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Place of PublicationPiscataway
PublisherIEEE
Pages5245-5249
Number of pages5
ISBN (Electronic)978-1-7281-7605-5
ISBN (Print)978-1-7281-7606-2
DOIs
Publication statusPublished - 2021
EventICASSP 2021: The IEEE International Conference on Acoustics, Speech, and Signal Processing - Virtual Conference/Toronto, Canada
Duration: 6 Jun 202111 Jun 2021

Conference

ConferenceICASSP 2021
Country/TerritoryCanada
CityVirtual Conference/Toronto
Period6/06/2111/06/21

Keywords

  • Distributed learning
  • Primal-dual learning
  • Stochastic graph neural networks
  • Variance constraint

Fingerprint

Dive into the research topics of 'Variance-Constrained Learning for Stochastic Graph Neural Networks'. Together they form a unique fingerprint.

Cite this