Graph-Adaptive Activation Functions for Graph Neural Networks

Bianca Iancu, Luana Ruiz, Alejandro Ribeiro, Elvin Isufi

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

2 Citations (Scopus)

Abstract

Activation functions are crucial in graph neural networks (GNNs) as they allow defining a nonlinear family of functions to capture the relationship between the input graph data and their representations. This paper proposes activation functions for GNNs that not only adapt to the graph into the nonlinearity, but are also distributable. To incorporate the feature-topology coupling into all GNN components, nodal features are nonlinearized and combined with a set of trainable parameters in a form akin to graph convolutions. The latter leads to a graph-adaptive trainable nonlinear component of the GNN that can be implemented directly or via kernel transformations, therefore, enriching the class of functions to represent the network data. Whether in the direct or kernel form, we show permutation equivariance is always preserved. We also prove the subclass of graph-adaptive max activation functions are Lipschitz stable to input perturbations. Numerical experiments with distributed source localization, finite-time consensus, distributed regression, and recommender systems corroborate our findings and show improved performance compared with pointwise as well as state-of-the-art localized nonlinearities.
Original languageEnglish
Title of host publication2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)
Subtitle of host publicationProceedings
PublisherIEEE
Pages1-6
Number of pages6
ISBN (Electronic)978-1-7281-6662-9
ISBN (Print)978-1-7281-6663-6
DOIs
Publication statusPublished - 2020
Event2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP) - Espoo, Finland
Duration: 21 Sept 202024 Sept 2020
Conference number: 33th

Workshop

Workshop2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)
Country/TerritoryFinland
CityEspoo
Period21/09/2024/09/20

Keywords

  • Activation functions
  • Graph neural networks
  • Graph signal processing
  • Lipschitz stability
  • Permutation equivariance

Fingerprint

Dive into the research topics of 'Graph-Adaptive Activation Functions for Graph Neural Networks'. Together they form a unique fingerprint.

Cite this