TY - JOUR
T1 - Convolutional Neural Operators for robust and accurate learning of PDEs
AU - Raonić, Bogdan
AU - Molinaro, Roberto
AU - De Ryck, Tim
AU - Rohner, Tobias
AU - Bartolucci, Francesca
AU - Alaifari, Rima
AU - Mishra, Siddhartha
AU - de Bézenac, Emmanuel
PY - 2023
Y1 - 2023
N2 - Although very successfully used in conventional machine learning, convolution based neural network architectures - believed to be inconsistent in function space - have been largely ignored in the context of learning solution operators of PDEs. Here, we present novel adaptations for convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as convolutional neural operators (CNOs), is designed specifically to preserve its underlying continuous nature, even when implemented in a discretized form on a computer. We prove a universality theorem to show that CNOs can approximate operators arising in PDEs to desired accuracy. CNOs are tested on a novel suite of benchmarks, encompassing a diverse set of PDEs with possibly multi-scale solutions and are observed to significantly outperform baselines, paving the way for an alternative framework for robust and accurate operator learning.
AB - Although very successfully used in conventional machine learning, convolution based neural network architectures - believed to be inconsistent in function space - have been largely ignored in the context of learning solution operators of PDEs. Here, we present novel adaptations for convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as convolutional neural operators (CNOs), is designed specifically to preserve its underlying continuous nature, even when implemented in a discretized form on a computer. We prove a universality theorem to show that CNOs can approximate operators arising in PDEs to desired accuracy. CNOs are tested on a novel suite of benchmarks, encompassing a diverse set of PDEs with possibly multi-scale solutions and are observed to significantly outperform baselines, paving the way for an alternative framework for robust and accurate operator learning.
UR - http://www.scopus.com/inward/record.url?scp=85191166636&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85191166636
SN - 1049-5258
VL - 36
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
Y2 - 10 December 2023 through 16 December 2023
ER -