TY - GEN
T1 - Machine Learning in Adaptive FETI-DP
T2 - European Conference on Numerical Mathematics and Advanced Applications, ENUMATH 2019
AU - Heinlein, Alexander
AU - Klawonn, Axel
AU - Lanser, Martin
AU - Weber, Janine
PY - 2021
Y1 - 2021
N2 - The convergence rate of classic domain decomposition methods in general deteriorates severely for large discontinuities in the coefficient functions of the considered partial differential equation. To retain the robustness for such highly heterogeneous problems, the coarse space can be enriched by additional coarse basis functions. These can be obtained by solving local generalized eigenvalue problems on subdomain edges. In order to reduce the number of eigenvalue problems and thus the computational cost, we use a neural network to predict the geometric location of critical edges, i.e., edges where the eigenvalue problem is indispensable. As input data for the neural network, we use function evaluations of the coefficient function within the two subdomains adjacent to an edge. In the present article, we examine the effect of computing the input data only in a neighborhood of the edge, i.e., on slabs next to the edge. We show numerical results for both the training data as well as for a concrete test problem in form of a microsection subsection for linear elasticity problems. We observe that computing the sampling points only in one half or one quarter of each subdomain still provides robust algorithms.
AB - The convergence rate of classic domain decomposition methods in general deteriorates severely for large discontinuities in the coefficient functions of the considered partial differential equation. To retain the robustness for such highly heterogeneous problems, the coarse space can be enriched by additional coarse basis functions. These can be obtained by solving local generalized eigenvalue problems on subdomain edges. In order to reduce the number of eigenvalue problems and thus the computational cost, we use a neural network to predict the geometric location of critical edges, i.e., edges where the eigenvalue problem is indispensable. As input data for the neural network, we use function evaluations of the coefficient function within the two subdomains adjacent to an edge. In the present article, we examine the effect of computing the input data only in a neighborhood of the edge, i.e., on slabs next to the edge. We show numerical results for both the training data as well as for a concrete test problem in form of a microsection subsection for linear elasticity problems. We observe that computing the sampling points only in one half or one quarter of each subdomain still provides robust algorithms.
UR - http://www.scopus.com/inward/record.url?scp=85106416276&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-55874-1_58
DO - 10.1007/978-3-030-55874-1_58
M3 - Conference contribution
AN - SCOPUS:85106416276
SN - 9783030558734
T3 - Lecture Notes in Computational Science and Engineering
SP - 593
EP - 603
BT - Numerical Mathematics and Advanced Applications, ENUMATH 2019 - European Conference
A2 - Vermolen, Fred J.
A2 - Vuik, Cornelis
PB - Springer
Y2 - 30 September 2019 through 4 October 2019
ER -