论文标题
多极图神经操作员用于参数偏微分方程
Multipole Graph Neural Operator for Parametric Partial Differential Equations
论文作者
论文摘要
使用基于深度学习的方法模拟物理系统和求解部分微分方程(PDE)的主要挑战之一是为神经网络的所需结构中基于物理的数据制定。图形神经网络(GNN)在该领域已获得流行,因为图提供了一种自然的建模粒子相互作用的方式,并提供了一种清晰的离散方法来离散连续模型。但是,构建用于近似此类任务的图通常忽略了由于相对于节点数量的计算复杂性的不利缩放而导致的远程相互作用。由于这些近似值尺度而导致的误差随系统的离散化,因此不允许在网格进行下进行概括。受经典多极方法的启发,我们提出了一个新型的多级图神经网络框架,该框架在所有范围内仅具有线性复杂性。我们的多级配方等效于递归将诱导点添加到内核矩阵,将核的多分辨率矩阵分解统一GNN。实验证实我们的多刻度网络学习了离散化不变的解决方案操作员到PDE,并且可以在线性时间内进行评估。
One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks. Graph neural networks (GNNs) have gained popularity in this area since graphs offer a natural way of modeling particle interactions and provide a clear way of discretizing the continuum models. However, the graphs constructed for approximating such tasks usually ignore long-range interactions due to unfavorable scaling of the computational complexity with respect to the number of nodes. The errors due to these approximations scale with the discretization of the system, thereby not allowing for generalization under mesh-refinement. Inspired by the classical multipole methods, we propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity. Our multi-level formulation is equivalent to recursively adding inducing points to the kernel matrix, unifying GNNs with multi-resolution matrix factorization of the kernel. Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.