论文标题

学习物理知识的神经网络而无需堆叠后传播

Learning Physics-Informed Neural Networks without Stacked Back-propagation

论文作者

He, Di, Li, Shanda, Shi, Wenlei, Gao, Xiaotian, Zhang, Jia, Bian, Jiang, Wang, Liwei, Liu, Tie-Yan

论文摘要

物理信息神经网络(PINN)已成为一种常用的机器学习方法,以解决部分微分方程(PDE)。但是,面对高维二级PDE问题,Pinn将遭受严重的可伸缩性问题,因为其损失包括二阶导数,其计算成本将随着堆叠后的后传播而增长。在这项工作中,我们开发了一种新颖的方法,可以显着加速物理知识神经网络的培训。特别是,我们通过高斯平滑模型对PDE解决方案进行参数化,并表明,从Stein的身份中得出,可以有效地计算二阶导数而无需反向传播。我们进一步讨论模型能力并提供降低方差方法,以解决衍生估计中的关键局限性。实验结果表明,与标准PINN训练相比,我们提出的方法可以达到竞争性误差,但要快得多。我们的代码在https://github.com/lithiumda/pinn-without-actacked-bp上发布。

Physics-Informed Neural Network (PINN) has become a commonly used machine learning approach to solve partial differential equations (PDE). But, facing high-dimensional secondorder PDE problems, PINN will suffer from severe scalability issues since its loss includes second-order derivatives, the computational cost of which will grow along with the dimension during stacked back-propagation. In this work, we develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks. In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation. We further discuss the model capacity and provide variance reduction methods to address key limitations in the derivative estimation. Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is significantly faster. Our code is released at https://github.com/LithiumDA/PINN-without-Stacked-BP.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源