论文标题

针对部分微分方程的量子启发的张量神经网络

Quantum-Inspired Tensor Neural Networks for Partial Differential Equations

论文作者

Patel, Raj, Hsing, Chia-Wei, Sahin, Serkan, Jahromi, Saeed S., Palmer, Samuel, Sharma, Shivam, Michel, Christophe, Porte, Vincent, Abid, Mustafa, Aubert, Stephane, Castellani, Pierre, Lee, Chi-Guhn, Mugel, Samuel, Orus, Roman

论文摘要

部分微分方程(PDE)用于对科学和工程中的各种动力系统进行建模。深度学习的最新进展使我们能够以新的方式解决维度的诅咒,从而在更高的维度中解决它们。但是,深度学习方法受训练时间和记忆的约束。为了解决这些缺点,我们实施了张量神经网络(TNN),这是一种量子启发的神经网络体系结构,利用张量网络的想法来改进深度学习方法。我们证明,与经典的密集神经网络(DNN)相比,TNN提供了明显的参数节省,同时获得了与经典密集的神经网络相同的准确性。此外,我们还展示了如何以相同的准确性来比DNN更快地训练TNN。我们通过将它们应用于求解抛物线PDE(特别是黑甲状化的Barenblatt方程)来基于TNN,该方程广泛用于金融定价理论,从经验上显示了TNN比DNN的优势。还讨论了进一步的例子,例如汉密尔顿 - 雅各比 - 贝尔曼方程。

Partial Differential Equations (PDEs) are used to model a variety of dynamical systems in science and engineering. Recent advances in deep learning have enabled us to solve them in a higher dimension by addressing the curse of dimensionality in new ways. However, deep learning methods are constrained by training time and memory. To tackle these shortcomings, we implement Tensor Neural Networks (TNN), a quantum-inspired neural network architecture that leverages Tensor Network ideas to improve upon deep learning approaches. We demonstrate that TNN provide significant parameter savings while attaining the same accuracy as compared to the classical Dense Neural Network (DNN). In addition, we also show how TNN can be trained faster than DNN for the same accuracy. We benchmark TNN by applying them to solve parabolic PDEs, specifically the Black-Scholes-Barenblatt equation, widely used in financial pricing theory, empirically showing the advantages of TNN over DNN. Further examples, such as the Hamilton-Jacobi-Bellman equation, are also discussed.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源