论文标题

近端隐式ode求解器,用于加速学习神经odes

Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs

论文作者

Baker, Justin, Xia, Hedi, Wang, Yiwei, Cherkaev, Elena, Narayan, Akil, Chen, Long, Xin, Jack, Bertozzi, Andrea L., Osher, Stanley J., Wang, Bao

论文摘要

学习神经odes通常需要解决非常僵硬的系统,主要是使用显式自适应步长尺寸求解器。这些求解器在计算上很昂贵,需要使用微小的步骤尺寸来保证数值稳定性和准确性。本文使用利用近端操作员的不同阶的隐式ode求解器来考虑学习神经odes。近端隐式求解器由内部迭代组成:内部迭代使用快速优化算法近似于每个隐式更新步骤,并且外迭代随着时间的推移求解了ode系统。近端隐式ODE求解器保证了数值稳定性和计算效率的显式求解器优越性。我们在各种具有挑战性的基准任务上验证了近端隐式求解器比现有流行的神经ode求解器的优势,包括学习连续深度的图形神经网络和连续正常化的流动。

Learning neural ODEs often requires solving very stiff ODE systems, primarily using explicit adaptive step size ODE solvers. These solvers are computationally expensive, requiring the use of tiny step sizes for numerical stability and accuracy guarantees. This paper considers learning neural ODEs using implicit ODE solvers of different orders leveraging proximal operators. The proximal implicit solver consists of inner-outer iterations: the inner iterations approximate each implicit update step using a fast optimization algorithm, and the outer iterations solve the ODE system over time. The proximal implicit ODE solver guarantees superiority over explicit solvers in numerical stability and computational efficiency. We validate the advantages of proximal implicit solvers over existing popular neural ODE solvers on various challenging benchmark tasks, including learning continuous-depth graph neural networks and continuous normalizing flows.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源