论文标题

使用长短期内存网络的多保真替代建模

Multi-fidelity surrogate modeling using long short-term memory networks

论文作者

Conti, Paolo, Guo, Mengwu, Manzoni, Andrea, Hesthaven, Jan S.

论文摘要

当评估取决于微分方程的解决方案的兴趣量时,我们不可避免地要面对准确性和效率之间的权衡。特别是对于工程计算中的参数化,依赖时间的问题,通常情况下,可接受的计算预算限制了高保真性,准确的仿真数据的可用性。多保真替代建模已成为克服这一困难的有效策略。它的关键思想是利用许多低保真模拟数据,较少准确但更快地计算,以使用有限的高保真数据来改善近似值。在这项工作中,我们介绍了一个新型的数据驱动的多保真替代建模的框架,用于使用长短期内存(LSTM)网络的参数化的,时间依赖时间的问题,以增强对未见参数值的输出预测,并且同时提高了时间 - 同时又有时间 - 已知对数据指导模型特别具有挑战性的任务。我们证明了所提出的方法在各种工程问题中具有通过精细和粗网格生成的高保真数据,小时与大的时间步长或有限的元素全订单全订单与深度学习减少订单模型的相关性。数值结果表明,所提出的多志愿LSTM网络不仅可以显着改善单曲回归,而且还优于基于馈送前向神经网络的多效率模型。

When evaluating quantities of interest that depend on the solutions to differential equations, we inevitably face the trade-off between accuracy and efficiency. Especially for parametrized, time dependent problems in engineering computations, it is often the case that acceptable computational budgets limit the availability of high-fidelity, accurate simulation data. Multi-fidelity surrogate modeling has emerged as an effective strategy to overcome this difficulty. Its key idea is to leverage many low-fidelity simulation data, less accurate but much faster to compute, to improve the approximations with limited high-fidelity data. In this work, we introduce a novel data-driven framework of multi-fidelity surrogate modeling for parametrized, time-dependent problems using long short-term memory (LSTM) networks, to enhance output predictions both for unseen parameter values and forward in time simultaneously - a task known to be particularly challenging for data-driven models. We demonstrate the wide applicability of the proposed approaches in a variety of engineering problems with high- and low-fidelity data generated through fine versus coarse meshes, small versus large time steps, or finite element full-order versus deep learning reduced-order models. Numerical results show that the proposed multi-fidelity LSTM networks not only improve single-fidelity regression significantly, but also outperform the multi-fidelity models based on feed-forward neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源