论文标题

NSGA-III的形状约束符号回归

Shape-constrained Symbolic Regression with NSGA-III

论文作者

Haider, Christian

论文摘要

形状受限的符号回归(SCSR)允许将先验知识包括到基于数据的建模中。此包含允许确保所得模型更好地反映某些预期行为。预期行为是通过约束来定义的,该约束是指函数形式,例如单调性,凹度,凸度或模型图像边界。除了由于定义了对功能形状的约束而获得更健壮和可靠的模型的优势外,SCSR的使用还可以找到对噪声更强大并具有更好外推行为的模型。本文提出了一种最小化近似误差以及约束违规的方法。明确实施了两种算法NSGA-II和NSGA-III,并在模型质量和运行时相互比较。两种算法都能够处理多个目标,而NSGA-II是一种良好的多目标方法,在具有最大目标的实例上表现良好。 NSGA-III是NSGA-II算法的扩展,并开发出来处理“许多”目标(超过3个目标)的问题。这两种算法均在物理教科书中选定的基准实例上执行。结果表明,两种算法都能够找到很大的可行解决方案,而NSGA-III在模型质量方面提供了略有改进。此外,可以使用多目标方法观察到运行时的改善。

Shape-constrained symbolic regression (SCSR) allows to include prior knowledge into data-based modeling. This inclusion allows to ensure that certain expected behavior is better reflected by the resulting models. The expected behavior is defined via constraints, which refer to the function form e.g. monotonicity, concavity, convexity or the models image boundaries. In addition to the advantage of obtaining more robust and reliable models due to defining constraints over the functions shape, the use of SCSR allows to find models which are more robust to noise and have a better extrapolation behavior. This paper presents a mutlicriterial approach to minimize the approximation error as well as the constraint violations. Explicitly the two algorithms NSGA-II and NSGA-III are implemented and compared against each other in terms of model quality and runtime. Both algorithms are capable of dealing with multiple objectives, whereas NSGA-II is a well established multi-objective approach performing well on instances with up-to 3 objectives. NSGA-III is an extension of the NSGA-II algorithm and was developed to handle problems with "many" objectives (more than 3 objectives). Both algorithms are executed on a selected set of benchmark instances from physics textbooks. The results indicate that both algorithms are able to find largely feasible solutions and NSGA-III provides slight improvements in terms of model quality. Moreover, an improvement in runtime can be observed using the many-objective approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源