论文标题

机器学习中的多目标超参数优化 - 概述

Multi-Objective Hyperparameter Optimization in Machine Learning -- An Overview

论文作者

Karl, Florian, Pielok, Tobias, Moosbauer, Julia, Pfisterer, Florian, Coors, Stefan, Binder, Martin, Schneider, Lennart, Thomas, Janek, Richter, Jakob, Lang, Michel, Garrido-Merchán, Eduardo C., Branke, Juergen, Bischl, Bernd

论文摘要

超参数优化构成了典型的现代机器学习工作流程的很大一部分。这源于这样一个事实,即机器学习方法和相应的预处理步骤通常只有在正确调整超参数时会产生最佳性能。但是,在许多应用中,我们不仅有兴趣仅用于预测准确性的ML管道;确定最佳配置时,必须考虑其他指标或约束,从而导致多目标优化问题。由于缺乏知识和用于多目标超参数优化的知识和容易获得的软件实现,因此通常在实践中被忽略。在这项工作中,我们向读者介绍了多目标超参数优化的基础知识,并激发了其在应用ML中的实用性。此外,我们从进化算法和贝叶斯优化的领域提供了对现有优化策略的广泛调查。我们考虑了诸如操作条件,预测时间,稀疏,公平,可解释性和鲁棒性之类的目标,说明了MOO在几个特定ML应用中的实用性。

Hyperparameter optimization constitutes a large part of typical modern machine learning workflows. This arises from the fact that machine learning methods and corresponding preprocessing steps often only yield optimal performance when hyperparameters are properly tuned. But in many applications, we are not only interested in optimizing ML pipelines solely for predictive accuracy; additional metrics or constraints must be considered when determining an optimal configuration, resulting in a multi-objective optimization problem. This is often neglected in practice, due to a lack of knowledge and readily available software implementations for multi-objective hyperparameter optimization. In this work, we introduce the reader to the basics of multi-objective hyperparameter optimization and motivate its usefulness in applied ML. Furthermore, we provide an extensive survey of existing optimization strategies, both from the domain of evolutionary algorithms and Bayesian optimization. We illustrate the utility of MOO in several specific ML applications, considering objectives such as operating conditions, prediction time, sparseness, fairness, interpretability and robustness.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源