论文标题
贝叶斯优化的Tree Ensemble核,在混合功能空间上具有已知约束
Tree ensemble kernels for Bayesian optimization with known constraints over mixed-feature spaces
论文作者
论文摘要
Tree Ensembles可以非常适合黑盒优化任务,例如算法调整和神经体系结构搜索,因为它们可以通过很少或没有手动调整来实现良好的预测性能,自然会处理离散的功能空间,并且对训练数据中的异常值不敏感。使用树的组合进行黑盒优化时面临的两个众所周知的挑战是(i)有效地量化模型的不确定性,以进行探索,以及(ii)优化在零件恒定的获取函数上。为了同时解决这两个要点,我们建议在获得模型方差估计之前使用树的内核解释为高斯过程,并为采集函数开发兼容优化公式。后者进一步使我们能够通过考虑工程设置中的域知识和建模搜索空间对称性(例如神经体系结构搜索中的层次关系关系)来无缝整合已知约束,以提高采样效率。我们的框架以及最先进的方法以及对连续/离散功能的无约束的黑框优化的方法,并且优于结合混合变量特征空间和已知输入约束的问题的竞争方法。
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search, as they achieve good predictive performance with little or no manual tuning, naturally handle discrete feature spaces, and are relatively insensitive to outliers in the training data. Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function. To address both points simultaneously, we propose using the kernel interpretation of tree ensembles as a Gaussian Process prior to obtain model variance estimates, and we develop a compatible optimization formulation for the acquisition function. The latter further allows us to seamlessly integrate known constraints to improve sampling efficiency by considering domain-knowledge in engineering settings and modeling search space symmetries, e.g., hierarchical relationships in neural architecture search. Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.