论文标题

加权随机搜索超参数优化

Weighted Random Search for Hyperparameter Optimization

论文作者

Florea, Adrian-Catalin, Andonie, Razvan

论文摘要

我们引入了改进的随机搜索(RS),此处用于机器学习算法的超参数优化。与标准RS为每个试验生成所有超参数的新值不同,我们为每个超参数生成新值,并具有变化的概率。我们方法背后的直觉是,一个已经触发好结果的值是下一步的良好候选者,应在超参数值的新组合中进行测试。在相同的计算预算中,我们的方法比标准RS产生的结果更好。我们的理论结果证明了这一说法。我们测试了我们的方法的变体,用于此类问题(贪婪函数)和深度学习CNN体系结构的超参参数优化的一种变体。我们的结果可以推广到离散域上定义的任何优化问题。

We introduce an improved version of Random Search (RS), used here for hyperparameter optimization of machine learning algorithms. Unlike the standard RS, which generates for each trial new values for all hyperparameters, we generate new values for each hyperparameter with a probability of change. The intuition behind our approach is that a value that already triggered a good result is a good candidate for the next step, and should be tested in new combinations of hyperparameter values. Within the same computational budget, our method yields better results than the standard RS. Our theoretical results prove this statement. We test our method on a variation of one of the most commonly used objective function for this class of problems (the Grievank function) and for the hyperparameter optimization of a deep learning CNN architecture. Our results can be generalized to any optimization problem defined on a discrete domain.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源