论文标题

简化的单变量边缘分布算法的运行时间分析

A Simplified Run Time Analysis of the Univariate Marginal Distribution Algorithm on LeadingOnes

论文作者

Doerr, Benjamin, Krejca, Martin

论文摘要

使用基本均值,我们证明了单变量边缘分布算法(UMDA)在理想的基准策略中优化Leaders基准函数的单变时间保证,并且遗传漂移较低。如果人口大小至少是准线性的,则较高的概率,umda在许多迭代中采样了最佳问题,这些迭代在问题大小中是线性的,除以UMDA选择率的对数。这是通过基于深层的人口方法获得的,这是在运行时间和通过较小的选择率中进一步的运行时间增长来改善的,这是通过基于深层的人群方法获得的上一位保证。有了与我们的上限分析相似的论点,我们还为此问题获得了第一个下限。在类似的假设下,我们证明,与恒定因子相匹配的界限具有很高的概率。

With elementary means, we prove a stronger run time guarantee for the univariate marginal distribution algorithm (UMDA) optimizing the LeadingOnes benchmark function in the desirable regime with low genetic drift. If the population size is at least quasilinear, then, with high probability, the UMDA samples the optimum within a number of iterations that is linear in the problem size divided by the logarithm of the UMDA's selection rate. This improves over the previous guarantee, obtained by Dang and Lehre (2015) via the deep level-based population method, both in terms of the run time and by demonstrating further run time gains from small selection rates. With similar arguments as in our upper-bound analysis, we also obtain the first lower bound for this problem. Under similar assumptions, we prove that a bound that matches our upper bound up to constant factors holds with high probability.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源