论文标题
SGLB:随机梯度Langevin提升
SGLB: Stochastic Gradient Langevin Boosting
论文作者
论文摘要
本文介绍了随机梯度Langevin Boosting(SGLB) - 一个强大而有效的机器学习框架,可以处理广泛的损失功能,并具有可证明的概括保证。该方法基于特殊形式的langevin扩散方程,专门为梯度增强而设计。这使我们从理论上保证了全局收敛,即使是多模式损耗函数,而标准梯度增强算法也只能保证本地最佳。我们还从经验上表明,当应用于具有0-1损耗函数的分类任务时,SGLB的表现优于经典梯度的提升,这是多模式的。
This paper introduces Stochastic Gradient Langevin Boosting (SGLB) - a powerful and efficient machine learning framework that may deal with a wide range of loss functions and has provable generalization guarantees. The method is based on a special form of the Langevin diffusion equation specifically designed for gradient boosting. This allows us to theoretically guarantee the global convergence even for multimodal loss functions, while standard gradient boosting algorithms can guarantee only local optimum. We also empirically show that SGLB outperforms classic gradient boosting when applied to classification tasks with 0-1 loss function, which is known to be multimodal.