论文标题

MBGDT:强大的迷你批次梯度下降

MBGDT:Robust Mini-Batch Gradient Descent

论文作者

Wang, Hanming, Luo, Haozheng, Wang, Yue

论文摘要

在高维度中,大多数机器学习方法即使有一些异常值也要脆弱。为了解决这个问题,我们希望与基础学习者一起引入一种新方法,例如贝叶斯回归或随机梯度下降,以解决模型中脆弱性的问题。由于迷你批次梯度下降允许比批处理梯度下降具有更强的收敛性,因此我们使用Mini Batch梯度下降(称为Mini Batch梯度下降)使用Trimming(MBGDT)来使用一种方法。当我们在设计数据集中应用方法时,我们的方法显示出最先进的性能,并且比几个基线具有更大的鲁棒性。

In high dimensions, most machine learning method perform fragile even there are a little outliers. To address this, we hope to introduce a new method with the base learner, such as Bayesian regression or stochastic gradient descent to solve the problem of the vulnerability in the model. Because the mini-batch gradient descent allows for a more robust convergence than the batch gradient descent, we work a method with the mini-batch gradient descent, called Mini-Batch Gradient Descent with Trimming (MBGDT). Our method show state-of-art performance and have greater robustness than several baselines when we apply our method in designed dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源