论文标题
在融合到全球改组型梯度算法的解决方案上
On the Convergence to a Global Solution of Shuffling-Type Gradient Algorithms
论文作者
论文摘要
随机梯度下降(SGD)算法是许多机器学习任务中选择的方法,这要归功于其在处理大规模问题方面的可扩展性和效率。在本文中,我们专注于与主流实践启发式方法相匹配的SGD的改组版。我们将收敛性在过度参数化设置下的一类非凸功能的全局解决方案显示。与以前的文献相比,我们的分析采用更轻松的非凸假设。然而,我们保持了所需的计算复杂性,因为改组SGD在一般凸设置中已实现。
Stochastic gradient descent (SGD) algorithm is the method of choice in many machine learning tasks thanks to its scalability and efficiency in dealing with large-scale problems. In this paper, we focus on the shuffling version of SGD which matches the mainstream practical heuristics. We show the convergence to a global solution of shuffling SGD for a class of non-convex functions under over-parameterized settings. Our analysis employs more relaxed non-convex assumptions than previous literature. Nevertheless, we maintain the desired computational complexity as shuffling SGD has achieved in the general convex setting.