论文标题

使用一阶方法进行分散优化的一般框架

A general framework for decentralized optimization with first-order methods

论文作者

Xin, Ran, Pu, Shi, Nedić, Angelia, Khan, Usman A.

论文摘要

分散的优化以最大程度地减少节点网络上的有限函数总和一直是控制和信号处理研究中的重要重点,因为它与最佳控制和信号估计问题的自然相关性。最近,复杂的计算和大规模数据科学需求的出现导致该领域的活动复兴。在本文中,我们讨论了分散的一阶梯度方法,这些方法在控制,信号处理和机器学习问题上取得了巨大成功,在这些方法上,由于其简单性,这种方法是许多复杂的推理和训练任务的首选方法。特别是,我们提供了一个分散的一阶方法的一般框架,该框架适用于无向和定向的通信网络,并证明有关优化和共识的许多现有工作都可以与此框架明确相关。我们将讨论进一步扩展到分散的随机一阶方法,这些方法依赖于每个节点处的随机梯度,并描述局部差异方案(以前证明)在集中式设置中有希望的局部差异方案如何与所谓的梯度跟踪相结合时能够提高分散方法的性能。我们激励并证明在分散环境中出现的机器学习和信号处理问题的背景下,相应方法的有效性。

Decentralized optimization to minimize a finite sum of functions over a network of nodes has been a significant focus within control and signal processing research due to its natural relevance to optimal control and signal estimation problems. More recently, the emergence of sophisticated computing and large-scale data science needs have led to a resurgence of activity in this area. In this article, we discuss decentralized first-order gradient methods, which have found tremendous success in control, signal processing, and machine learning problems, where such methods, due to their simplicity, serve as the first method of choice for many complex inference and training tasks. In particular, we provide a general framework of decentralized first-order methods that is applicable to undirected and directed communication networks alike, and show that much of the existing work on optimization and consensus can be related explicitly to this framework. We further extend the discussion to decentralized stochastic first-order methods that rely on stochastic gradients at each node and describe how local variance reduction schemes, previously shown to have promise in the centralized settings, are able to improve the performance of decentralized methods when combined with what is known as gradient tracking. We motivate and demonstrate the effectiveness of the corresponding methods in the context of machine learning and signal processing problems that arise in decentralized environments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源