论文标题

分布式分区优化的一般框架

A General Framework for Distributed Partitioned Optimization

论文作者

Chezhegov, Savelii, Novitskii, Anton, Rogozin, Alexander, Parsegov, Sergei, Dvurechensky, Pavel, Gasnikov, Alexander

论文摘要

分散的优化广泛用于大规模和隐私保护机器学习以及各种分布式控制和传感系统。假定网络中的每个代理都具有局部目标函数,并且节点通过通信网络进行交互。在文献中主要研究的标准方案中,本地函数取决于一组通用变量,因此,必须在每个通信回合中发送整个变量集。在这项工作中,我们研究了一个不同的问题陈述,其中每个节点持有的局部函数仅取决于变量的某些子集。 给定一个网络,我们为分散的分区优化构建了一个一般算法独立的框架,该框架允许使用Laplacian矩阵的概括来构建以减少通信负载的算法。此外,我们的框架允许获取具有非反应收敛速率的算法,并明确依赖网络参数,包括加速和最佳的一阶方法。我们在合成示例中说明了方法的功效。

Decentralized optimization is widely used in large scale and privacy preserving machine learning and various distributed control and sensing systems. It is assumed that every agent in the network possesses a local objective function, and the nodes interact via a communication network. In the standard scenario, which is mostly studied in the literature, the local functions are dependent on a common set of variables, and, therefore, have to send the whole variable set at each communication round. In this work, we study a different problem statement, where each of the local functions held by the nodes depends only on some subset of the variables. Given a network, we build a general algorithm-independent framework for decentralized partitioned optimization that allows to construct algorithms with reduced communication load using a generalization of Laplacian matrix. Moreover, our framework allows to obtain algorithms with non-asymptotic convergence rates with explicit dependence on the parameters of the network, including accelerated and optimal first-order methods. We illustrate the efficacy of our approach on a synthetic example.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源