论文标题
朝着分布式优化的速率加速速率超过时变网络
Towards accelerated rates for distributed optimization over time-varying networks
论文作者
论文摘要
我们研究了具有强烈凸平的成本功能的分散优化的问题。在我们的方法中,节点在进行每个梯度更新后运行一个多步八卦程序,从而确保每次迭代时近似共识,而外循环基于加速的Nesterov方案。该算法达到$ o(\ sqrt {κ_g}χ\ log^2(1/\ varepsilon))$通信步骤和$ o(\ sqrt {κ___g} \ logepsilon)$ restigations $ n $ $χ$表征通信网络的连接性。在静态网络的情况下,$χ= 1/γ$,其中$γ$表示通信矩阵$ \ mathbf {w} $的归一化光谱差距。复杂性结合包括$κ_G$,这可以明显好于节点中最差的情况数。
We study the problem of decentralized optimization over time-varying networks with strongly convex smooth cost functions. In our approach, nodes run a multi-step gossip procedure after making each gradient update, thus ensuring approximate consensus at each iteration, while the outer loop is based on accelerated Nesterov scheme. The algorithm achieves precision $\varepsilon > 0$ in $O(\sqrt{κ_g}χ\log^2(1/\varepsilon))$ communication steps and $O(\sqrt{κ_g}\log(1/\varepsilon))$ gradient computations at each node, where $κ_g$ is the global function number and $χ$ characterizes connectivity of the communication network. In the case of a static network, $χ= 1/γ$ where $γ$ denotes the normalized spectral gap of communication matrix $\mathbf{W}$. The complexity bound includes $κ_g$, which can be significantly better than the worst-case condition number among the nodes.