论文标题
拜占庭式的通过Clippedgossip分散学习
Byzantine-Robust Decentralized Learning via ClippedGossip
论文作者
论文摘要
在本文中,我们研究了关于任意交流图的拜占庭式分散培训的具有挑战性的任务。与联邦学习不同的是,工人通过服务器进行沟通,分散环境中的工人只能与邻居交谈,从而更难达成共识并从协作培训中受益。为了解决这些问题,我们提出了一种用于拜占庭式的算法和优化的杂物算法,这是第一个可证明与$ o(δ_ {\ max}ζ^2/γ^2)$相聚的非cevex目标的邻域的$(δ_{\ max}ζ^2/γ^2)$。最后,我们证明了在大量攻击下剪裁的令人鼓舞的经验表现。
In this paper, we study the challenging task of Byzantine-robust decentralized training on arbitrary communication graphs. Unlike federated learning where workers communicate through a server, workers in the decentralized environment can only talk to their neighbors, making it harder to reach consensus and benefit from collaborative training. To address these issues, we propose a ClippedGossip algorithm for Byzantine-robust consensus and optimization, which is the first to provably converge to a $O(δ_{\max}ζ^2/γ^2)$ neighborhood of the stationary point for non-convex objectives under standard assumptions. Finally, we demonstrate the encouraging empirical performance of ClippedGossip under a large number of attacks.