论文标题

全球更新指导联邦学习

Global Update Guided Federated Learning

论文作者

Wu, Qilong, Liu, Lin, Xue, Shibei

论文摘要

联合学习通过交换模型而不是数据来保护数据隐私和安全性。但是,参与客户之间的数据分布不平衡会损害联合学习算法的准确性和收敛速度。为了减轻此问题,与以前的研究限制了本地模型更新的距离不同,我们建议全球校园指导的联邦学习(FEDGG),该学习将模型cosine损失引入本地目标功能,以便本地模型可以根据全球模型更新指示指南拟合本地数据分布。此外,考虑到全球模型的更新方向在培训的早期阶段是有益的,因此我们根据本地模型的更新距离提出了自适应损失权重。数值模拟表明,与其他高级算法相比,FEDGG在模型收敛精度和速度方面具有显着改善。此外,与传统的固定减肥重量相比,自适应减肥重量使我们的算法在实践中更稳定,更易于实现。

Federated learning protects data privacy and security by exchanging models instead of data. However, unbalanced data distributions among participating clients compromise the accuracy and convergence speed of federated learning algorithms. To alleviate this problem, unlike previous studies that limit the distance of updates for local models, we propose global-update-guided federated learning (FedGG), which introduces a model-cosine loss into local objective functions, so that local models can fit local data distributions under the guidance of update directions of global models. Furthermore, considering that the update direction of a global model is informative in the early stage of training, we propose adaptive loss weights based on the update distances of local models. Numerical simulations show that, compared with other advanced algorithms, FedGG has a significant improvement on model convergence accuracies and speeds. Additionally, compared with traditional fixed loss weights, adaptive loss weights enable our algorithm to be more stable and easier to implement in practice.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源