论文标题

用于沟通有效的联合学习的动态抽样和选择性掩盖

Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning

论文作者

Ji, Shaoxiong, Jiang, Wenqi, Walid, Anwar, Li, Xue

论文摘要

联合学习(FL)是一种新颖的机器学习设置,可以通过分散的培训和联合优化实现现场智能。深度神经网络的快速发展促进了对复杂问题进行建模的学习技术,并在联邦环境下出现在联邦深度学习中。但是,大量的模型参数以高负载的运输量负担通信网络。本文介绍了两种通过动态抽样和顶部$ K $选择性掩蔽提高沟通效率的方法。前者动态控制选定的客户端模型的分数,而后者选择了联合更新的最大差异值的参数。在三个公共数据集上进行了有关卷积图像分类和经常性语言建模的实验,以显示我们提出的方法的有效性。

Federated learning (FL) is a novel machine learning setting that enables on-device intelligence via decentralized training and federated optimization. Deep neural networks' rapid development facilitates the learning techniques for modeling complex problems and emerges into federated deep learning under the federated setting. However, the tremendous amount of model parameters burdens the communication network with a high load of transportation. This paper introduces two approaches for improving communication efficiency by dynamic sampling and top-$k$ selective masking. The former controls the fraction of selected client models dynamically, while the latter selects parameters with top-$k$ largest values of difference for federated updating. Experiments on convolutional image classification and recurrent language modeling are conducted on three public datasets to show our proposed methods' effectiveness.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源