论文标题

与匹配的平均学习联合学习

Federated Learning with Matched Averaging

论文作者

Wang, Hongyi, Yurochkin, Mikhail, Sun, Yuekai, Papailiopoulos, Dimitris, Khazaeni, Yasaman

论文摘要

联合学习允许Edge设备在将培训数据保留在设备上的同时协作学习共享模型,从而将模型培训的能力与将数据存储在云中的需求中的需要。我们建议联合匹配的平均(FEDMA)算法,专为现代神经网络体系结构的联合学习,例如卷积神经网络(CNN)和LSTMS。 FEDMA通过匹配和平均隐藏元素(即卷积层的通道; LSTM的隐藏状态;完全连接的层的神经元),以相似的特征提取签名来构建共享的全局模型。我们的实验表明,FEDMA不仅胜过对CNN和LSTM深入的LSTM Architectures在现实世界数据集培训的Deep CNN和LSTM架构上的最先进的学习算法,而且减轻了整体通信负担。

Federated learning allows edge devices to collaboratively learn a shared model while keeping the training data on device, decoupling the ability to do model training from the need to store the data in the cloud. We propose Federated matched averaging (FedMA) algorithm designed for federated learning of modern neural network architectures e.g. convolutional neural networks (CNNs) and LSTMs. FedMA constructs the shared global model in a layer-wise manner by matching and averaging hidden elements (i.e. channels for convolution layers; hidden states for LSTM; neurons for fully connected layers) with similar feature extraction signatures. Our experiments indicate that FedMA not only outperforms popular state-of-the-art federated learning algorithms on deep CNN and LSTM architectures trained on real world datasets, but also reduces the overall communication burden.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源