论文标题
异构数据感知联合学习
Heterogeneous Data-Aware Federated Learning
论文作者
论文摘要
联合学习(FL)是一个有吸引力的概念,可以在保持数据私有的同时对神经网络(NN)进行分布式培训。随着FL框架的工业化,我们确定了妨碍其成功部署的几个问题,例如存在非I.I.D数据,不相交类,跨数据集的信号多模式。在这项工作中,我们通过提出一种新的方法来解决这些问题,该方法不仅(1)汇总了服务器上(例如在传统的FL中)(例如,在传统的FL中),还保留一组参数(例如,一组特定于每个客户端的任务NN层)。我们验证了传统使用的公共基准(例如女性)以及我们专有收集的数据集(即交通分类)的方法。结果显示了我们方法的好处,在极端情况下具有很大的优势。
Federated learning (FL) is an appealing concept to perform distributed training of Neural Networks (NN) while keeping data private. With the industrialization of the FL framework, we identify several problems hampering its successful deployment, such as presence of non i.i.d data, disjoint classes, signal multi-modality across datasets. In this work, we address these problems by proposing a novel method that not only (1) aggregates generic model parameters (e.g. a common set of task generic NN layers) on server (e.g. in traditional FL), but also (2) keeps a set of parameters (e.g, a set of task specific NN layer) specific to each client. We validate our method on the traditionally used public benchmarks (e.g., Femnist) as well as on our proprietary collected dataset (i.e., traffic classification). Results show the benefit of our method, with significant advantage on extreme cases.