论文标题

DRES-FL:通过秘密数据共享为非IID客户提供的辍学弹性安全联合学习

DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

论文作者

Shao, Jiawei, Sun, Yuchang, Li, Songze, Zhang, Jun

论文摘要

联合学习(FL)致力于在不集中收集客户的私人数据的情况下对机器学习模型进行协作培训。与集中式培训不同,佛罗里达州跨客户的本地数据集是非独立且分布的(非IID)的。此外,拥有数据的客户可能会任意退出培训过程。这些特征将大大降低训练性能。本文提出了一个基于拉格朗日编码计算(LCC)的辍学耐药性安全联合学习(DRES-FL)框架,以解决非IID和辍学问题。关键的想法是利用拉格朗日编码来秘密地共享私有数据集,以便每个客户端接收全局数据集的编码版本,并且该数据集对本地数据集的本地梯度计算是公正的。要正确解码服务器上的梯度,梯度函数必须是有限字段中的多项式,因此我们构建多项式整数神经网络(PINN)才能启用我们的框架。理论分析表明,DRES-FL对客户辍学是弹性的,并为本地数据集提供了隐私保护。此外,我们在实验上证明了DRES-FL始终导致基线方法的绩效增长显着。

Federated learning (FL) strives to enable collaborative training of machine learning models without centrally collecting clients' private data. Different from centralized training, the local datasets across clients in FL are non-independent and identically distributed (non-IID). In addition, the data-owning clients may drop out of the training process arbitrarily. These characteristics will significantly degrade the training performance. This paper proposes a Dropout-Resilient Secure Federated Learning (DReS-FL) framework based on Lagrange coded computing (LCC) to tackle both the non-IID and dropout problems. The key idea is to utilize Lagrange coding to secretly share the private datasets among clients so that each client receives an encoded version of the global dataset, and the local gradient computation over this dataset is unbiased. To correctly decode the gradient at the server, the gradient function has to be a polynomial in a finite field, and thus we construct polynomial integer neural networks (PINNs) to enable our framework. Theoretical analysis shows that DReS-FL is resilient to client dropouts and provides privacy protection for the local datasets. Furthermore, we experimentally demonstrate that DReS-FL consistently leads to significant performance gains over baseline methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源