论文标题

组织病理学图像的联合学习中基于群集的安全多方计算

Cluster Based Secure Multi-Party Computation in Federated Learning for Histopathology Images

论文作者

Hosseini, S. Maryam, Sikaroudi, Milad, Babaei, Morteza, Tizhoosh, H. R.

论文摘要

联邦学习(FL)是一种分散的方法,使医院能够在不共享私人患者数据进行培训的情况下协作学习模型。在FL中,参与者医院定期交换培训结果,而不是用中央服务器培训样品。但是,拥有模型参数或梯度可以揭示私人培训数据样本。为了应对这一挑战,我们采用安全的多方计算(SMC)来建立一个保护隐私的联合学习框架。在我们提出的方法中,医院分为集群。经过当地培训后,每家医院在同一集群中将其模型权重分开,因此没有一家医院可以自己检索其他医院的体重。然后,所有医院总结了收到的权重,将结果发送到中央服务器。最后,中央服务器汇总了结果,检索模型的平均权重并更新模型,而无需访问各个医院的权重。我们在公开可用的存储库《癌症基因组图集》(TCGA)上进行实验。我们将提议框架的性能与差异隐私进行了比较,并将平均为基准。结果表明,与差异隐私相比,我们的框架可以达到更高的准确性,而没有隐私泄漏风险,而较高的通信开销则可以实现。

Federated learning (FL) is a decentralized method enabling hospitals to collaboratively learn a model without sharing private patient data for training. In FL, participant hospitals periodically exchange training results rather than training samples with a central server. However, having access to model parameters or gradients can expose private training data samples. To address this challenge, we adopt secure multiparty computation (SMC) to establish a privacy-preserving federated learning framework. In our proposed method, the hospitals are divided into clusters. After local training, each hospital splits its model weights among other hospitals in the same cluster such that no single hospital can retrieve other hospitals' weights on its own. Then, all hospitals sum up the received weights, sending the results to the central server. Finally, the central server aggregates the results, retrieving the average of models' weights and updating the model without having access to individual hospitals' weights. We conduct experiments on a publicly available repository, The Cancer Genome Atlas (TCGA). We compare the performance of the proposed framework with differential privacy and federated averaging as the baseline. The results reveal that compared to differential privacy, our framework can achieve higher accuracy with no privacy leakage risk at a cost of higher communication overhead.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源