论文标题
减少课堂失衡的联合学习
Federated learning with class imbalance reduction
论文作者
论文摘要
联合学习(FL)是一种有前途的技术,它使大量边缘计算设备能够协作训练全球学习模型。由于关注隐私问题,因此无法用于集中式服务器上的原始数据。受频谱限制和计算能力的限制,只能参与一部分设备以训练并将训练的模型传输到集中式服务器进行聚合。由于局部数据分布在所有设备之间都有所不同,因此类别不平衡问题以及不利的客户选择,导致全局模型的收敛速度缓慢。在本文中,估算方案旨在揭示班级分布而没有原始数据的认识。基于该方案,提出了针对最小级别不平衡的设备选择算法,因此可以改善全局模型的收敛性能。仿真结果证明了所提出的算法的有效性。
Federated learning (FL) is a promising technique that enables a large amount of edge computing devices to collaboratively train a global learning model. Due to privacy concerns, the raw data on devices could not be available for centralized server. Constrained by the spectrum limitation and computation capacity, only a subset of devices can be engaged to train and transmit the trained model to centralized server for aggregation. Since the local data distribution varies among all devices, class imbalance problem arises along with the unfavorable client selection, resulting in a slow converge rate of the global model. In this paper, an estimation scheme is designed to reveal the class distribution without the awareness of raw data. Based on the scheme, a device selection algorithm towards minimal class imbalance is proposed, thus can improve the convergence performance of the global model. Simulation results demonstrate the effectiveness of the proposed algorithm.