论文标题

朝着移动边缘网络中的多个联合学习服务资源共享

Toward Multiple Federated Learning Services Resource Sharing in Mobile Edge Networks

论文作者

Nguyen, Minh N. H., Tran, Nguyen H., Tun, Yan Kyaw, Han, Zhu, Hong, Choong Seon

论文摘要

联合学习是一种用于协作培训的新学习计划,同时将数据保留在参与设备上的同时。在本文中,我们研究了多个访问边缘计算服务器的多个联合学习服务的新模型。因此,必须考虑必须考虑每个移动设备的学习服务中的CPU资源,用于本地培训过程,并必须考虑移动设备之间的通信资源以交换学习信息。此外,不同学习服务的收敛性能取决于需要精确决定的超学习率参数。为此,我们提出了一个联合资源优化和超学习率控制问题,即MS-FEDL,涉及移动设备的能耗和整体学习时间。我们设计了一种基于块坐标下降方法和分散的JP-MIADMM算法的集中算法,以解决MS-FEDL问题。与集中式方法不同,分散的方法需要许多迭代才能获得,但它允许每种学习服务独立管理本地资源和学习过程,而无需透露学习服务信息。我们的仿真结果表明,与启发式策略相比,我们提出的算法的收敛性能以及我们提出的算法的出色性能。

Federated Learning is a new learning scheme for collaborative training a shared prediction model while keeping data locally on participating devices. In this paper, we study a new model of multiple federated learning services at the multi-access edge computing server. Accordingly, the sharing of CPU resources among learning services at each mobile device for the local training process and allocating communication resources among mobile devices for exchanging learning information must be considered. Furthermore, the convergence performance of different learning services depends on the hyper-learning rate parameter that needs to be precisely decided. Towards this end, we propose a joint resource optimization and hyper-learning rate control problem, namely MS-FEDL, regarding the energy consumption of mobile devices and overall learning time. We design a centralized algorithm based on the block coordinate descent method and a decentralized JP-miADMM algorithm for solving the MS-FEDL problem. Different from the centralized approach, the decentralized approach requires many iterations to obtain but it allows each learning service to independently manage the local resource and learning process without revealing the learning service information. Our simulation results demonstrate the convergence performance of our proposed algorithms and the superior performance of our proposed algorithms compared to the heuristic strategy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源