论文标题
联合学习的更好方法和理论:压缩,客户选择和异质性
Better Methods and Theory for Federated Learning: Compression, Client Selection and Heterogeneity
论文作者
论文摘要
Federated Learning(FL)是一种新兴的机器学习范式,涉及多个客户,例如手机设备,并激励了协作解决由中央服务器协调的机器学习问题。 FL由Konečný等人于2016年提出。和McMahan等。作为传统集中机器学习的可行保护替代方案,因为通过构造,培训数据点是分散的,并且从未由客户转移到中央服务器上。因此,在一定程度上,FL会减轻与集中数据收集相关的隐私风险。 不幸的是,对FL面对的优化通常不需要处理集中优化的几个特定问题。在本文中,我们确定了其中一些挑战,并提出了解决这些挑战的新方法和算法,以实现由数学上严格的保证支持实用的FL解决方案。
Federated learning (FL) is an emerging machine learning paradigm involving multiple clients, e.g., mobile phone devices, with an incentive to collaborate in solving a machine learning problem coordinated by a central server. FL was proposed in 2016 by Konečný et al. and McMahan et al. as a viable privacy-preserving alternative to traditional centralized machine learning since, by construction, the training data points are decentralized and never transferred by the clients to a central server. Therefore, to a certain degree, FL mitigates the privacy risks associated with centralized data collection. Unfortunately, optimization for FL faces several specific issues that centralized optimization usually does not need to handle. In this thesis, we identify several of these challenges and propose new methods and algorithms to address them, with the ultimate goal of enabling practical FL solutions supported with mathematically rigorous guarantees.