论文标题

通过最大编码率降低,联合表示的学习

Federated Representation Learning via Maximal Coding Rate Reduction

论文作者

Cervino, Juan, NaderiAlizadeh, Navid, Ribeiro, Alejandro

论文摘要

我们提出了一种联合方法,以从几个客户之间分发的数据集中学习低维表示。特别是,我们摆脱了联邦学习中普遍使用的跨透明拷贝损失,并试图通过分散的方式通过最大编码率降低(MCR2)来学习数据的共享低维表示。我们所提出的方法(我们称为流量)将MCR2用作选择的目标,因此导致表示既是阶层之间的歧视性歧视性和课堂可压缩性的表示。从理论上讲,我们表明我们的分布式算法实现了一阶固定点。此外,我们通过数值实验证明了学到的低维表示的实用性。

We propose a federated methodology to learn low-dimensional representations from a dataset that is distributed among several clients. In particular, we move away from the commonly-used cross-entropy loss in federated learning, and seek to learn shared low-dimensional representations of the data in a decentralized manner via the principle of maximal coding rate reduction (MCR2). Our proposed method, which we refer to as FLOW, utilizes MCR2 as the objective of choice, hence resulting in representations that are both between-class discriminative and within-class compressible. We theoretically show that our distributed algorithm achieves a first-order stationary point. Moreover, we demonstrate, via numerical experiments, the utility of the learned low-dimensional representations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源