论文标题
分散沟通效率的多任务表示学习
Decentralized Communication-Efficient Multi-Task Representation Learning
论文作者
论文摘要
这项工作开发了一种完全准确的完全分类的预测梯度下降(GD)算法,用于以快速且沟通的方式从其每个列的相互独立投影中恢复低级(LR)矩阵。据我们所知,这项工作是第一次尝试为任何涉及使用交替投影的GD算法的问题开发出可证明正确的分散算法(i)的尝试; (ii)并且对于要投影到的约束的任何问题都是非凸集集。
This work develops a provably accurate fully-decentralized alternating projected gradient descent (GD) algorithm for recovering a low rank (LR) matrix from mutually independent projections of each of its columns, in a fast and communication-efficient fashion. To our best knowledge, this work is the first attempt to develop a provably correct decentralized algorithm (i) for any problem involving the use of an alternating projected GD algorithm; (ii) and for any problem in which the constraint set to be projected to is a non-convex set.