论文标题

分布式非convex优化:无梯度迭代和$ε$ - 全球最佳解决方案

Distributed Nonconvex Optimization: Gradient-free Iterations and $ε$-Globally Optimal Solution

论文作者

He, Zhiyu, He, Jianping, Chen, Cailian, Guan, Xinping

论文摘要

分布式优化利用本地计算和通信来实现优化本地目标函数总和的全球目的。本文介绍了一类受约束的分布式非凸优化问题,涉及单变量目标,旨在实现全球优化,而无需在每次迭代时对梯度进行本地评估。我们提出了一种名为CPCA的新型算法,利用了结合Chebyshev多项式近似,平均共识和多项式优化的概念。所提出的算法是i)能够为任何任意的给定精度$ε$,ii)在零订单查询(即功能值评估)和III时有效地在指定的精确要求时分布在指定的精确要求时分布在零件上可被解雇的,以获取$ε$ - 全球最佳解决方案$ε$,ii。关键见解是使用多项式近似来代替一般的本地目标,通过平均共识分配这些近似值,并解决原始问题的近似版本。由于多项式的良好分析特性,这种近似不仅有助于有效的全局优化,而且还允许设计无梯度的迭代,以降低查询的累积成本并实现用于解决非convex问题的几何收敛。我们对所提出算法的准确性和复杂性进行了全面分析。

Distributed optimization utilizes local computation and communication to realize a global aim of optimizing the sum of local objective functions. This article addresses a class of constrained distributed nonconvex optimization problems involving univariate objectives, aiming to achieve global optimization without requiring local evaluations of gradients at every iteration. We propose a novel algorithm named CPCA, exploiting the notion of combining Chebyshev polynomial approximation, average consensus, and polynomial optimization. The proposed algorithm is i) able to obtain $ε$-globally optimal solutions for any arbitrarily small given accuracy $ε$, ii) efficient in both zeroth-order queries (i.e., evaluations of function values) and inter-agent communication, and iii) distributed terminable when the specified precision requirement is met. The key insight is to use polynomial approximations to substitute for general local objectives, distribute these approximations via average consensus, and solve an easier approximate version of the original problem. Due to the nice analytic properties of polynomials, this approximation not only facilitates efficient global optimization, but also allows the design of gradient-free iterations to reduce cumulative costs of queries and achieve geometric convergence for solving nonconvex problems. We provide a comprehensive analysis of the accuracy and complexities of the proposed algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源