论文标题

零订单混合梯度下降:迈向有原则的黑盒优化框架

Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box Optimization Framework

论文作者

Sharma, Pranay, Xu, Kaidi, Liu, Sijia, Chen, Pin-Yu, Lin, Xue, Varshney, Pramod K.

论文摘要

在这项工作中,我们专注于不需要一阶梯度信息并仅使用功能评估的随机零阶(ZO)优化的研究。 ZO优化的问题已经在许多最近的机器学习应用程序中出现,在这些应用程序中,目标函数的梯度不可用或难以计算。在这种情况下,我们可以通过基于函数的梯度估计值近似完整的梯度或随机梯度。在这里,我们提出了一种新型的混合梯度估计器(HGE),该估计量利用了随机梯度估计值的查询效率以及坐标梯度估计的方差减少。我们表明,通过在坐标重要性采样方面的优美设计中,提出的基于HGE的ZO优化方法在迭代复杂性和功能查询成本方面都有效。我们提供了对非凸,凸和强凸优化的提议方法的收敛性的彻底理论分析。我们表明,我们得出的收敛速率将在非convex情况下的某些明显现有方法的结果推广,并匹配凸情况下的最佳结果。我们还用现实世界的黑盒攻击生成应用来证实该理论,以证明我们方法比最先进的ZO优化方法的经验优势。

In this work, we focus on the study of stochastic zeroth-order (ZO) optimization which does not require first-order gradient information and uses only function evaluations. The problem of ZO optimization has emerged in many recent machine learning applications, where the gradient of the objective function is either unavailable or difficult to compute. In such cases, we can approximate the full gradients or stochastic gradients through function value based gradient estimates. Here, we propose a novel hybrid gradient estimator (HGE), which takes advantage of the query-efficiency of random gradient estimates as well as the variance-reduction of coordinate-wise gradient estimates. We show that with a graceful design in coordinate importance sampling, the proposed HGE-based ZO optimization method is efficient both in terms of iteration complexity as well as function query cost. We provide a thorough theoretical analysis of the convergence of our proposed method for non-convex, convex, and strongly-convex optimization. We show that the convergence rate that we derive generalizes the results for some prominent existing methods in the nonconvex case, and matches the optimal result in the convex case. We also corroborate the theory with a real-world black-box attack generation application to demonstrate the empirical advantage of our method over state-of-the-art ZO optimization approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源