论文标题

GEOM-SPIDER-EM:更快的方差降低了非凸有限和优化的随机期望最大化

Geom-SPIDER-EM: Faster Variance Reduced Stochastic Expectation Maximization for Nonconvex Finite-Sum Optimization

论文作者

Fort, Gersende, Moulines, Eric, Wai, Hoi-To

论文摘要

期望最大化(EM)算法是潜在变量模型中推断的关键参考。不幸的是,在大规模学习环境中,其计算成本是高度启发的。在本文中,我们提出了随机路径综合估计量EM(蜘蛛网)的扩展,并为这种新型算法得出复杂性界限,该算法旨在解决平滑的非凸有限的有限量和优化问题。我们表明,它达到了与蜘蛛网相同的艺术复杂性界限。并提供线性收敛速率的条件。数值结果支持我们的发现。

The Expectation Maximization (EM) algorithm is a key reference for inference in latent variable models; unfortunately, its computational cost is prohibitive in the large scale learning setting. In this paper, we propose an extension of the Stochastic Path-Integrated Differential EstimatoR EM (SPIDER-EM) and derive complexity bounds for this novel algorithm, designed to solve smooth nonconvex finite-sum optimization problems. We show that it reaches the same state of the art complexity bounds as SPIDER-EM; and provide conditions for a linear rate of convergence. Numerical results support our findings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源