论文标题

非负矩阵分解的算法与kullback-leibler Divergence

Algorithms for Nonnegative Matrix Factorization with the Kullback-Leibler Divergence

论文作者

Hien, Le Thi Khanh, Gillis, Nicolas

论文摘要

非负矩阵分解(NMF)是非负数据集的标准线性降低技术。为了测量输入数据与低级别近似之间的差异,kullback-leibler(kl)差异是NMF使用最广泛使用的目标函数之一。当观察到的数据样本的基础统计数据遵循泊松分布时,它对应于最大类似估计器,而KL NMF对于计数数据集(例如文档或图像)特别有意义。在本文中,我们首先收集KL目标函数的重要特性,这对于研究KL NMF算法的收敛至关重要。第二,加上审查用于求解KL NMF的现有算法,我们提出了三种新算法,以保证目标函数的非提示性。我们还为我们提出的一种算法提供了全球收敛保证。最后,我们进行了广泛的数值实验,以全面了解KL NMF算法的性能。

Nonnegative matrix factorization (NMF) is a standard linear dimensionality reduction technique for nonnegative data sets. In order to measure the discrepancy between the input data and the low-rank approximation, the Kullback-Leibler (KL) divergence is one of the most widely used objective function for NMF. It corresponds to the maximum likehood estimator when the underlying statistics of the observed data sample follows a Poisson distribution, and KL NMF is particularly meaningful for count data sets, such as documents or images. In this paper, we first collect important properties of the KL objective function that are essential to study the convergence of KL NMF algorithms. Second, together with reviewing existing algorithms for solving KL NMF, we propose three new algorithms that guarantee the non-increasingness of the objective function. We also provide a global convergence guarantee for one of our proposed algorithms. Finally, we conduct extensive numerical experiments to provide a comprehensive picture of the performances of the KL NMF algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源