论文标题
降低维度的方法,用于寻找最不利的先验,重点是布雷格曼分歧
A Dimensionality Reduction Method for Finding Least Favorable Priors with a Focus on Bregman Divergence
论文作者
论文摘要
在点估计中表征最小估计器的一种常见方法是将问题移动到贝叶斯估计域并找到至少有利的先前分布。在轻度条件下,贝叶斯估计量最低的先验诱导,然后已知最小值。但是,由于对概率分布空间的固有优化,这是无限二维的,因此找到最不利的分布可能会具有挑战性。本文开发了一种减少维度的方法,该方法使我们能够将优化移至有限维设置,并具有明确的限制。降低维度的好处是,它允许使用流行的算法,例如预测的梯度上升,以找到最不利的先验。在整个论文中,为了在问题上取得进展,我们将自己限制在相对较大的损失功能(即布雷格曼的分歧)引起的贝叶斯风险。
A common way of characterizing minimax estimators in point estimation is by moving the problem into the Bayesian estimation domain and finding a least favorable prior distribution. The Bayesian estimator induced by a least favorable prior, under mild conditions, is then known to be minimax. However, finding least favorable distributions can be challenging due to inherent optimization over the space of probability distributions, which is infinite-dimensional. This paper develops a dimensionality reduction method that allows us to move the optimization to a finite-dimensional setting with an explicit bound on the dimension. The benefit of this dimensionality reduction is that it permits the use of popular algorithms such as projected gradient ascent to find least favorable priors. Throughout the paper, in order to make progress on the problem, we restrict ourselves to Bayesian risks induced by a relatively large class of loss functions, namely Bregman divergences.