论文标题

归一化功率先前的贝叶斯分析

Normalized Power Prior Bayesian Analysis

论文作者

Ye, Keying, Han, Zifei, Duan, Yuyan, Bai, Tianyu

论文摘要

基于历史数据的可用性,通过将历史数据的可能性功能提高到分数幂δ来实现,从而实现了权力先验的启发,这可以量化历史信息的折现程度,以推断当前数据。如果未预先指定δ并将其视为随机,则可以使用贝叶斯更新范式从数据中估算出来。但是,以联合力量的原始形式先前的贝叶斯方法,当采用不同的足够统计数据的不同设置时,在历史数据的可能性之前,某些积极常数可以乘以。这将用不同的常数改变权力先验,因此违反了可能性原则。在本文中,我们调查了一种归一化的先验方法,该方法遵守可能性原理,并且是事先进行的联合权力的一种修改形式。在最小化加权kullback-leibler差异的意义上,归一化功率的最佳性能。通过检查几个常用分布的后代,我们表明历史数据和当前数据之间的差异可以通过归一化功率之前的功率参数来很好地量化。还提出了有效的计算比例因子的算法。此外,我们用三个数据示例说明了归一化功率先前的贝叶斯分析的使用,并提供了R软件包NPP的实现。

The elicitation of power priors, based on the availability of historical data, is realized by raising the likelihood function of the historical data to a fractional power δ, which quantifies the degree of discounting of the historical information in making inference with the current data. When δ is not pre-specified and is treated as random, it can be estimated from the data using Bayesian updating paradigm. However, in the original form of the joint power prior Bayesian approach, certain positive constants before the likelihood of the historical data could be multiplied when different settings of sufficient statistics are employed. This would change the power priors with different constants, and hence the likelihood principle is violated. In this article, we investigate a normalized power prior approach which obeys the likelihood principle and is a modified form of the joint power prior. The optimality properties of the normalized power prior in the sense of minimizing the weighted Kullback-Leibler divergence is investigated. By examining the posteriors of several commonly used distributions, we show that the discrepancy between the historical and the current data can be well quantified by the power parameter under the normalized power prior setting. Efficient algorithms to compute the scale factor is also proposed. In addition, we illustrate the use of the normalized power prior Bayesian analysis with three data examples, and provide an implementation with an R package NPP.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源