论文标题

基于经验可能性的贝叶斯变量选择

Empirical Likelihood Based Bayesian Variable Selection

论文作者

Cheng, Yichen, Zhao, Yichuan

论文摘要

经验可能性是一种流行的非参数统计工具,不需要任何分布假设。在本文中,我们探讨了通过贝叶斯经验可能性进行可变选择的可能性。从理论上讲,当先前的分布满足某些轻度条件时,相应的贝叶斯经验可能性估计量在后方一致且选择一致。作为特殊情况,我们显示了贝叶斯经验可能性拉索和SCAD的先前情况满足了这种情况,因此可以识别参数的非零元素,其概率趋于1。此外,很容易验证这些条件是否满足了其他广泛使用的先知,例如山脊,弹性网,弹性网和适应性的Lasso。经验可能性取决于需要通过数值求解非线性方程来获得的参数。因此,后验分布没有任何共轭物,这在某些情况下会导致MCMC采样算法的缓慢收敛。为了解决这个问题,我们提出了一种新颖的方法,该方法使用近似分布作为提案。计算结果证明了本文中使用的示例的快速收敛。我们同时使用仿真和实际数据分析来说明所提出方法的优势。

Empirical likelihood is a popular nonparametric statistical tool that does not require any distributional assumptions. In this paper, we explore the possibility of conducting variable selection via Bayesian empirical likelihood. We show theoretically that when the prior distribution satisfies certain mild conditions, the corresponding Bayesian empirical likelihood estimators are posteriorly consistent and variable selection consistent. As special cases, we show the prior of Bayesian empirical likelihood LASSO and SCAD satisfies such conditions and thus can identify the non-zero elements of the parameters with probability tending to 1. In addition, it is easy to verify that those conditions are met for other widely used priors such as ridge, elastic net and adaptive LASSO. Empirical likelihood depends on a parameter that needs to be obtained by numerically solving a non-linear equation. Thus, there exists no conjugate prior for the posterior distribution, which causes the slow convergence of the MCMC sampling algorithm in some cases. To solve this problem, we propose a novel approach, which uses an approximation distribution as the proposal. The computational results demonstrate quick convergence for the examples used in the paper. We use both simulation and real data analyses to illustrate the advantages of the proposed methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源