论文标题

贝叶斯神经网络的近似阻止的吉布斯采样

Approximate blocked Gibbs sampling for Bayesian neural networks

论文作者

Papamarkou, Theodore

论文摘要

在这项工作中,用于前馈神经网络的Minibatch MCMC采样更为可行。为此,建议通过阻止的Gibbs采样方案采样参数的亚组。通过对参数空间进行分区,可以采样,无论层宽度如何。也有可能通过减少更深层的提案差异来减轻深度的消失率。增加非构造链的长度会提高分类任务的预测准确性,从而避免消失的接受率并因此使更长的链条运行具有实际好处。此外,非构造链实现有助于量化预测不确定性。一个开放的问题是如何在存在增强数据的情况下对前馈神经网络进行Minibatch MCMC采样。

In this work, minibatch MCMC sampling for feedforward neural networks is made more feasible. To this end, it is proposed to sample subgroups of parameters via a blocked Gibbs sampling scheme. By partitioning the parameter space, sampling is possible irrespective of layer width. It is also possible to alleviate vanishing acceptance rates for increasing depth by reducing the proposal variance in deeper layers. Increasing the length of a non-convergent chain increases the predictive accuracy in classification tasks, so avoiding vanishing acceptance rates and consequently enabling longer chain runs have practical benefits. Moreover, non-convergent chain realizations aid in the quantification of predictive uncertainty. An open problem is how to perform minibatch MCMC sampling for feedforward neural networks in the presence of augmented data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源