论文标题

切成薄片的二键率差异

Sliced Kernelized Stein Discrepancy

论文作者

Gong, Wenbo, Li, Yingzhen, Hernández-Lobato, José Miguel

论文摘要

核化的Stein差异(KSD)虽然被广泛用于拟合的测试和模型学习,但却遭受了差异性的诅咒。我们通过提出切成薄片的Stein差异及其可扩展和内核变体来解决此问题,这些变体采用基于内核的测试功能在最佳的一维预测上定义。当应用于合适性测试时,广泛的实验表明,在高维度中,提出的差异显着超过了KSD和各种基准。对于模型学习,我们通过培训具有不同差异的独立组件分析模型来证明其比现有Stein差异基线的优势。我们进一步提出了一种新型的粒子推理方法,称为切片Stein变异梯度下降(S-SVGD),该方法在训练变异自动编码器中减轻了SVGD模式折叠问题。

Kernelized Stein discrepancy (KSD), though being extensively used in goodness-of-fit tests and model learning, suffers from the curse-of-dimensionality. We address this issue by proposing the sliced Stein discrepancy and its scalable and kernelized variants, which employ kernel-based test functions defined on the optimal one-dimensional projections. When applied to goodness-of-fit tests, extensive experiments show the proposed discrepancy significantly outperforms KSD and various baselines in high dimensions. For model learning, we show its advantages over existing Stein discrepancy baselines by training independent component analysis models with different discrepancies. We further propose a novel particle inference method called sliced Stein variational gradient descent (S-SVGD) which alleviates the mode-collapse issue of SVGD in training variational autoencoders.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源