论文标题

一个离散的差异反复主题模型,而没有重新分配技巧

A Discrete Variational Recurrent Topic Model without the Reparametrization Trick

论文作者

Rezaee, Mehdi, Ferraro, Francis

论文摘要

我们展示了如何使用离散的随机变量学习神经主题模型---使用不依赖随机反向传播来处理离散变量的神经变异推理明确对每个单词分配的主题进行建模。我们利用的模型结合了神经方法的表达能力,用于代表文本序列,并与主题模型捕获全局,主题连贯性的能力。使用神经变性推断,我们在多个语料库中表现出改善的困惑和文档理解。我们研究了先前参数对模型和变分参数的影响,并演示了我们的方法如何竞争并超过流行的主题模型实现,以自动衡量主题质量。

We show how to learn a neural topic model with discrete random variables---one that explicitly models each word's assigned topic---using neural variational inference that does not rely on stochastic backpropagation to handle the discrete variables. The model we utilize combines the expressive power of neural methods for representing sequences of text with the topic model's ability to capture global, thematic coherence. Using neural variational inference, we show improved perplexity and document understanding across multiple corpora. We examine the effect of prior parameters both on the model and variational parameters and demonstrate how our approach can compete and surpass a popular topic model implementation on an automatic measure of topic quality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源