论文标题

端到对话变压器

End to End Dialogue Transformer

论文作者

Měkota, Ondřej, Gökırmak, Memduh, Laitoch, Petr

论文摘要

对话系统试图促进人与计算机之间的对话,目的是与预订假期的小聊天一样多样化。我们在这里受到了经常性神经网络模型的表现的启发,当进行对话时,该模型使用序列到序列体系结构在对话过程中首先产生文本表示,并在进一步的一步中使用数据库发现来对用户产生答复。我们在这里提出了一个基于变压器体系结构而不是Sequicity的基于RNN的体系结构的对话系统,该架构以端到端的序列到序列方式类似。

Dialogue systems attempt to facilitate conversations between humans and computers, for purposes as diverse as small talk to booking a vacation. We are here inspired by the performance of the recurrent neural network-based model Sequicity, which when conducting a dialogue uses a sequence-to-sequence architecture to first produce a textual representation of what is going on in the dialogue, and in a further step use this along with database findings to produce a reply to the user. We here propose a dialogue system based on the Transformer architecture instead of Sequicity's RNN-based architecture, that works similarly in an end-to-end, sequence-to-sequence fashion.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源