论文标题

使用参数学习知识基础,以实现任务对话系统

Learning Knowledge Bases with Parameters for Task-Oriented Dialogue Systems

论文作者

Madotto, Andrea, Cahyawijaya, Samuel, Winata, Genta Indra, Xu, Yan, Liu, Zihan, Lin, Zhaojiang, Fung, Pascale

论文摘要

以任务为导向的对话系统通过单独的对话状态跟踪(DST)和管理步骤进行模块化,或者端到端训练。无论哪种情况,知识库(KB)在满足用户请求方面都起着至关重要的作用。模块化系统依靠DST与KB进行交互,这在注释和推理时间方面很昂贵。端到端系统直接使用KB作为输入,但是当KB大于几百条条目时它们无法扩展。在本文中,我们提出了一种将任何大小嵌入KB直接嵌入模型参数中的方法。所得模型不需要任何DST或模板响应,也不需要KB作为输入,并且可以通过微调动态更新其KB。我们在五个带有小型,中和较大的KB大小的面向任务的对话数据集中评估解决方案。我们的实验表明,端到端模型可以有效地将知识库嵌入其参数中,并在所有评估的数据集中实现竞争性能。

Task-oriented dialogue systems are either modularized with separate dialogue state tracking (DST) and management steps or end-to-end trainable. In either case, the knowledge base (KB) plays an essential role in fulfilling user requests. Modularized systems rely on DST to interact with the KB, which is expensive in terms of annotation and inference time. End-to-end systems use the KB directly as input, but they cannot scale when the KB is larger than a few hundred entries. In this paper, we propose a method to embed the KB, of any size, directly into the model parameters. The resulting model does not require any DST or template responses, nor the KB as input, and it can dynamically update its KB via fine-tuning. We evaluate our solution in five task-oriented dialogue datasets with small, medium, and large KB size. Our experiments show that end-to-end models can effectively embed knowledge bases in their parameters and achieve competitive performance in all evaluated datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源