论文标题

模式引导的零摄像对话状态跟踪的微调BERT

Fine-Tuning BERT for Schema-Guided Zero-Shot Dialogue State Tracking

论文作者

Ruan, Yu-Ping, Ling, Zhen-Hua, Gu, Jia-Chen, Liu, Quan

论文摘要

我们在“对话系统技术”挑战8(DSTC8)中介绍了轨道4的工作。 DSTC8-TRACK 4旨在在零摄影设置下执行对话状态跟踪(DST),在该设置中,在该设置中,在该设置中,对于这些目标API的架构定义,该模型需要对看不见的服务API进行概括。 DST是Siri,Alexa和Google Assistant等许多虚拟助手的核心,可以跟踪用户的目标以及对话历史记录中发生的事情,主要包括意图预测,插槽填充和用户状态跟踪,该跟踪测试了模型的自然语言理解能力。最近,验证的语言模型已取得了最新的结果,并在各种NLP任务上表现出了令人印象深刻的概括能力,这提供了一种有希望的方法来执行零局学习以进行语言理解。基于此,我们提出了一个模式引导的范式,用于通过微调伯特(Bert)(最受欢迎的预验证的语言模型之一)进行零摄像对话状态跟踪(SGP-DST)。 SGP-DST系统包含四个用于意图预测,插槽预测,插槽传输预测和用户状态汇总的模块。根据官方评估结果,我们的SGP-DST(Team12)在联合目标准确性(用于排名提交的主要评估指标)中排名第三,在25个参与者团队中重新定位的插槽F1排名第1。

We present our work on Track 4 in the Dialogue System Technology Challenges 8 (DSTC8). The DSTC8-Track 4 aims to perform dialogue state tracking (DST) under the zero-shot settings, in which the model needs to generalize on unseen service APIs given a schema definition of these target APIs. Serving as the core for many virtual assistants such as Siri, Alexa, and Google Assistant, the DST keeps track of the user's goal and what happened in the dialogue history, mainly including intent prediction, slot filling, and user state tracking, which tests models' ability of natural language understanding. Recently, the pretrained language models have achieved state-of-the-art results and shown impressive generalization ability on various NLP tasks, which provide a promising way to perform zero-shot learning for language understanding. Based on this, we propose a schema-guided paradigm for zero-shot dialogue state tracking (SGP-DST) by fine-tuning BERT, one of the most popular pretrained language models. The SGP-DST system contains four modules for intent prediction, slot prediction, slot transfer prediction, and user state summarizing respectively. According to the official evaluation results, our SGP-DST (team12) ranked 3rd on the joint goal accuracy (primary evaluation metric for ranking submissions) and 1st on the requsted slots F1 among 25 participant teams.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源