论文标题

TSGP:两阶段生成型提示无监督的常识问题回答

TSGP: Two-Stage Generative Prompting for Unsupervised Commonsense Question Answering

论文作者

Sun, Yueqing, Zhang, Yu, Qi, Le, Shi, Qi

论文摘要

无监督的常识性问题回答需要开采有效的常识性知识,而无需依靠标记的任务数据。以前的方法通常从传统的知识库或使用的预训练的语言模型(PRLMS)中检索,以生成固定类型的知识,这些知识具有较差的概括能力。在本文中,我们旨在通过利用PRLMS中存储的隐式知识来解决上述限制,并提出一个基于两阶段的基于及时的无监督常识问题答案框架(TSGP)。具体来说,我们首先使用知识生成提示来生成具有无限类型的问题所需的知识,并且可能独立于指定选择的候选答案。然后,我们进一步利用答案生成提示来生成可能独立于指定选择的候选答案。对三种不同的常识性推理任务(CommonSenseQA,OpenBookQa和SocialiQA)的实验结果和分析表明,TSGP显着提高了语言模型在无监督环境中的推理能力。我们的代码可在以下网址提供:https://github.com/yueqing-sun/tsgp。

Unsupervised commonsense question answering requires mining effective commonsense knowledge without the rely on the labeled task data. Previous methods typically retrieved from traditional knowledge bases or used pre-trained language models (PrLMs) to generate fixed types of knowledge, which have poor generalization ability. In this paper, we aim to address the above limitation by leveraging the implicit knowledge stored in PrLMs and propose a two-stage prompt-based unsupervised commonsense question answering framework (TSGP). Specifically, we first use knowledge generation prompts to generate the knowledge required for questions with unlimited types and possible candidate answers independent of specified choices. Then, we further utilize answer generation prompts to generate possible candidate answers independent of specified choices. Experimental results and analysis on three different commonsense reasoning tasks, CommonsenseQA, OpenBookQA, and SocialIQA, demonstrate that TSGP significantly improves the reasoning ability of language models in unsupervised settings. Our code is available at: https://github.com/Yueqing-Sun/TSGP.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源