论文标题

Qurious:文本生成审议的问题。

QURIOUS: Question Generation Pretraining for Text Generation

论文作者

Narayan, Shashi, Simoes, Gonçalo, Ma, Ji, Craighead, Hannah, Mcdonald, Ryan

论文摘要

使用预训练的自然语言处理的最新趋势已将重点转向了文本生成的训练和微调方法。通常,重点放在概括语言建模目标的任务无关方法上。我们将问题生成作为一种预处理方法,它可以更好地与文本生成目标保持一致。我们通过这种方法预测的文本生成模型可以更好地理解输入的本质,并且是目标任务的更好的语言模型。当对两个文本生成任务(抽象性摘要和以答案的问题生成)进行评估时,我们的模型就自动指标进行了最新的性能。人类评估人员还发现我们的摘要和产生的问题更自然,简洁和信息丰富。

Recent trends in natural language processing using pretraining have shifted focus towards pretraining and fine-tuning approaches for text generation. Often the focus has been on task-agnostic approaches that generalize the language modeling objective. We propose question generation as a pretraining method, which better aligns with the text generation objectives. Our text generation models pretrained with this method are better at understanding the essence of the input and are better language models for the target task. When evaluated on two text generation tasks, abstractive summarization and answer-focused question generation, our models result in state-of-the-art performances in terms of automatic metrics. Human evaluators also found our summaries and generated questions to be more natural, concise and informative.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源