论文标题

任务转移和域的适应性零弹问题答案

Task Transfer and Domain Adaptation for Zero-Shot Question Answering

论文作者

Pan, Xiang, Sheng, Alex, Shimshoni, David, Singhal, Aditya, Rosenthal, Sara, Sil, Avirup

论文摘要

预审前的语言模型在自然语言处理的各个领域都取得了成功,包括阅读理解任务。但是,当将机器学习方法应用于新域时,标记的数据可能并不总是可用。为了解决这个问题,我们使用对源域数据的监督预处理,以降低特定于域的下游任务的样本复杂性。我们通过将任务转移与域适应性相结合以微调验证的模型,而没有目标任务的数据来评估特定于领域的阅读理解任务的零射击性能。我们的方法在4个域中的3个域中的下游域特异性阅读理解任务上超过了域自适应预处理。

Pretrained language models have shown success in various areas of natural language processing, including reading comprehension tasks. However, when applying machine learning methods to new domains, labeled data may not always be available. To address this, we use supervised pretraining on source-domain data to reduce sample complexity on domain-specific downstream tasks. We evaluate zero-shot performance on domain-specific reading comprehension tasks by combining task transfer with domain adaptation to fine-tune a pretrained model with no labelled data from the target task. Our approach outperforms Domain-Adaptive Pretraining on downstream domain-specific reading comprehension tasks in 3 out of 4 domains.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源