论文标题

框架内:获取有关生物医学多任务学习的说明

In-BoXBART: Get Instructions into Biomedical Multi-Task Learning

论文作者

Parmar, Mihir, Mishra, Swaroop, Purohit, Mirali, Luo, Man, Murad, M. Hassan, Baral, Chitta

论文摘要

单任务模型已证明在解决特定任务方面具有关键。但是,它们在需要多任务处理并展示域移位的现实应用程序中存在局限性。最近,教学提示已显示出对多任务概括的显着改善。但是,在生物医学领域中尚未系统地研究教学提示和多任务学习(MTL)的影响。在此激励的情况下,本文探讨了教学提示对生物医学MTL的影响。我们介绍了盒子,该框是(x)各种类别的生物医学NLP的32个指令任务的集合。使用此元数据,我们提出了一个统一的统一模型,该模型可以在没有任何特定于任务的模块的情况下共同学习盒子的所有任务。据我们所知,这是在生物医学领域中提出一个统一模型并使用指令在几个生物医学任务中实现概括的尝试。实验结果表明,提出的模型:1)比单个任务平均比单任务基线(即平均每任务32个实例)相比,平均比单任务(无指令)基线的基线效仿约3%,而多任务(无指令)基线的表现约为23%。我们的分析表明,盒子中的任务之间有很大的改进空间,这意味着未来研究方向的范围。

Single-task models have proven pivotal in solving specific tasks; however, they have limitations in real-world applications where multi-tasking is necessary and domain shifts are exhibited. Recently, instructional prompts have shown significant improvement towards multi-task generalization; however, the effect of instructional prompts and Multi-Task Learning (MTL) has not been systematically studied in the biomedical domain. Motivated by this, this paper explores the impact of instructional prompts for biomedical MTL. We introduce the BoX, a collection of 32 instruction tasks for Biomedical NLP across (X) various categories. Using this meta-dataset, we propose a unified model termed In-BoXBART, that can jointly learn all tasks of the BoX without any task-specific modules. To the best of our knowledge, this is the first attempt to propose a unified model in the biomedical domain and use instructions to achieve generalization across several biomedical tasks. Experimental results indicate that the proposed model: 1) outperforms the single-task baseline by ~3% and multi-task (without instruction) baseline by ~18% on an average, and 2) shows ~23% improvement compared to the single-task baseline in few-shot learning (i.e., 32 instances per task) on an average. Our analysis indicates that there is significant room for improvement across tasks in the BoX, implying the scope for future research direction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源