论文标题

深层神经网络的多任务学习:一项调查

Multi-Task Learning with Deep Neural Networks: A Survey

论文作者

Crawshaw, Michael

论文摘要

多任务学习(MTL)是机器学习的一个子字段,其中共享模型同时学习多个任务。这种方法提供了提高数据效率,通过共享表示形式过度拟合的优势,以及利用辅助信息的快速学习。但是,对多个任务的同时学习提出了新的设计和优化挑战,选择应共同学习哪些任务本身就是一个非平凡的问题。在这项调查中,我们概述了深层神经网络的多任务学习方法,目的是总结该领域内的良好和最新方向。我们的讨论是根据现有深度MTL技术分为三类的分区的结构:架构,优化方法和任务关系学习。我们还提供了常见的多任务基准。

Multi-task learning (MTL) is a subfield of machine learning in which multiple tasks are simultaneously learned by a shared model. Such approaches offer advantages like improved data efficiency, reduced overfitting through shared representations, and fast learning by leveraging auxiliary information. However, the simultaneous learning of multiple tasks presents new design and optimization challenges, and choosing which tasks should be learned jointly is in itself a non-trivial problem. In this survey, we give an overview of multi-task learning methods for deep neural networks, with the aim of summarizing both the well-established and most recent directions within the field. Our discussion is structured according to a partition of the existing deep MTL techniques into three groups: architectures, optimization methods, and task relationship learning. We also provide a summary of common multi-task benchmarks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源