论文标题

Hydalearn:具有辅助任务的多任务学习的高度动态任务加权

HydaLearn: Highly Dynamic Task Weighting for Multi-task Learning with Auxiliary Tasks

论文作者

Verboven, Sam, Chaudhary, Muhammad Hafeez, Berrevoets, Jeroen, Verbeke, Wouter

论文摘要

多任务学习(MTL)可以通过与一个或多个相关的辅助任务共享表示形式来提高任务的绩效。通常,MTL-NETWORKS受到复合损失函数的训练,该复合损失函数由单独的任务损失的恒定加权组合形成。实际上,恒定的减肥体重导致结果不佳,原因有两个:(i)辅助任务的相关性可以在整个学习过程中逐渐漂移; (ii)对于基于迷你批次的优化,最佳任务权重从一个更新到下一个更新明显差异,具体取决于迷你批处理样品组成。我们介绍了Hydalearn,这是一种将主任务增益连接到单个任务梯度的智能加权算法,以便在迷你批次级别上为动态减少加权提供信息,以解决I和II。使用Hydalearn,我们报告了综合数据以及两个监督的学习域的性能提高。

Multi-task learning (MTL) can improve performance on a task by sharing representations with one or more related auxiliary-tasks. Usually, MTL-networks are trained on a composite loss function formed by a constant weighted combination of the separate task losses. In practice, constant loss weights lead to poor results for two reasons: (i) the relevance of the auxiliary tasks can gradually drift throughout the learning process; (ii) for mini-batch based optimisation, the optimal task weights vary significantly from one update to the next depending on mini-batch sample composition. We introduce HydaLearn, an intelligent weighting algorithm that connects main-task gain to the individual task gradients, in order to inform dynamic loss weighting at the mini-batch level, addressing i and ii. Using HydaLearn, we report performance increases on synthetic data, as well as on two supervised learning domains.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源