论文标题
在持续学习框架下的自然语言理解的元学习
Meta-Learning for Natural Language Understanding under Continual Learning Framework
论文作者
论文摘要
神经网络在解决各种自然语言理解(NLU)任务方面的成就得到了认可。已经开发了训练强大模型以处理多个任务以获得文本的一般表示形式的方法。在本文中,我们在NLU任务的持续框架下实施了模型不合时宜的元学习(MAML)和在线意识到的元学习(OML)元学习。我们在选定的超级胶水和胶水基准上验证我们的方法。
Neural network has been recognized with its accomplishments on tackling various natural language understanding (NLU) tasks. Methods have been developed to train a robust model to handle multiple tasks to gain a general representation of text. In this paper, we implement the model-agnostic meta-learning (MAML) and Online aware Meta-learning (OML) meta-objective under the continual framework for NLU tasks. We validate our methods on selected SuperGLUE and GLUE benchmark.