论文标题

周期性课程学习

Cyclical Curriculum Learning

论文作者

Kesgin, H. Toprak, Amasyali, M. Fatih

论文摘要

人工神经网络(ANN)受人类学习的启发。但是,与人类教育不同,古典ANN不使用课程。课程学习(CL)是指ANN培训的过程,其中以有意义的顺序使用了示例。使用CL时,训练从数据集的一个子集开始,并且在整个培训过程中添加了新样本,或者培训始于整个数据集,并减少了所使用的样品数量。随着训练数据集大小的这些变化,与香草法相比,可以通过课程,抗课程或随机课程方法获得更好的结果。但是,找不到针对各种架构和数据集的通常有效的CL方法。在本文中,我们提出了周期性课程学习(CCL),其中训练过程中使用的数据大小周期性变化而不是仅仅增加或减少。不仅使用香草方法或课程方法,而是使用CCL中的两种方法都提供了更成功的结果。我们在图像和文本分类任务中的18个不同数据集和15个体系结构上测试了该方法,并获得了比NOT-CL和现有CL方法更成功的结果。从理论上讲,我们还表明,循环使用Cl和香草,而不是仅使用Cl或仅使用CL或Vanilla方法是错误的。周期性课程的代码可在https://github.com/cliclicalcurriculum/criclical-curriculum上获得。

Artificial neural networks (ANN) are inspired by human learning. However, unlike human education, classical ANN does not use a curriculum. Curriculum Learning (CL) refers to the process of ANN training in which examples are used in a meaningful order. When using CL, training begins with a subset of the dataset and new samples are added throughout the training, or training begins with the entire dataset and the number of samples used is reduced. With these changes in training dataset size, better results can be obtained with curriculum, anti-curriculum, or random-curriculum methods than the vanilla method. However, a generally efficient CL method for various architectures and data sets is not found. In this paper, we propose cyclical curriculum learning (CCL), in which the data size used during training changes cyclically rather than simply increasing or decreasing. Instead of using only the vanilla method or only the curriculum method, using both methods cyclically like in CCL provides more successful results. We tested the method on 18 different data sets and 15 architectures in image and text classification tasks and obtained more successful results than no-CL and existing CL methods. We also have shown theoretically that it is less erroneous to apply CL and vanilla cyclically instead of using only CL or only vanilla method. The code of Cyclical Curriculum is available at https://github.com/CyclicalCurriculum/Cyclical-Curriculum.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源