论文标题

Fucitnet:通过学习班级转换的融合来改善深度学习网络的概括

FuCiTNet: Improving the generalization of deep learning networks by the fusion of learned class-inherent transformations

论文作者

Rey-Area, Manuel, Guirado, Emilio, Tabik, Siham, Ruiz-Hidalgo, Javier

论文摘要

众所周知,非常小的数据集在深神网络(DNN)中产生过度拟合,即,该网络对已训练的数据高度偏见。通常,使用转移学习,正则化技术和/或数据增强来缓解此问题。这项工作提出了一种新方法,独立但与以前提到的技术相互互补,用于改善在非常小的数据集中DNN的概括,其中所涉及的类具有许多视觉特征。提出的称为Fucutnet(Fusion类固有的转换网络)的方法学启发,它创造了与问题中的类一样多的生成器。每个发电机($ k $)都学会了将输入图像带入K级域的转换。我们在发电机中引入了分类损失,以推动特定的K级转换的倾斜。我们的实验表明,提出的转换改善了三个不同数据集中的分类模型的概括。

It is widely known that very small datasets produce overfitting in Deep Neural Networks (DNNs), i.e., the network becomes highly biased to the data it has been trained on. This issue is often alleviated using transfer learning, regularization techniques and/or data augmentation. This work presents a new approach, independent but complementary to the previous mentioned techniques, for improving the generalization of DNNs on very small datasets in which the involved classes share many visual features. The proposed methodology, called FuCiTNet (Fusion Class inherent Transformations Network), inspired by GANs, creates as many generators as classes in the problem. Each generator, $k$, learns the transformations that bring the input image into the k-class domain. We introduce a classification loss in the generators to drive the leaning of specific k-class transformations. Our experiments demonstrate that the proposed transformations improve the generalization of the classification model in three diverse datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源