论文标题

培训卷积神经网络的混合方法

A Hybrid Method for Training Convolutional Neural Networks

论文作者

Lopes, Vasco, Fazendeiro, Paulo

论文摘要

人工智能算法在受欢迎程度和使用方面一直在稳步提高。深度学习,允许使用大量数据集对神经网络进行培训,并消除对人类提取功能的需求,因为它可以自动化特征学习过程。在训练深神经网络(例如卷积神经网络)的壁炉中,我们发现了反向传播,即通过计算给定输入的网络权重的损失函数的梯度,它允许将网络的权重调整到给定任务中更好地执行。在本文中,我们提出了一种混合方法,该方法同时使用反向传播和进化策略来训练卷积神经网络,其中使用进化策略来帮助避免局部最小值并微调重量,以便该网络获得更高的准确性结果。我们表明,在CIFAR-10中,使用VGG16模型的图像分类任务时,所提出的混合方法能够改善定期训练,并且与仅使用反向流动相比,最终测试结果平均增加了0.61%。

Artificial Intelligence algorithms have been steadily increasing in popularity and usage. Deep Learning, allows neural networks to be trained using huge datasets and also removes the need for human extracted features, as it automates the feature learning process. In the hearth of training deep neural networks, such as Convolutional Neural Networks, we find backpropagation, that by computing the gradient of the loss function with respect to the weights of the network for a given input, it allows the weights of the network to be adjusted to better perform in the given task. In this paper, we propose a hybrid method that uses both backpropagation and evolutionary strategies to train Convolutional Neural Networks, where the evolutionary strategies are used to help to avoid local minimas and fine-tune the weights, so that the network achieves higher accuracy results. We show that the proposed hybrid method is capable of improving upon regular training in the task of image classification in CIFAR-10, where a VGG16 model was used and the final test results increased 0.61%, in average, when compared to using only backpropagation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源