论文标题
将学习的模式从地面现场图像转移,以预测精确作物耕作中的作物和杂草语义细分的基于无人机的图像
Transferring learned patterns from ground-based field imagery to predict UAV-based imagery for crop and weed semantic segmentation in precision crop farming
论文作者
论文摘要
杂草和作物细分正成为精确耕作的越来越多的组成部分,它利用了当前的计算机视觉和深度学习技术。研究是根据来自各个平台的相机捕获的图像进行了广泛进行的。无人驾驶飞机(UAV)和包括农业机器人在内的地面车辆是田野收集数据的两个流行平台。它们都为特定地点的杂草管理(SSWM)做出了贡献,以保持农作物的产量。当前,尽管共享相同的语义对象(杂草和作物),但这两个平台的数据分别进行了处理。在我们的论文中,我们开发了一个深度卷积网络,该网络能够通过训练阶段中仅提供的现场图像来预测无人机的田间和空中图像,以进行杂草分割和映射。网络学习过程通过浅层和深层的特征图可视化。结果表明,在现场数据集的开发模型中,分割的联合(玉米),杂草和土壤背景的平均相交分别为0.744、0.577、0.979,从UAV中的空气图像进行性能,以及具有相同模型的UAV的空气图像,均和土壤的IOU值(杂草),杂草和土壤的IOU值。分别为0.875。为了估计对使用植物保护剂的影响,我们根据预测的杂草图量化除草剂喷涂速率和网格尺寸(喷涂分辨率)之间的关系。当喷涂分辨率为1.78 x 1.78 cm2时,喷涂率高达90%。研究表明,开发的深卷卷神经网络可用于对田间和空中图像进行分类,并提供令人满意的结果。
Weed and crop segmentation is becoming an increasingly integral part of precision farming that leverages the current computer vision and deep learning technologies. Research has been extensively carried out based on images captured with a camera from various platforms. Unmanned aerial vehicles (UAVs) and ground-based vehicles including agricultural robots are the two popular platforms for data collection in fields. They all contribute to site-specific weed management (SSWM) to maintain crop yield. Currently, the data from these two platforms is processed separately, though sharing the same semantic objects (weed and crop). In our paper, we have developed a deep convolutional network that enables to predict both field and aerial images from UAVs for weed segmentation and mapping with only field images provided in the training phase. The network learning process is visualized by feature maps at shallow and deep layers. The results show that the mean intersection of union (IOU) values of the segmentation for the crop (maize), weeds, and soil background in the developed model for the field dataset are 0.744, 0.577, 0.979, respectively, and the performance of aerial images from an UAV with the same model, the IOU values of the segmentation for the crop (maize), weeds and soil background are 0.596, 0.407, and 0.875, respectively. To estimate the effect on the use of plant protection agents, we quantify the relationship between herbicide spraying saving rate and grid size (spraying resolution) based on the predicted weed map. The spraying saving rate is up to 90% when the spraying resolution is at 1.78 x 1.78 cm2. The study shows that the developed deep convolutional neural network could be used to classify weeds from both field and aerial images and delivers satisfactory results.