论文标题

嵌入地球:对密集土地覆盖分类的自制对比预训练

Embedding Earth: Self-supervised contrastive pre-training for dense land cover classification

论文作者

Tarasiou, Michail, Zafeiriou, Stefanos

论文摘要

在培训机器学习模型中,用于土地覆盖语义细分的机器学习模型在将卫星图像的可用性用作输入和地面真相数据之间存在鲜明的对比。虽然每天有成千上万的新卫星图像可以免费获得,但获取地面真相数据仍然非常具有挑战性,耗时和昂贵。在本文中,我们介绍了一种嵌入地球的一种自我监督的对比预训练方法,以利用卫星图像的大量可用性来提高下游密集的土地覆盖分类任务的性能。进行了跨越四个国家和两个大洲的广泛的实验评估,我们将预先培训的模型用我们的方法作为监督土地覆盖语义分段的初始化点,并观察到高达25%的绝对MIOU的显着改善。在每种情况下,我们都胜过随机初始化,尤其是当地面真相数据是稀缺时。通过一系列消融研究,我们探讨了所提出的方法的质量,并发现学习的特征可以在不同的区域之间概括开放使用拟议的预训练方案作为替代地球观察任务的随机初始化的可能性。代码将很快在https://github.com/michaeltrs/deepsatmodels上载。

In training machine learning models for land cover semantic segmentation there is a stark contrast between the availability of satellite imagery to be used as inputs and ground truth data to enable supervised learning. While thousands of new satellite images become freely available on a daily basis, getting ground truth data is still very challenging, time consuming and costly. In this paper we present Embedding Earth a self-supervised contrastive pre-training method for leveraging the large availability of satellite imagery to improve performance on downstream dense land cover classification tasks. Performing an extensive experimental evaluation spanning four countries and two continents we use models pre-trained with our proposed method as initialization points for supervised land cover semantic segmentation and observe significant improvements up to 25% absolute mIoU. In every case tested we outperform random initialization, especially so when ground truth data are scarse. Through a series of ablation studies we explore the qualities of the proposed approach and find that learnt features can generalize between disparate regions opening up the possibility of using the proposed pre-training scheme as a replacement to random initialization for Earth observation tasks. Code will be uploaded soon at https://github.com/michaeltrs/DeepSatModels.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源