论文标题

在LIDAR点云上进行自我监管的预训练的蒙面自动编码器

Masked Autoencoder for Self-Supervised Pre-training on Lidar Point Clouds

论文作者

Hess, Georg, Jaxing, Johan, Svensson, Elias, Hagerman, David, Petersson, Christoffer, Svensson, Lennart

论文摘要

蒙版自动编码已成为用于文本,图像和最近的点云的变压器模型的成功预处理范式。原始汽车数据集是适合自我监督预训练的合适候选者,因为与3D对象检测(OD)等任务相比,它们通常便宜地收集。但是,用于点云的蒙版自动编码器的开发仅集中在合成和室内数据上。因此,现有方法已将其表示和模型定制为具有均匀点密度的小且密集的点云。在这项工作中,我们在汽车环境中研究了蒙版的自动编码,该自动编码是稀疏的,并且在同一场景中的对象之间,点密度可能会大不相同。为此,我们提出了Voxel-MAE,这是一种简单的掩盖自动编码预训练方案,设计用于体素表示。我们将基于变压器的3D对象检测器的骨干培养为重建掩盖的体素并区分空的和非空的体素。我们的方法将3D OD性能提高了1.75个地图点和1.05 nds的NUSCENES数据集。此外,我们表明,通过对Voxel-Mae进行预训练,我们只需要40%的带注释的数据即可超过随机初始化的等效物。可在https://github.com/georghess/voxel-mae上找到代码

Masked autoencoding has become a successful pretraining paradigm for Transformer models for text, images, and, recently, point clouds. Raw automotive datasets are suitable candidates for self-supervised pre-training as they generally are cheap to collect compared to annotations for tasks like 3D object detection (OD). However, the development of masked autoencoders for point clouds has focused solely on synthetic and indoor data. Consequently, existing methods have tailored their representations and models toward small and dense point clouds with homogeneous point densities. In this work, we study masked autoencoding for point clouds in an automotive setting, which are sparse and for which the point density can vary drastically among objects in the same scene. To this end, we propose Voxel-MAE, a simple masked autoencoding pre-training scheme designed for voxel representations. We pre-train the backbone of a Transformer-based 3D object detector to reconstruct masked voxels and to distinguish between empty and non-empty voxels. Our method improves the 3D OD performance by 1.75 mAP points and 1.05 NDS on the challenging nuScenes dataset. Further, we show that by pre-training with Voxel-MAE, we require only 40% of the annotated data to outperform a randomly initialized equivalent. Code available at https://github.com/georghess/voxel-mae

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源