论文标题

GMA3D:局部全球注意力学习以估计场景流动的动作

GMA3D: Local-Global Attention Learning to Estimate Occluded Motions of Scene Flow

论文作者

Lu, Zhiyang, Cheng, Ming

论文摘要

场景流表示3D点云中每个点的运动信息。这是一种适用于许多任务的重要下游方法,例如运动分割和对象跟踪。但是,无论是从稀疏性数据采样还是现实世界的遮挡,两个连续的点云之间总是有遮挡点。在本文中,我们专注于通过移动对象的语义自相似性和运动一致性来解决场景流中的遮挡问题。我们根据变压器框架提出了一个GMA3D模块,该模块利用局部和全局语义相似性从分别从本地和全局非封闭点的运动信息中推断出咬合点的运动信息,然后使用偏移聚合器来汇总它们。我们的模块是第一个应用基于变压器的体系结构来评估点云上场景闭塞问题的模块。实验表明,我们的GMA3D可以解决场景流中的遮挡问题,尤其是在真实场景中。我们评估了点云数据集的封闭版本上的提出方法,并在真实场景Kitti数据集中获得最新结果。为了证明GMA3D对非封闭场景流仍然有益,我们还对非封闭式版本数据集进行了实验,并在Flythings3d和Kitti上实现了有希望的性能。该代码可在https://anonymon.4open.science/r/gma3d-e100上找到。

Scene flow represents the motion information of each point in the 3D point clouds. It is a vital downstream method applied to many tasks, such as motion segmentation and object tracking. However, there are always occlusion points between two consecutive point clouds, whether from the sparsity data sampling or real-world occlusion. In this paper, we focus on addressing occlusion issues in scene flow by the semantic self-similarity and motion consistency of the moving objects. We propose a GMA3D module based on the transformer framework, which utilizes local and global semantic similarity to infer the motion information of occluded points from the motion information of local and global non-occluded points respectively, and then uses an offset aggregator to aggregate them. Our module is the first to apply the transformer-based architecture to gauge the scene flow occlusion problem on point clouds. Experiments show that our GMA3D can solve the occlusion problem in the scene flow, especially in the real scene. We evaluated the proposed method on the occluded version of point cloud datasets and get state-of-the-art results on the real scene KITTI dataset. To testify that GMA3D is still beneficial to non-occluded scene flow, we also conducted experiments on non-occluded version datasets and achieved promising performance on FlyThings3D and KITTI. The code is available at https://anonymous.4open.science/r/GMA3D-E100.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源