论文标题

无MOCAP的定量评估,对没有地面真相测量的自我置量估计

MoCap-less Quantitative Evaluation of Ego-Pose Estimation Without Ground Truth Measurements

论文作者

Possamaï, Quentin, Janny, Steeven, Bono, Guillaume, Nadri, Madiha, Bako, Laurent, Wolf, Christian

论文摘要

机器人技术中数据驱动和计划的数据驱动方法的出现突出了需要开发用于数据收集的实验性机器人平台。但是,它们的实施通常是复杂且昂贵的,特别是对于飞行和陆地机器人而言,该位置的确切估计需要运动捕获装置(MOCAP)或LIDAR。为了简化用于研究各种室内和室外环境的机器人平台的使用,我们提出了一个数据验证工具,用于自我置式估计,该工具不需要任何设备以外的任何设备。该方法和工具允许对自我传感器质量进行快速,视觉和定量评估,并对采集链中不同漏洞的不同来源敏感,从传感器流的不同步到机器人平台的几何参数的Misesvaluation。使用计算机视觉,传感器的信息用于通过投影到板载摄像头的2D图像空间来计算语义场景点的运动。这些关键点与使用半自动工具创建的参考文献的偏差允许对平台上收集的数据的快速简单质量评估。为了证明我们的方法的性能,我们将其评估在两个具有挑战性的标准无人机数据集以及一个从地面机器人中获取的数据集。

The emergence of data-driven approaches for control and planning in robotics have highlighted the need for developing experimental robotic platforms for data collection. However, their implementation is often complex and expensive, in particular for flying and terrestrial robots where the precise estimation of the position requires motion capture devices (MoCap) or Lidar. In order to simplify the use of a robotic platform dedicated to research on a wide range of indoor and outdoor environments, we present a data validation tool for ego-pose estimation that does not require any equipment other than the on-board camera. The method and tool allow a rapid, visual and quantitative evaluation of the quality of ego-pose sensors and are sensitive to different sources of flaws in the acquisition chain, ranging from desynchronization of the sensor flows to misevaluation of the geometric parameters of the robotic platform. Using computer vision, the information from the sensors is used to calculate the motion of a semantic scene point through its projection to the 2D image space of the on-board camera. The deviations of these keypoints from references created with a semi-automatic tool allow rapid and simple quality assessment of the data collected on the platform. To demonstrate the performance of our method, we evaluate it on two challenging standard UAV datasets as well as one dataset taken from a terrestrial robot.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源