论文标题

LIDAR点云压缩的多尺度潜水熵模型

Multiscale Latent-Guided Entropy Model for LiDAR Point Cloud Compression

论文作者

Fan, Tingyu, Gao, Linyao, Xu, Yiling, Wang, Dong, Li, Zhu

论文摘要

激光点云(LPC)的非均匀分布和极稀疏的性质给其高效压缩带来了重大挑战。本文提出了一个新颖的端到端,完全物质的深层框架,该框架将原始LPC编码为OCTREE结构,并分层分解OCTREE熵模型。所提出的框架利用层次的潜在变量作为侧面信息来封装同胞和祖先依赖性,该依赖性为点云分布建模提供了足够的上下文信息,同时启用同一层中的Octree节点的并行编码和解码。此外,我们提出了一个残留的编码框架,用于压缩潜在变量,该框架通过进行性下采样探索了每一层的空间相关性,并使用完全因素的熵模型对相应的残差进行建模。此外,我们提出了剩余编码的软添加和减法,以提高网络灵活性。对LiDAR基准Semantickitti和MPEG指定数据集的综合实验结果表明,我们提出的框架在所有以前的LPC框架中都实现了最先进的性能。此外,与以前的LPC压缩有关的先前最新方法相比,实验证明了我们的端到端,完全物质的框架是高平行和时间效率的,并节省了超过99.8%的解码时间。

The non-uniform distribution and extremely sparse nature of the LiDAR point cloud (LPC) bring significant challenges to its high-efficient compression. This paper proposes a novel end-to-end, fully-factorized deep framework that encodes the original LPC into an octree structure and hierarchically decomposes the octree entropy model in layers. The proposed framework utilizes a hierarchical latent variable as side information to encapsulate the sibling and ancestor dependence, which provides sufficient context information for the modelling of point cloud distribution while enabling the parallel encoding and decoding of octree nodes in the same layer. Besides, we propose a residual coding framework for the compression of the latent variable, which explores the spatial correlation of each layer by progressive downsampling, and model the corresponding residual with a fully-factorized entropy model. Furthermore, we propose soft addition and subtraction for residual coding to improve network flexibility. The comprehensive experiment results on the LiDAR benchmark SemanticKITTI and MPEG-specified dataset Ford demonstrates that our proposed framework achieves state-of-the-art performance among all the previous LPC frameworks. Besides, our end-to-end, fully-factorized framework is proved by experiment to be high-parallelized and time-efficient and saves more than 99.8% of decoding time compared to previous state-of-the-art methods on LPC compression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源