Point cloud geometry compression based on the combination of interlayer residual and IRN concatenated residual

Point clouds have been attracting more and more attentions due to its capability of representing objects precisely, such as autonomous vehicle navigation, VR/AR, cultural heritage protection, etc. However, the enormous amount of data carried in point clouds presents significant challenges for transm...

Full description

Saved in:
Bibliographic Details
Main Authors: Meng Huang, Qian Xu, Wenxuan Xu
Format: Article
Language:English
Published: Elsevier 2025-08-01
Series:Graphical Models
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1524070325000268
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Point clouds have been attracting more and more attentions due to its capability of representing objects precisely, such as autonomous vehicle navigation, VR/AR, cultural heritage protection, etc. However, the enormous amount of data carried in point clouds presents significant challenges for transmission and storage. To solve this problem, this dissertation presents a point cloud compression framework based on the combination of interlayer residual and IRN concatenated residual. This paper deployed upsampling design after downsampled point cloud data. It calculates the residuals among point cloud data through downsampling and upsampling processes, consequently, maintains accuracy and reduces errors within the downsampling process. In addition, a novel Inception ResNet-Concatenated Residual Module is designed for maintaining the spatial correlation between layers and blocks. At the same time, it can extract the global and detailed features within point cloud data. Besides, Attention Module is dedicated to enhance the focus on salient features. Respectively compared with the traditional (G-PCC) and the learning point cloud compression method (PCGC v2), this paper lists a series of solid experiments data proving a 70% to 90% and a 6% to 9% BD-Rate gains on 8iVFB and Owlii datasets.
ISSN:1524-0703