Tracking and Registration Technology Based on Panoramic Cameras

Augmented reality (AR) has become a research focus in computer vision and graphics, with growing applications driven by advances in artificial intelligence and the emergence of the metaverse. Panoramic cameras offer new opportunities for AR due to their wide field of view but also pose significant c...

Full beskrivning

Sparad:
Bibliografiska uppgifter
Huvudupphovsmän: Chao Xu, Guoxu Li, Ye Bai, Yuzhuo Bai, Zheng Cao, Cheng Han
Materialtyp: Artikel
Språk:engelska
Publicerad: MDPI AG 2025-07-01
Serie:Applied Sciences
Ämnen:
Länkar:https://www.mdpi.com/2076-3417/15/13/7397
Taggar: Lägg till en tagg
Inga taggar, Lägg till första taggen!
Beskrivning
Sammanfattning:Augmented reality (AR) has become a research focus in computer vision and graphics, with growing applications driven by advances in artificial intelligence and the emergence of the metaverse. Panoramic cameras offer new opportunities for AR due to their wide field of view but also pose significant challenges for camera pose estimation because of severe distortion and complex scene textures. To address these issues, this paper proposes a lightweight, unsupervised deep learning model for panoramic camera pose estimation. The model consists of a depth estimation sub-network and a pose estimation sub-network, both optimized for efficiency using network compression, multi-scale rectangular convolutions, and dilated convolutions. A learnable occlusion mask is incorporated into the pose network to mitigate errors caused by complex relative motion. Furthermore, a panoramic view reconstruction model is constructed to obtain effective supervisory signals from the predicted depth, pose information, and corresponding panoramic images and is trained using a designed spherical photometric consistency loss. The experimental results demonstrate that the proposed method achieves competitive accuracy while maintaining high computational efficiency, making it well-suited for real-time AR applications with panoramic input.
ISSN:2076-3417