Robust Drone Video Analysis for Occluded Urban Traffic Monitoring Based on Deep Learning

Urban traffic management (UTM) relies on accurate vehicular flow data to optimize infrastructure and reduce congestion. However, existing video-based methods struggle with occlusions and complex trajectories in multi-directional intersections, limiting their applicability. This paper presents a nove...

Full description

Saved in:
Bibliographic Details
Main Authors: Carlos Gellida-Coutino, Reyes Rios-Cabrera, Alan Maldonado-Ramirez, Anand Sanchez-Orta
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11048557/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Urban traffic management (UTM) relies on accurate vehicular flow data to optimize infrastructure and reduce congestion. However, existing video-based methods struggle with occlusions and complex trajectories in multi-directional intersections, limiting their applicability. This paper presents a novel Speed Calculation and Trajectory Observation Algorithm (SCTOA) for robust vehicle counting, classification, and speed estimation in visually obstructed urban environments (e.g., tunnels, overpasses). Our approach integrates drone-captured aerial footage with YOLOv8 detection, ByteTrack tracking, and LoFTR-based motion compensation. To address occlusion challenges, we propose strategically placed Observation Areas (OAs), in points of traffic bifurcation, improving the overall performance. By mapping vehicle trajectories between OAs we achieve up to 96.73% accuracy on vehicle counting in unobstructed zones and 87.8% in complex scenarios, —outperforming prior single-direction methods. Experimental validation across 27 hours of drone footage from high-traffic intersections demonstrates the efficacy of the method for multi-directional flow analysis. The results enable precise input for traffic simulators (e.g., PTV-Vissim), supporting data-driven UTM decisions while minimizing costly real-world experimentation.
ISSN:2169-3536