Improving Railway Track Detection With a Mixed-Modality Deep Learning Approach
Artificial intelligence (AI) integration has become crucial in augmenting safety systems at train stations and along railway lines. It plays a crucial role in identifying and monitoring individuals, detecting hazardous items, classifying unusual movements, and recognizing obstructions on railway lin...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/11098944/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Artificial intelligence (AI) integration has become crucial in augmenting safety systems at train stations and along railway lines. It plays a crucial role in identifying and monitoring individuals, detecting hazardous items, classifying unusual movements, and recognizing obstructions on railway lines. Artificial intelligence’s capacity to swiftly and precisely analyze data enables systems to react to possible dangers promptly. The present study introduces a sophisticated railway track identification system that utilizes semantic segmentation methods and deep learning models to effectively distinguish between items and persons in close proximity to railway zones. This work combines ResNet-50 with five segmentation models: SegNet, U-Net, FCN-8, FCN-16, and FCN-32. Refinements to the model’s underlying architecture are intended to enhance its performance, while a dataset of 5,000 images from the internet is utilized for testing. This paper compares traditional and improved model architectures, including precise learning rate adjustment to maximize accuracy. Empirical findings demonstrate that incorporating ResNet-50 substantially enhances accuracy and precision. The FCN-16+ ResNet-50 model had a mean accuracy coefficient of 98.72%. Despite the increased computational parameters, the improved models paradoxically enable quicker processing across all models. |
---|---|
ISSN: | 2169-3536 |