YOLOv8 with Post-Processing for Small Object Detection Enhancement
Small-object detection in images, a core task in unstructured big-data analysis, remains challenging due to low resolution, background noise, and occlusion. Despite advancements in object detection models like You Only Look Once (YOLO) v8 and EfficientDet, small object detection still faces limitati...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-06-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/13/7275 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Small-object detection in images, a core task in unstructured big-data analysis, remains challenging due to low resolution, background noise, and occlusion. Despite advancements in object detection models like You Only Look Once (YOLO) v8 and EfficientDet, small object detection still faces limitations. This study proposes an enhanced approach combining the content-aware reassembly of features (CARAFE) upsampling module and a confidence-based re-detection (CR) technique integrated with the YOLOv8n model to address these challenges. The CARAFE module is applied to the neck architecture of YOLOv8n to minimize information loss and enhance feature restoration by adaptively generating upsampling kernels based on the input feature map. Furthermore, the CR process involves cropping bounding boxes of small objects with low confidence scores from the original image and re-detecting them using the YOLOv8n-CARAFE model to improve detection performance. Experimental results demonstrate that the proposed approach significantly outperforms the baseline YOLOv8n model in detecting small objects. These findings highlight the effectiveness of combining advanced upsampling and post-processing techniques for improved small object detection. The proposed method holds promise for practical applications, including surveillance systems, autonomous driving, and medical image analysis. |
---|---|
ISSN: | 2076-3417 |