Lightweight Object Detector Based on Images Captured Using Unmanned Aerial Vehicle
This study aims to investigate the flight endurance problems that unmanned aerial vehicles (UAVs) face when carrying out filming tasks, the relatively limited computational resources of xmini platforms carried by UAVs, and the need for fast decision making and responses when processing image data in...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-07-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/13/7482 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This study aims to investigate the flight endurance problems that unmanned aerial vehicles (UAVs) face when carrying out filming tasks, the relatively limited computational resources of xmini platforms carried by UAVs, and the need for fast decision making and responses when processing image data in real-time. In this study, an improved Yolov8s-CFS model based on Yolov8s is proposed to address the need for a lightweight solution when UAVs are used to perform filming tasks. First, the Bottlenet in C2f is replaced by the FasterNet Block to achieve an overall lightweighting effect; second, in order to reduce the problem of model accuracy degradation due to excessive lightweighting, this study introduces the self-weight coordinate attention (SWCA) in the C2f-Faster module connected to each detect head. This results in the C2f-Faster-SWCA module, which provides a better solution to mitigate the model accuracy degradation that may occur due to excessive lightweighting. The experimental results show that the number of parameters in the Yolov8-CFS model is decreased by 17.4% with respect to the baseline on the Visdrone2019 dataset; in addition, its average accuracy remains at 40.1%. In summary, the Yolov8-CFS model reduces the number of parameters and model complexity while ensuring the accuracy of the model, facilitating its application in mobile deployment scenarios. |
---|---|
ISSN: | 2076-3417 |