Deep Learning Enabled Garbage Classification and Detection by Visual Context for Aerial Images
Environmental pollution caused by garbage is a significant problem in most developing countries. Proper garbage waste processing, management, and recycling are crucial for both ecological and economic reasons. Computer vision techniques have shown advanced capabilities in various applications, inclu...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2025-01-01
|
Series: | Applied Computational Intelligence and Soft Computing |
Online Access: | http://dx.doi.org/10.1155/acis/9106130 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Environmental pollution caused by garbage is a significant problem in most developing countries. Proper garbage waste processing, management, and recycling are crucial for both ecological and economic reasons. Computer vision techniques have shown advanced capabilities in various applications, including object detection and classification. In this study, we conducted an extensive review of the use of artificial intelligence for garbage processing and management. However, a major limitation in this field is the lack of datasets containing top-view images of garbage. We introduce a new dataset named “KACHARA,” containing 4727 images categorized into seven classes: clothes, decomposable (organic waste), glass, metal, paper, plastic, and wood. Importantly, the dataset exhibits a moderate imbalance, mirroring the distribution of real-world garbage, which is crucial for training accurate classification models. For classification, we utilize transfer learning with the well-known deep learning model MobileNetV3-Large, where the top layers are fine-tuned to enhance performance. We achieved a classification accuracy of 94.37% and also evaluated performance using precision, recall, F1-score, and confusion matrix. These results demonstrate the model’s strong generalization in aerial/top-view garbage classification. |
---|---|
ISSN: | 1687-9732 |