Improving Cell Detection and Tracking in Microscopy Images Using YOLO and an Enhanced DeepSORT Algorithm

Accurate and automated detection and tracking of cells in microscopy images is a persistent challenge in biotechnology and biomedical research. Effective detection and tracking are crucial for understanding biological processes and extracting meaningful data for subsequent simulations. In this study...

Full description

Saved in:
Bibliographic Details
Main Authors: Mokhaled N. A. Al-Hamadani, Richard Poroszlay, Gabor Szeman-Nagy, Andras Hajdu, Stathis Hadjidemetriou, Luca Ferrarini, Balazs Harangi
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/14/4361
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate and automated detection and tracking of cells in microscopy images is a persistent challenge in biotechnology and biomedical research. Effective detection and tracking are crucial for understanding biological processes and extracting meaningful data for subsequent simulations. In this study, we present an integrated pipeline that leverages a fine-tuned YOLOv8x model for detecting cells and cell divisions across microscopy image series. While YOLOv8x exhibits strong detection capabilities, it occasionally misses certain cells, leading to gaps in data. To mitigate this, we incorporate the DeepSORT tracking algorithm, which enhances data association and reduces the cells’ identity (ID) switches by utilizing a pre-trained convolutional network for robust multi-object tracking. This combination ensures continuous detection and compensates for missed detections, thereby improving overall recall. Our approach achieves a recall of 93.21% with the enhanced DeepSORT algorithm, compared to the 53.47% recall obtained by the original YOLOv8x model. The proposed pipeline effectively extracts detailed information from structured image datasets, providing a reliable approximation of cellular processes in culture environments.
ISSN:1424-8220