Finger drawing on smartphone screens enables early Parkinson's disease detection through hybrid 1D-CNN and BiGRU deep learning architecture.

<h4>Background</h4>Parkinson's disease (PD), a progressive neurodegenerative disorder prevalent in aging populations, manifests clinically through characteristic motor impairments including bradykinesia, rigidity, and resting tremor. Early detection and timely intervention may delay...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhaohui Zhu, E Wu, Pengfei Leng, Jiajun Sun, Mingming Ma, Zhigeng Pan
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0327733
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:<h4>Background</h4>Parkinson's disease (PD), a progressive neurodegenerative disorder prevalent in aging populations, manifests clinically through characteristic motor impairments including bradykinesia, rigidity, and resting tremor. Early detection and timely intervention may delay disease progression. Spiral drawing tasks have been established as effective auxiliary diagnostic tools. This study developed a hybrid deep learning model to analyze motion data from finger drawings of spiral and wave lines on smartphone screens, aiming to detect early Parkinson's disease.<h4>Methods</h4>We recruited 58 age-matched participants (28 early idiopathic PD patients: 68.4 ± 5.7 years; 30 healthy controls: 68.0 ± 4.5 years) for two smartphone-based drawing tasks (spiral and wave). A custom-developed app recorded finger touch coordinates, instantaneous movement speed, and timestamps at a sampling frequency of 60 Hz. Our hybrid model combined multi-scale convolutional feature extraction (using parallel 1D-Convolutional branches) with bidirectional temporal pattern recognition (via gated recurrent unit [GRU] networks) to analyze movement abnormalities and detect the disease.<h4>Results</h4>The proposed model demonstrated robust diagnostic performance, achieving a cross-validation accuracy of 87.93% for spiral drawings (89.64% sensitivity, 86.33% specificity). Wave drawings yielded 87.24% accuracy (86.79% sensitivity, 87.67% specificity). The integration of both tasks achieved 91.20% accuracy (95% CI: 89.2%-93.2%) with balanced sensitivity (91.43%) and specificity (91.00%).<h4>Conclusion</h4>This study establishes the technical feasibility of a hybrid deep learning framework for early PD detection using smartphone-captured finger motion dynamics. The developed model effectively combines one-dimensional convolutional neural networks with bidirectional GRUs to analyze drawing tasks. Distinct from existing approaches that rely on clinical rating scales, neuroimaging modalities, or stylus-based digital assessments, this telemedicine-compatible method requires only bare-finger interactions on consumer-grade smartphones and enables operator-independent assessments. Furthermore, it facilitates cost-effective and convenient PD assessment in remote healthcare and patient monitoring, particularly in resource-limited settings.
ISSN:1932-6203