Development of an explainable machine learning model for Alzheimer’s disease prediction using clinical and behavioural features

This article presents a reproducible machine learning methodology for the early prediction of Alzheimer’s disease (AD) using clinical and behavioural data. A comparative analysis of multiple classification algorithms was conducted, with the Gradient Boosting classifier yielding the best performance...

Full description

Saved in:
Bibliographic Details
Main Authors: Rajkumar Govindarajan, K. Thirunadanasikamani, Komal Kumar Napa, S. Sathya, J. Senthil Murugan, K. G. Chandi Priya
Format: Article
Language:English
Published: Elsevier 2025-12-01
Series:MethodsX
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S221501612500336X
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This article presents a reproducible machine learning methodology for the early prediction of Alzheimer’s disease (AD) using clinical and behavioural data. A comparative analysis of multiple classification algorithms was conducted, with the Gradient Boosting classifier yielding the best performance (accuracy: 93.9 %, F1-score: 91.8 %). To improve interpretability, SHapley Additive exPlanations (SHAP) were integrated into the workflow to quantify feature contributions at both global and individual levels. Key predictive variables such as Mini-Mental State Examination (MMSE), Activities of Daily Living (ADL), cholesterol levels, and functional assessment scores were identified and visualized using SHAP-based insights. A user-friendly, interactive web application was developed using Streamlit, allowing real-time patient data input and transparent model output visualization. This method offers a practical tool for clinicians and researchers to support early diagnosis and personalized risk assessment of AD, thus aiding in timely and informed clinical decision-making.Accurate Prediction: Gradient Boosting model achieved 93.9 % accuracy for early Alzheimer’s detection.Explainability: SHAP values provided interpretable insights into key clinical features.Clinical Tool: A Streamlit-based web app enabled real-time, explainable predictions for users.
ISSN:2215-0161