Intelligent System Using Data to Support Decision-Making

Interest in explainable machine learning has grown, particularly in healthcare, where transparency and trust are essential. We developed a semi-automated evaluation framework within a clinical decision support system (CDSS-EQCM) that integrates LIME and SHAP explanations with multi-criteria decision...

Full description

Saved in:
Bibliographic Details
Main Authors: Viera Anderková, František Babič, Zuzana Paraličová, Daniela Javorská
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/14/7724
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Interest in explainable machine learning has grown, particularly in healthcare, where transparency and trust are essential. We developed a semi-automated evaluation framework within a clinical decision support system (CDSS-EQCM) that integrates LIME and SHAP explanations with multi-criteria decision-making (TOPSIS and Borda count) to rank model interpretability. After two-phase preprocessing of 2934 COVID-19 patient records spanning four epidemic waves, we applied five classifiers (Random Forest, Decision Tree, Logistic Regression, k-NN, SVM). Five infectious disease physicians used a Streamlit interface to generate patient-specific explanations and rate models on accuracy, separability, stability, response time, understandability, and user experience. Random Forest combined with SHAP consistently achieved the highest rankings in Borda count. Clinicians reported reduced evaluation time, enhanced explanation clarity, and increased confidence in model outputs. These results demonstrate that CDSS-EQCM can effectively streamline interpretability assessment and support clinician decision-making in medical diagnostics. Future work will focus on deeper electronic medical record integration and interactive parameter tuning to further enhance real-time diagnostic support.
ISSN:2076-3417