Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction

In high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive proc...

Full description

Saved in:
Bibliographic Details
Main Authors: Maxim Majlatow, Fahim Ahmed Shakil, Andreas Emrich, Nijat Mehdiyev
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/14/7925
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1839616599367090176
author Maxim Majlatow
Fahim Ahmed Shakil
Andreas Emrich
Nijat Mehdiyev
author_facet Maxim Majlatow
Fahim Ahmed Shakil
Andreas Emrich
Nijat Mehdiyev
author_sort Maxim Majlatow
collection DOAJ
description In high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive process monitoring (PPM) framework tailored to healthcare analytics. CP is renowned for its distribution-free prediction regions and formal coverage guarantees under minimal assumptions; however, its practical utility critically depends on well-calibrated probability estimates. We compare a range of post-hoc calibration methods—including parametric approaches like Platt scaling and Beta calibration, as well as non-parametric techniques such as Isotonic Regression and Spline calibration—to assess their impact on aligning raw model outputs with observed outcomes. By incorporating these calibrated probabilities into the CP framework, our multilayer analysis evaluates improvements in prediction region validity, including tighter coverage gaps and reduced minority error contributions. Furthermore, we employ SHAP-based explainability to explain how calibration influences feature attribution for both high-confidence and ambiguous predictions. Experimental results on process-driven healthcare data indicate that the integration of calibration with CP not only enhances the statistical robustness of uncertainty estimates but also improves the interpretability of predictions, thereby supporting safer and robust clinical decision-making.
format Article
id doaj-art-c834e77a92934c419c18c34635e3d38c
institution Matheson Library
issn 2076-3417
language English
publishDate 2025-07-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj-art-c834e77a92934c419c18c34635e3d38c2025-07-25T13:12:44ZengMDPI AGApplied Sciences2076-34172025-07-011514792510.3390/app15147925Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal PredictionMaxim Majlatow0Fahim Ahmed Shakil1Andreas Emrich2Nijat Mehdiyev3German Research Center for Artificial Intelligence (DFKI), 66123 Saarbrücken, GermanyGerman Research Center for Artificial Intelligence (DFKI), 66123 Saarbrücken, GermanyGerman Research Center for Artificial Intelligence (DFKI), 66123 Saarbrücken, GermanyGerman Research Center for Artificial Intelligence (DFKI), 66123 Saarbrücken, GermanyIn high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive process monitoring (PPM) framework tailored to healthcare analytics. CP is renowned for its distribution-free prediction regions and formal coverage guarantees under minimal assumptions; however, its practical utility critically depends on well-calibrated probability estimates. We compare a range of post-hoc calibration methods—including parametric approaches like Platt scaling and Beta calibration, as well as non-parametric techniques such as Isotonic Regression and Spline calibration—to assess their impact on aligning raw model outputs with observed outcomes. By incorporating these calibrated probabilities into the CP framework, our multilayer analysis evaluates improvements in prediction region validity, including tighter coverage gaps and reduced minority error contributions. Furthermore, we employ SHAP-based explainability to explain how calibration influences feature attribution for both high-confidence and ambiguous predictions. Experimental results on process-driven healthcare data indicate that the integration of calibration with CP not only enhances the statistical robustness of uncertainty estimates but also improves the interpretability of predictions, thereby supporting safer and robust clinical decision-making.https://www.mdpi.com/2076-3417/15/14/7925conformal predictionexplainable artificial intelligenceprobability calibrationpredictive process monitoring
spellingShingle Maxim Majlatow
Fahim Ahmed Shakil
Andreas Emrich
Nijat Mehdiyev
Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
Applied Sciences
conformal prediction
explainable artificial intelligence
probability calibration
predictive process monitoring
title Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
title_full Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
title_fullStr Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
title_full_unstemmed Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
title_short Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
title_sort uncertainty aware predictive process monitoring in healthcare explainable insights into probability calibration for conformal prediction
topic conformal prediction
explainable artificial intelligence
probability calibration
predictive process monitoring
url https://www.mdpi.com/2076-3417/15/14/7925
work_keys_str_mv AT maximmajlatow uncertaintyawarepredictiveprocessmonitoringinhealthcareexplainableinsightsintoprobabilitycalibrationforconformalprediction
AT fahimahmedshakil uncertaintyawarepredictiveprocessmonitoringinhealthcareexplainableinsightsintoprobabilitycalibrationforconformalprediction
AT andreasemrich uncertaintyawarepredictiveprocessmonitoringinhealthcareexplainableinsightsintoprobabilitycalibrationforconformalprediction
AT nijatmehdiyev uncertaintyawarepredictiveprocessmonitoringinhealthcareexplainableinsightsintoprobabilitycalibrationforconformalprediction