Bringing Light Into the Darkness - A Systematic Literature Review on Explainable Predictive Business Process Monitoring Techniques
Stierle Matthias, Brunk Jens, Weinzierl Sven, Zilker Sandra, Matzner Martin, Becker Jörg
Zusammenfassung
Predictive business process monitoring (PBPM) provides a set of techniques to perform different prediction tasks in running business processes, such as the next activity, the process outcome, or the remaining time. Nowadays, deep-learning-based techniques provide more accurate predictive models. However, the explainability of these models has long been neglected. The predictive quality is essential for PBPM-based decision support systems, but also its explainability for human stakeholders needs to be considered. Explainable artificial intelligence (XAI) describes different approaches to make machine-learning-based techniques explainable. To examine the current state of explainable PBPM techniques, we perform a structured and descriptive literature review. We identify explainable PBPM techniques of the domain and classify them along with different XAI-related concepts: prediction purpose, intrinsically interpretable or post-hoc, evaluation objective, and evaluation method. Based on our classification, we identify trends in the domain and remaining research gaps.
Schlüsselwörter
business process, prediction, interpretability, explainable artificial intelligence.