Predictive Process Monitoring leverages machine learning models to predict the future states of business processes. Most Predictive Process Monitoring approaches rely on black-box models which, while powerful, lack interpretability, limiting their applicability in critical decision-making scenarios. Explainable Predictive Process Monitoring has emerged to address this gap, focusing on delivering actionable and transparent insights into predictions. Current approaches, however, often fail to incorporate multiple process perspectives and granularity levels in their explanations, overlooking crucial factors that influence process outcomes. This paper proposes a novel Explainable Predictive Process Monitoring approach to deliver explanations, integrating multiple process perspectives at various levels of granularity. The proposed approach addresses the limitations of existing methods, providing comprehensive, process context-aware explanations. The effectiveness of the proposed method is assessed through experimental evaluations on real-life event logs, showing how the integration of diverse process perspectives improves the interpretability and predictive insights of local explanations with improvements ranging from 2%-3% to 15%-20% depending upon the event log under analysis.

Predictive Process Monitoring leverages machine learning models to predict the future states of business processes. Most Predictive Process Monitoring approaches rely on black-box models which, while powerful, lack interpretability, limiting their applicability in critical decision-making scenarios. Explainable Predictive Process Monitoring has emerged to address this gap, focusing on delivering actionable and transparent insights into predictions. Current approaches, however, often fail to incorporate multiple process perspectives and granularity levels in their explanations, overlooking crucial factors that influence process outcomes. This paper proposes a novel Explainable Predictive Process Monitoring approach to deliver explanations, integrating multiple process perspectives at various levels of granularity. The proposed approach addresses the limitations of existing methods, providing comprehensive, process context-aware explanations. The effectiveness of the proposed method is assessed through experimental evaluations on real-life event logs, showing how the integration of diverse process perspectives improves the interpretability and predictive insights of local explanations with improvements ranging from 2%–3% to 15%–20% depending upon the event log under analysis.

Outcome-oriented local explanation using context-aware process patterns / Vazifehdoostirani, Mozhgan; Buliga, Andrei; Genga, Laura; Di Francescomarino, Chiara; Ronzani, Massimiliano; Ghidini, Chiara; Dijkman, Remco. - In: ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE. - ISSN 0952-1976. - 159:(2025), pp. 111387-111387. [10.1016/j.engappai.2025.111387]

Outcome-oriented local explanation using context-aware process patterns

Di Francescomarino, Chiara;Ghidini, Chiara;
2025-01-01

Abstract

Predictive Process Monitoring leverages machine learning models to predict the future states of business processes. Most Predictive Process Monitoring approaches rely on black-box models which, while powerful, lack interpretability, limiting their applicability in critical decision-making scenarios. Explainable Predictive Process Monitoring has emerged to address this gap, focusing on delivering actionable and transparent insights into predictions. Current approaches, however, often fail to incorporate multiple process perspectives and granularity levels in their explanations, overlooking crucial factors that influence process outcomes. This paper proposes a novel Explainable Predictive Process Monitoring approach to deliver explanations, integrating multiple process perspectives at various levels of granularity. The proposed approach addresses the limitations of existing methods, providing comprehensive, process context-aware explanations. The effectiveness of the proposed method is assessed through experimental evaluations on real-life event logs, showing how the integration of diverse process perspectives improves the interpretability and predictive insights of local explanations with improvements ranging from 2%-3% to 15%-20% depending upon the event log under analysis.
2025
Vazifehdoostirani, Mozhgan; Buliga, Andrei; Genga, Laura; Di Francescomarino, Chiara; Ronzani, Massimiliano; Ghidini, Chiara; Dijkman, Remco
Outcome-oriented local explanation using context-aware process patterns / Vazifehdoostirani, Mozhgan; Buliga, Andrei; Genga, Laura; Di Francescomarino, Chiara; Ronzani, Massimiliano; Ghidini, Chiara; Dijkman, Remco. - In: ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE. - ISSN 0952-1976. - 159:(2025), pp. 111387-111387. [10.1016/j.engappai.2025.111387]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/473940
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex 0
social impact