Machine Learning (ML) algorithms that perform classification may predict the wrong class, experiencing misclassifications. It is well-known that misclassifications may have cascading effects on the encompassing system, possibly resulting in critical failures. This paper proposes SPROUT, a Safety wraPper thROugh ensembles of UncertainTy measures, which suspects misclassifications by computing uncertainty measures on the inputs and outputs of a black-box classifier. If a misclassification is detected, SPROUT blocks the propagation of the output of the classifier to the encompassing system. The resulting impact on safety is that SPROUT transforms erratic outputs (misclassifications) into data omission failures, which can be easily managed at the system level. SPROUT has a broad range of applications as it fits binary and multi-class classification, comprising image and tabular datasets. We experimentally show that SPROUT always identifies a huge fraction of the misclassifications of supervised classifiers, and it is able to detect all misclassifications in specific cases. SPROUT implementation contains pre-trained wrappers, it is publicly available and ready to be deployed with minimal effort.

Ensembling Uncertainty Measures to Improve Safety of Black-Box Classifiers / Zoppi, T.; Ceccarelli, A.; Bondavalli, A.. - 372:(2023), pp. 3156-3164. (Intervento presentato al convegno European Conference on Artificial Intelligence (ECAI) tenutosi a Krakow, Poland nel 30 September- 4 October, 2023) [10.3233/FAIA230635].

Ensembling Uncertainty Measures to Improve Safety of Black-Box Classifiers

Zoppi, T.
;
2023-01-01

Abstract

Machine Learning (ML) algorithms that perform classification may predict the wrong class, experiencing misclassifications. It is well-known that misclassifications may have cascading effects on the encompassing system, possibly resulting in critical failures. This paper proposes SPROUT, a Safety wraPper thROugh ensembles of UncertainTy measures, which suspects misclassifications by computing uncertainty measures on the inputs and outputs of a black-box classifier. If a misclassification is detected, SPROUT blocks the propagation of the output of the classifier to the encompassing system. The resulting impact on safety is that SPROUT transforms erratic outputs (misclassifications) into data omission failures, which can be easily managed at the system level. SPROUT has a broad range of applications as it fits binary and multi-class classification, comprising image and tabular datasets. We experimentally show that SPROUT always identifies a huge fraction of the misclassifications of supervised classifiers, and it is able to detect all misclassifications in specific cases. SPROUT implementation contains pre-trained wrappers, it is publicly available and ready to be deployed with minimal effort.
2023
26th European Conference on Artificial Intelligence
Amsterdam, the Netherlands
IOS Press
9781643684369
9781643684376
Zoppi, T.; Ceccarelli, A.; Bondavalli, A.
Ensembling Uncertainty Measures to Improve Safety of Black-Box Classifiers / Zoppi, T.; Ceccarelli, A.; Bondavalli, A.. - 372:(2023), pp. 3156-3164. (Intervento presentato al convegno European Conference on Artificial Intelligence (ECAI) tenutosi a Krakow, Poland nel 30 September- 4 October, 2023) [10.3233/FAIA230635].
File in questo prodotto:
File Dimensione Formato  
374Zoppi-2.pdf

Solo gestori archivio

Descrizione: preprint
Tipologia: Pre-print non referato (Non-refereed preprint)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 388.07 kB
Formato Adobe PDF
388.07 kB Adobe PDF   Visualizza/Apri
FAIA-372-FAIA230635.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 522.43 kB
Formato Adobe PDF
522.43 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/397675
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact