Object detection, a critical feature for autonomous vehicles, is performed today using Convolutional Neural Networks (CNNs). Errors in a CNN execution can modify the way the vehicle sense the surrounding environment, potentially causing accidents or unexpected behaviors. The high computational requirements of CNNs combined with the need to perform detection in real-time allow little margin for implementing error detection. In this paper, we present an extremely efficient error detection solution for CNN based on the observation that, in the absence of errors, the differences between the input frames and the detection provided by the CNN should be strictly correlated. In other words, if the image between two subsequent frames does not change significantly, the detection should also be very similar. Similarly, if the detection varies considerably from a frame to the next, then the input image should also have been different. Whenever input images and output detection don't correlate we can detect a error. After formalizing and evaluating the inter-frame and output correlation thresholds, we implement and validate the detection strategy, utilizing data from previous radiation experiments. Exploiting the intrinsic efficiency in processing images of devices used to execute CNNs, we can detect up to 80% of errors while adding low overhead.

Detecting Errors in Convolutional Neural Networks Using Inter Frame Spatio-Temporal Correlation / Draghetti, L. K.; Santos, F. F. D.; Carro, L.; Rech, P.. - (2019), pp. 310-315. ( IOLTS 2019 Rhodes, Greece 01-03 July 2019) [10.1109/IOLTS.2019.8854431].

Detecting Errors in Convolutional Neural Networks Using Inter Frame Spatio-Temporal Correlation

Rech P.
2019-01-01

Abstract

Object detection, a critical feature for autonomous vehicles, is performed today using Convolutional Neural Networks (CNNs). Errors in a CNN execution can modify the way the vehicle sense the surrounding environment, potentially causing accidents or unexpected behaviors. The high computational requirements of CNNs combined with the need to perform detection in real-time allow little margin for implementing error detection. In this paper, we present an extremely efficient error detection solution for CNN based on the observation that, in the absence of errors, the differences between the input frames and the detection provided by the CNN should be strictly correlated. In other words, if the image between two subsequent frames does not change significantly, the detection should also be very similar. Similarly, if the detection varies considerably from a frame to the next, then the input image should also have been different. Whenever input images and output detection don't correlate we can detect a error. After formalizing and evaluating the inter-frame and output correlation thresholds, we implement and validate the detection strategy, utilizing data from previous radiation experiments. Exploiting the intrinsic efficiency in processing images of devices used to execute CNNs, we can detect up to 80% of errors while adding low overhead.
2019
2019 IEEE 25th International Symposium on On-Line Testing and Robust System Design (IOLTS)
New York, USA
IEEE Institute of Electrical and Electronics Engineers Inc.
978-1-7281-2490-2
Draghetti, L. K.; Santos, F. F. D.; Carro, L.; Rech, P.
Detecting Errors in Convolutional Neural Networks Using Inter Frame Spatio-Temporal Correlation / Draghetti, L. K.; Santos, F. F. D.; Carro, L.; Rech, P.. - (2019), pp. 310-315. ( IOLTS 2019 Rhodes, Greece 01-03 July 2019) [10.1109/IOLTS.2019.8854431].
File in questo prodotto:
File Dimensione Formato  
IOLTS_Detecting_Errors_in_Convolutional_Neural_Networks_Using_Inter_Frame_Spatio-Temporal_Correlation.pdf

Solo gestori archivio

Descrizione: IEEE Symposium on On-Line Testing - conference paper
Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 251.4 kB
Formato Adobe PDF
251.4 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/346649
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 12
  • ???jsp.display-item.citation.isi??? 11
  • OpenAlex 16
social impact