Anomaly detection is of great significance for intelligent surveillance videos. Current works typically struggle with object detection and localization problems due to crowded and complex scenes. Hence, we propose a Deep Spatiotemporal Translation Network (DSTN), novel unsupervised anomaly detection and localization method based on Generative Adversarial Network (GAN) and Edge Wrapping ( $EW$ ). In training, we use only the frames of normal events in order to generate their corresponding dense optical flow as temporal features. During testing, since all the video sequences are input into the system, unknown events are considered as anomalous events due to the fact that the model knows only the normal patterns. To benefit from the information provided by both appearance and motion features, we introduce (i) a novel fusion of background removal and real optical flow frames with (ii) a concatenation of the original and background removal frames. We improve the performance of anomaly localization in the pixel-level evaluation by proposing (iii) the Edge Wrapping to reduce the noise and suppress non-related edges of abnormal objects. Our DSTN has been tested on publicly available anomaly datasets, including UCSD pedestrian, UMN, and CUHK Avenue. The results show that it outperforms other state-of-the-art algorithms with respect to the frame-level evaluation, the pixel-level evaluation, and the time complexity for abnormal object detection and localization tasks.

Unsupervised Anomaly Detection and Localization Based on Deep Spatiotemporal Translation Network / Ganokratanaa, T.; Aramvith, S.; Sebe, N.. - In: IEEE ACCESS. - ISSN 2169-3536. - 8:(2020), pp. 50312-50329. [10.1109/ACCESS.2020.2979869]

Unsupervised Anomaly Detection and Localization Based on Deep Spatiotemporal Translation Network

Sebe N.
2020-01-01

Abstract

Anomaly detection is of great significance for intelligent surveillance videos. Current works typically struggle with object detection and localization problems due to crowded and complex scenes. Hence, we propose a Deep Spatiotemporal Translation Network (DSTN), novel unsupervised anomaly detection and localization method based on Generative Adversarial Network (GAN) and Edge Wrapping ( $EW$ ). In training, we use only the frames of normal events in order to generate their corresponding dense optical flow as temporal features. During testing, since all the video sequences are input into the system, unknown events are considered as anomalous events due to the fact that the model knows only the normal patterns. To benefit from the information provided by both appearance and motion features, we introduce (i) a novel fusion of background removal and real optical flow frames with (ii) a concatenation of the original and background removal frames. We improve the performance of anomaly localization in the pixel-level evaluation by proposing (iii) the Edge Wrapping to reduce the noise and suppress non-related edges of abnormal objects. Our DSTN has been tested on publicly available anomaly datasets, including UCSD pedestrian, UMN, and CUHK Avenue. The results show that it outperforms other state-of-the-art algorithms with respect to the frame-level evaluation, the pixel-level evaluation, and the time complexity for abnormal object detection and localization tasks.
2020
Ganokratanaa, T.; Aramvith, S.; Sebe, N.
Unsupervised Anomaly Detection and Localization Based on Deep Spatiotemporal Translation Network / Ganokratanaa, T.; Aramvith, S.; Sebe, N.. - In: IEEE ACCESS. - ISSN 2169-3536. - 8:(2020), pp. 50312-50329. [10.1109/ACCESS.2020.2979869]
File in questo prodotto:
File Dimensione Formato  
09031390.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 2.68 MB
Formato Adobe PDF
2.68 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/266658
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 64
  • ???jsp.display-item.citation.isi??? 40
social impact