A fundamental and challenging problem in deep learning is catastrophic forgetting, the tendency of neural networks to fail to preserve the knowledge acquired from old tasks when learning new tasks. This problem has been widely investigated in the research community and several Incremental Learning approaches have been proposed in the past years. While earlier works in computer vision have mostly focused on image classification and object detection, more recently some IL approaches for semantic segmentation have been introduced. These previous works showed that, despite its simplicity, knowledge distillation can be effectively employed to alleviate catastrophic forgetting. In this paper, we follow this research direction and, inspired by recent literature on contrastive learning, we propose a novel distillation framework, Uncertainty-aware Contrastive Distillation. In a nutshell, is operated by introducing a novel distillation loss that takes into account all the images in a mini-batch, enforcing similarity between features associated to all the pixels from the same classes, and pulling apart those corresponding to pixels from different classes. Our experimental results demonstrate the advantage of the proposed distillation technique, which can be used in synergy with previous IL approaches, and leads to state-of-art performance on three commonly adopted benchmarks.

Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation / Yang, Guanglei; Fini, Enrico; Xu, Dan; Rota, Paolo; Ding, Mingli; Nabi, Moin; Alameda-Pineda, Xavier; Ricci, Elisa. - In: IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE. - ISSN 0162-8828. - 45:2(2023), pp. 2567-2581. [10.1109/TPAMI.2022.3163806]

Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation

Enrico Fini;Dan Xu;Paolo Rota;Moin Nabi;Xavier Alameda-Pineda;Elisa Ricci
2023-01-01

Abstract

A fundamental and challenging problem in deep learning is catastrophic forgetting, the tendency of neural networks to fail to preserve the knowledge acquired from old tasks when learning new tasks. This problem has been widely investigated in the research community and several Incremental Learning approaches have been proposed in the past years. While earlier works in computer vision have mostly focused on image classification and object detection, more recently some IL approaches for semantic segmentation have been introduced. These previous works showed that, despite its simplicity, knowledge distillation can be effectively employed to alleviate catastrophic forgetting. In this paper, we follow this research direction and, inspired by recent literature on contrastive learning, we propose a novel distillation framework, Uncertainty-aware Contrastive Distillation. In a nutshell, is operated by introducing a novel distillation loss that takes into account all the images in a mini-batch, enforcing similarity between features associated to all the pixels from the same classes, and pulling apart those corresponding to pixels from different classes. Our experimental results demonstrate the advantage of the proposed distillation technique, which can be used in synergy with previous IL approaches, and leads to state-of-art performance on three commonly adopted benchmarks.
2023
2
Yang, Guanglei; Fini, Enrico; Xu, Dan; Rota, Paolo; Ding, Mingli; Nabi, Moin; Alameda-Pineda, Xavier; Ricci, Elisa
Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation / Yang, Guanglei; Fini, Enrico; Xu, Dan; Rota, Paolo; Ding, Mingli; Nabi, Moin; Alameda-Pineda, Xavier; Ricci, Elisa. - In: IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE. - ISSN 0162-8828. - 45:2(2023), pp. 2567-2581. [10.1109/TPAMI.2022.3163806]
File in questo prodotto:
File Dimensione Formato  
Uncertainty-Aware_Contrastive_Distillation_for_Incremental_Semantic_Segmentation.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.69 MB
Formato Adobe PDF
1.69 MB Adobe PDF   Visualizza/Apri
Uncertainty_aware_Contrastive_Distillation___TPAMI_Special_Issue_compressed.pdf

Open Access dal 02/02/2025

Descrizione: Accepted Manuscript
Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.74 MB
Formato Adobe PDF
1.74 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/341694
Citazioni
  • ???jsp.display-item.citation.pmc??? 2
  • Scopus 43
  • ???jsp.display-item.citation.isi??? 38
  • OpenAlex ND
social impact