Almost all Deep Learning models are dramatically affected by Catastrophic Forgetting when learning over continual streams of data. To mitigate this problem, several strategies for Continual Learning have been proposed, even though the extent of the forgetting is still unclear. In this paper, we analyze Concept Bottleneck (CB) models in the Continual Learning setting and we investigate the effect of high-level features supervision on Catastrophic Forgetting at the representation layer. Consequently, we introduce two different metrics to evaluate the loss of information on the learned concepts as new experiences are encountered. We also show that the obtained Saliency maps remain more stable with the attributes supervision. The code is available at https://github.com/Bontempogianpaolo1/continualExplain

Catastrophic Forgetting in Continual Concept Bottleneck Models / Marconato, E; Bontempo, G; Teso, S; Ficarra, E; Calderara, S; Passerini, A. - 13374:(2022), pp. 539-547. (Intervento presentato al convegno ICIAP tenutosi a Lecce nel May 23-27, 2022) [10.1007/978-3-031-13324-4_46].

Catastrophic Forgetting in Continual Concept Bottleneck Models

Marconato, E;Teso, S;Passerini, A
2022-01-01

Abstract

Almost all Deep Learning models are dramatically affected by Catastrophic Forgetting when learning over continual streams of data. To mitigate this problem, several strategies for Continual Learning have been proposed, even though the extent of the forgetting is still unclear. In this paper, we analyze Concept Bottleneck (CB) models in the Continual Learning setting and we investigate the effect of high-level features supervision on Catastrophic Forgetting at the representation layer. Consequently, we introduce two different metrics to evaluate the loss of information on the learned concepts as new experiences are encountered. We also show that the obtained Saliency maps remain more stable with the attributes supervision. The code is available at https://github.com/Bontempogianpaolo1/continualExplain
2022
21st International Conference on Image Analysis and Processing
GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND
SPRINGER INTERNATIONAL PUBLISHING AG
978-3-031-13323-7
978-3-031-13324-4
Marconato, E; Bontempo, G; Teso, S; Ficarra, E; Calderara, S; Passerini, A
Catastrophic Forgetting in Continual Concept Bottleneck Models / Marconato, E; Bontempo, G; Teso, S; Ficarra, E; Calderara, S; Passerini, A. - 13374:(2022), pp. 539-547. (Intervento presentato al convegno ICIAP tenutosi a Lecce nel May 23-27, 2022) [10.1007/978-3-031-13324-4_46].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/364858
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
  • OpenAlex ND
social impact