Continual Learning (CL) aims to develop agents emulating the human ability to sequentially learn new tasks while being able to retain knowledge obtained from past experiences. In this paper, we introduce the novel problem of Memory-Constrained Online Continual Learning (MC-OCL) which imposes strict constraints on the memory overhead that a possible algorithm can use to avoid catastrophic forgetting. As most, if not all, previous CL methods violate these constraints, we propose an algorithmic solution to MC-OCL: Batch-level Distillation (BLD), a regularization-based CL approach, which effectively balances stability and plasticity in order to learn from data streams, while preserving the ability to solve old tasks through distillation. Our extensive experimental evaluation, conducted on three publicly available benchmarks, empirically demonstrates that our approach successfully addresses the MC-OCL problem and achieves comparable accuracy to prior distillation methods requiring higher memory overhead.

Online Continual Learning under Extreme Memory Constraints / Fini, Enrico; Lathuilière, Stéphane; Sangineto, Enver; Nabi, Moin; Ricci, Elisa. - ELETTRONICO. - 12373:(2020), pp. 720-735. (Intervento presentato al convegno 16th European Conference on Computer Vision, ECCV 2020 tenutosi a Glasgow, Scotland nel 23-28 August) [10.1007/978-3-030-58604-1_43].

Online Continual Learning under Extreme Memory Constraints

Fini, Enrico;Lathuilière, Stéphane;Sangineto, Enver;Nabi, Moin;Ricci, Elisa
2020-01-01

Abstract

Continual Learning (CL) aims to develop agents emulating the human ability to sequentially learn new tasks while being able to retain knowledge obtained from past experiences. In this paper, we introduce the novel problem of Memory-Constrained Online Continual Learning (MC-OCL) which imposes strict constraints on the memory overhead that a possible algorithm can use to avoid catastrophic forgetting. As most, if not all, previous CL methods violate these constraints, we propose an algorithmic solution to MC-OCL: Batch-level Distillation (BLD), a regularization-based CL approach, which effectively balances stability and plasticity in order to learn from data streams, while preserving the ability to solve old tasks through distillation. Our extensive experimental evaluation, conducted on three publicly available benchmarks, empirically demonstrates that our approach successfully addresses the MC-OCL problem and achieves comparable accuracy to prior distillation methods requiring higher memory overhead.
2020
European Conference on Computer Vision (ECCV)
Cham, Svizzera
Springer Science and Business Media Deutschland GmbH
978-3-030-58604-1
Fini, Enrico; Lathuilière, Stéphane; Sangineto, Enver; Nabi, Moin; Ricci, Elisa
Online Continual Learning under Extreme Memory Constraints / Fini, Enrico; Lathuilière, Stéphane; Sangineto, Enver; Nabi, Moin; Ricci, Elisa. - ELETTRONICO. - 12373:(2020), pp. 720-735. (Intervento presentato al convegno 16th European Conference on Computer Vision, ECCV 2020 tenutosi a Glasgow, Scotland nel 23-28 August) [10.1007/978-3-030-58604-1_43].
File in questo prodotto:
File Dimensione Formato  
fini.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.42 MB
Formato Adobe PDF
1.42 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/284412
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 36
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact