Future deep learning systems call for techniques that can deal with the evolving nature of temporal data and scarcity of annotations when new problems occur. As a step towards this goal, we present FUSION (Few-shot UnSupervIsed cONtinual learning), a learning strategy that enables a neural network to learn quickly and continually on streams of unlabelled data and unbalanced tasks. The objective is to maximise the knowledge extracted from the unlabelled data stream (unsupervised), favor the forward transfer of previously learnt tasks and features (continual) and exploit as much as possible the supervised information when available (few-shot). The core of FUSION is MEML - Meta-Example Meta-Learning - that consolidates a meta-representation through the use of a self-attention mechanism during a single inner loop in the meta-optimisation stage. To further enhance the capability of MEML to generalise from few data, we extend it by creating various augmented surrogate tasks and by optimising over the hardest. An extensive experimental evaluation on public computer vision benchmarks shows that FUSION outperforms existing state-of-the-art solutions both in the few-shot and continual learning experimental settings.

Generalising via Meta-Examples for Continual Learning in the Wild / Bertugli, Alessia; Vincenzi, Stefano; Calderara, Simone; Passerini, Andrea. - (2023), pp. 414-429. (Intervento presentato al convegno LOD tenutosi a Certosa di Pontignano, Italy nel 19–22 September, 2022`) [10.1007/978-3-031-25599-1_31].

Generalising via Meta-Examples for Continual Learning in the Wild

Bertugli, Alessia;Passerini, Andrea
2023-01-01

Abstract

Future deep learning systems call for techniques that can deal with the evolving nature of temporal data and scarcity of annotations when new problems occur. As a step towards this goal, we present FUSION (Few-shot UnSupervIsed cONtinual learning), a learning strategy that enables a neural network to learn quickly and continually on streams of unlabelled data and unbalanced tasks. The objective is to maximise the knowledge extracted from the unlabelled data stream (unsupervised), favor the forward transfer of previously learnt tasks and features (continual) and exploit as much as possible the supervised information when available (few-shot). The core of FUSION is MEML - Meta-Example Meta-Learning - that consolidates a meta-representation through the use of a self-attention mechanism during a single inner loop in the meta-optimisation stage. To further enhance the capability of MEML to generalise from few data, we extend it by creating various augmented surrogate tasks and by optimising over the hardest. An extensive experimental evaluation on public computer vision benchmarks shows that FUSION outperforms existing state-of-the-art solutions both in the few-shot and continual learning experimental settings.
2023
Machine Learning, Optimization, and Data Science
Switzerland
Springer
Bertugli, Alessia; Vincenzi, Stefano; Calderara, Simone; Passerini, Andrea
Generalising via Meta-Examples for Continual Learning in the Wild / Bertugli, Alessia; Vincenzi, Stefano; Calderara, Simone; Passerini, Andrea. - (2023), pp. 414-429. (Intervento presentato al convegno LOD tenutosi a Certosa di Pontignano, Italy nel 19–22 September, 2022`) [10.1007/978-3-031-25599-1_31].
File in questo prodotto:
File Dimensione Formato  
2101.12081.pdf

Open Access dal 10/03/2024

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 569.78 kB
Formato Adobe PDF
569.78 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/364927
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex ND
social impact