Learning on sets is increasingly gaining attention in the machine learning community, due to its widespread applicability. Typically, representations over sets are computed by using fixed aggregation functions such as sum or maximum. However, recent results showed that universal function representation by sum- (or max-) decomposition requires either highly discontinuous (and thus poorly learnable) mappings, or a latent dimension equal to the maximum number of elements in the set. To mitigate this problem, we introduce a learnable aggregation function (LAF) for sets of arbitrary cardinality. LAF can approximate several extensively used aggregators (such as average, sum, maximum) as well as more complex functions (e.g., variance and skewness). We report experiments on semi-synthetic and real data showing that LAF outperforms state-of-the-art sum- (max-) decomposition architectures such as DeepSets and librarybased architectures like Principal Neighborhood Aggregation, and can be effectively combined with attention-based architectures.

Learning Aggregation Functions / Pellegrini, Giovanni; Tibo, Alessandro; Frasconi, Paolo; Passerini, Andrea; Jaeger, Manfred. - (2021), pp. 2892-2898. (Intervento presentato al convegno IJCAI tenutosi a Montreal, Canada nel 19th- 26th August, 2021) [10.24963/ijcai.2021/398].

Learning Aggregation Functions

Pellegrini, Giovanni;Frasconi, Paolo;Passerini, Andrea;Jaeger, Manfred
2021-01-01

Abstract

Learning on sets is increasingly gaining attention in the machine learning community, due to its widespread applicability. Typically, representations over sets are computed by using fixed aggregation functions such as sum or maximum. However, recent results showed that universal function representation by sum- (or max-) decomposition requires either highly discontinuous (and thus poorly learnable) mappings, or a latent dimension equal to the maximum number of elements in the set. To mitigate this problem, we introduce a learnable aggregation function (LAF) for sets of arbitrary cardinality. LAF can approximate several extensively used aggregators (such as average, sum, maximum) as well as more complex functions (e.g., variance and skewness). We report experiments on semi-synthetic and real data showing that LAF outperforms state-of-the-art sum- (max-) decomposition architectures such as DeepSets and librarybased architectures like Principal Neighborhood Aggregation, and can be effectively combined with attention-based architectures.
2021
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
International Joint Conferences on Artificial Intelligence Organization
International Joint Conferences on Artificial Intelligence Organization
978-0-9992411-9-6
Pellegrini, Giovanni; Tibo, Alessandro; Frasconi, Paolo; Passerini, Andrea; Jaeger, Manfred
Learning Aggregation Functions / Pellegrini, Giovanni; Tibo, Alessandro; Frasconi, Paolo; Passerini, Andrea; Jaeger, Manfred. - (2021), pp. 2892-2898. (Intervento presentato al convegno IJCAI tenutosi a Montreal, Canada nel 19th- 26th August, 2021) [10.24963/ijcai.2021/398].
File in questo prodotto:
File Dimensione Formato  
ijcai2021.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.17 MB
Formato Adobe PDF
2.17 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/330913
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact